Authentication is one of the biggest problems of security since the beginning of the internet. In most cases, we are using passwords for authentication. But it usually causes problems since people are using weak passwords, reusing the same passwords on different platforms or simply giving them away with phishing scams. Not only for websites/applications, we were also using them to unlock our mobile phones. However, companies like Apple provided more user-friendly authentication options such as Touch ID and Face ID where you can unlock your phone with your biometric data.
Authentication with biometric data is cool, but I’m not really a big fan of that. It’s easy to put someone’s finger on their iPhone by force. It’s a great risk for people who are living under oppressive regimes or criminals who want to negotiate after being captured.
What about regular passcodes. It’s something that’s only available in your mind. But this doesn’t mean that you are not at risk. They can torture you to get that passcode, and eventually, you will give up.
What do we need then? We need an authentication mechanism that can’t be captured even by the torture. Assuming that we will compromise under torture, this mechanism should aware that we are under stress.
An Experiment: Authentication via Body Motions
My base hypothesis was everyone has a unique wrist motion. I was thinking that we can create a machine learning model around that, differentiate and authenticate people by analyzing their wrist motion data coming from smartwatches. How would it help with the torture case? Since you will be under huge stress after torture, your wrist movements will be different even if you push yourself to act like as normal. Hence, you won’t be able to authenticate.
To test this, I created an Apple Watch application which records my wrist movements when I’m walking from the kitchen to my computer. I recorded 100 walking events and trained it. My expectation was: When I walk again to my computer, it should correlate the data with the model and authenticate me. Besides that, when another person walks in, it should detect the anomaly and won’t authorize. This experiment is failed for two reasons:
1) To get a proper model, 100 records isn’t enough, you may need to record it 500 times. This kills the usability.
2)I turned out it’s not so hard to mimic someone else’s movement. This kills the security and the entire concept
A Futuristic Approach
We need a microcomputer that goes under the skin and stays there for a long time. This little computer will have our password (or private key etc.) for authentication and needs to be physically destroyed when it goes out of the skin. The password will be transferred to the device that you want to authenticate via a wireless medium such as NFC or Bluetooth (or some future model)
This computer will also measure the heartbeat and cortisol levels of the user. If the measurement goes beyond the defined threshold, authentication won’t occur until the user gets calm. There won’t be any backdoor to bypass this control. This will be a common knowledge and your kidnappers won’t torture you since they would know there is no chance to trigger the authentication mechanism by force.
More Realistic Approaches
We don’t know when we will have microcomputers with a heart rate detector. But there are things that can be done with today’s technology (or in the near future)
Touch ID with heart rate measurement: Apple Watches and various devices can measure the heart rate from outside of the skin. So when a user tries to authenticate with fingerprint, the device should control his/her heart rate, and won’t allow if it’s above the defined threshold.
Face ID with a crazy machine learning model: Imagine a machine learning model that can differentiate your normal face from a stressed one. This can be possible in the near future. So when you try to unlock your phone with a stressed face, the device won’t allow you to do it.
Panic button: When the user thing zhe’s going to be arrested (or kidnapped), zhe will press a button which will lock all kind of authorization attempts on all devices. This can’t be unlocked by the user zhimself. The user should define three trusted people that can trigger the unlock event. The devices won’t unlock if all these three people agree on that.