It should be no surprise to many that breaches continue to occur despite the presence of two factor authentication controls (2fA). The latest well publicised breach in this situation is of course Facebook. The result of such breaches can be very damaging to those identities affected as well as to the provider of the affected service. Often consumers are lulled into a false sense of thinking, based on the presence of a 2fA process, believing additional authentication challenges are all that is needed to protect their identities from misuse.
One of the problems here is the misunderstanding that the holy grail of authentication and of securing identities is to simply deploy 2fA controls. Organisations that are used to requiring only username and password, rush to introduce an additional factor, believing that 2fA will be enough to protect them and their users from attack. However, not all 2fA implementations should be considered equal.
One of the main concerns when implementing additional security controls of any kind is the balancing act that needs to exist between usability and security, especially for a consumer facing application.
A poor user experience, no matter how secure, will inevitably reduce user uptake of a service as well as restrict the repeated use of that service if the friction is too high.
With this is mind, it becomes perfectly understandable that relatively low friction and easy to deploy second factor authentication mechanisms (such as SMS) are deployed. However, the associated risk with deploying such mechanisms without additional intelligence can leave an organisation open to attack, despite believing they are securing their applications and implementing the correct controls.
Here at SecureAuth we have been advising our customers and the marketplace in general that 2fA is not enough. The recent well publicised breaches give further credence to this belief.
Forgetting about the actual issues and attack vectors associated with technologies such as SMS for a moment, the presentation of an authentication challenge to an end user needs to be discussed. We see second factor authentication processes delivered to end users without recognition that real users as well as illegitimate users could potentially both see the same authentication challenges. For this scenario I refer to the process as simple and static 2fA. An example of which would be:
- First Factor – Password (Something you know)
- Second Factor - OTP delivered by SMS (Something you have)
As username and password (or any knowledge based approach) are such low hanging fruit, we can acknowledge that the likelihood is for any given consumer (or enterprise) application, an identity and password have been compromised along the way. This leaves just the “something you have” element to protect the identity and act as the user challenge. The result of which being that the attack vector shifts from the password to the second factor. Malicious attackers can craft various forms of attack to subvert OTPs from an end user (typically through phishing attacks) or utilise weaknesses in the architecture of the method, SS7, IMSI catchers, Malware and SIM Swaps - using SMS as the example. Therefore, the second factor as a deterrent is exactly only that, a nuisance to the attacker, not a solution to the problem.
If the same challenge is shown repeatedly enough to a malicious attacker (static 2fA) and they are determined enough (architecture exploits aside), the chances are the method will eventually be bypassed.
If we understand and agree that this is a very valid issue and real world problem, what can we do to change this exposure?
Building intelligence into the authentication process now has to be the standard when looking to implement any 2fA or Multi Factor Authentication (MFA) method (what is MFA?). By utilising adaptive techniques during the authentication process, we can remove the easily attacked static elements and introduce dynamic controls in a real time manner. This allows us to change the authentication process before presenting the options to the end user. It also allows us to remove and block authentication requests from attempts considered to be high risk.
The decisions points that make up the risk are derived from a unique blend of adaptive layers such as, anonymous sources, malicious ip, geo controls, device controls, unusual access patterns and privileged access accounts, to name a few. Once these adaptive layers are passed, the authentication controls can responsibly be presented to the user. If for example SMS is to be used, then at least deploy protection around the SMS OTP delivery process by utilising the above adaptive layers and SecureAuth’s own phone number fraud prevention checks.
SecureAuth’s multi factor authentication solutions have the richest pre-authentication adaptive risk analysis and authentication options available, protecting the identity and the authentication process while keeping the user experience friction light and secure. A statement backed up by the findings of Gartner in the recent access management quadrant report, where Gartner reported that SecureAuth has “the broadest set of adaptive access features.”
Without a doubt a 2fA process is better than just username and password alone, however, if implemented poorly and without appropriate and required checks before presenting the authentication challenges, the risk of attack and breach remains. Feeling like the job is done because static 2fa is implemented is a dangerous place to be.
Two factor and multi factor authentication solutions should be deployed responsibly, leveraging real time intelligence, with customer and security in mind, not just as a “good enough” practice.