The increasing number of security breach incidents in recent times have highlighted phishing (especially spear phishing) as a key risk for which organizations need to develop specific control measures. People controls are a key focus area as phishing attacks are targeted against unsuspecting users. Of course, technology and process controls also need to be deployed, but they would be ineffective if the people controls are weak.
The security industry has used the concept of “build”, “test” & “fix” since inception. A lot of time and money is spent to identify vulnerabilities in technologies and processes. The idea is that the flaws would be found and fixed. However, there aren’t too many ways to test people, which is why people (users) have been called the weakest link in security.
The most effective people control against phishing is user education. We educate users on the risks of phishing, how it happens, how to identify phishing attempts etc. More often than not, these tend to be generic training. Such training is not as effective as ones tailored specifically to the user’s behavior pattern. For example, I don’t learn much by watching Federer play, but I learn a lot more when my tennis coach observes my play and gives me specific pointers and corrections.
Aujas has used its Phishing Diagnostic Solution (Phishnix) in several organizations to capture and analyze user behavior against phishing attacks. Phishnix focuses on the “Fall rate” and “Fail rate” based on actual employee behavior and helps organizations to develop specific control measures to address potential problem areas.
In a typical phishing attack, the target is enticed to read an email, visit a website and reveal information. A common misconception is that the attack is successful only if the target reveals information. But, this is not true. An attacker essentially looks for information to plan the next move, which he can get based on user actions, even when there are no major revelations of private data. For instance, just by the mere act of visiting the malicious website, the target reveals information that could be used for fingerprinting and understanding the kind of information that attracts targets.
Phishnix has performed behavior analysis across thousands of users, across multiple industry segments using various attack patterns. These attacks are designed to enable observant targets to easily identify a phishing attempt. The attacks are of specific interest to the target while still being generic in nature.
Fall Rate is defined as the percentage of users (targets) who “fall” for the attack and visit the fake website.
Fail rate is defined as the percentage of users (targets) who “fail” in the attack, visit the fake website and reveal sensitive information.
Following are the Fall & Fail rates statistics analyzed using Phishnix:
The two ends of the spectrum, Best and Worst, are results for specific tests, whereas the Average is computed across all tests. All percentages are computed based on total target users in the test.
Average fall rate of 61% is startlingly high when you realize that there is a percentage of users who have taken no action at all. Mostly, this inactivity on the user’s part is not because they have spotted an attack. It is more likely that they have an email back log or may not have had the time to see the email yet. As mentioned earlier, visiting a malicious website itself provides attackers enough knowledge to plan their next move.
Average fail rate of 21% is a huge concern as these users have revealed sensitive information like passwords. 65% of the users who “fell” didn’t “fail”, which means that either they realized it was an attack (most likely the case) or just didn’t proceed further due to other priorities. This highlights the need for tailored education to users by leveraging the “teaching moment” created. If we educate users with examples of good and bad behavior based on their own actions, the retention of that knowledge would be far greater than the retention of knowledge gathered from generic trainings.
These statistics change as we perform follow-up tests and or vary the parameters of the tests. For instance, the fall and fail rate increase as the services (e.g. social networking) and end devices (e.g. mobiles) change. They also increase if the tests are conducted around specific incidents (e.g. 9/11 memorial or company specific events). Organizations can learn about user behavior and modify their control strategies by fall and fail rate benchmarks. As an example, if the fail rate is high in a specific department or location and it doesn’t decrease after education, then technology or process controls would need to be enhanced.
Risk management and control mechanisms against social engineering attacks need to be dynamic to keep up with the evolving security risks. Benchmarking is a first step in analyzing and improving metrics. As you can’t improve what you can’t measure, tracking Fall and Fail Rates become critical for an organization interested in introducing positive changes in user behavior.