One cloud security specialist, featured at InfoSec this year, thought that, despite the industry focusing its marketing on a string of events, real cyber security requires an ongoing permanent process, to match the persistent nature of the attacks, rather than a reactive program.
They referred to Verizon’s 2014 Data Breach Investigations Report (DBIR), which found that the number of confirmed security incidents grew by 34 percent from 2012 to 2013. A year that Verizon tagged ‘the year of the retailer breach’, with an emphasis on large-scale attacks on payment card systems.
Most breaches could be classified under one of nine basic incident patterns. And for many, Verizon’s ‘Recommended Controls’ include the use of two-factor authentication and the monitoring of suspicious traffic patterns. As they put it, “The writing’s on the wall for single-factor, password-based authentication on anything Internet-facing”.
It was claimed that 97 percent of all attacks could have been prevented using relatively low-level standard controls. Worth considering when the Ponemon Institute’s 2014 Cost of Data Breach Study estimates the average cost of each breach at $145 per record. With the average annualised total cost of a breach at $5.85m for the US, and $3.68m for the UK.
Another exhibitor outlined what they called the ‘new threat landscape’. As PwC’s Global State of Information Security Survey 2014 put it, “You can’t fight today’s threats with yesterday’s strategies”. CEOs were typically more confident (84 percent) of their organization’s security capabilities than the average respondent (74 percent). However, PwC estimated that just 17 percent of them could be classed as ‘true leaders’ in this respect. Even with such capabilities, respondents still reported that security incidents had risen by 33 percent.
According to this exhibitor’s research, attackers were typically present on a victim’s network for an average of 229 days before being detected. And although Verizon’s 2014 DBIR report marked the first year when internal fraud detection had outnumbered external detection, the usual pattern had been for the victim of a breach to be first notified of it by a third party.
Another cyber security specialist highlighted a worrying survey of IT security professionals that showed that 57 percent of respondents did not think their organization was protected from advanced cyber attacks. In terms of their grasp of the overall risk, only 41 percent believed they had a good understanding of the threat landscape facing their company, with 37 percent not sure whether they had lost sensitive or confidential information as a result of a cyber attack. Of those who had lost such information, 35 percent did not know exactly what data had been stolen.
One vendor quoted Gartner’s recommendation that “All organizations should now assume they are in a state of continuous compromise”. While another drew attention to the findings in InfoSecurity’s own European Information Security Survey 2014, based on responses from information security professionals from December 2013 to March 2014. Phishing attacks were identified as the biggest risk, closely followed by financial gain attacks, which meant that banks in particular had to do more to protect themselves and their customers.
Returning to the issue of trust, although 58.6 percent of respondents thought that the ‘Snowden effect’ and its revelations had improved the ability of their business to understand potential threats, 49.6 percent of respondents were somewhat less likely, or much less likely, to trust US technology companies with sensitive information.
In the light of this, Google’s Keynote Presentation on ‘Securing and Protecting User Information Online’ was likely to be keenly followed, and that’s what we’ll look at in our next blog.