The 12 Days of Cyber Christmas

December 19, 2014

As 2014 draws to a close, we will not make any predictions for 2015. We’ll leave that to astrologers and TimeLords (and we don’t believe in either!). What we can do, however, is learn from the past. As an industry, we haven’t been good at educating others outside our industry of the key things they should be doing, yet the same basic security measures ,either poorly implemented or missing completely, pop up time and again. So here is our take on the 12 days of Christmas: The 12 days of Cyber Christmas: 12 essential security measures organisations should be taking, both large and small.

Cyber Tree, 12 days of Christmas, Cyber Security, Patridge in a Cyber Tree

On the 12th day of Christmas,
Cyber Security sent to me:
12 remotes removing
11 policies persuading
10 defaults denying
9 apps a patching
8 coders coding
7 keys safeguarding
6 risks a weighing
5 unsolicited pings
4 complex words
3 Varied Pens
2 console dumps
And No View access to SNMP

On the 12st day of Christmas, Cyber Security sent to me: 12 remotes removing

It’s truly shocking how many accounts have remote access to resources which don’t require it. Think at about this from both a technical and “need to know” business case perspective. On the technical side, do you really need Remote Desktop (in Windows) via RDP. If you conclude you really do, make sure to remove the local admins account from RDP and have specific user accounts with strong passphrases (see the 4th Day of Christmas). In the United States Senate Report on the 2013 Target breach, published earlier this year, on of their key findings was that Target had given remote access to a third party vendor which did not appear to be following information security practices.

11 policies persuading

As we reported in September, organisations large and small need to work in security policies, user training and quarterly security KPIs, monitored by their HR teams. If you are looking for two areas to start then physical security and social media usage should be two good ones to start with in early 2015. It’s simply too easy to get into some organisations on the physical side; whilst we’ve seen numberous politicians and would-be politicians forced to resign following faux-pas on social media in recent months.

10 defaults denying

The constant revelations from Edward Snowden in 2014 remind us all of the dangers of giving too much access to data to one individual in the organisation. According to the Verizon 2014 Data Breach investigations report, 88% of security incidents were caused by insider misuse, either accidental or malicious. Review your access permissions to data too, and ensure you have appropriate technical controls in place to both grant access to data to only appropriate personel, and ways of auditing and tracking access to data. I’ve really been impressed with Vormetric’s Data Security Platform which specifically addresses this issue, using file level encryption and key manangement.

9 apps a patching

According to Secunia’s Vulnerability Review 2014, “the more widespread a program is, and the higher the unpatched share, the more lucrative it is for a hacker to target this program, as it will allow the hacker to compromise a lot of victims”. This is reflected in their research, where Microsoft IE (99% of market share) had 270 known vulnerabilities (at their time of publication); whereas Apple Safari (11% market share) had 75 known vulnerabilities. They also discovered 33% of users running IE were using unpatched browsers; and 14% of those users running Apple’s Safari browser was unpatched. Ensure you are patching all your applications – in-house and third-party regularly for 2015!

8 Coders coding

If you are developing any applications in-house applications, be sure to encourage secure coding best practices. A wealth of information is available on this subject, the Microsoft Security Development Lifecycle which is a good starting point.

7 Keys Safeguarded

In the eBay breach announced on 21st May 2014, hackers stole personal information of 145 million active users of eBay, after they compromised the database containing encrypted passwords. This suggests that their wasn’t proper key management in place to safeguard the data. Remember never to give user access to keys that isn’t required. Encryption of data, both in-house and cloud, along with key management is handled very efficiently with Vormetric’s Data Security Platform.

6 Risks a weighing

In any SME to large organisation, your organisation should be regularly reviewing the risk. In order to determine where your resources are spent mitigating the risk, you need to look at the costs of the assets you are protecting, the costs of the risks of a breach, the legislation you must be compliant with and decide where best to spend your resources. A good starting point for risk management framework is ISO 31000.

5 Unsolicited Pings

When was the last time you did a network/ asset discovery exercise on the network? If you deal with credit cards data you have a requirement to do a quarterly network vulnerability scan. Can you account for all the devices discovered on your network?

4 Complex Words

If there’s one message we need to get over to end-users in eCommerce in 2015, it’s the avoidance of using default, vendor-supplied passwords, and weak passwords. As the ICO warned us this November, with a Russian website providing live footage from thousands of webcams across the UK, we should never use vendor-supplied passwords on our devices. Make sure all your devices on the network are strong passphrases, comprising of not just word, but phrases, containing numbers and special characters. An example would be: MerryChr1stmas2You!   Remember also to use different phrases per device/ logon account.

3 Varied Pens

It’s good security practice to change your penetration tester regularly: the black art of ethical hacking is partly creative and intuitive: what one pen tester finds, another may not necessarily. People do have their favourite methods and utilities, and overtime, if you keep re-using the same penetration testers, you may find you get the same results. Change your penetration testers regularly so that no stone is unturned looking for vulnerabilities. In addition, as well as having a human element, mid-tier to large organisations should also invest in some automated threat analysis. This year we’ve been very impressed with the offering from Alien Vault, with their large OTX database and threat analysis.

2 Console Dumps

One of the key findings in the 2014 Senate Report on the 2013 Target breach, affecting 110m credit card numbers; was their failing to respond to multiple warnings from the company’s anti-intrusion software. Ensure your SIEM is properly configured, and that you have the right resources to monitor these logs in real-time and on a proactive basis.

And no View Access to SNMP

SNMP (Simple Network Management Protocol) is used in systems management to monitor devices such as routers, switches, and printers. On your systems management consoles, it enables the agents to report the status of IP devices. Unfortunately, a large number of these SNMP v1/2 configurations allow each user to view an entire, unrestricted view of the entire agent tree, which could then be used to provide sufficient credentials to obtain read-write or admin level SNMP access.

Wishing all our readers a Merry Christmas and Happy & Prosperous 2015.


What Price, Privacy?

October 15, 2014

At IP Expo last week in London, the inventor of the World Wide Web, Tim Berners-Lee, outlined his vision for the future of the web, and data ownership in his keynote speech.

internet, web commerce, ecommerce, internet, future, connectivity, global connectivity, data sharing, data

© Victoria – Fotolia.com

Over the past few years there have been numerous concerns about big data and its use. Tim Berners-Lee argues that whilst he is perfectly happy for his personal data to be used by professionals where it is of benefit to him or mankind (e.g. healthcare professionals in a road traffic accident), this does not apply to commercial organisations for the purposes of targeted marketing. Tim argues that the owners of personal data are the data subject, not large corporations, and that we should have a say how it can be used and where it can be sold or merged with other data sets.

Others take a different view. An interesting panel discussion with Raj Samani, Andrew Rose & Josh Pennell ensued later in the conference. There was disagreement amongst these panelists whether the ‘reputational damage’ argument regarding data loss actually meant anything these days to organisations, since data breaches are seemingly ubiquitous and those outside the industry, especially the Facebook generation, simply don’t care if their personal data is lost. Time will tell if there is long-term damage to Target, for example from their breach, although early indications appear to show that the share price of Target has mostly recovered.

Earlier in the month, Raj Samani had argued in a webcast to Forbes, that as a consumer, he gives consent for his personal data being used and shared when he signs up for services, and that most consumers will happily do so, given there is consent, perceived value and transparency. After, all, services such as Twitter and Facebook are free for the end-user to use. Raj does concede however, that the terms and conditions are rarely clear, being large legal documents, and that few end users will actually read them. There rarely is effective transparency in these cases, if end-users do not realise what they are signing up to.

How Tim’s proposal might work in practice would be to change legislation and afford personal data the same status as copyrighted data: use and context requires specific owner consent and re-use requires a royalty paid to the data subject. It may also solve the forthcoming conundrum about the “right to erasure” in the new EU Data Protection legislation: if I ask to be removed from one database, in theory the deletion should cascade into third party databases, where the data had been sold on. It would also take away some of the burden from over-worked regulators like the ICO, who are severely under-resourced.

I’m sure many organisations will say such a model is unworkable, but it may just make organisations think about how and where they use our personal data, especially if over-use of personal data directly affected bottom line via royalty payments. 40 years ago, in the Cold War era, if you had suggested an interconnected network of disparate computers upon which banking, finance, government, education, retail and leisure all relied on, which would become both ubiquitous and intrinsic to our daily lives, the idea would probably have been dismissed as the work of science-fiction and unworkable. Yet in 2014, we rely upon the internet to sustain all these things and more. Our industry needs such radical thinking today for a new model of data protection.

Phil Stewart is Director, Excelgate Consulting


Compliant But Not Necessarily Secure? How HR Can Help..

September 25, 2014

As the recent Target breach has shown: being compliant alone does not necessarily mean inherently secure. As a recent analysis of the theft of personal and financial information of 110 million Target customers by the United States Senate illustrates, key warning signals that Target’s network was being infiltrated were missed in the organisation. Both anti-malware and SIEM identified suspicious behaviour on their network, yet two days later the attackers began exporting customer data from their network.

HR - business functions, business operations, business processes

© Rawpixel – Fotolia.com

The fact that Target was certified as being compliant with the PCI DSS standard just two months before the breach took place has certainly caused a lot of debate in the industry. I’ve always argued that simply persuing an annual ‘checkbox exercise’ is not enough: there must be a lasting and on-going cultural awareness with regards to data security.

I’ve also long argued that whilst responsibility and drive for data security lie at board level; the entire business needs to be both aware and on-board too. That’s why HR is vital to achieving the role. KPIs for data security, along with employee training programmes need to enhance the goals of the CISO. Data security KPIs should be SMART: Specific, Measurable, Achievable, Realistic and Time-orientated. Across all business units there should be at least annual (and perhaps quarterly, depending upon the nature of the business) employee awareness training programmes.

The KPIs should be pertinent to the area of the business unit they operate in and relevant to the data security standard your organisation is working towards gaining (or maintaining) being compliant with. For PCI DSS, for example, the requirement to ‘Protect Cardholder Data’ clearly has different implications for different business units. For a call centre, for example the requirement should be around not storing PANS in a readable format; not recording CV2 numbers when taken over the phone; not writing cardholder details down on bits of paper for subsequent entry; and the secure physical storage of paper records. For development, this would pertain to: encryption methods used and key storage and protection; how PANs are rendered unreadable electronically and adherence to the standard; how cardholder data is transmitted across networks. For IT support, these could relate to maintaining up to date anti-malware and intrusion prevention systems; hourly monitoring of SIEM information; weekly reports to senior management concerning monitoring status and patch levels.

Whilst line managers are usually responsible for setting KPIs in a business, I strongly believe HR can make a valuable contribution: by enforcing company policy in ensuring all line managers include data security quarterly KPIs in their target setting. A function of HR in a business is to safeguard the reputation of a business, something which is also the function of an information security professional.

A recent study by insurance firm Beazley analysed more than 1,500 data breaches it services between 2013 and 2014, and discovered that employee errors accounted for 55% of them. HR is also about people management and defining the culture of an organisation. Whilst I’m not suggesting that HR take on responsibility for information security, they certainly have a part to play in ensuring that the correct mindset operates across all parts of the business. Changing an organisation’s culture requires a sustained effort from numerous players across the business: and HR is one of the key ones.

Phil Stewart is director of Excelgate Consulting.


Bring Your Own Pandora’s Box

May 1, 2012

Bring Your Own Device (BYOD) seems to spreading in the industry as a debating point in the industry like a virulent virus. Is it fast becoming the 21st century version of the “Pandora’s Box” for the workplace?

Bring Your Own Device - BYOD = Bring Your Own Pandora's Box

In ancient Greek mythology the “Pandora’s box” was a box which contained all the evils of the world. Compelled by her own curiousity, Pandora opened it, and all the evil contained therein escaped and spread all over the world. Is “Bring Your Own Device” (BYOD) the 21st Century equivalent?
Image: © foaloce - Fotolia.com

I’ve long been an advocate of information security professionals adopting a more flexible approach when giving end-users the functionality and flexibility they want: a happier user is a more productive user. Indeed, a recent study published by O2 last month [1], the largest ever flexible working pilot of its kind in the UK, found that workers were more productive, happier and richer if they did not go into the office.

Users want to experience the functionality, applications and interfaces that they use and are familiar with in their private lives. They want devices that allow them to work remotely from outside the office and on the move. All things can be taken too far, however; and this is exactly what is happening with BYOD. It seems that some organisations appear intent in throwing out the corporate security (baby) out with their data (treating it as bathwater).

BYOD should be treated with the same three basic principles as with any other information security issue: confidentiality, integrity and availability. Vast sums were spent on secure laptop rollouts in the noughties in order to give workers the ability to work from home and on the road securely. Did we ask/ allow users to bring their own home laptops (along with their malware!) and just plug them into the corporate network and hope for the best? Of course we didn’t!

Some good reasons why BYOD may not be such a great idea:

  • What happens if the device is lost or stolen from the end user? Along with the device goes (unencrypted) corporate data and contacts and an inability to remote wipe/ kill it
  • Removable media is easily targeted by would-be remote hackers as an easy way of bypassing network firewalls: it the assumed route for example for the Stuxnet virus for example.
  • Smartphones usually nowadays have both voice recording and high definition camera capabilities: easily allowing for industrial espionage. There’s good reason most high-security organisations don’t allow them on their premises.
  • There are legal ramifications in most geographies in monitoring employee’s personal web behaviour.
  • Laws on employee monitoring of personal data/ behaviour varies from geography to geography: this is a potential headache for corporate risk departments.
  • How will you monitor the mobile devices for malware, and prevent its introduction into the corporate network?
  • How will you identify which device is ‘rogue’ and which is ‘approved’ in a true BYOD environment?
  • How do you monitor for acceptable use on an employee-owned device? (assuming there are no legal blockers)
  • How will you prevent the employee walking out the door with your customer and supplier contact lists?
  • How do prevent illicit/unlicensed content from entering the corporate network?
  • Which devices are deemed to be supported and unsupported?
  • Does your IT department have the right skills to support cross-platform mobile applications in a secure fashion?

The more enlightened organisations are of course doing a sensible compromise: rolling out well-known tablets to their workforce in a controlled manner, whilst not allowing private devices to access the corporate network. According to Apple, 9 months after they launched the iPad, 80% of Fortune 100 companies had either deployed or were piloting the device. [2]. When deploying mobile devices you should consider all the areas you would have done for a laptop device and more:

  • How will the data on the device be encrypted?
  • How will you transmit data encrypted to your internal network and external service providers?
  • How will you ‘remote kill’ the device should it be lost or stolen?
  • How will you prevent malware and inappropriate content entering the network.?
  • How will the user authenticate securely onto both the device itself and also the corporate network?
  • How will you identify vulnerabilities on the mobile Operating System?

It is perhaps my last question which causes me the most concern, since some in the industry appear to think that certain mobile platforms are not open to the vulnerabilities and exploits in the same way as their desktop and server counterparts. A recent mystery shopper exercise has confirmed my opinion: most do not take the threat posed by mobile devices as seriously as they should. The SANS Institute think otherwise: a whitepaper published them by in 2011 [3] stated that whilst Apple has done a great job by allowing only digitally signed software to be installed on a non jail-broken iPhone; there were still vulnerabilities that could be exploited, particularly in the web-browser (Safari).  Recently this year Reuters reported on a flaw on the operating system of the Android operating systems which allowed hackers to eavesdrop on phone calls or monitor the location of the device. [4]

In order to reduce the risk posed by BYOD, organisations should develop an acceptable usage policy for tablets which clearly identifies the steps end-users should take to complement the security that has been built-in to the device by the IT department. This should include sensible advice on the dangers of allow shoulder-surfing when logging on in a public area; ensuring the device is not left in a public place; regular network connectivity to the corporate network to allow regular updates.

The sensible compromise of rolling out familiar, but secured, tablets in the workplace combined with an acceptable usage policy, will prevent your employees seeing the management as “tone deaf” and may well realise significant increases to both end-user productivity and creativity in the workplace; whilst maintaining a sensible approach to safeguarding against the unguarded use of mobile devices in the workplace.

Phil Stewart is Director, Excelgate Consulting & Secretary & Director, Communications, ISSA UK

Sources:

[1] Working From Home More Productive, Telegraph 3rd April 2012:
http://www.telegraph.co.uk/technology/news/9182464/Working-from-home-more-productive.html

[2] Morgan Stanley: Tablet Demand & Disruption:
http://www.morganstanley.com/views/perspectives/tablets_demand.pdf

[3] SANS Institute: Security Implications of iOS:
http://www.sans.org/reading_room/whitepapers/pda/security-implications-ios_33724

[4] Android Bug Opens Devices to Outside Control, Reuters, 24th February 2012
http://www.reuters.com/article/2012/02/24/us-google-android-security-idUSTRE81N1T120120224


2011: The Year that News Made the News

December 15, 2011

2011: A year that started with the continued aftermath of the WikiLeaks saga, and ended with the Leveson Inquiry investigating the phone hacking scandal in the UK. Along the way many big names saw data breaches in 2011 including Citibank, Epsilon Marketing, RSA, Google, and Sony.

News in the News

2011 was the year that the news made the news. We started the year talking about classified information finding its way into the press and ended the year with an inquiry into the press. Alleged offences under section 55 of the Data Protection Act are at the heart of the phone hacking scandal.
Image: © Claudia Paulussen - Fotolia.com

Back in January Excelgate reported that following the ongoing WikiLeaks saga, it was necessary to have a coordinated global response across governments if a repeat of such a massive leak of classified material was to be avoided.  The industry is just coming to terms with the ramifications of WikiLeaks, and at the East West Institute in London in June, it was recognised that there needs to be greater co-ordination in the future, not just between technical and legal practitioners and politicians, but also across geographies in drafting new data protection and privacy legislation in the future.

2011 saw the launch of some new standards and assurance schemes in the UK. In March, ISSA UK launched ISSA 5173 – a new security standard aimed at improving information security for small and medium-sized enterprises. I’ve been involved in the formation of the standard since the foundation of the workgroup back in 2010, and in the collation of feedback since the publication of the draft standard in March. To date feedback has been overwhelming positive, as well as being well received by the IT press. 2012 will see the publication of a series of guidance documents to accompany the standard. 2011 also saw the launch of CESG’s Commerical Product Assurance, the replacement product assurance scheme for CCTM for security solutions in the UK public sector. It aims at providing two levels of assurance for security solutions depending on where the proposed solution is to be used. This should open up the market for the relatively closed space of IL3 (for example local authorities) by encouraging competition and innovation in this space.

August saw the rioting across London and other parts of the UK. Social media was widely reported to have been used by some rioters in coordinating some of the riots. As reported in June, the misuse of social media comes in a number of forms: from criminal acts to damaging reputations – both corporate and private. In November, speaking at the London Conference on Cyberspace, The Foreign Secretary, William Hague, whilst warning against the dangers of government censorship with regards to the use of social media, did state that global, coordinated action is needed to deal with social media misuse and cyber attacks. In short he said “that behaviour that is unacceptable offline is also unacceptable online, whether it is carried out by individuals or by governments.”

The Levenson Inquiry, investigating the culture, practice and ethics of the press started its formal evidence hearings in November. At the heart of the phone hacking scandal is that somewhere along the chain personal data was potentially procured illegally. The What Price Privacy? report of 2006, published by the Information Comissioner’s Office already highlighted there was widespread abuse of the Data Protection Act in the UK and recommended a custodial sentence for Section 55 offences. As reported in my interview with Christopher Graham in August, the Information Commissioner will continue to push for custodial sentences for the most serious offences.

The message of the What Price Privacy?  report has somehow become lost and focused on journalists rather than the illegal trade in personal data. I certainly don’t want to see restrictions placed upon a free press: this is the foundation of any democracy. Wrong-doing and fraud should be exposed. There is already a large carve-out of exceptions in the Data Protection Act for journalistic purposes. The trade and procurement of personal data is already illegal, however, and in my view the law is not being enforced severely enough – either in severity of sentence for the most serious cases, or in the number of prosecutions.

I have been impressed with the ICO’s response to date with regards to education and the issuing of fines. They have taken a very reasonable position: that people need education as to how to comply with the Data Protection Act. It would indeed be foolish to adopt a mass fining policy of organisations when the powers of the ICO to issue a monetary penalty notice only came into effect in April 2010. On the other hand, April 2012 will mark the second anniversary of this and to continue indefinitely in this mode would be, in my opinion, be a mistake. I certainly don’t want to see a situation where every single human error results in a fine from the ICO. The Data Protection Act is now, however, a 13 year-old piece of legislation, and organisations have now had two years to ensure they comply with the law in this regard. If HMRC operated on a mainly notice to improve basis, I’m fairly sure we’d see a significant decline in tax revenues (i.e. non-compliance). They don’t however, and operate a sliding scale of penalties depending upon the severity of mistake/ non-compliance. For mistakes on VAT returns, for example, penalties are categorised as follows:

  • Careless : you failed to take reasonable care
  • Deliberate: you knowingly sent HMRC an incorrect document
  • Deliberate and concealed: you knowingly sent HMRC an incorrect document and tried to conceal the inaccuracy

In each category there is also a sub-category: promoted and unprompted (i.e. HMRC discover the discrepancy or you notify HMRC of it). The fines vary from 100% of the tax due for deliberate and concealed and prompted to 0% for careless and unprompted. I’d like to see a similar sliding scale defined for data protection offences and then these enforced, since I simply don’t believe there has only been 7 serious data protection offences in the UK since April 2010. I’ve come across more than that myself this year alone, ranging from the potentially criminal to the careless error.

I’ve also, over the course of 2011, become convinced of the need for a mandatory data breach notification laws for the UK, as is already the case in some US states. The naysayers are already out in full force in the UK, saying the UK doesn’t need another piece of legislation regulating business. It is worth bearing in mind that this legislation originated in California – that US state well-known for over-regulating businesses and stifling innovation -not! Similarly, the criticism of data breach notification laws is not based upon any real-world experience. A study from the University of California-Berkeley of views from CISOs in the US, showed that data breach notification laws has put data protection and information security firmly into the public eye, and actually fostered dialogue in some cases between the consumer and data controller regarding their data. It also empowers consumers to protect themselves, either by asking awkward questions of their data controllers or by simply shopping elsewhere. We need this raising of awareness and dialogue in the UK too. Why should we either trust or trade with an organisation that doesn’t safeguard our privacy?

Phil Stewart is Director, Excelgate Consulting & Secretary & Director, Communications for ISSA-UK


The Apple of my i

December 7, 2011

As 2011 draws to a close, inspiration for the way forward in the information security industry is drawn from the past and a man outside of it: Apple’s former CEO – Steve Jobs, who passed away on 5th October, 2011.

Apple: Think different

Apple’s iconic logo is as well known as their products are globally today. Many urban myths have sprung up over the years over the origins of the Apple logo: from the apple of knowledge from the Garden of Eden; that the logo resulted from Steve Jobs having worked in an apple orchard; to the death of the founder of computer science, Alan Turing, by biting an apple laced with cyanide. An interview with the logo designer, Rob Janoff in 2009 revealed that none of these were the inspiration for the bitten apple – it was purely a design decision to give the logo scale and make the apple instantly recognisable as such as compared to other fruit, such as a cherry. This version of the logo was used by Apple from the mid 1970s until 1998; the “Think Different” slogan was retired in 2002.

It was interesting to see the huge number of tributes for Steve Jobs, when he passed away in October this year. Has there ever been such a response for the death of a technologist? Steve Jobs’ death produced tributes from people from every walk of life and all corners of the globe: from Presidents to Prime Ministers; from rock stars to every day users of their products. Steve Jobs was a special person, who instinctively knew what his customers wanted – without asking them. He saw the value of technology as an enabler: in improving people’s lives- but he absolutely understood – that most people are not fascinated by the technology in itself but what it can do for them in their everyday lives (in the same way the motor car was an enabler in the last century: few people are concerned with how they work).  Apple’s products all have an underlying intuitiveness about them that really does allow their users to unplug and use them straight away– without assuming any technical knowledge hitherto.

Most of us, however, do not possess that instinctive insight that Steve Jobs’ had – we have to do our market research with our target audience first. Historically, many information security professionals will not engage with end users at all – since these are the people -they believe – will ask for things they can’t have and do all the things they don’t want them to do.

Some years ago I was brought into an American Investment bank as project manager, on a desktop platform refresh program that had previously failed miserably in its objectives. Successive project managers had come and gone on that project, and it always seemed a case of one step forward and two back previously. Typically – for the IT department – the team were kept out of sight and out of mind in the lower dungeons of the bank: safely away from daylight and customers. There wasn’t even a telephone for the team – all communication had been done via email previously!

I decided that as well as talking to the head of each business unit to determine what was their requirements were, I would also – shock horror!- talk to the end users of each team to determine and capture which applications they used and how they used them in their day to day tasks. I also initiated a series of end user tests, and ensured that a representative from each team came down and tested the applications before the desktop builds were approved and shipped back to the respective end-user desks. When business managers asked why we would be asking for 30mins –of one staff member’s time for testing, I explained to them that this was improving staff productivity, by reducing helpdesk calls and eliminating the need for recalled failed builds. This strategy payed off: not only did we retain this business we went onto to win further rollouts for other parts of the bank.

This year I’ve heard and seen things which beggar belief. A consultant proudly boasting that the organisation he was contracted to work for “deserved data breaches” because “their staff were uneducated” (worse still, he meant this in a generic sense, not specifically to information security best practice education). I’ve also heard all security vendors being branded as “snake oil vendors”. An interesting concept – I don’t think we’d have much of an industry without security vendors, and I’ve come across one or two unscrupulous practitioners in my time who have a scant disregard for data privacy themselves to whom the disingenuous adjective could easily apply.

Whilst there certainly are some security vendors around to whom the adjective “snake oil” can easily be applied to (and a reputable re-seller recently reminded me of one): those that have little respect for their customers’ product feedback; who are in the business purely to make money without advancing genuine information security; and whose products are so desperately clunky to use that they require reams of documentation to use them; that greatly reduce user productivity and encourage their end users to find a workaround, and thus bypass security policy. Equally, however, there are some innovative vendors on the market that are genuinely interested in advancing information security, by helping develop new standards; thinking of helping the SME community by taking away the laborious task of log oversight from them and outsourcing it to specialists; or helping to secure the use of the cloud. I’ve come across all these types of vendors too this year. To label all security vendors in the same fashion is not only disingenuous to all vendors but also rather childish.

Earlier this year when I interviewed the UK’s Information Commissioner, Christopher Graham, for the ISSA, he remarked how he felt that end users were just not getting the message regarding data protection.  Too often we see the same old problems: users not being educated, making basic mistakes. Personally, I think we have an industry that’s geared up for messaging aimed mainly at board and manager level and around legal compliance, so is it any wonder? Who is teaching the end users how to handle personal data correctly, and what should and shouldn’t be stored regarding credit cards on a day to day basis in their jobs? Similarly, I’ve always disliked the industry term “evangelist” – widely used in our industry – since that implies preaching! Who on earth likes being preached to? Perhaps that’s why few end users are listening.

We urgently need an approach where information security professionals think about being business enablers, whilst enhancing security, and can talk in a language that their end users understand. For twenty years plus now, we’ve been thinking that all our problems will be solved if only we throw more technology at it. Yet still we see data breaches. Similarly, we need security products that are focused at improving end user productivity, rather than working against the business. Then users might stop looking for workarounds, to both the solutions and hence their security policy.

If only Apple did iSec!

Phil Stewart is Director, Excelgate Consulting  and Secretary and Director, Communications for ISSA UK.


The Insider Threat

September 5, 2011

Last month the Information Commissioner, Christopher Graham, gave an interview to the ISSA, ahead of his address to the ISSA later this week, and looks at how most data breaches start with an employee from within the organisation:

In 2006, the ICO uncovered the organised and illegal trade of confidential personal information in the report, What Price Privacy? How widespread do you believe this problem is today in the UK?

“I’ve described it as a modern scourge. The headlines, both in 2006 and more recently, have all been about the behaviour of the press, but I think it goes much further … Basically we’ve got pretty systematic trashing of our rights under the Data Protection Act.  My predecessor, in flagging the blagging, made the case for a much stronger penalty that would act as a deterrent but also send a very strong signal that data protection offences are not a victimless crime. It’s very important now that parliament gets on and activates section 77 of the Criminal Justice & Immigration Act 2008 which allows for the custodial penalty of up to 6 months in a magistrates court and up to 2 years in the crown court, but has not been commenced. It wasn’t commenced because of a stand-off between the politicians and the press.  I think we can now get through that, because the terms of the debate have changed a bit.  This isn’t something that should wait for the Leveson Inquiry because frankly it isn’t about newspapers – it’s about debt recovery, claims management companies, matrimonial disputes, child custody battles, you name it.”

Yes, I remember when you became Information Commissioner in 2009, at a Parliamentary select Committee you re-iterated the need for a custodial sentence for convictions under section 55 of the Data Protection Act.

“Indeed, I did.  I didn’t get very far, though, because basically the politicians and the press had agreed this was going to be a sort of Sword of Damocles hanging over the press and if they misbehaved then it would be activated. I think the whole point was the 2006 report – yes, it talks about the behaviour of the tabloid press because the particular private investigator that the ICO raided, that was his main line of business.  But that’s  not what all the report was about. The idea that you don’t have to take any action against staff in NHS walk-in centres selling information to claims management companies because of some arcane dispute about investigative journalism is clearly nonsense. I didn’t get very far two years ago but I’m determined to go on pushing. There’s the human factor in all of this. We can have wonderful systems and policies for data protection and data security, but if the men and women on the ground don’t take it seriously and don’t think it matters – none of those systems are going to work. A small fine in a magistrates court is simply not a deterrent. I think understanding that you might go to prison is more like it, but it also enables the courts to look at the whole range of possible penalties which might be somewhere between a small fine and the threat of going to prison or having a community sentence.”

I noticed that in the ICO’s latest annual report – you mentioned the NHS – as an organisation they had the largest number of data breaches

 “They are about the largest organisation so I’m not surprised by that – they are quite good at reporting breaches, it’s part of their procedure. I recently met with the chief executive of the National Health Service,  Sir David Nicholson.  We had a very good, workmanlike discussion. There’s a lot of change in the NHS and that makes for a particularly dangerous time but you’ve got to distinguish between the trucks and the tracks. We’re much better at thinking about the trucks: these are the great security initiatives and projects. The tracks are the routine: the day-to-day. My experience of a lot of organisations, not just the NHS, is that the messages haven’t got down to the grassroots. Data protection is seen as the sole concern of a few geeks, and as a result terrible things happen.  People have heard the messages but haven’t internalised them. Every week I’m dealing with laptops going missing, not encrypted; portable devices and papers left on the bus; sensitive files dumped in a skip. In the health service of course by definition the information is almost certainly going to be sensitive information so we’re working very closely with the NHS so that they can get the big things right – summary care records etc, but they can also get the smaller things – which actually aren’t that small – such as persuading the receptionist in the GP surgery you don’t give things out over the phone just because somebody rings you up and sounds persuasive.”

What can be done in the health sector and other sectors to generate a culture where data protection becomes second nature, rather than seen as an annual event, or a burdensome task?

“I think organisations both in the public sector and the private sector have so much at stake in terms of their reputation, which of course in commercial terms you can put a value on, and in the public sector it’s all about the threat of reversing all the work that’s gone into citizen engagement.  The fact that it’s a real issue at the top of the organisation means that the message then needs to be taken to the whole organisation: it’s not just something of peripheral concern. Yes, it’s about training, but then it’s about auditing, it’s about going back and making sure people are practising what they know they are supposed to be doing – it’s about mainstreaming the whole thing. It should be absolutely part of the performance review system. We shouldn’t have the situations we’ve had over the past few years or so where people dealing with very sensitive information are treating it in such a cavalier way.  Our first civil monetary penalty was imposed on Hertfordshire County Council where they’d faxed highly sensitive court papers in a child welfare case to what they thought was Watford County Court but unfortunately it wasn’t and they’d got the number wrong. You wouldn’t do that if you were thinking about what the material was you were handling, that it was very sensitive, personal information about vulnerable children, so faxing wasn’t a very good idea anyway. You needed to have made sure you had got the right fax number, and that someone was waiting for the fax at the other end and that you didn’t just have finger trouble and were about to send it elsewhere. The civil monetary penalties which we’ve had at [the ICO’s] disposal since April of last year have certainly had a sobering effect and made people sit up and take notice. We’ve only imposed 6 of them – we’re not trigger happy – but it’s certainly made it very real to organisations who have focused on the reputational damage – being hit with a penalty.”

Following the use of social networking sites such as Twitter to reveal the details of super-injunctions earlier this year, the Prime Minister has called for a review of Data Privacy legislation in the UK. Will the ICO be contributing to the work of any new parliamentary committee in shaping any new data privacy legislation?

“Well, the Data Privacy legislation will be reviewed anyway in the context of the European Directive – that’s a process that’s going ahead. The Commissioner Viviane Reding is leading that process and the ICO is very much engaged with our European colleagues in the Article 29 working party – we’ve been inputting into that study. We expect to see a draft of a directive in about November and then the legislative process will follow. In a few years’ time there will be changes to data protection law, because the directive on which it’s based will have changed. We don’t think there is much wrong with the principles, but we’re looking for legislation that is much more modern and realistic in terms of what actually happens in the world of global information exchange.”

 

The legislation has been behind the technology in terms of usage of it, hasn’t it?

“Absolutely. It’s very important that Brussels produces a legislative proposal which is reasonably future-proof – if you get the principles right then the principles can take the technological changes on board. If you’re overly prescriptive then you’ll come up with something that is highly relevant for 2013 but by the time it comes into law it’s probably outdated because of all of these other developments. The ICO, working with anyone who will listen to us, is stressing the accountability principle – that the legal responsibilities lie with the data controller, and that the role of the data protection authority is to regulate that relationship and intervene as and when necessary on the basis of risk, rather than pretend the data protection authorities can be like some latter day King Canute holding back the waves, and let’s not kid ourselves that no information moves across borders without some tick in the box from the Data Protection Authority. The current Directive, of course, is pre-cloud, but it ought to be clear that’s it’s a very outdated text that doesn’t take into account the realities of the modern world.”

Would you like to see greater powers for the ICO, such as the power to audit an organisation to investigate a serious data breach?

“We gained some extra powers under the Coroner’s & Justice Act, and we have found that doing consensual audits is going really well, more in the public sector than the private sector. We can run the ruler over a company’s compliance which can then be a badge of pride: “we’ve been checked over by the ICO”. There are powers to compulsorily audit government departments. I will go as far and as fast with the existing powers that I’ve got, but if I come to the conclusion that I’m not able to get anywhere-  that I can‘t audit  organisations – then I will certainly return to the Secretary of State. I can get warrants – I signed a warrant today. It would be more satisfactory to require an audit, probably as part of an undertaking to improve.”

At a recent ISSA chapter meeting, one speaker remarked that social engineering over the phone is often the seed for an attacker that allows them  either to guess a password or a weakness in a system or a process to exploit.  Many of the data breaches we have touched on this afternoon all start from an internal employee in an organisation. Do you think we are doing enough to educate employees in organisations to create a security culture?

“No I think we’re not and you absolutely put your finger on it when, in relation to this row about hacking, you have to ask the question: ‘how is it possible for the phones to be hacked?’ and the answer is: somebody has blagged, which they shouldn’t have done. That gets us back to section 55 of the Data Protection Act – it’s just too easy to blag and the penalty isn’t very impressive if you get caught. So I believe very strongly we’ve got to push for that [a custodial sentence] and I’m trying to get the politicians to see that this is something we need to do anyway – it’s going to take some time. Frankly we can’t wait [for the Leveson Inquiry] . We’ve got information leaking from databases, every day – and not, as I said earlier, to journalists particularly – because information is valuable and it’s making people a lot of money.  That’s the root of our problems – so if we’re concerned about cyber security then getting these basic things right is absolutely essential, and members of staff in all organisations need to see the connection between something which seems to them as a bit naughty, but not terribly bad, and the terrible things that happen as a result.”

Christopher Graham – thank you for your time this afternoon and we look forward to your address at the ISSA next month.

 

The Information Commissioner, Christopher Graham, was talking to Phil Stewart, Director, External Communications, ISSA UK. Christopher Graham will be addressing the ISSA at their next meeting on 8th September 2011 in London.