This article will explore five serious problems preventing information security policies from being implemented, even though these policies may have been written with the best of intentions. Cutting across all five of these causative factors is a theme involving a lack of understanding about the nature of policies. All too often policies are written in a rushed and narrowly-scoped effort, which is in turn responding to circumstances such as: (a) a request or complaint issued by a business partner or important customer, (b) an adverse audit finding, (c) a necessary step on the way to obtaining some type of security certification (such as the Payment Card Industry’s Data Security Standard), (d) a legal or regulatory requirement, or (e) a serious security breach or some other costly operational problem.
Information security policies are by their very nature cross-disciplinary, cross-departmental, cross-organizational, and cross-national. That means that, to be successful, they must embrace many different considerations. These considerations may on occasion be at odds with each other. For example, an access control policy at a bank may need to be rigorous, and may on one hand need to satisfy prevailing laws and regulations, keep fraud and other problems to minimum, and allow participation in inter-bank networks. On the other hand, this policy must also be simple and easy to use, and not be so onerous that workers are prevented from efficiently doing their work, and not be so burdensome that staff at customer organizations accessing the bank’s systems are encouraged to do business elsewhere. So the best overall advice, to avoid implementation problems such as those discussed below, is to study the big picture in advance, to understand all the requirements that may have a potential influence on the final version of a policy.
While the performance of such a holistic review of requirements may at first seem economically infeasible, much of this information should already have been collected in the course of performing a risk assessment. Some other essential pieces of information, needed to paint a holistic picture of the environment into which a policy will be placed, will be gathered by well-run information security operations. For example, a loss history database, documenting the incidents occurring over time (and also categorizing these, analyzing these, and ranking these incidents), will in turn provide a perspective that is essential to understanding the real-world operating environment into which a new policy will be inserted.
(1) Failure To Explicitly Define Long-Term Implications: While keeping presentations about proposed policies both short and to the point is certainly desirable, sometimes information security specialists go overboard with this objective. They may then leave out the important long-term implications of adopting a specific policy. Management may later be dismayed to discover that these implications contradict, or are otherwise in conflict with, other organizational objectives. This description of the problem of course assumes that the information security specialist making a request for approval has himself or herself thought through these long term implications; certainly there are many cases where this has not been done before a request is made of management. Either way, the net result is the same: management is angry or upset because they feel as though they have been led through a “bait and switch” process, where they thought they agreed to X, but really what they got was X, Y, and Z.
The top managers who approve policies should not be expected to ask about long-term implications. In many cases, because they are not technical, these managers cannot imagine the implications of proposed policies. Instead, the responsibility should rest on the shoulders of the middle manager proposing an information security policy. Even if this information is never mentioned in a presentation, it should nonetheless still be delivered to management, perhaps as an appendix to a written proposal to upgrade a policy.
(2) Full Cost Analysis Not Employed: With the severe economic pressure that so many IT shops are facing these days, it can be tempting to only request the first step in a long implementation process. For example, next year’s budget proposal, in support of a newly adopted policy, may request only the purchase of new security software and training for operations staff to use that same software. The requested budget may fail to mention the cost to handle frequent fixes and patches, on-going maintenance charges for the software, end-user training required in order to properly utilize the software, etc. Management is never going to be happy to hear that there are hidden costs, and this unhappiness may mean that the whole project gets dropped midway across the stream toward implementation.
While in some instances the requestor of funds many not have understood the full cost of a proposed policy, failure to employ a full-cost approach is clearly out of step with current IT management practices. For example, management is increasingly looking at TCO, or the total cost of ownership. Management really needs to know the long-term costs and benefits associated with a proposed policy. This includes the implications for many things including: production system downtime to install a new security system, future upgrade and scalability expenses associated with a new system, and the impact of the new system on response time.
It is thus advisable that the requestor perform this research before a management request is made. The requestor should have identified the technology that can be used to implement a policy, before that policy is proposed. This will give the requestor an opportunity to talk with the vendor about full life cycle costs, as well as the other implications of adopting a particular approach to improve security or privacy. This background research will also prevent the occasionally occurring (and highly-credibility-eroding) situation, where it comes to light that there is in fact no commercially available software that can be used to implement an adopted policy, and that an in-house solution must instead be developed.
(3) User Training & Acceptance Efforts Not Undertaken: Failure to convince those who will be affected by a policy that the new policy is indeed in their best interests is most likely going to be a serious problem. As much as those responsible for meeting a deadline might like to autocratically dictate that such-and-such a policy will be adopted by everyone — period, this approach sets up a resistance dynamic that will interfere with the consistent implementation of a new policy. Users need to be respected, and convinced that the new policy is in their best interests, and that it also protects the organization as a whole, and then they will be much more likely to go along with a changed operating environment.
Granted, sometimes user resistance is illogical or out-of-touch with the facts. But it will nonetheless be encountered more frequently if we do not take time to explain and sell new policies and the systems that implement these policies. For example, many years ago, a programmer didn’t like a new policy that required an automatic session time-out to kick in after a certain number of minutes had elapsed without any key on the keyboard being depressed. He was not part of the discussion about adopting this policy, and he hated to repeatedly sign-in because he had been signed-out automatically. While it was certainly not a good use of organizational resources, while at work, he designed a small script to automatically issue a space after every period of 14 minutes had elapsed that involved no typing on his keyboard (the auto-logout feature would kick in at 15 minutes). Because his job involved programming, and a random space interspersed into his code would not affect the operation of the code that he was developing, this solution was an elegant one.
It would have been much better for information security (especially because his workaround became a popular topic of internal discussion), if he had understood why an auto-logout feature was now being required. Those folks proposing policies really need to be intimately in touch with the user community, and they need to know how the proposed policies will affect the user community. To meet this objective, and to get the best results, have representatives from the user community provide input to the writing of policies, the implementation of policies, and the development of related user training.
(4) Discovery Of Unforeseen Implementation Conflicts: Failure to research the cultural, ethical, economic, and operational implications of the policy implementation process is often a problem. This is particularly serious in multi-national organizations where the local way of doing things may be at great variance from a newly adopted policy. For example, while consolidation of data centers may seem like a good idea from a cost containment standpoint, and it may also provide new opportunities to consistently apply privacy measures to all human resources data from various countries. Such an approach may run afoul of so-called trans-border data flow laws. More specifically, privacy laws in western Europe may prevent human resource data from leaving the country where it is now stored, because then the government’s ability to oversee and regulate this data would be eliminated. While there are some workarounds, such as the “safe harbor” agreements that permit the international movement of such data, this example illuminates what may become an unforeseen conflict blocking the implementation of a particular policy — the existence of different local way of doing things.
This particular example also points to what has become one of the most complex areas of modern information security work. We are talking about the harmonization of security and privacy laws and regulations across countries, so that organizations can have clarity about what is permitted and what is not, and so that organizations can then go on to adopt a certain policy across-the-board throughout the world. To be required to develop special approaches for users in certain countries, or for certain departments within an organization, that approach violates the secure systems design principle of “consistent application.” Security is eroded each time an exception is made; to increase the level of security, all users should be required to abide by the same policy.
(5) Communications Gap Between Technologists & Management: In many instances, the information systems technical infrastructure is modified regularly to respond to new security threats, to adjust new software so that it operates better, to accommodate a new business partner, to improve user response time, etc. These changes don’t necessarily go through a formal change control process, instead being a reflection of the formal duties assigned to technical staff such as systems administrators. While these technical staff may be diligently attempting to keep their own portion of the information systems infrastructure secure, reliable, and responsive, the changes that they adopt locally may close doors, and thus prevent the future changes required in order to implement an organization-wide policy.
The communications gap here is between the technical and administrative staff who are often running around handling problems, sometimes in crisis mode, and talking with each other. These people don’t necessarily discuss their activities with the top managers who approved a policy, who in turn may be occasionally talking to middle managers. This gap in internal communications can be especially problematic if the Chief Information Security Officer, the Chief Privacy Officer, or some other middle manager who ordinarily would be expected to act as a bridge, is non-technical in his or her orientation.
This communications gap may also come about when one department decides it needs a specific control, and it goes ahead and writes a local policy, and then unilaterally implements a control supporting that local policy. This local implementation may later be shown to be in conflict with an organization-wide policy and/or security architecture. For example, at one high-tech company where the author worked, the research & development group was very concerned about their latest product designs walking out the door on removable storage media. They accordingly adopted a policy mandating a digital rights management (DRM) system implemented on thin workstations, so that the sensitive information could only be accessed via those workstations physically located within the office. This approach, although effective in terms of meeting departmental goals, ended up being incompatible with the new encryption policy adopted for the organization as a whole.
Communication is Key: By paying attention to these five common mistakes, the information security practitioner can save his- or herself and the organization a lot of wasted time and effort. As the reader will note, the common thread running through these is communication. More and more, the information security team must have both technical and interpersonal skills to be successful.