Security is inextricably linked with electronic and human processes. Yet process is one of the most difficult things to enforce with any degree of reliability or effectiveness. This in itself can be a threat agent. The flow of information is too rapid and voluminous to address in the traditional way, which calls for a wholesale shift in thinking about the process of security management.
A security policy is only as good as the paper it is written on unless the procedures that support that policy are able to be enforced in reliable, effective and consistent ways. For the users, that may mean leading them through a very linear process that culminates in a logical end. For the security response team, that may mean a multidimensional workflow with lots of “choose your own adventure” variables along the way, dictated by different types of security events and the weighted risk of assets involved.
Consider the variables introduced by a single type of user role, like an adviser working in a large financial services firm, for example. This type of user has access to an incredible amount of sensitive client data. The adviser may not be hip to all the types of information breaches that can occur. Desiring to have a mobile office, he or she may put data in motion frequently, manifested by copying files to portable, online or internal storage, emailing, etc. What controls are there to enforce process around good data hygiene? Are there tools to detect the movement of sensitive data? If detected, is a data owner alerted? Will that alert be silenced amidst all the other noise one must respond to? Is there a closed-loop process to educate the adviser about the company’s policy regarding this data? Is it even possible to maintain regulatory compliance in this scenario?
There may be exceptions that you want to allow. For instance, certain individuals may have a valid business reason to move sensitive data outside the trusted network. In that case, the exceptions need to be handled with the appropriate approvals and tracking, and the data must be verified as secure.
The term “security response” is the reactive processes that happen when an event occurs, but also the proactive processes to mitigate the risk of exposure. By orchestrating processes in a more effective way, it is possible to deal with security events quicker, enforce secure user and system provisioning, and ensure ongoing compliance with security policy.
It is even more vital during times of downsizing because individuals that remain employed end up having a greater span of control. This not only increases the workload, but it increases the scope of an individual’s access. Sometimes during downsizing, companies will employ contractors in an effort to shed some of the burdened costs associated with FTEs. If, however, processes are not automated and secure, this invites additional risk.
In short, security is process.
(See Figure 1 at end)
The following are questions every organization should consider about their processes around security:
- Are users accountable for their actions, and are your processes themselves auditable at every turn?
- Do your processes allow for personnel gaps (i.e., process participants who are not available)?
- Is there data missing about critical assets or configuration items that slows down response? Might that data live in other systems?
- Has management accepted certain risks simply because the process cannot address them without an extraordinary amount of manual effort? (cost vs. benefit)
- Is the mean time to recover too slow, or worse, are incidents backed up so badly that only the most egregious ones can be addressed?
- Can personnel missteps in the process cause disastrous ramifications downstream?
- Do internal silos get in the way of process efficiency?
- Are the processes too rigid, or can they adapt quickly as changes in the environment dictate?
- Is an owner assigned to each process, and is there management commitment?
- Will the process and underlying procedures stand up in court if needed? (ISO/IEC 27002 provides a good blueprint for security incident response and evidence-gathering procedures.)
- Are your processes able to be measured and reported upon?
- Is it difficult to train new employees because your processes are antiquated and require many disconnected manual steps?
- Can regulatory compliance really be assured?
- Is the fire-fighting so bad that continuous improvement gets neglected?
- Are you aware of advances in technology that serve to orchestrate both human and electronic resources in order to fulfill processes – technology that makes possible what you thought was impossible?
These last two points are critical if your organization is to progress. Often an organization will simply accept the state it is in with processes because the exasperated individuals involved feel as though they have nothing left to give after the intense firefighting they endure. Or, process improvement fails to get management commitment because the returns can seem softer than they really are, making them reticent to invest. The result of this can be high turnover and it can increase the probability of vulnerabilities. The point? Automate your processes now.
"Great, but I need a blueprint.”
What follows are some logical actions that need to be taken when deciding to automate processes:
Prioritize your use cases. You probably have dozens of processes that you know could be more efficient, but the reality is that anything will be an improvement (the caveat here being that making a bad process faster and more efficient is actually a step backward). Consider the following characteristics when choosing which processes to automate first:
- Highly visible but not mission critical. You want the success of your process automation to be recognized when completed, but you want to do your first couple automation projects on safer-use cases where you can gain experience but still gain visibility.
- Too many personnel are required to complete the process. Achieving cost savings by no longer requiring actual people to touch processes is a big reason why you retool processes anyway. Far too many processes require simple approvals and they break down when people are out of the office.
- System integration is required but will be minimal. Most automation will require some kind of data or UI federation across multiple systems. For your first use cases you want some of this, but you don’t want it to be overly complicated at this point.
- Can be divided up into logical chunks. An excellent approach to process automation is to automate and pilot only portions of a process at a time, rather than to boil the ocean. Look for processes that can be broken up.
- Low-hanging fruit. A couple of quick successes can pave the way for additional funding and resource commitments.
It is important in this initial stage not to try and take on or even identify the dozens of processes you know exist. Just settle on your top one to three processes and make plans to automate them.
Evaluate tools that help with workflow. Many vendors have tools to help facilitate workflow. With the right tool, it is possible to cut out months or even years of development work, not to mention additional products that purport to solve all your woes. The positive budget implications of this undertaking are enormous, but you should prove it with a detailed cost/benefit analysis after the tool evaluation process and discovery sessions have been completed.
Assign a process owner and lead technical resource. The process owner should be the individual who knows the most about the intricacies of the process itself, and is likely to be the only person who can tell you how it flows from end to end. Every process should have one, and you should have commitment from the process owner’s management to do the automation work. The technical resource can understand the deep technical workings of the workflow tool and has a broad range of experience with other technologies such as database languages, scripting/coding and Web services. Another must for this individual is that he/she has excellent business analysis skills. They are a rare breed but are worth their weight in gold.
Hold discovery sessions. You will have to dig deep to know the intricacies of your processes. Any given process may span several departments and multiple individuals. They will likely need to span many disparate systems as well to maximize the ability to take action on the data. Look for procedural gaps in the process that have required manual data input, approvals, exceptions, status logging, etc. Plan for the types of SLAs you will need to report on. All of these things can be automated, or at least routed more efficiently. Determine what triggers processes so that you can build in invocation methods (e.g., a user, a monitored event, an email, a set schedule, etc.).
Avoid the temptation to build sexy UIs and opt instead to spend time on process design and efficiency. Where UIs are needed, function should be emphasized over form. A single process will require multiple discovery sessions and ad hoc consultations here and there where needed. It is important to perfect the steps of the process before ever entering the technical design phase. That said, it has to be approached with the idea that the workflow tool and integrated systems will replace many of the steps.
Investigate APIs for systems to be integrated. Many people do not realize it, but software vendors have been coding rich application programming interfaces into their products for years. An application’s integration interface will come under a variety of monikers: API, SDK (software development kit), connector, plug-in, etc. Despite the confusing nomenclature, the easiest of all products to integrate with are going to be those that have Web service APIs. Web services conform to open standards and abstract the back-end work one would normally have to do in order to read and write data, which also provides a far greater degree of protection. The lowest common denominator is to integrate directly with the external system’s database, but this is a last resort only and usually cannot be done without substantial planning and political wrangling. It is also an extremely technical undertaking.
Design the process with flexibility in mind. This is not the end; it is merely a start to creating a finely honed process. You will need to design the process with flexibility in mind so that you can easily iterate the process as business needs dictate.
Pilot the new process in chunks. As mentioned, an excellent approach to process automation is to automate and pilot only portions of a process at a time. Choose smaller groups of people to pilot pieces of the process. Gradual assimilation will help with better acceptance levels for your larger processes. Smaller and less complicated processes could easily be done all at once.
Get help. There are many excellent third-party consulting firms that can help you at every step to redesign workflows. Take advantage of their wide range of skills from helping many other clients.
Link process effectiveness to employee performance. Once you have automation in place, you want to continuously iterate the process until it is a well-oiled machine. Nobody knows the inefficiencies of a process like the employee who must be inflicted with it. They should make material contributions to the ongoing streamlining of the process, and it is wise to measure them on this and review it in performance evaluations.
Writer and management consultant Peter Drucker said, “Efficiency is doing things right; effectiveness is doing the right things.” If your organization is in a state in which continuous improvement gets neglected because of the day-to-day toils of just trying to keep up, don’t accept it. Make the time you need to kick off a process reengineering program. Through time and effort, you will increase the security posture of your organization and the quality of life of your employees. The upfront costs need not be overly expensive, but the returns can be astronomical if you really take the time to compare your present state to the state of your environment with better process automation.