Posted on by Mission Assurancein
The purpose of this two-part blog series is to discuss five challenges that often plague insider threat programs and more specifically the analysts that are working in insider threat hubs. I am in a unique position to discuss this area because I have many years of experience working directly with operational insider threat programs of varying maturity levels. Thus I have a front-row vantage point to understand the challenges that analysts face on a daily basis. In this blog post, I will discuss some of the key challenges and associated recommendations (e.g., quick wins) facing many organizations.
As you read this blog, think about these questions (1) How many of these challenges are you facing today? (2) Are there any challenges on this list that lead to an "aha" moment? (3) Are there challenges that you are facing that did not make it onto this list? (4) Do you need assistance (from inside and outside your organization) with combating any of these challenges? Let us know your answers and thoughts via email at firstname.lastname@example.org.
One important and often overlooked aspect is training the analysts to know what to look for in the data that is pushed or pulled into the hub. The data could consist of HR records, network activity, badge access, and a myriad of other useful information for the analyst to examine data. Frequently, determining an indicator of concern can be thought of as finding a needle in a stack of needles. It is imperative that the insider threat program team set the tone, expectation, rules, and measures of success for the analysts to follow.
Another concern is the breadth of the insider threat problem as it encompasses technical and behavioral science, as well as counter-intelligence domains. Within the technical domain there are specialty areas such as networking, databases, modeling, statistics, etc. Given budget and hiring restrictions, it is difficult to hire for all of these separate positions. Thus, it is imperative that training is provided to the analyst to ensure they are up to speed on as many of the domains as possible.
Quick Win #1: Enroll in the brand new SEI Insider Threat Analyst Course
Quick Win #2: Enroll in the NITTF Insider Threat Hub Operations Course
Consider a situation whereby the hub is made up of various analysts from different organizations working on different contracts with each having a different role and responsibility within the insider threat program team. It is important to understand that insider threat is a team sport and requires collaboration. Another concern is how to best handle a particular concerning event. For example, suppose the hub data shows a high frequency of printing, during off-hours, and immediately before foreign travel. The analysts with a cyber-background might recommend a different course of action from those with a counter-intelligence (CI) background. This is often known as the right of first refusal. In simple terms, an insider threat hub comprised of one type of analyst may think the best course of action is to disable access, notify management, and request that the employee of concern be terminated. However, another set of analysts may recommend that management take a wait and see approach. The rationale being to see what else the insider is capable of, whom else they might be colluding with, and to see if there is a foreign nexus at play.
Quick Win #1: Create an insider threat playbook and action plan. This playbook should be developed before there is an incident to ensure that the processes are well-understood, tested, and revised according to who has the authority (refusal).
Quick Win #2: Review the SEI Common Sense Guide to Mitigating Insider Threat, Fifth Edition, focusing on the section "Organization Wide Participation."
It is quite difficult to perform insider threat detection without the necessary data in place. Often the data is obsolete or perhaps does not cover all networks or employees. Additionally, data is kept close by the data owner and breaking down the barriers to allow seamless sharing of data is a challenge. The challenge is for the analysts to know the process for data collection and data sharing. They must delicately balance both the frequency and amount of data they are requesting. I have seen many situations where analysts requested a mountain of data but never actually used it. On the flip side, I have seen analysts that were hesitant to request information for concern of "rocking the boat," perhaps due to the culture of the organization.
Once the data authorization is given there are several other subsequent issues. They range from how to secure the data during transit and at rest. Who has access to the data? How long is access granted? How often is the data updated? Is the data being pushed, pulled, or is it a hybrid approach? All of these challenges should be discussed ahead of time with the insider threat program management office, legal/privacy, and the data owners to reduce the impact on the stakeholders.
Quick Win #1: Create a data sharing and handling agreement.
Quick Win #2: Leadership buy-in. Ensure that you have the appropriate leadership in place that can assist with getting the agreements in place and enforcing the agreements for data sharing that you have put into place. Related, senior leadership should help pave the way by negotiating and promoting information sharing.
Once the analysts have access to the data, an entirely new set of challenges may arise. Many organizations--either through the use of commercial tools or in-house methods--strive to develop a ranking of the riskiest employees. The risk equation is fluid and consists of many different variables such as clearance held (top secret), position (system admin), and account privileges (super admin. Additionally, it also includes different risk indicators such as frequent use of "bad" keywords, accessing blacklisted sites, accessing file shares and networks without a need to know, frequent printing, etc. With that said, how does the analyst calibrate the data to show the riskiest person in the organization? If an employee has numerous minor violations does that score get calculated higher or lower than the employee that has one egregious violation? Who is the employee of concern? Stated another way, is quantity or quality scored higher? Another challenge is making sense of the data. A particular employee of concern may be ranked high on a list of most anomalous users. All of this information should be analyzed and compared to a baseline which can be that same employee's previous computer usage or a baseline of a peer doing the same type of job. The challenge is understanding why that particular person is anomalous and what a change in their baseline really means.
Quick Win #1: Ensure appropriate communication between all insider threat hub analysts to ensure that the decision is being made with all appropriate information.
Quick Win #2: Leverage technology (but don't 100% rely on it) to help you make sense of the data that you are seeing. Be cognizant of developing a baseline over time and comparing any deviations to it.
False positives, resulting from the analysis of the data, can be quite frustrating and time consuming for the analyst. Generally speaking, false positives in the context of insider threat can be thought of as a system firing an alert when there is nothing malicious there. For example, consider an insider threat hub that uses a particular "bad" keyword list. Now suppose of the words on the list is "DWI" (as in driving while intoxicated). Is the system going to generate an alert every time it encounters the word "Bandwidth?" The ability for the system and the analyst to reduce the amount of false positives is paramount for success. However, there is also the conflicting approach regarding the organization's appetite for risk and simply not wanting to miss a single potential threat.
Quick Win #1: Review the blog post titled: "Navigating the Insider Threat Tool Landscape"
A related step is to familiarize yourself with natural language processing, the use of Regular Expressions, and other techniques to reduce false positives.
Quick Win #2: Ensure that you have an understanding as to what the organization (or insider threat program designated approving authority) considers to be risk. The organization should have completed a risk assessment at various points during the implementation and operation of its hub.
We recommend that you consider each of these challenges and have the appropriate conversations with the members of the insider threat program and specifically those working with or in the hub.
Be sure to check back for part two of this blog series where I will be covering five additional challenges facing insider threat programs and hub analysts, including: (1) false negatives, (2) effectiveness measures, (3) use of insider threat tools, (4) types of insider incidents (malicious or unintentional), and (5) privacy, legal, civil liberty, and GDPR considerations.
Please send questions, comments, or feedback to email@example.com.