search menu icon-carat-right cmu-wordmark

5 Best Practices to Prevent Insider Threat

Headshot of Randy Trzeciak.

Insider threat continues to be a problem with approximately 50 percent of organizations experiencing at least one malicious insider incident per year, according to the 2017 U.S. State of Cybercrime Survey. Although the attack methods vary depending on the industry, the primary types of attacks identified by researchers at the CERT Insider Threat Center--theft of intellectual property, sabotage, fraud, and espionage--continue to hold true. In our work with public and private industry, we continue to see that insider threats are influenced by a combination of technical, behavioral, and organizational issues. To address these threats, we have published the fifth edition of the Common Sense Guide to Mitigating Insider Threats, which highlights policies, procedures, and technologies to mitigate insider threats in all areas of the organization. In this blog post, excerpted from the latest edition of the guide, I highlight five best practices that are important first steps for an organization interested in establishing a program to implement to protect and detect insider threats.

What's New in the Latest Guide

While intellectual property (IP) theft, IT sabotage, fraud, and espionage have continued to appear as the primary forms of malicious insider threats, new research has led us to understand the patterns related to unintentional insider threats. These threats represent a significant risk for organizations and potential attack vectors for malicious insiders and external adversaries.

The best practices included in the fifth edition of the Common Sense Guide are reordered to better align with the development of an insider threat program. Significant updates have been made to the best practices "Know and protect your critical assets," "Building an insider threat program," "Deploy solutions for monitoring employee actions and correlating information from multiple data sources," and "Establish a baseline of normal behavior for both networks and employees."

The revisions of practices focused on data analysis provide insider threat programs with potential data sources and methods of analysis. These practices reflect our recent experience with monitoring and analysis capabilities in operational environments.

The remainder of this post will detail five practices including emphasis on six groups within an organization--Human Resources, Legal, Physical Security, Data Owners, Information Technology, and Software Engineering--and provides quick reference tables noting to which of these groups each practice applies.

  • Know and protect your critical assets
  • Develop a formalized insider threat program
  • Deploy solutions for monitoring employees actions and correlating information from multiple data sources
  • Clearly document and consistently enforce policies and controls
  • Incorporate malicious and unintentional insider threat awareness into periodic security training for all employees

Know and Protect Your Critical Assets.

Reference table indicating an organization's Human Resources, Legal, Physical Security, Data Owners, Information Technology, and Software Engineering departments are responsible for knowing and protecting critical assets.

The basic function of an insider threat program is to protect the assets that provide your organization with a competitive advantage. A critical asset can be thought of as something of value that if destroyed, altered, or otherwise degraded would impact the confidentiality, integrity, or availability and have a severe negative affect on the ability for the organization to support essential missions and business functions.

Critical assets can be both physical and logical and can include facilities, systems, technology, and people. An often-overlooked aspect of critical assets is intellectual property. This may include proprietary software, customer data for vendors, schematics, and internal manufacturing processes. The organization must keep a close watch on where data is at rest and in transport.

Current technology allows seamless collaboration, but also allows the organization's sensitive information to be easily removed from the organization. A complete understanding of critical assets (both physical and logical) is invaluable in defending against attackers who will often target the organization's critical assets.

The following questions help the organization to identify and prioritize the protection of its critical assets:

  • What critical assets do we have?
  • Do we know the current state of each critical asset?
  • Do we understand the importance of each critical asset and can we explain why it is critical to our organization?
  • Can we prioritize our list of critical assets?
  • Do we have the authority, money, and resources to effectively monitor our critical assets?
  • Do we know who has (and who should have) authorized access to those asset?

Develop a Formalized Insider Threat Program.

Chart indicating organizations that have been protected from insider threat.

The trust that organizations place in their workforce can leave them vulnerable to malicious insiders, who often use particular methods to hide their illicit activities. Only by taking commensurately specialized action can organizations effectively detect, prevent, and respond to the unique threat from insiders. The best time to develop a process for mitigating malicious insider incidents and the unintentional insider threat is before they occur, not as one is unfolding. When an incident does occur, the process can be modified as appropriate based on postmortem results from prior incidents.

Increasingly, organizations, including the federal government, are recognizing the need to counter insider threats and are doing it through specially focused teams. In October 2011, President Barack Obama signed Executive Order (E.O.) 13587, Structural Reforms to Improve the Security of Classified Networks and the Responsible Sharing and Safeguarding of Classified Information. It requires all federal agencies that have access to classified information and systems to have a formal insider threat program. In addition, in May 2016, the Department of Defense published Change 2 to DoD 5220.22-M, "National Industrial Security Manual Operating Manual (NISPOM)." NISPOM Change 2 requires "contractors to establish and maintain an insider threat program to detect, deter and mitigate insider threats."

The CERT Insider Threat Center, along with other organizations such as the Intelligence National Security Alliance, has documented the most common components found in insider threat within the government as well as non-government organizations. This practice recommends that a program include, as a minimum, the following components:

  • Formalized and Defined Program: Directives, authorities, mission statement, leadership intent, governance, budget.
  • Organization-wide Participation: Active participation from all components that eases data access, sharing, and provides visible senior leader support for the program, especially when data necessary to an insider threat program is in silos (HR, Security, IA, CI, LE, IG, Finance, etc.).
  • Oversight of Program Compliance and Effectiveness: Governance structure, such as an Insider Threat Program Working Group/Change Control Board that helps the program manager produce standards and operating procedures for the insider threat program and recommends changes to existing practices and procedures. Also, an Executive Council/Steering Group that approves changes recommended by the working group/change control board. Oversight includes annual self-assessments, as well as third-party assessments of the compliance and effectiveness of the program.
  • Confidential Reporting Mechanisms and Procedures: Not only enable reporting of suspicious activity, but when closely coordinated with the insider threat program (InTP), these ensure that legitimate whistleblowers are not inhibited or inappropriately monitored by an insider threat program.
  • Insider Threat Incident Response Plan: More than just a referral process to outside investigators. These plans detail how alerts and anomalies will be identified, managed, escalated. This includes timelines for every action and formal disposition procedures.
  • Communication of Insider Threat Events: Appropriate sharing of event information with the correct components, while maintaining confidentiality and protecting privacy until allegations are fully substantiated. Includes communication of insider threat trends, patterns, and probable future events so that policies, procedures, training, etc., can be modified as required.
  • Protection of Employees' Civil Liberties and Rights: Legal Counsel review at all stages of program development, implementation, and operation.
  • Policies, Procedures, and Practices that support the InTP: Formal documents that detail all aspects of the program (including mission, scope of threats, directives, instructions, standard operating procedures).
  • Data Collection and Analysis Techniques and Practices: The UAM data collection and analysis portion of a program. Requires detailed documentation for all aspects of data collection, processing, storage, and sharing to ensure compliance with privacy and civil liberties.
  • Insider Threat Training and Awareness: Provides training for three aspects of the program (see Section Insider threat awareness training for all organization personnel; Training for InTP personnel; Role-based training for mission specialists who are likely to observe certain aspects of insider threat events (e.g. HR, IA, CI, LE, Behavioral Sciences, IG, Finance).
  • Prevention, Detection, and Response Infrastructure: Network defenses, host defenses, physical defenses, tools and processes, and other components.
  • Insider Threat Practices Related to Trusted Business Partners: Agreements, contracts, and processes reviewed for insider threat prevention, detection, and response capabilities.
  • Insider Threat Integration with Enterprise Risk Management: Ensure all aspects of risk management include insider threat considerations (not just outside attackers) and possibly a standalone component for insider threat risk management.

Clearly document and consistently enforce policies and controls.


A consistent, clear message on all organizational policies and procedures will reduce the chance that employees will inadvertently damage the organization or lash out at the organization for a perceived injustice. Organizations must ensure that policies are fair, and sanctions for any violation is not disproportionate.

Policies or controls that are misunderstood, not communicated, or inconsistently enforced can breed resentment among employees and potentially result in harmful insider actions. For example, in multiple cases in the CERT insider threat database, insiders took IP they had created to a new job, not understanding that they did not own it. They were quite surprised when they were arrested for a crime they did not know they had committed.

Organizations should ensure policies and controls provide:

  • concise and coherent documentation, including reasoning behind the policy, where applicable
  • consistent and regular employee training on the policies and their justification, implementation, and enforcement

Organizations should be particularly clear on policies regarding

  • acceptable use and disclosure of the organization's systems, information, and resources
  • use of privileged or administrator accounts
  • ownership of information created as a work product
  • evaluation of employee performance, including requirements for promotion and financial bonuses
  • processes and procedures for addressing employee grievances

As individuals join the organization, they should receive a copy of organizational policies that clearly lay out what is expected of them and the consequences of violations. Organizations should retain evidence that each individual has read and agreed to organizational policies. System administrators and anyone with privileged access to information systems present a unique challenge to the organization.

Employee disgruntlement has been a recurring factor in insider compromises, particularly in cases of insider IT sabotage. In many cases, the insider's disgruntlement was caused by some degree of unmet expectation, including

  • insufficient salary increase or bonus
  • limitations on use of company resources
  • diminished authority or responsibilities
  • perception of unfair work requirements
  • perceptions of being treated poorly by co-workers

Clear documentation of policies and controls can prevent employee misunderstandings that can lead to unmet expectations. Consistent enforcement can ensure that employees do not feel they are being treated differently from or worse than other employees.

Deploy solutions for monitoring employee actions and correlating information from multiple data sources.


Effective insider threat programs collect and analyze information from many different sources across their organizations. Simply logging all network activity is not sufficient to protect an organization from malicious insider activity. As the number of data sources used for insider threat analysis increases, so too does an organization's ability to produce more relevant alerts and make informed decisions regarding potential insider activity. The volume of data that must be collected, aggregated, correlated, and analyzed drives the need for tools that can fuse data from disparate sources into an environment where alerts can be developed that identify actions indicative of potential insider activity. Solutions for monitoring employee actions should be implemented using a risk-based approach and focusing first on the organization's critical assets.

User activity can be monitored at two levels: at the network and at the host. Many actions performed on computers involve network communications, often allowing network-based analysis to provide a sufficient view into user activity. The volume of information necessary for network-based monitoring is often much less than is required for collecting host-based logs and other information from every system on the network.

Insider-threat-related activity identifiable through network analysis can include authentication, access to sensitive files, unauthorized software installations, web browsing activity, email/chat, printing, and many others. However, there are some actions the organization may be interested in monitoring that do not leave any traces on the network. These can include copying local files to removable media, local privilege escalation attempts, and many others. These actions can be monitored through host-based log collection as well as through host-based monitoring systems.

One of the most powerful tools an organization can use to perform event correlation is a security information and event management (SIEM) solution. SIEM tools are designed to provide a centralized view of a wide array of logs from sources including databases, applications, networks, and servers. SIEM tools provide the ability to write queries or generate alerts that pull together data from previously disparate data sources, enhancing potential analytic capabilities for insider threat prevention, detection, and response. A SIEM system allows an organization to continuously monitor employee actions. This further allows the organization to establish a baseline level of normal activity as well as detect irregular events. Organizations can use a SIEM system to conduct more granular monitoring of privileged accounts. The SIEM system should be able to highlight events related to any actions a normal user cannot perform, such as installing software or disabling security software. Increasing the auditing level for certain events will create additional audit records that must be reviewed.

The SIEM system will facilitate sorting through these events by highlighting those that need further review and discarding background noise. Organizations can also use a SIEM system for enhanced monitoring. This is especially important for employees who are leaving the organization or who have violated or are suspected of violating organizational policy. Based on the CERT Insider Threat Center's research and feedback from industry, malicious insiders often conduct illicit activities within 90 days of their termination.

When an employee submits his or her resignation, the HR team should notify the insider threat program who should then notify the information assurance (IA) team so that its staff may review the employee's actions over at least the past 90 days and going forward to detect potential insider activity. HR should also alert IA if an employee is reprimanded or counseled for violating a work policy. Ideally, the communication between HR and IA should take place between representatives from each division working in the insider threat program.

The insider threat program provides a way to quickly and seamlessly respond to insider incidents by including representation from all key stakeholders within an organization. SIEM tools are not limited to information security events. Physical security events should also be sent to the SIEM system for analysis, creating a more complete set of events to detect insider activity. For example, if an organization sends employee badge access records to a SIEM system, it would be possible to detect unauthorized account usage by checking to see if an employee who is logged into a workstation locally is physically present within the facility. This same method could also be used to detect unauthorized remote access if an employee is physically in the facility. It would also be possible to detect after-hours physical access and correlate it with logical access logs. It should be noted that many alerts, triggers, and indicators will be organization specific. Successful insider threat indicator development depends on an understanding of the organization's culture and behavioral norms.

Incorporate malicious and unintentional insider threat awareness into periodic security training for all employees.


Without broad understanding and buy-in from the organization, technical or managerial controls will be short lived. Periodic security training that includes malicious and unintentional insider threat awareness supports a stable culture of security in the organization.

All employees need to understand that insider incidents do occur and have severe consequences. In addition, it is important for them to understand that malicious insiders do not fit a particular profile. Their technical abilities have ranged from minimal to advanced, and their ages have ranged from late teens to retirement age. No standard profile exists that can be used to identify a malicious insider.

The CERT Insider Threat Center's collection of insider threat incidents reveals a wide range of people who have harmed the organization, from low-wage earners to executives, and new hires to seasoned company veterans. There is no way to use demographic information to easily identify a potentially malicious insider. However, there are ways to identify higher risk employees and implement mitigation strategies to reduce their impact on the organization should they choose to attack. The same can be said of non-malicious insider incidents.

Cases reveal that those who cause harm without malicious intent also fail to fit a particular profile. Their behaviors and technical skills vary drastically. Security awareness training should encourage employees to identify insider threats not by stereotypical characteristics but by their behavior, including

  • threatening the organization or bragging about the damage the insider could do to the organization or coworkers
  • downloading sensitive or proprietary data within 30 days of resignation
  • using the organization's resources for a side business or discussing starting a competing business with co-workers
  • attempting to gain employees' passwords or to obtain access through trickery or exploitation of a trusted relationship (often called "social engineering")

Awareness training for the unintentional insider threat should encourage employees to identify potential actions or ways of thinking that could lead to an unintentional event, including

  • level of risk tolerance--someone willing to take more risks than the norm
  • attempts at multi-tasking--individuals who multi-task may be more likely to make mistakes
  • large amounts of personal or proprietary information shared on social media
  • lack of attention to detail

Managers and employees should be trained to recognize social networking in which an insider engages other employees to join his or her schemes, particularly to steal or modify information for financial gain. Alerting employees of this possibility and its consequences may make them more aware of such manipulation and may be more likely to report it to management. Social engineering is often associated with attempts to gain physical or electronic access to an organization's system via accounts and passwords. For example, an attacker who has gained remote access to a system may need to use another employee's account to access a server containing sensitive information.

In addition, some cases in the CERT insider threat database reveal that social engineering is sometimes an intermediary step to malicious access or an attempt to obfuscate the malicious insider's activities. Organizations should train their employees to be wary of unusual requests, even ones that do not concern accounts and passwords. This includes social engineering by outsiders to gain access to an insider's credentials. Training programs should create a security culture appropriate for the organization and include all personnel. The training program should be offered at least once a year.

Looking Ahead: Workplace Violence in Insider Threat

In addition to unintentional insider threats, CERT's formal definition of an insider threat has recently been updated.

Insider Threat - the potential for an individual who has or had authorized access to an organization's assets to use their access, either maliciously or unintentionally, to act in a way that could negatively affect the organization.

Our intent was to develop a single definition for insider threat that

  • covers malicious and non-malicious (unintentional) insider threats
  • covers cyber and physical impacts
  • applies to both government and industry
  • is clear, concise, consistent with existing definitions of 'threat', and broad enough to cover all insider threats

While the CERT Insider Threat Center recognizes this as an important area for potential future work, our current corpus is expanding to include a focus on workplace violence or physical threats.

Meanwhile insider threat will continue to be a problem. A recent survey conducted by SpecterSoft shows that insider threat attacks are on the rise, with organizations experiencing an average of 3.8 attacks per year.

Additional Resources

Read the latest edition of The Common Sense Guide to Insider Threat which includes a revised list of information security best practices, a new mapping of the guide's practices to established security standards, a new breakdown of the practices by organizational group, a new look at considerations for employee privacy, and new checklists of activities for each practice.

Read the 2017 U.S. State of Cybercrime Survey, which was conducted in partnership with Forcepoint, CSO, U.S. Secret Service, and the CERT Division of the Software Engineering Institute at Carnegie Mellon University.

Read other posts in our Best Practices Series on the SEI Blog.

Get updates on our latest work.

Each week, our researchers write about the latest in software engineering, cybersecurity and artificial intelligence. Sign up to get the latest post sent to your inbox the day it's published.

Subscribe Get our RSS feed