Just as online marketplaces made shoplifting worse by providing thieves with an easy avenue to sell their stolen goods, the dark web is raising risk by offering employees a venue for doing the dirty business of selling stolen information. According to a new report, the number of employees who did just that doubled last year. And not all retailers know how to keep data secure.
Whether it’s information about upcoming corporate earnings or trade secrets, the dark web is helping to monetize employee theft of data and information. The dark web is full of individuals recruiting retail employees to sell customers’ credit card numbers, according to “Monetizing the Insider: The Growing Symbiosis of Insiders and the Dark Web,” a report by security firms RedOwl and Intsights. “Our research also showed continued recruiting of retail workers that have access to consumer credit card information. Sophisticated actors will then engage in carding, a generic name referring to the extraction of money from stolen credit cards, for personal profit. Typically, recruiters target lower-ranked employees, such as cashiers, whose help is needed to use stolen credit cards.”
More ambitious schemes include recruiting insiders to provide information that the threat actor can use to profit with more educated stock market bets, for which insiders receives a commission. Or sophisticated hackers within the dark web will arm insiders with the tools and knowledge they need to steal data and commit fraud.
“The dark web has an active community of sophisticated buyers and collaborators who are aiding in the monetization and even weaponization of malicious insider activity,” according to the 2017 RedOwl report. “The ease with which employees can now learn about and access the dark web means that it will continue to grow in its impact over the coming years.”
Do retail organizations really know how to keep data secure? To combat the problem, risk and security teams “should join the growing number of organizations that are actively building insider threat programs,” the report advises. According to a 2016 Gartner survey, presented at the 2016 Gartner Security Summit, only 18 percent of enterprises have a formal insider threat program in place, which “means many insider threat events often go completely undetected.”
Do You Need an Insider Threat Program?
There are several aspects of insider threat mitigation, including risk indicators of malicious insider activity; lessons learned from real-life fraud, theft, and sabotage incidents; and risk reduction strategies and controls, such as a comprehensive employee termination procedure, data security agreements for any cloud services, and separation of duties. Strategies and controls can and should be implemented within departments where malicious insiders can cause harm.
But should you take the next step? What does rolling those strategies into a formal program entail?
If an organization has few insider incidents, a solely dedicated insider threat team may be unnecessary. An analysis of the risk needs to determine whether it justifies funding and is a valuable use of time by members of the insider threat team. But if the risk is sufficient, then it may be worthwhile because the consequences are often substantial—real insider incidents have ended companies and crushed stock prices.
The primary value of a formal program is to push insider threat detection upstream, suggests John Wetzel, president of Securas Consulting Group. Security measures can deter malicious insiders and provide evidence of insider wrongdoing, but a program—under which leaders from different units meet and discuss the insider threat—is a better avenue for identifying an individual who is exhibiting behavioral indicators that portend trouble.
Wetzel said an insider threat program should entail representatives from many departments, such as ethics, IT and data owners, human resources, security, asset protection and loss prevention, legal, business unit managers, and internal audit. The team should meet quarterly and act as a fusion center for information on personnel. “Talk about people and the threats they pose. About the world and changes that impact risk. About employees and their emotional problems,” said Wetzel. “Suddenly people are late, or showing signs of disloyalty … those are ordinarily things that security wouldn’t learn about.”
Those types of discussions have obvious employee-relation ramifications, so organizations need to decide if an insider threat program is a good cultural fit. Wetzel advises a program to be an “open procedure, closed process”—meaning, employees are notified that an insider threat program exists but are assured of confidentiality and privacy.
Such a program also raises legal issues, including those relating to federal and state harassment and discrimination laws and the Privacy Act of 1974, and Wetzel says a legal representative must be involved in developing an insider threat program. Organizations must also prepare for the fact that “when you dig into people’s lives, you’re going to find out odd things,” said Wetzel. He warns against legalese hijacking the presentation of the program to staff. “The policies should be simple. Legal stuff is important, but you want people to actually read it and get it,” said Wetzel.
Wetzel noted several recent examples of insiders at US companies providing corporate secrets to firms in China. The primary goal of a program is to identify these high-risk individuals, which is not the same as discrimination. “Am I more likely to be approached if I travel to Asia and I am Asian? Of course,” said Wetzel. “These people should be identified, and security should go to them and warn them that they might be approached and to give them tools to handle it. It’s not discrimination, it’s a recognition of risk.”
How to Keep Data Secure: Additional Best Practices
Like any program, one targeting the insider threat requires strong senior management support and a knowledgeable manager in charge. The Software Engineering Institute (SEI) at Carnegie Mellon University offers training on program development and certificate programs for program managers. In addition to best practices on how to keep data secure, outlined above, SEI suggests the following:
- Establish a vision and define roles and responsibilities for everyone on the insider threat team.
- Develop criteria and thresholds for conducting inquiries, referring to investigators, and requesting prosecution.
- Establish an incident response plan that addresses incidents perpetrated by insiders, has an escalation chain, and delineates authorities for deciding disposition.
- The team should follow procedures for security and discretion when using email as people outside the team, such as system administrators and administrative assistants, might have access to the emails and be a person of interest or be friends with a person of interest.
- When the insider threat team is conducting an inquiry, be careful requesting data. For example, if the team is inquiring about a person in the accounting department and needs to see system logs to establish login and logoff times, the team should request logs from a larger data set, such as the accounting department and another team within the organization, to avoid tipping off either the suspect or the data owner.
The RedOwl report stresses the need for vigilance across the employee base and for examining motivation. “Security teams must monitor employee behavior across a broad array of channels that identify suspicious employee activity, but also help understand negative employee sentiment.”
The right technology is also important. “Building an effective insider threat program requires a robust security ecosystem built on a foundational capability to see across all employee activity and spotlight unwanted behavior while respecting employee privacy,” according to RedOwl.
Detection technology is a key component of insider threat programs, according to the Insider Threat Task Force of the Intelligence and National Security Alliance (INSA). Most common elements are: content monitoring; custom site blocking; a data loss prevention (DLP) tool with checks for “dirty words”; web monitoring tools; a specialized insider threat tool to capture keystrokes and voice; foreign travel tracking; email monitoring; and log aggregation. “Some of these tools are built solely for detecting insider threats, while others are re-purposed from existing network or security-monitoring tools,” according to INSA. The organization recommends the use of specialized technologies to detect and prevent insider threats that mine diverse data sets for anomalies, such as employees’ computer login or physical access times that are outside normal work hours.
This post was originally published in 2017 and was updated November 14, 2017.