Asset 1
Sentiment Analysis: Are You Feeling Risky?

How do you feel when you are at work? Are you happy, sad or stressed out? Are you thinking about quitting, breaking a few rules or even committing fraud?

Be careful: Companies are increasingly using sentiment analysis technology to monitor internal communications in order to better understand employees’ moods and assess any potential risks. “Sentiment analysis has become a form of risk management and is emerging as a useful risk control tool for a variety of businesses,” said Vasant Dhar, a data scientist and director of New York University’s Center for Business Analytics at the Stern School of Business. Firms in highly regulated, compliance-oriented or risk-focused industries, such as financial services, health care and insurance, are starting to use the technology to identify and address regulatory risk issues, compliance problems and potential fraud.

What Is Sentiment Analysis?

Sentiment analysis studies the mood, opinions and attitudes expressed in written text. It aims to discover the emotions behind words in order to determine whether a communication suggests a positive, negative or neutral sentiment.

When first developed, sentiment analysis was conducted manually, but it is now being performed on corporate communication automatically or in partnership with analysts. It is also being used in combination with other computer-based tools like text mining, text analytics, natural language processing, machine learning, statistical modeling, big data analytics and computational linguistics.

This analysis can help organizations quickly and easily study large quantities of unstructured data culled from employee communications, including company chat rooms, wikis, customer call logs, emails and instant messages. Such information can then be used to provide evidence of employee satisfaction levels, anticipate signs of employee churn, identify new business opportunities or sales strategies, and determine if employees are a source of serious risk.

Various factors are fueling the growth of this technology as an employee-monitoring platform. First, U.S. regulations impacting industries such as banking, financial services, insurance and pharmaceuticals often require ongoing monitoring of communications as a consequence of infractions. Regulatory frameworks such as the Foreign Corrupt Practices Act also mandate that U.S. firms operating in global markets closely monitor their global workforce for instances of bribery, fraud or money laundering.

In addition, concerns about data security in the wake of recent leaks from internal actors like Edward Snowden and Chelsea Manning have caused many firms to explore their options for preventative measures. Some companies have already used sentiment analysis tools on social media to gauge consumer brand preferences, leading many to wonder if such tools would also work internally.

With the increased focus, sentiment analysis continues to evolve and improve. Some sentiment analysis providers are starting to offer facial coding of video, speech analysis of audio streams and assessment of affective states. Others are even gearing up for the application of neuroscience and wearables to study customer or employee physiological states.

Limitations and Data Privacy Considerations

Despite the enthusiasm for sentiment analysis, many are also quick to highlight its limitations.  “Sentiment analysis by itself is not really indicative of a particular risk,” said Rob Metcalf, president of Digital Reasoning. “You have to pair it with tools that can identify audience, content and tone—who and what I am talking about—and understand the context of the communication.” For example, instances of code switching—the use of substitute words or euphemisms to try to hide the true meaning of a communication, like using “baseball” for “financial instrument”—are often indicators of risk and are dependent on context cues. Humor and sarcasm in text are also difficult for automated sentiment analysis platforms to identify and parse.

You can read the full article here.