The fact that data is doubling every two years mimics one of electronics’ most famous laws: Moore’s law. In 1965 Gordon Moore stated that the number of transistors on an integrated circuit doubled approximately every two years and he expected the trend to continue “for at least 10 years.” Forty-five years later, Moore’s law still influences many aspects of IT and electronics. As a consequence of Moore’s law, technology is more affordable and the latest innovations help engineers and scientists capture, analyze, and store data at rates faster than ever before.
The cost of storage space for all of this data has decreased exponentially from $228/GB in 1998 to $.06/GB in 2010. Changes like this combined with the advances in technology resulting from Moore’s law, undoubtedly fuel the Big Data phenomenon and raise the question, “How do we extract meaning from that much data?” Unfortunately, Moore’s law only applies to technology and not to humans.
The cost of a person reading through data and analyzing will not be going down anytime soon. Imagine the cost anywhere from $10/hour to nearly $1000 an hour for subject matter experts in various fields such as Law, Finance, Healthcare etc. Adjusting for inflation the hourly rates for knowledge workers will continue to rise as the cost of technology goes down.
From human communications such as email, social media and instant messaging to documents, spreadsheet and many other forms of information – the challenge to analyze this continually growing mountain of data has become a daunting task for people, organizations and companies that need to unlock critical knowledge within this information.
When you think about how much time there is in a day and you start to run the numbers, it doesn’t take long to see the problem: Considering that it takes a well trained analyst anywhere from 5 to 10 minutes to review an average sized document or 2-3 minutes to review an email, it becomes painfully obvious that one person can only analyze 40-50 documents or several hundred emails a day (assuming no time is taken for lunch or breaks).
Then consider how many emails an average size company generates in a day. Anywhere from 1000’s to millions of individual emails that need to be analyzed for relationships and risks that in many cases are being intentionally concealed by the individual who wrote the email. So let’s now run the numbers at scale for a large company that generates several million emails a day, and needs to compare these emails to several months of previous email activity, that would be more than 200 million records that would take a team of 2000 analysts, 8 hours a day, and approximately 3 years to complete the review . That same task would take our machine learning-based analytics platform, Synthesys® just 1 day.
The advantage in all this is having man and machine working together. Imagine the possibilities by simply using machines to amplify human intelligence, freeing up our time to leverage the insights and outcomes from the data, while making our organizations more productive, valuable and competitive. It’s a compelling strategy and an even more compelling ROI. To learn more about Synthesys and our visionary approach, visit us at: http://www.digitalreasoning.com/visionary-approach