Add 'value' to the Vs of big data
Value matters in big data investments, but money isn't the only measure.
The Memphis Police Department saw violent crime drop dramatically over a span of five years following the launch of Blue CRUSH, a predictive analytics system using data from cameras and police records to predict future crime.
Big data has traditionally been described by a series of words that begin with the letter ‘V’: volume, velocity, variety, and veracity.
The newest addition to that list is “value,” a term gaining critical importance in times of tough budget constraints and pressure on agencies to better analyze the data they have, said Dante Ricci, director of federal innovation at SAP.
“What agencies should be thinking of is, ‘What value could come out of it,’ not just creating a big data solution for the sake of big data,” Ricci said. “Big data is about the value you get out of it. It’s not tough to gauge the outcome (of big data investments) if you understand what the use case is going to be.”
Clay Richardson, senior analyst for Forrester Research, said agencies he works with are “trying to connect big data to get more value” by connecting data to operations.
In essence, Richardson said agencies are seeing success by making big data analytical processes “frictionless,” allowing data to be collected, analyzed, disseminated and dispersed to clients or customers with ease.
“A big piece of value is agencies trying to connect the big data to their core processors and operational processes so they can make changes to processes as data chances,” Richardson said. “So as more data and information comes in from the government, agencies are able to make quicker changes, and we’re definitely seeing that in some agencies.”
Data that can be analyzed and dispersed quickly helps an agency’s bottom line, Richardson said, and it keeps customers who use the data happy.
“A lot of what agencies want to understand is how data flows, who it goes to, who consumes it, why they need it, and agencies want to find how we infuse more intelligent and smarts in data and make it more consumable across the organization,” Richardson added. “Right now, a lot of the data is dumb – it isn’t smart in the sense that you’re not going to do anything with it other than build reports or a bit of analysis. Customers want more value out of the data.”
Some agencies already boast successful big data initiatives despite the relative newness of the technology.
The Internal Revenue Service used big data analytics to fix tax-filing errors in 2012, saving approximately $100 million in erroneous claims. The Department of Defense’s global shared service center, the Defense Finance and Accounting Service (DFAS), has saved approximately $4 billion in improper vendor payments since 2008 using a business activity monitoring software tool that continuously monitors several terabytes of Department of Defense transactions.
Dollars and cents are not the only currency for measuring value. Big data initiatives have been undertaken successfully at local and state levels, Ricci said, in some cases saving lives through programs at hospitals or by law enforcement agencies.
Violent crime in Memphis, Tenn., dropped 31 percent from 2006 to 2011 following the launch of Memphis Police Department’s Blue CRUSH (Crime Reduction Utilizing Statistical History) pilot program in partnership with the University of Memphis. The predictive analytics tool combined data from surveillance cameras, crime records and numerous other sources and used it to predict crime hotspots and to provide officers with real-time information about suspects and victims.
A study by Nucleus Research showed the city of Memphis spent about $400,000 per year on the initiative, including personnel costs, and calculated a $7 million return on investment, or 860 percent.
“If you’re able to save lives because of better data and situational awareness, that’s invaluable,” Ricci said.
NEXT STORY: FCW digital edition now available