Q and A: How Big Data Is Revolutionizing City Government
Harvard Professor Stephen Goldsmith wants to build a market for municipal data analytics.
Powerful new data analysis tools coupled with shrinking city budgets presents a once-in-a-lifetime opportunity to fundamentally change how government operates, according to Harvard Professor Stephen Goldsmith.
Goldsmith, a former mayor of Indianapolis and deputy mayor of New York City, directs the Innovations in Government Program at Harvard’s Kennedy School of Government. In mid-June he launched Data-Smart City Solutions, a website dedicated to studying and publicizing big data projects that are making city government smarter and more efficient.
Big data refers to new analysis tools that make it easier to spot patterns and pull meaning from information such as sensors and video feeds that can’t be neatly organized in a spreadsheet.
At the federal level, officials are exploring how big data can democratize scientific research and better spot waste, fraud and abuse. Some mayors and city officials are using similar tools to make data-driven decisions about where police patrol and how they hand out permits for restaurants and other businesses.
Nextgov spoke with Goldsmith recently about the state of municipal big data and where he sees it heading. The Q and A is edited for length and clarity.
What do you hope to accomplish with Data-Smart City Solutions?
Generally in government something innovative is always going on but there’s not a very good market for those innovations and people are slow to find out about them.
There’s an increasing amount of activity in this area, but it’s often a project here and a project there. What I’m hoping to do is act as a catalyst for the use of big data, data analytics and predictive analytics to change the way government operates by connecting experiments that are going on in a relatively isolated way now to the market of public leaders across the country.
So you’re trying to create a more efficient market?
Well, to create a market. Efficient would be the next step. This is a slow market for new ideas because people are not really sure if what’s being promised will come through or if the cost will be worth it.
A lot of the big data market in the municipal government now is managed by vendors such as IBM’s Smarter Cities. Are you hoping to shift the power balance more toward government consumers?
I think the two are complementary. There are a number of vendors that are doing pretty neat things and they’re beginning to sell to the marketplace and that’s great. At the same time, we can create a trusted marketplace for those ideas, communities of public sector actors talking to each other about ‘did this do what you thought it was going to?’
What cities are doing a good job with big data now?
There are a couple different models. New York City has this great group with Mike Flowers. Basically it’s a skunk works. They get a problem and they go solve it. With fire inspection, for instance, they get data -- they get it from everywhere -- and they show you can integrate and mine that data. Even though everyone says you can’t, they always do. But that’s still a problem-oriented team.
By contrast, in Chicago they’re really trying to infuse data analytics into the very functions of government in a more structured, comprehensive way, whether it’s the way traffic moves or child welfare. There are a lot of other places where a lot of stuff is going on, but not many where it’s going on so comprehensively to change the very functions of government.
Some municipal big data efforts have focused on predictive policing, which has concerned a lot of civil libertarians and privacy advocates. Revelations about the National Security Agency’s PRISM program also have people thinking a lot more about how their data’s being used. Do you think these concerns are justified and will that make big data a tougher sell to cities?
I think it definitely will make it a tougher sell. My way of thinking about this is: to assume there aren’t substantial privacy issues is a mistake, but to assume there aren’t benefits is an equally large mistake. There are tradeoffs. We can handle the privacy issues through opt in options and other things. People are going to be more scared because of what’s going on, but the chances to improve the quality of governance and quality of life are very dramatic. We just need to make the case that we’ll explicitly pay attention to privacy issues.
Maybe 25 years ago it made sense to ask police officers to drive around waiting for a crime to occur and then hope that by some miracle they’d be in the right place at the right time. But it doesn’t make nearly as much sense as mining millions of pieces of data to figure out where that crime will occur and going there to wait for it to happen or intervening before it happens.
A lot of data is also anonymous. If you want to change New York City traffic signals by gathering data from drivers, that doesn’t have to be personalized. You can take GPS information from taxis and that doesn’t have to be personalized either. But if you’re driving down that street, you benefit from that information being gathered and analyzed.
What level of government is doing the best work with big data?
There are significant activities going on at the federal level just because it’s so large, but just as a percentage of total activities, we’re seeing most activity at the city level. More and more cities have innovation offices and are using the power of data analytics. More and more often there’s someone in the executive office at the top level who’s driving these initiatives. If you have a mayor who’s very vocal you can see a lot of activity.
There are great things going on at the federal level too, but in terms of changing day-to-day government, that’s happening more at the local level.
What are the biggest stumbling blocks for big data in government now?
I think the biggest problem is that technology has progressed so much that it now exceeds the understanding and capacity of even [chief information officers]. A lot of folks have very professional and technical skills in how to manage their agency’s legacy systems, but the science of data mining has moved so much that things are possible now that many people don’t know are possible. The biggest restriction is imagination. Imagining a totally new way for government to operate, that’s what’s holding us back.
The opportunity here is that big data analytics will allow us to target scarce resources where they’ll make the most difference. Not all restaurant owners are equally bad. Some are really bad and some are really good. So why don’t we mine the data to figure out the indicia of who’s good and who’s bad? What contractors can we trust and which ones can’t we? Who’s not paying their taxes?
What do you see on the horizon for municipal big data?
Almost every aspect of government, from how we fight crime, to how we take care of our kids, to how we fix potholes will change as a result of analytics. We have a chance to make government more targeted, efficient and effective at solving problems before they occur. We can know where the next place is someone will fall or a child will get hurt, where the next car accident will happen and where ambulances should be staged.
There will be a lot of work with transportation, more sophisticated methods of integrating sensors into how we manage mobility and reduce congestion. We’ll find ways to make markets more efficient and ways to provide data to help people make better decisions such as letting them know when the school bus will arrive so their child won’t have to wait out in the cold. I think the whole way we regulate will change dramatically. We’ll be better able to guess the cost of regulation and to hyper-personalize it.
NEXT STORY: Feds and the Fifth Amendment