Law enforcement agencies explore semantics
Technology could help in collection and analysis of data.
Semantic technology is poised to become the next evolutionary step in helping law enforcement agencies automatically analyze and collect pertinent information on suspects and criminals from a wide range of data sources.
Semantic technology, which includes software standards and methodologies, can be a shortcut to finding and sharing relevant data across agencies in a more intelligent way using ontologies.
Ontologies are models that describe information and the full range of relationships among pieces of data.
Dave McComb, president of Semantic Arts, a software architecture firm based in Fort Collins, Colo., said semantic technology has three main elements. The first is infrastructure, such as inference engines that perform automated classification or entity extraction. The second covers tools that people use, he said.
The third element is content, either ontologies or tagged content.
“Infrastructure stuff tends to run on servers and doesn’t have to have a human involved for every single transaction necessarily, but tools are things like ontology editors that people use interactively to design and build,” he said.
Experts say the use of semantic technology is growing among consultants and application developers. The World Wide Web Consortium’s adoption of two semantic standards — the Resource Description Framework (RDF) and Web Ontology Language (OWL) — has further spurred the use of the technology in the past two years. Although the intelligence community is probably the most advanced in using such tools, experts note that deployment in other sectors is still sporadic.
Paul Wormeli, executive director of the Integrated Justice Information Systems Institute, a nonprofit organization that prompts the technology industry to develop new standards and practices in the public safety sector, said several companies are beginning to deploy semantic technology, but it is still new to state and local law enforcement agencies.
He said law enforcement officials are still struggling with implementing Extensible Markup Language-based messaging standards such as the Global Justice XML Data Model, and 200 similar projects are probably under way.
“But the future thrust is toward implementing a service-oriented architecture and the ontologies that will need to be implemented in such a service-oriented approach to [exchange] information,” he wrote in an e-mail message. SOA “will benefit considerably from the application of semantic concepts.”
Mike Kinkead, chief executive officer at Metatomix, based in Waltham, Mass., said the company has developed and deployed several modules — mostly in Florida — using semantic technology and the RDF and OWL standards specifically for law enforcement and justice agencies. He said the technology acts more like a sophisticated human analyst than a program.
“When an analyst is trying to figure something out, typically they’re not going in a straight line like a program does,” he said. “A program does straight-line processing, doing one thing after the other and kicking out exceptions, [while] an analyst is [saying], ‘Gee, where should I start?’ And then [analysts] start pulling a string,[and] they learn something. It’s an iterative process that’s discovery-based in nature, and you can’t do that with traditional programs.”
For example, an investigator looking for information about John Doe might have to visit 20 agencies’ databases, such as corrections and motor vehicles, to get a complete picture of the individual’s criminal record, child support payment record and civil cases, said Tim Perkins, Metatomix’s chief marketing officer.
“What you want to do is lift all that data out of all those systems and bring it together in one model, one understanding, and now I can begin to relate those pieces of information together to infer more relationships, find out more insights about that individual that could drive me in another direction,” he said.
Metatomix officials said their technology sits on top of existing systems in an abstract layer. Data resides with the owner, and the technology does not require a data warehouse. A common understanding is created across disparate databases by building an ontology that captures knowledge. Because it’s in an abstract layer, the technology is loosely linked to all the source applications and data. So when one of the applications changes, the whole system isn’t destroyed, Perkins said.
Sarkar is a freelance writer based in Washington, D.C.
NEXT STORY: Service families to get free PCs