Vendors building high-performance knowledge system

Server could help intell agencies find relationships, connect the dots

In a bid to greatly increase officials' ability to manage interconnected information, employees at Ontology Works Inc. and database developer Objectivity Inc. are working together on the High Performance Knowledge Server.

The system will manage large volumes of information, according to developers at the two companies. As Ontology Works' name suggests, the system will be based on ontology, a philosophical concept that pertains to the essential nature of things and the relationships among them.

In the information technology context, ontologies refer to cataloging data as it is used by people in a particular discipline and charting those relationships, said Bill Andersen, chief scientist at Ontology Works.

"The majority of our customers are intelligence community customers," he said. "The High Performance Knowledge Server is designed to deal with large amounts of complex, structured information, pretty much about anything."

An ontology-driven database is more complex than traditional relational databases, Andersen said. Drawing that distinction is the critical differentiating factor for companies such as Ontology Works. A relational database is structured so that data is laid out in rows and columns. An ontology-based system uses the more complex web of interconnected relationships to aid information analysis.

However, such a facile explanation may make the challenge seem simpler than it is, said Jonathan Eunice, principal analyst at Illuminata Inc.

"In philosophy, ontology is the study of what lies beneath the real world," he said. "Ontologies that you see in computing are generally attempts to represent the deep structure of knowledge."

An ontology-based system, for example, could find that two seemingly unconnected people once appeared together in a photograph and thus infer that they know each another. The conclusion would not be definite, of course, but would be as good an inference as a human investigator could draw in the absence of additional evidence. Or, in a climatology application, an ontology could be used to help a researcher find articles about hurricanes and related phenomena, including tornadoes, tropical depressions and floods.

"That's the theory," Eunice said. "In practice, it hasn't worked very well. Going back 500 years now, there have been many attempts to categorize human knowledge, and it is intractable at a fundamental level. It's not challenging to come up with some categories, but it is challenging to build all of the relationships in."

The problem is that "knowledge is a slippery thing," he said. People understand many relationships intuitively, while even the best computers still only know what they're programmed to know.

Ready in 2005

However, Andersen said there is a tremendous need for the high-performance system that Ontology Works and Objectivity officials are developing and expect to release no later than the third quarter of next year. In a sense, he said, much of the work is already done.

"What the High Performance Knowledge Server does that our current Knowledge Server does not do is simply a matter of scale," he said. "It's going to have the same feature set."

Some significant engineering challenges exist, though, he said.

Chief among those challenges is figuring out how to manage memory at the higher scale, said Joshua Engel, chief software architect at Ontology Works. Given the intertwined nature of the data and the business rules that apply to the data, it makes sense to keep everything in memory at once. At the upper end, though, that won't be possible.

"In a lot of ways, it's a complete rewrite," he said. "The existing system assumed copious amounts of memory. We're going to have to figure out how to use the same amount of memory for a much greater amount of data."

The programming team wants customers to avoid a learning curve,he said. Although almost everything may change on the back end, the front end should look and behave the same.

Objectivity's role

Objectivity officials' role is to create an object database that can handle the complex information for the high-performance server, said Leon Guzenda, the company's chief technology officer.

"We're going to help them take their current application and make everything parallel and multithreaded," he said. "That's been a limiting factor for them."

If a word has multiple meanings in a particular discipline, for example, a parallel and multithreaded system can run hundreds or thousands of database searches simultaneously to find additional data related to each of the meanings.

"If you're an FBI guy, the ontologies you're building will be different than someone in bioinformatics," Guzenda said. Objectivity will store the ontology data, defining the terms and relationships to be unique to users' needs."

He added, "A lot of the relationships are not hard-wired," one of the complexities that make ontologies challenging. Although the relationship between a type of aircraft and the engine it uses is fixed, the same aircraft may be a military plane in Iraq and a medical transport plane in South America, he said.

The project is a stretch for Objectivity, too, Guzenda said. "It's going to take us into soft relationships, where we've always dealt with hard relationships," he said.

What is ontology?

Officials at Ontology Works Inc. define ontologies as "sophisticated, high-fidelity information models of an application domain." The High Performance Knowledge Server, now under development at Ontology Works and Objectivity Inc., will allow users to run analyses that involve uncertainty, the passage of time, security and changing hypotheses.

The philosopher Aristotle called ontology "the science of being qua being," or the study of beings insofar as they exist, and René Descartes summed it up with the Latin "cogito, ergo sum," or "I think, therefore I am."

Source: Ontology Works Inc. and Wikipedia.com

NEXT STORY: CRM crosses over