State’s AI chatbot journey started with collaboration

Mark Wilson/Getty Images

Tech leadership at the State Department gave Nextgov/FCW an exclusive look into how the development of its AI-powered chatbot is going, along with the most important internal processes.

In the State Department’s ongoing development and deployment of its internal artificial intelligence chatbot, collaboration is just as important as the technology bedrocking these systems, according to agency officials.

Matthew Graviss, State’s chief data and artificial intelligence officer, joined Gharun Lacy, the deputy assistant secretary and assistant director of the Diplomatic Security Service for Cyber and Technology Security at State, to update Nextgov/FCW about its internal AI-powered chatbot meant to streamline department operations. 

Both Graviss and Lacy highlighted collaboration as key to the success of the chatbot. 

“I think it's really an incredible partnership that [serves] as a model for other agencies, that we have the head of cyber, the head of diplomatic technology, the head of analytics and AI, all working together in lockstep to the point where we've been meeting as an executive team every couple of weeks,” Graviss said. “Our modernization, with respect to AI, is a team game.”

State’s quest to build and scale AI technologies for internal usage has been a roughly one and a half year process –– a relatively long journey to jumpstart an AI tool. Originating as a research resource prior to shifting to generative offerings, Graviss said that his team tried several different AI models before opting for Microsoft Azure’s OpenAI.

Other software providers have also been contracted by State to assist with the chatbot. Palantir is helping develop the user interface and further large language model integration, Deloitte is helping analyze chatbot prompts and output responses, and Bright Star is working to conduct independent verification and validation audits.

With the chatbot now boasting 10,000 State employees as users, Graviss said that the model will continue to grow with new features and analytic capabilities released on a recurring basis. This will impact a variety of the department’s roughly 270 mission areas across the world, including diplomatic efforts. 

“From a user standpoint, we want to get our diplomats comfortable with using generative AI day in and day out,” Graviss said. “Part of the philosophy on generative AI being valuable to the department is that words are our currency. At the end of the day at the State Department, we read, write and engage. It's extremely powerful. And so having worked at multiple agencies myself, I have not seen technology match a mission like generative AI does to diplomacy.”

Keeping diplomats engaged in the field and off screens is the bottom line for State’s development of AI aides. Graviss said that consolidating the ample research that State analysts document annually is the main goal for the chatbot, from Congressional reports, to regional executive summaries for diplomats, to specialty subjects such as human trafficking and human rights reports.

“The department spends approximately over 150,000 hours on producing those three reports, and those three reports are really impactful,” Graviss said, noting that State’s AI products are focused on aiding in the research process as opposed to the writing process. The model is now focused on collecting research State analysts document throughout the year and summarizing it into focused chapters of information.

“The benefit of a research tool is to be able to…use generative AI to summarize that information, translate it into English, and automatically recommend which sections of those reports that that material is relevant to, Graviss said

State’s modernization with AI won’t happen in a vacuum. As more emerging tools become add-on applications to the chatbot, Lacy described an updated cybersecurity posture that prioritizes balancing vulnerability opportunities with the desire for emerging tech adoption. 

“One of the biggest shifts that we kind of learned through this journey for us is we usually look at emerging technology as an opportunity to…make sure our fundamentals are solid, to make sure we're ready for the emerging tech,” Lacy said. “We love to get our hands on things early, and it gives us a chance to…break it, show vulnerabilities. It also helps us understand how we can use it.”

Lacy added that State’s strategy to have multiple subagencies participate in developing the chatbot, particularly its in-house cybersecurity specialists, helped expedite the technology’s deployment and progress. 

“We're showing that bringing security in earlier accelerates the business process,” he said. “It's not what people think. It doesn't hinder it; it accelerates it.” 

A strong cybersecurity approach will be critical for the department’s overall security posture as the chatbot handles sensitive data that can’t be used in a public software tool. Graviss emphasized that internal State staff play a pivotal role in training the chatbot’s permitted prompts tailored to their specific work needs and how it operates in the field. Multiple teams –– such as Lacy’s –– at State are now overseeing alpha testing, or end-to-end software evaluations, featuring participation from experts in fields like diplomatic technology, State’s center for analytics and diplomatic and cyber security to ensure the chatbot is trained and deployed securely. 

“One of the primary adopters of this technology is my directorate right now,” Lacy said. “We use it for defensive purposes, so as we get familiar with it, that helps us internally tweak how we're going to do the vulnerability assessments, how we're going to respond if there are incidents, and how we can be non-disruptive in all that work because we're users. We don't have to go out and poll anybody anymore. We're using it ourselves, and we're watching the development happen organically, so that, for us, the biggest lesson learned is that feedback loop can be a lot shorter.”

Beyond the technical approach needed to launch a safe and intuitive AI product, both Lacy and Graviss emphasized the importance of having leadership support as a means to quickly deploy updates to a complex federal agency like State. They cited Secretary Antony Blinken’s earlier advocacy of AI technology adoption within the agency as a key foundation for other teams to come together for the procurement, testing and integration of emerging technologies. 

“Leadership attention can really drive adoption,” Graviss said. “I mean, when you have the CIO and the [deputy assistant secretary] for cyber and the chief AI officer meeting on generative AI on a weekly or biweekly basis, you can drive real, meaningful progress and in a quick fashion.”