Empowering Generative AI with Trusted Federal Data: Insights from a NISS–FCSM Webinar

Date: Friday, January 30, 2026 - 12:00pm to 1:00pm ET

Event Page: NISS/FCSM Empowering Generative AI with Trusted Federal Data: Strategies for Quality & Usability | National Institute of Statistical Sciences 

The National Institute of Statistical Sciences (NISS) and the Federal Committee on Statistical Methodology (FCSM) co-hosted a seminar exploring the responsible use of federal data in generative AI systems, with presentations focusing on data accessibility, metadata improvements, and cross-agency collaboration. Bella Mendoza (Data Scientist at the Department of Commerce in the Office of the Undersecretary for Economic Affairs) presented on the Model Context Protocol (MCP) project, which enhances AI models' ability to retrieve and interpret federal data, while Dr. Dominique Duval-Diop (Deputy Chief Data Officer & Acting Chief Data Officer, Department of Commerce) discussed the Department of Commerce's efforts to make government data AI-ready through new guidelines and empirical evaluation tools. Luke Keller (Chief Innovation Officer & Bureau AI Lead, U.S. Census Bureau) presented on the use of Model Context Protocol (MCP) servers as intermediaries between large language models and federal data APIs, highlighting how this approach can improve data access, preserve semantic integrity, and reduce hallucinations in AI-generated responses. The seminar concluded with a discussion on MCP servers as intermediaries between LLMs and data APIs, emphasizing the need for further evaluation and collaboration across federal agencies to advance this technology. 

Federal Data in Generative AI 

Dr. David S. Matteson, Director of the National Institute of Statistical Sciences (NISS), welcomed attendees to a seminar exploring the responsible and effective use of trusted federal data in generative AI systems, a collaboration between NISS and the Federal Committee on Statistical Methodology (FCSM). Dr. Travis Hoppe, representing FCSM, outlined the seminar's purpose, emphasizing the need to ensure that federal data remains accessible and semantically intact as AI technologies evolve. Travis highlighted three key steps for agencies to enhance data accessibility and metadata, which the seminar's speakers would address. The seminar featured presentations by Bella Mendoza, Dr. Dominique Duval-Diop, and Luke Keller, focusing on baseline assessments, API and metadata enhancements, and cross-agency learning, respectively. Attendees were encouraged to submit questions via chat for a panel discussion and Q&A session. 

Enhancing Federal Data Accessibility 

Bella presented on a project they  worked on as a U.S. Digital Corps Fellow to improve public access to federal data using Model Context Protocol (MCP). They demonstrated how MCP enhances AI models' ability to retrieve and interpret federal data, experimenting with MCP servers for CDC Places and USA Spending. The team evaluated the effectiveness of MCP by collaborating with subject matter experts to create a comprehensive question panel and comparing model responses with and without MCP. Bella highlighted the potential of MCP to transform public engagement with federal data, elevate authoritative data, and prompt re-examination of data structures and metadata quality. 

AI-Ready Government Data Initiatives 

Dr. Dominique Duval-Diop presented on the Department of Commerce's efforts to make government data AI-ready, outlining challenges like data silos, inconsistent formatting, and privacy concerns. She introduced guidelines released in January 2025 to enhance data quality and accessibility, emphasizing the shift from machine-readable to machine-understandable data. Dominique highlighted a collaboration with the National Science Foundation to develop an open-source empirical evaluation tool measuring LLM accuracy with AI-ready data. This collaboration will help Commerce better understand the impact of data quality and AI-readiness on the ability of LLMs to interact with federal data. The tool will be made available as a shared service of the National Secure Data Service. 

MCP Servers for Federal Data Access 

The presentation focused on the use of MCP (Model Content Protocol) servers for accessing and interacting with federal data, particularly in the context of large language models (LLMs). Luke Keller, Chief Innovation Officer at the U.S. Census Bureau, explained how MCP servers serve as intermediaries between LLMs and data APIs, optimizing data access and reducing hallucinations in LLM responses. Bella discussed the need for further evaluation of MCP servers, including measuring accuracy, user behavior, and adoption impact. The panelists encouraged external stakeholders to experiment with MCP servers and provide feedback, emphasizing the importance of collaboration across federal agencies to advance this technology. 

Acknowledgments 

NISS gratefully acknowledges the contributions of the seminar speakers: Bella Mendoza (Department of Commerce), Dr. Dominique Duval-Diop (Department of Commerce), and Luke Keller (U.S. Census Bureau)—for sharing their expertise and leadership on advancing the use of trusted federal data in generative AI. We also thank the moderator and the Federal Committee on Statistical Methodology (FCSM) for their partnership in organizing this event and for their continued commitment to improving the quality, accessibility, and responsible use of federal statistical data. 

Monday, February 2, 2026 by Megan Glenn