• Save the date for SSTI's 2024 Annual Conference

    Join us December 10-12 in Arizona to connect with and learn from your peers working around the country to strengthen their regional innovation economies. Visit ssticonference.org for more information and sign up to receive updates.

  • Become an SSTI Member

    As the most comprehensive resource available for those involved in technology-based economic development, SSTI offers the services that are needed to help build tech-based economies.  Learn more about membership...

  • Subscribe to the SSTI Weekly Digest

    Each week, the SSTI Weekly Digest delivers the latest breaking news and expert analysis of critical issues affecting the tech-based economic development community. Subscribe today!

NIH puts the kibosh on generative AI

July 20, 2023

Last month, NIH came out with a policy statement that prohibits using generative AI to analyze or critique NIH grant applications and contract proposals. Specifically, as written in NIH Notice NOT-OD-23-149, “NIH prohibits NIH scientific peer reviewers from using natural language processors, large language models, or other generative Artificial Intelligence (AI) technologies for analyzing and formulating peer review critiques for grant applications and R&D contract proposals.” The problem with using generative AI in peer review is that it compromises confidentiality. As expressed in the notice, once information is loaded onto a generative AI platform, “AI tools have no guarantee of where data are being sent, saved, viewed, or used in the future, and thus NIH is revising its Confidentiality Agreements for Peer Reviewers to clarify that reviewers are prohibited from using AI tools in analyzing and critiquing NIH grant applications and R&D contract proposals. Such actions violate NIH’s peer review confidentiality requirements.”

NIH's Michael Lauer, deputy director for extramural research, Stephanie Constant, Ph.D., review policy officer, and Amy Wernimont, Ph.D., chief of staff, NIH Center for Scientific Review, summarize the serious consequences of breaching confidentiality in a blog post. A breach of confidentiality, they say, “could lead to terminating a peer reviewer’s service, referring them for government-wide suspension or debarment, as well as possibly pursuing criminal or civil actions based on the severity.” They provide links to NIH guide notice NOT-OD-22-044, their Integrity and Confidentiality in NIH Peer Review page, and an NIH All About Grants podcast for further explanations.

Lauer, Constant, and Wernimont also point out that generative AI might also introduce bias into peer reviews. NIH carefully selects peer reviewers based on their expertise and original thinking. AI, which relies on preexisting information and conclusions, negates the input of selected reviewers' unique qualities and may introduce bias into the process that the reviewer would have otherwise avoided.

The trio also warns sternly about using AI when writing grant applications.

artificial intelligence, nih