Sen. John Hickenlooper, D-Colo., is planning to introduce legislation that would require the National Institute of Standards and Technology to help agencies develop voluntary guidelines for third-party AI evaluations.
The Validation and Evaluation for Trustworthy AI Act would mandate an interagency effort to set standards and recommendations for certifying third-party evaluators to collaborate with AI developers to assess and verify their systems, Hickenlooper’s office said Monday.
Under the bill, NIST would work in partnership with the Department of Energy and National Science Foundation to create AI testing specifications and guidelines with a focus on data privacy protections, mitigations against potential risks from an AI system, dataset quality and governance.
The VET AI Act would also establish an advisory committee to review and suggest criteria for auditors looking to receive certification to conduct internal or external AI evaluations.
“AI is moving faster than any of us thought it would two years ago,” said Hickenlooper. “But we have to move just as fast to get sensible guardrails in place to develop AI responsibly before it’s too late. Otherwise, AI could bring more harm than good to our lives.”