With the application of artificial intelligence still in its relative early stages, North Carolina's public sector technology leaders stressed Wednesday to be cautious and curious before deploying it and not to test it on sensitive data. did.
In some cases, independent exploration is already underway, city, county and state IT leaders said in a TechConnect-sponsored webinar, “The Local Imperative: Policy and Use of AI.” But they cautioned that much remains to be learned about the technology.
“There's a lot we don't know about how these models work. It's really complex. It's pretty chaotic,” said Jonathan Feldman, chief information officer for Wake County, North Carolina. He ends up in the newspaper because they did something crazy with the data, using the AI model. ”
The county is leading an effort known as the Enslaved Peoples Project, in which volunteers scour records to better understand the history of enslaved people. Information services authorities are considering using AI to assist with document review.
“It's in flight now. And it's really exciting because it has the potential to save a lot of volunteer time,” Feldman said.
The City of Chapel Hill is using generative AI to rewrite documents and policies in a way that is more accessible to the public, and to translate government language into language that is easier to understand and access. Officials can also tell the technology what grade level they want the document to be created at, said CIO Chris Butts. However, the data input into the generative AI is in line with commissioner notes and job descriptions, and is information that is already publicly available.
“We're still researching this. And some of those use cases are just low-hanging fruit that could be a low-hanging fruit,” he said, adding that for now the guiding rules are It added that: still. “
Keith Briggs, Director of Enterprise Architecture and Innovation for the North Carolina Department of Information Technology, provided attendees with a straightforward blueprint for use.
“Don’t use the internet, don’t use open source, don’t use generative AI,” Briggs said. “And when you use secure AI capabilities, be sure to use them responsibly. And part of that responsible use is validating the output.”
Mark Wittenberg, Raleigh's chief information officer, also said AI is a tool, not an outcome, and called for “sound guidelines and guardrails” rather than fear.
“But I actually think it’s important for us, especially as IT leaders, to really explore what we can do with technology,” Wittenburg said. “And again, be very careful about the community, the impact on the community, the positive and negative impacts that that could potentially have.”