‘Sometimes they will just make up facts’: UNL professor on the pitfalls of new AI
LINCOLN, Neb. (KLKN) – A new feature has rolled out on Snapchat called My AI, and it’s causing some worry among users, as well as law enforcement.
The artificial intelligence chatbot is similar to the ChatGPT Program. You can ask it questions or give a prompt that it will respond to.
But some users have noticed the AI doesn’t always give the best advice and can make unsafe suggestions.
Bryan Wang, a professor at the University of Nebraska-Lincoln, said these types of programs can sometimes misunderstand context and provide incorrect information.
“Sometimes they will just make up facts when they don’t know the answers,” he said. “And so, a lot of the answers seem like they’re real, but they’re made-up facts.”
So Wang said chatbots can become misinformation generators.
SEE ALSO: How Lincoln Public Schools prevents ChatGPT cheating
Wang said AI programs will often let the users train the algorithms to make them better, which can backfire if users don’t give it the right information.
The programs will make these mistakes because, he said, they haven’t been fully tested before being released.
“I think it’s not really ethical or socially responsible to release AI-based systems or these chatbot systems when the company has not really tested for adverse societal impacts,” he said.
On its website, Snapchat has acknowledged that the AI can provide “biased, incorrect, harmful, or misleading content.”
The Merrick County Sheriff’s Office and the Alliance Police Department are alerting parents and guardians about the potential threat posed by the AI.
They find the “friend suggestions” particularly worrisome.
The feature uses artificial intelligence to recommend new buddies based on your child’s activity and interests.
SEE ALSO: Lincoln psychologist blames social media for decline in teen girls’ mental health
Tyler Sherlock, a school resource officer with the police department, said while some of these features may seem harmless, they’ve seen cases when kids have been connected to people with harmful intentions.
He said the “Snap Map” live location feature is also causing concern, as it can give precise details about where someone can be found at any time.
“It starts to take in that privacy factor,” he said. “You know, who’s watching your location, where are you at, things like that. And it can put children or even adults at risk through strangers or other people for stalking, harassment or just unwanted attention.”
Police recommend enabling the app’s “ghost mode” to turn off location sharing altogether.
The department also asks parents and guardians to review your child’s account to make sure their privacy settings are set to the highest level.
If you have any remaining questions or concerns, both agencies urge you to contact local law enforcement.