Shortly after Matador Network announced its generative AI travel assistant GuideGeek early last year, an employee testing the tool asked about his hometown of Pittsburgh.
After a while, she changed gears and asked about Crete. But instead of typing “Greece, Crete,” she accidentally typed “Crete Freeze.”
GuideGeek was confused by the typo and thought the query was about ice cream parlors in Pittsburgh. But they couldn't find a real parlor called Crete Freeze, so they made up one with a background story about the founder and a claim that the shop makes its own ice cream, according to Matador Network, which relayed the story. said Ross Borden, CEO. .
“She realized it was completely invented and this ice cream parlor didn't exist,” Borden said of the employee.
Similarly, Brian Schultz, chief information officer at Cruise Planners, which introduced Maxx Intelligence, a generative AI tool for travel advisors, in December, said he also queried the original ChatGPT about himself. be.
“In fact, it set me up for a whole career that I had never had before,” he said. “I was told that I was the chief technology officer of Royal Caribbean.”
Such hypothetical reactions are known as hallucinations in generative AI parlance. And these are the pitfalls that travel companies and travel advisors need to manage and keep in mind as they start implementing and using AI-powered tools.
“Hallucinations are a big problem,” said Michael Coletta, senior manager of research and innovation at Focuslight, who plans to publish a paper on generative AI in the coming weeks. “He has 10 correct answers, and he just makes up the 11th one because he wants to give it an answer.”
In the world of travel, for example, Coletta said he's seen generative AI create ferry and rail routes where none exist.
According to Megan Hastings, head of customer insight strategy at digital analytics platform Quantum Metric, hallucinations in the context of travel queries manifest as inaccurate flight information, misleading hotel descriptions, or incomplete travel recommendations. It is said that there is a possibility.
“The frequency of these hallucinations varies depending on the AI's understanding of the domain, its ability to accurately process and interpret data, and the level of human oversight and quality control,” said Hastings, who works with digital teams at major travel brands. There is a possibility.”
suppress hallucinations
One way companies can reduce illusions when deploying generative AI tools is by properly training the interfaces based on their own data.
Travel companies typically layer their own generative AI solutions on top of OpenAI's ChatGPT, Google's Gemini, or Microsoft's Copilot. But the system must be taught when and how to retrieve information from a company's own datasets, rather than from much larger datasets, Coletta said.
Intensive human intervention is also key to minimizing hallucinations, Borden said, with Matador now reducing its hallucination and other confusion rate from 14% last April to just over 2%, according to Guide Geek. He added that it had been reduced to.
Ross Borden, CEO of Matador Network.Photo by Robert Silk
Borden spoke this month at the Mountain Travel Symposium, a ski industry conference held at California's Palisades Tahoe Ski Resort, and said Matador has hired six employees to eradicate and solve hallucinations. He said he did.
In total, we've read over 900,000 GuideGeek conversations over the past nine months.
“They were key to understanding the conditions under which hallucinations occur,” he said in a later interview. “There's only so much you can do without human eyes.”
Borden said one of the situations in which GuideGeek is prone to hallucinations is exemplified by the problems Matador employees encountered with questions in Pittsburgh. This tool was likely to be confusing when asked a large number of questions on the same topic, requiring users to suddenly switch gears, especially if there was a typo.
Questions about specific topics also caused hallucinations, especially coffee shops in small towns.
Matador used a software solution to address these and other GuideGeek issues, Borden said.
In addition to layering on top of ChatGPT-4, GuideGeek also uses direct connections to other platforms such as Sky Scanner and Expedia to provide real-time information on flight schedules, hotel availability, weather, and currency conversion. I pointed out that it is generating a response. Queries can be made on WhatsApp, Instagram, Messenger or within the GuideGeek website.
Matador also customizes GuideGeek for travel agents, airlines, and other travel companies. Currently, 12 destination marketing organizations are using customized versions of the tool.
Cruise Planners uses Maxx Intelligence to implement a less labor-intensive method to address the illusion risks faced by travel advisors. Schultz said that when using the platform, Cruise Planners franchisees must first click a box accepting responsibility for the information they choose to disseminate. You should also confirm that you have reviewed the material each time you respond to a query.
The company also trains agents on best practices for using Maxx Intelligence. This includes fact-checking certain responses, particularly those related to restaurants, hotels, and other commercial establishments. Schultz noted that his ChatGPT-4, which is also layered with Maxx Intelligence, is only up to date through this September. Advisors need to be aware of that.