Artificial intelligence can’t always be trusted. Let me tell you about the time an AI chatbot lied to me about a famous blues musician.
A few days ago, I decided to compile a list of pop songs about supply chains (I had my reasons; it’s for a future column). So I asked ChatGPT, the generative AI chatbot, for suggestions. ChatGPT identified various appropriate songs, though mostly it focused on adjacent themes such as songs about factories and organized labor. Looking to add to the list, I asked ChatGPT specifically if it knew of any songs about delivering packages. In response, ChatGPT immediately identified an intriguing tune by a legendary bluesman.
ChatGPT: “Package Delivery Blues” by Big Bill Broonzy — This classic blues song is about a man who’s struggling to make ends meet as a delivery driver, and he’s feeling weighed down by the burdens of his job.
Wow! I was generally familiar with Broonzy, who was active from the 1920s to the 1950s, and looked forward to listening to “Package Delivery Blues,” which sounds like he was anticipating the plight of the Amazon driver. But the song wasn’t on Spotify and didn’t appear in Google searches. So maybe it was less known? I searched discographies, including lists of all Broonzy’s 78 RPM singles and B-sides. Nothing. I rechecked Google. Where was this “classic blues song”?
I was at the library to do more research when I stopped in my tracks. I remembered a phenomenon known as the artificial intelligence hallucination problem, in which large language model AI programs confidently provide information that is totally false, as if seeking to maintain their all-knowing status through bull*****ing.
Tech companies acknowledge the issue. “The new AI systems are ‘built to be persuasive, not truthful,’” The New York Times said, quoting an internal Microsoft document. “This means that outputs can look very realistic but include statements that aren’t true.’”
I began to suspect that the chatbot was hallucinating about “Package Delivery Blues,” but the description of “Package Delivery Blues” was so specific I was still unsure. My next step was to re-interview ChatGPT and ask for evidence. Here’s our exchange:
MICHAEL: You told me there is a song about supply chains called “Package Delivery Blues” by Big Bill Broonzy. He’s a famous blues musician but I can find no evidence that “Package Delivery Blues” exists. Can you provide proof?
ChatGPT: I apologize for the error in my previous response. Upon further research, I have not been able to find any evidence that Big Bill Broonzy recorded a song called “Package Delivery Blues.” It seems that the information I provided was incorrect, and I am sorry for any confusion or inconvenience this may have caused. Thank you for bringing this to my attention.
At least the AI confessed.
But don’t be fooled. ChatGPT and other such large language model programs sound smart but should be fact-checked. They aren’t ready for the responsibility of teaching humans. At times they fabricate information. Be too trusting and you’ll get the chatbot blues.
Read the complete Issue 34 of ChainMail here.
Enjoying this story? Subscribe to ChainMail, MxD’s newsletter on breaking supply chain news, trends, and updates.