Late last week, MSN.com’s Microsoft Travel section posted an AI-generated article about the “cannot miss” attractions of Ottawa that includes the Ottawa Food Bank, a real charitable organization that feeds struggling families. In its recommendation text, Microsoft’s AI model wrote, “Consider going into it on an empty stomach.”
Titled, “Headed to Ottawa? Here’s what you shouldn’t miss!,” (archive here) the article extols the virtues of the Canadian city and recommends attending the Winterlude festival (which only takes place in February), visiting an Ottawa Senators game, and skating in “The World’s Largest Naturallyfrozen Ice Rink” (sic).
As the No. 3 destination on the list, Microsoft Travel suggests visiting the Ottawa Food Bank, likely drawn from a summary found online but capped with an unfortunate turn of phrase.
The organization has been collecting, purchasing, producing, and delivering food to needy people and families in the Ottawa area since 1984. We observe how hunger impacts men, women, and children on a daily basis, and how it may be a barrier to achievement. People who come to us have jobs and families to support, as well as expenses to pay. Life is already difficult enough. Consider going into it on an empty stomach.
That last line is an example of the kind of empty platitude (or embarrassing mistaken summary) one can easily find in AI-generated writing, inserted thoughtlessly because the AI model behind the article cannot understand the context of what it is doing.
The article is credited to “Microsoft Travel,” and it is likely the product of a large language model (LLM), a type of AI model trained on a vast scrape of text found on the Internet. Microsoft partner OpenAI made waves with LLMs called GPT-3 in 2020 and GPT-4 in 2023, both of which can imitate human writing styles but have frequently been used for unsuitable tasks, according to critics.
Since the announcement of deep investments and collaborations with ChatGPT-maker OpenAI in January and the emergence of Bing Chat the month after, Microsoft has been experimenting with integrating AI-generated content into its online publications and services, such as adding AI-generated stories to Bing Search and including AI-generated App review summaries on the Microsoft Store. “Microsoft Travel” appears to be another production use of generative AI technology.
First noticed by tech author Paris Marx on Bluesky, the post on the Ottawa Food Bank began to gain traction on social media late Thursday. In response to Marx’s post, frequent LLM critic Emily Bender noted, “I can’t find anything on that page that marks it overtly as AI-generated. Seems like a major failing on two of their ‘Responsible AI’ principles.”
Bender also pointed toward Microsoft policies; one is “Transparency,” and reads, “How might people misunderstand, misuse, or incorrectly estimate the capabilities of the system?” and the other is “Accountability,” which states, “How can we create oversight so that humans can be accountable and in control?”
Judging by the Ottawa article’s content, it’s more than likely that a human was not responsible for writing the article and that no one fully reviewed its content before publication either, which means that Microsoft is publishing AI-generated content on the Internet with little-to-no oversight.
Microsoft was not immediately available for comment by press time.