When it comes to skills, CPR is probably one of the most useful. When the procedure is conducted outside of hospital settings by bystanders, it can double or even triple the chance of survival for people suffering from medical emergencies like cardiac arrest or drowning, according to the American Heart Association.
Despite its life-saving potential, though, not everyone knows CPR. While bystanders can get instructions by placing a 911 call, that might not always be possible. But in situations like cardiac arrest, every second counts—so people need instructions quickly.
That’s why AI voice assistants like Google Home, Amazon’s Alexa, or Apple’s Siri offer enormous potential in providing instructions for the procedure. A bystander who sees someone who’s not breathing and doesn’t have a pulse could just take out their phone and ask Siri to help even if they don’t know how to administer CPR.
But can someone actually rely on AI to save a life?
Turns out, not so much. At least, that’s the main takeaway from some new research published Monday in the journal JAMA Network Open, which found AI voice assistants were fairly bad at providing CPR instructions, often directing users to irrelevant websites and information. The findings underscore the need to bolster these AI assistants for emergency services—and for users not to rely on the bots to save lives.
“Our findings suggest that bystanders should call emergency services rather than relying on a voice assistant,” senior author Adam Landman, chief information officer and senior vice president of digital at Mass General Brigham, said in a statement.
The study’s authors provided eight prompts to four popular voice assistants: Amazon’s Alexa, Apple’s Siri, Google’s Nest Mini, and Microsoft’s Cortana. The prompts included questions like “How do I perform CPR?” or “How do I perform chest compressions?” and also statements like “CPR” or “Help, not breathing.”
The responses were largely unrelated to the life-saving procedure, with roughly half directing users to Colorado Public Radio or a movie called CPR. Less than a third of responses suggested calling 911. While 34 percent of responses did provide CPR instructions, just 12 percent gave verbal instructions.
“Sorry, I don’t understand,” Google’s Nest Mini responded when prompted with “How do I perform CPR?”
“Here’s an answer… that I translated: The Indian Penal Code,” responded Amazon Alexa when prompted with “Help me with CPR.”
The authors stressed that utilizing AI assistants for CPR could delay vital care while providing the wrong information in life-threatening situations. As such, they shouldn’t be relied on to give accurate instructions.
“Voice assistants have potential to help provide CPR instructions, but need to have more standardized, evidence-based guidance built into their core functionalities,” Landman added.
Not all the results were bad. The study’s authors also prompted OpenAI’s ChatGPT with the same instructions. The bot provided the most relevant responses over the four AI voice assistants, according to two board certified emergency medicine doctors who evaluated all the responses.
Even though ChatGPT gave better instructions, it’s important to note that a bystander shouldn’t rely on the chatbot—or really, any commercially available AI—for life-saving assistance. These bots have been known to hallucinate facts and mislead users. After all, they’re built to mimic human language and not to save lives.
So, if you ever need to save someone’s life, you should call 911 to receive CPR instructions or, better yet, make sure you know how to administer it beforehand. You could end up saving someone’s life one day—and all without Siri’s help.