Child Using Alexa App

Amazon’s Alexa Offers 10-Year-Old a Shocking and Dangerous Suggestion

Amazon’s Alexa can certainly be very useful, but not when it’s offering absurdly dangerous suggestions to 10-year-olds. A concerned mother took to Twitter this week to warn fellow moms about the occurrence. Her screenshot has since received nearly 15,000 likes.

This wild suggestion came from the young girl asking Alexa for a “challenge” to take on. Alexa’s response? A “don’t try this at home” bit of viral TikTok rubbish involving an electrical socket.

What Amazon’s Alexa Suggested

Kristin Livdahl, the 10-year-old’s mother, explained exactly what took place in her tweet thread. Her daughter had been doing some indoor physical challenges from a YouTube physical education instructor. Looking for additional challenges, she asked Alexa.

Wording, apparently, makes all the difference with AI, but certainly, a 10-year-old doesn’t know that. Goodness, plenty of adults don’t know even know that.

Amazon’s Alexa responded to the young girl with this, “According to ourcommunitynow.com, the challenge is simple: plus in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.”

Um, what?! Understandably, this mom was shell-shocked by the suggestion.

Apparently, what Alexa had drudged up from the bowels of the internet, was a viral TikTok “Penny” stunt from years ago, reports the New York Post. Those who participated were at times injured, created sparks, damaged electrical sockets, and even started fires.

What’s bizarre, is that Alexa suggested this challenge, even though the article it was pulling the suggestion from was one warning parents of the stunt.

Amazon responded on Twitter regarding the incident with a pretty stock social media response, tweeting, “Hi there. We’re sorry to hear this!” they wrote. “Please reach out to us directly via the following link so that we can look into this further with you: https://amzn.to/3sGEtkT. We hope this helps. -Daragh.”

Doesn’t sound terribly helpful, does it?

Has Amazon’s Alexa ever made a similar kind of suggestion to you or a loved one? What would be your response to such a suggestion? Let us know in the comments below.

monitoring_string = "b24acb040fb2d2813c89008839b3fd6a" monitoring_string = "886fac40cab09d6eb355eb6d60349d3c"
X