Virtual assistants, such as Amazon Alexa and Apple’s Siri, are supposed to make our busy lives slightly easier.
Rather than wasting valuable seconds setting alarms, checking facts and writing shopping lists, we can now just ask our artificially intelligent devices to do it for us.
However, as is always the case with new technology, it was always bound to go wrong.
Alexa fails & other virtual assistant mishaps
For the most part, smart home devices like Alexa are extremely helpful. However, having a virtual assistant listening to your every word at all times isn’t always as convenient as it sounds, as these incidents prove:
Parrot places Amazon order
A clever parrot used its owner’s Alexa to place itself an order on Amazon earlier this week. However, the African grey wasn’t smart enough to order itself something useful.
Having heard its owner calling out Alexa, the pet used its broadened vocabulary to order itself a set of golden gift boxes.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataAfter questioning her family over the mystery purchase, owner Corienne Pretorius discovered audio clips of the mimicking bird squawking: “Alexa! Oh, um, hang on! Alexa!”
South Park pranks viewers
Hit Comedy Central show South Park is well known for breaking rules. It delivered once again earlier this year with an episode that heavily features the Amazon Alexa and Google Home virtual assistant devices.
Viewers took to social media to moan at the likes of Cartman, Kyle and Stan spoke to their devices throughout the episode. Some reported their alarm going off at 7am the next morning, while others claim to have found a set of “hairy balls” on their shopping lists.
Rising demand for dollhouses
There have been plenty of stories of children using their parent’s virtual assistants to order themselves some treats.
One six-year-old asked Alexa: “Alexa, can you play dollhouse with me and get me a dollhouse?”
The device delivered, sending a $160 mansion dollhouse to her house. The order came complete with a huge tin of cookies.
Reported on a San Diego TV station, one Alexa mishap quickly became two. As the news anchor repeated what the child had said, Alexa devices across the state went on a shopping spree of their own.
Digger digger kid
In most cases, virtual assistants pick up what you’re saying fairly accurately. However, some parents have found out that that isn’t always the case.
One unknown YouTube user uploaded footage of his son asking their Alexa device to “play Digger, Digger”. Mishearing the toddler, Alexa’s delivers a far from PG response as his parents scream for her to stop.
We’re warning you – it’s probably best not to watch this one at work: