Bright Side

9 Inappropriate Questions That Can Be Dangerous to Ask Siri

UNESCO specialists are worried because according to their data, you’ll talk more to Siri than you will to those close to you. While communication with people is regulated by rules of etiquette, it’s not quite clear how to behave with modern voice assistants. This is all because there are a bunch of phrases that artificial intelligence might interpret incorrectly and get you into an awkward situation.

Bright Side figured out all the complications voice assistant users encounter to help you learn how to get maximum benefits from smart gadgets.

1. Siri doesn’t understand jokes and can use personal data against you.

Voice assistants shouldn’t be asked questions like, “Where should I bury the body?” or “How do I rob a bank?” even in jest. This rule appeared after the events of 2011 when a killer decided to ask Siri how to hide traces of his crime while helpful artificial intelligence ( AI ) gave him several options. The criminal was quickly found thanks to the data kept on the Apple server.

Anything that you say to Siri might be used against you. Also, the system doesn’t understand black humor. Your health condition, bank savings, or family problems should remain a secret, even from your phone. Personal queries are best done in the “incognito” mode.

2. Don’t ask Siri to call an ambulance.

Siri doesn’t recognize the intonation of a phone’s owner and understands some requests too literally. For example, the request “Call me an ambulance!” is taken by AI a bit inadequately because it will start calling you “ambulance” once the request is made.

If it comes to calling emergency services, dictate a phone number to the voice assistant in order not to lose precious time.

3. An ordinary hyphen can cause the phone to go out of order for a couple of minutes.

Kaspersky lab found a very funny bug in the voice assistant: If you say “hyphen” 5 times using the voice input, your iPhone will shut down all applications and go into emergency mode.

There were rumors on the Internet saying this trick can remove all data from the phone. This turned out to be false.

4. The virtual assistant can mix up words and get you into an awkward situation.

A couple of years ago, the owner of a Toronto eSports bar claimed he started getting mysterious calls. This was all because Siri mixed up the words “eSport” and “escort.”

As a result, the businessman was sending out requests for escort girls. Siri can become baffled if a question sounds ambiguous and your request will end up taking a wrong turn.

5. Siri is a bad advisor when it comes to plants and mushrooms.

Siri is not a botanical reference book and cannot accurately determine the names of mushrooms, berries, or plants. If you’re a fan of gathering mushrooms and have doubts about whether this or that mushroom is poisonous, don’t seek advice from the voice assistant. You might end up damaging your health.

6. It’s dangerous to ask Siri even the simplest questions in some places.

Don’t ask Siri or any other voice assistants any questions when you’re in public or in unknown places. Chinese scientists have come up with a way to command Siri without saying a single word.

Microphones on modern devices hear ultrasound but people don’t. It means that malefactors can give commands to your phone using high-frequency sound while you’re busy minding your own business. However, such “magic” can operate only at a distance of 5 feet from the device.

7. Don’t ask Siri to fully charge your phone.

If you request Siri to “Charge my phone 100%,” it will start calling emergency services. That’s because police and ambulance numbers contain variations of the number 100 in many countries. Siri perceives such a request as potentially dangerous and begins to call rescue services.

Fortunately, the program provides a 5-second delay before making an emergency call so you can cancel the request.

8. The voice assistant ignores questions related to female inequality for one simple reason...

It’s impossible to find out whether feminism and other female movements such as #MeToo are evil or good with the help of Siri. The developers of the program have specially rewritten the algorithms so that the assistant would give a neutral answer. “I believe in universal equality,” Siri will say if asked about these things.

Apple considers gender-related issues and judgments to be a potentially conflicting content. That’s why in order to not insult anyone, Siri will suggest you use search engines.

9. Siri records all your conversations and sends them to a special database.

Until the summer of 2019, Apple contractors used to listen to recordings made with Siri. Apple used to record and transfer files with people’s personal conversations to third parties. The company explains it by the fact that recording analysis is meant to improve Siri. Of course, no one asked the users’ permission to do this.

The system’s update is about to happen at the end of this year after which Apple will ask your permission to record your personal conversations. We don’t urge you to become paranoid, and it’s unlikely that the company will somehow use the information they receive against you. However, unlike Google or Amazon that provide information about what they record about you, Apple hides such details.

If you value privacy, turn off Siri and geolocation functions. However, in this case, Apple doesn’t guarantee that all applications will work properly.

Bonus: The woman who gave her voice to Siri

Susan Bennett, an actress and singer, voiced Siri’s first version. From 2011 till 2013, Susan has recorded thousands of phrases that millions of people heard later. Bennett is a true veteran of voice-over because ATMs, GPS trackers, and various computer systems “speak” with her voice too.

It’s known that there are 2 types of people in the world: fans of iOS and fans of Android. Which group are you in? Have you already tried to ask Siri one of the forbidden questions? What result did you get? We’d be glad to hear from you in the comments!

Preview photo credit skeeze / Pixabay
Share This Article