top of page
Writer's pictureBurton Kelso, Tech Expert

How to Avoid ChatBot Cyber Attacks


computer repair service it support services outsourced it managed services secure it computer repair home computer repair business computer repair seniors windows computer repair macintosh computer repair laptop repair best buy geeks squad geeks onsite pc repair near me computer repair around me it support around me dell notebook computer store apple hp Lenovo my computer windows surface macbook mac mini smarthome tech support data recovery servers networks church real estate restaurant construction disaster recovery virus malware home based business non profits retail business insurance agencies restaurants law offices accounting offices

If you're like most people, you like the efficiency and convenience of chatbots because they can respond nearly as well as humans. Even though they are a major benefit to you and your family as well as the businesses you buy products and services from, chatbots can be a big risk to you and your personal information. New technology doesn't just benefit our daily lives, they also, unfortunately, help improve how cybercriminals are able to trick you into falling for scams. Tools like WormGPT and FraudGPT are the evil cousins of ChatGPT. You also need to make sure your smart home assistants aren't spying on you as criminals are trying to inch their way into devices such as Google and Alexa. As more AI threats emerge, you will have to watch out for how Chatbots can scam you. Here's what you need to know.


I bet you thought you knew it all when it came to cyber scams, right? Well, cybercrime is ever-changing. Throw everything you knew out the window about online scams. Here are all of the ChatBot scams you need to watch for:

  • Chatbot social media scams use sponsored ads and posts to get you to download viruses and malware.

  • Chatbot phishing scams contact you with a sense of urgency, pretending to be a legitimate company or bank claiming there is a problem with your online accounts.

  • Chatbot voice cloning scams fool you into thinking a loved one is hurt, in jail, or in an emergency situation that money can only solve.

  • Chatbot investment scammers pose as cryptocurrency experts who promise huge returns on fake investments.

  • Fake Chatbot websites and apps that look and respond just like the real thing.

How to stay safe while using online chatbots. Overall chatbots can be hugely valuable and are typically very safe, however, you need to recognize how to stay safe from online malicious chatbots.

  • Use chatbots only on websites you have navigated yourself. Never click on links in emails and texts to visit them.

  • If you receive an email with a link to a chatbot, always verify the “from” address first before clicking on any links.

  • Only click on links in texts or emails you were expecting or from senders you know.

  • Ignore tempting offers and incredible prizes, especially when they appear out of nowhere.

  • If you receive a suspicious message, do an internet search for the company name and the offer or message in question. If the offer is real and valid, you will likely find more information about it on the company’s website. If it is fake, you might find news reports about the scam or no information at all.

  • If you receive frequent texts from unknown numbers that contain suspicious links, you can adjust your carrier settings to filter out spam calls and texts. You can also download mobile apps to block shady numbers and texts.


AI home assistant Chatbot risks and vulnerabilities

Chatbot threats aren’t only online. Many of you have already opened your homes to chatbots in the form of virtual personal assistants like Amazon’s Alexa, Google’s Home, and Apple’s Siri. Your voice-activated assistants aren't after your passwords, but criminals will try to access them by logging into your web-based Amazon, Google, and Apple accounts to attempt to gain access to your personal conversations in your home. They might even try to extort you once they are logged into your smart home accounts.

Chatbot scams and attacks on your smart home devices assistants usually fall into one of the following categories:

  • Overriding: Devices rely on ultrasonic frequencies to pair with other electronics, so encoded voice commands transmitted above 20Khz are inaudible to humans but received by compliant smart speakers. These so-called “dolphin attacks” can be broadcast independently or embedded within other audio. Researchers have also triggered assistant actions via light commands – fluctuating lasers interpreted as voices to control devices from afar. Eighty-eight percent of AI assistant owners in our study had never heard of these dastardly tactics.

  • Eavesdropping: The most basic attack of an always-on microphone is turning it into a spying device. Hijacking this capability would effectively plant an evil ear right in your home. Beyond creepily invading your privacy, such intrusions could capture financial details, password clues, blackmail material, and a way to confirm that a residence is empty.

  • Imposters: Hackers can employ a method called “voice squatting,” where unwanted apps launch in response to commands that sound like legitimate requests. An alternate approach called “voice masquerading” involves apps pretending to close or connect elsewhere. Rather than obeying requests to shut down or launch alternate apps, corrupt programs feign execution, then collect information intended for others. According to our study, only 15 percent of respondents knew of these possible hacks.

  • Self-hacks: Researchers recently uncovered a method to turn smart speakers against themselves. Hackers within Bluetooth range can pair with an assistant and use a program to force it to speak audio commands. Since the chatbot is chatting with itself, the instructions are perceived as legitimate and executed – potentially accessing sensitive information or opening doors for an intruder.

Ways to secure your AI home assistant devices Manufacturers issue updates to address security flaws, but when personal assistant divisions like Alexa lose billions of dollars, such support is likely to be slashed. Luckily, there are simple steps that consumers can take to help safeguard their devices.

  1. Mute the mic: Turning off an assistant’s microphone when not actively in use may reduce fun and convenience (preventing one from randomly soliciting a song or weather report) but eliminates access to many hacking attacks.

  2. Add a PIN or voice recognition: Most AI assistants can require voice-matching, personal identification numbers, or two-factor authentication before executing costly commands. Activating these safeguards keeps a device from obeying unauthorized users and stops children from making purchases without authorization.

  3. Delete and prevent recordings: Users can revoke manufacturers’ permission to record or review audio commands. Within the “Alexa Privacy” (Amazon) or “Activity Controls” (Google) settings, owners can find the option to disallow saving or sending recordings. Apple devices no longer record users by default, though owners can opt-in and allow it.

  4. Set listening notifications: Configuring assistants to emit audible alerts when actively listening or acknowledging commands provides a reminder when ears are open…and can uncover external ultrasonic or laser attacks.

  5. Disable voice purchasing: Impulse buying with only a sentence spoken aloud is a modern marvel that’s rarely necessary. Given their security shortcomings, allowing virtual assistants to access or execute financial transactions is risky. Have the assistant place items on a list instead, then confirm the purchase from a more secure terminal. Users might even save money by reconsidering late-night splurges.

Additionally, when it comes to online and smart home chatbots, you need to follow the baseline safety protocols which are:

  • Keep your smart home device protected by keeping firmware updated, using strong passwords, and connecting them only to secured and password-protected routers.

  • Never share information like social security numbers, credit card numbers, bank information or account numbers, and medical information.


Hopefully, this post has given you the information you need to keep yourself safe from Chatbot scams. If you need further assistance, please reach out to me with any questions you might have. I am always happy to help!

Looking for More Useful Tips Tips?


My Tuesday Tech Tips Blog is released every Tuesday. If you like video tips, I LIVE STREAM new episodes of 'Computer and Tech Tips for Non-Tech People' every Wednesday at 1:00 pm CST on Facebook, Instagram, LinkedIn, and Twitter. Technology product reviews are posted every Thursday. You can view previous episodes on my YouTube channel.


Sign Up for My Tech Tips Newsletter! Click this link to sign-up and subscribe and you will receive every tip directly in your inbox each week.


Want to ask me a tech question? Send it to burton@burtonkelso.com. I love technology. I've read all of the manuals and I'm serious about making technology fun and easy to use for everyone.


Need computer repair service near you? My company Integral offers the highest quality computer repair service nationwide. If you need on-site or remote tech support for your Windows\Macintosh, computers, laptops, Android/Apple smartphone, tablets, printers, routers, smart home devices, and anything that connects to the Internet, please feel free to contact my team. Our team of friendly tech experts organization can help you with any IT needs you might have. Reach out to us a www.callintegralnow.com or phone at 888.256.0829.


Please share this with your friends and family! If you found this post useful, would you mind helping me out by sharing it? Just click one of the handy social media sharing buttons below.


The above content is provided for information purposes only. All information included therein is subject to change without notice. I am not responsible for any direct or indirect damages, arising from or related to the use of or reliance on the above content.






65 views0 comments

Comments


bottom of page