AI WON'T TELL?

Five facts an AI must never learn about you – even your birthday is too dangerous to give away on a chat

One popular AI has already been used by nearly 500 million people

Weaponized 'AI girlfriends' built with 'malicious' design can steal cash from victims as experts warn over shocking scam

THERE are at least five facts about yourself that you should never let artificial intelligence know.

If you’re using regular AI-powered chatbots – even trusted ones from respected brands – you still need to be very careful.

Getty
Abstract view of a generative AI with lines of code soaking up by an blue face

The Sun spoke to top security experts who warned over the dangers of over-sharing with AI bots.

Chatbots seem to be everywhere these days: not just OpenAI‘s ChatGPT, but even in common Meta apps like Facebook Messenger and WhatsApp.

They offer humanlike conversation, but the problem is that you don’t know where the info you’re sharing will end up.

“When it comes to using large language models and chatbots it’s important to realise that your information is going to be processed by the platform and used for further model training,” said security expert James McQuiggan, speaking to The Sun.

“Anything you upload should not be sensitive or private information otherwise it could end up in someone else’s prompt response.

“It’s best to use general information and, if sensitive information is needed for the response, then mask it or use disguised names or information.”

James, a security awareness advocate at KnowBe4, told The Sun that there are at least five facts you should never share with AI bots.

“You never want to share credit card info, addresses, names, birthdays, identification numbers,” he explained.

James added that it’s important to not reveal “anything else that identifies you or someone else”.

It’s best to use fake names and info when asking for more personalised advice from chatbots.

Meta’s top VR boss predicts AI-powered future with no phones, brain-controlled ovens and virtual TVs that only cost $1

RULE OF THUMB

One of the easiest ways to stay safe when using chatbots is to pretend that your conversations aren’t private at all.

Imagine you’re literally posting online for the whole world to see.

See more

That’s the advice from Paul Bischoff, Consumer Privacy Advocate at Comparitech, who told The Sun that it lets you avoid having personal info scooped up by the AI machine.

“A good rule of thumb is that if you wouldn’t post it publicly on social media, then you shouldn’t share it with an AI chatbot,” Paul said.

STAY AI SAFE

Here’s the advice from The Sun’s tech expert Sean Keach

Chatbots are on the rise – and soon it’s going to be very hard to avoid them.

They’ll be built into lots of the apps you use. For some, they’re already there.

So it’s important to get ahead of the AI revolution and make those good habits now.

Start off strong by understanding why it’s important to not overshare with chatbots.

After all, you don’t really know where that information is going to end up.

Even tech companies are still getting to grips with their own AI products.

That’s why you should treat these chatbots with caution, and limit what you say to them.

If you must ask more personal questions, be sure to change key details so that the AI isn’t absorbing real info about you.

And try to stick to well-known and well-reviewed chatbots from reputable brands.

It’s not a guarantee of your safety, but it helps if you know you’re at least chatting a legitimate AI bot.

“Any information you share with an AI chatbot could be added to that chatbot’s corpus of data that’s used to generate answers for other users.

“You could inadvertently give personal info to the chatbot that it then shares with other users, not to mention the chatbot’s administrator.”

Facebook chief Mark Zuckerberg says that nearly 500 million people have used his Meta AI.

And this number is sure to rise in the future, so it’s important that users know how to use their AI helpers safely.

Getty
Even reputable chatbots like OpenAI’s ChatGPT should be used with caution
Exit mobile version