Five things you should not share on chatgpt
ChatgPT, developed by Open AI, guarantees many users a digital assistant who can write text, review code, analyze medical results, or even help fall asleep a child. But behind these intimate answers are algorithms that learn something with each interaction.
Over the years, ChatgPT has learned that the user, for example, loves chicken eggs and has back pain. These are just a few details – this AI tool memorized many others.
In fact, tools like ChatgPT aim to customize answers and improve their own future performance, memorizing many individual details, from the physical condition of users to some more particular issues.
That said, it is important to know what should not be shared and why.
According to expert warnings, there are five types of information that should not be inserted in chatbots conversations as they can be misused in training process processes, system errors or even cyber attacks.
This information includes:
Identity Information: As a birth certificate number, certificate, passport, address, telephone number and date of birth. Some chatbots try to censor this data, but there are no guarantees about the reliability of the process.
MEDICAL TEST RESULTS: Chatbots are not subject to strict health data protection laws. If necessary, load only test results without identity information.
Financial information: such as bank or investment accounts, which may have serious consequences if disseminated involuntarily.
Confidential business information: Many users use chatbots to write emails or check codes while working, but this may reveal trade information or confidential codes.
Password and user names: Chatbots are not a safe space for storing passage words or digital coffers. This information must be preserved in the passing word management software.
Large companies like OpenAi, Google and Microsoft use users’ conversations to improve their models. Although users can activate the option of not using data training data, many people are unaware of this capacity.
Ways to maintain privacy
Cybersecurity experts suggest that you take the following measures to protect your privacy, such as:
Use temporary chat mode: The ‘Temporary Chat’ functionality acts as the anonymous browser navigation mode, preventing conversations from being stored and used in training models.
Eliminate conversations: You can eliminate them from the history after the end of each conversation and make sure the information is completely erased after 30 days.
Use anonymization tools: Use anonymization tools like ‘Duck Duckgo’ (Duck.ai) that sends anonymously messages to AI models; although it has limited possibilities compared to full chatbots.
Ultimately, it is up to each of us to decide what to share and what to keep private. Chatbots are designed to keep conversations to take place, but the final authority is still in your hands.