Don’t sacrifice integrity on the AI efficiency altar
I wasn’t sure this needed to be an article, after all, most adults know that the internet is a giant post-card where even your most private messages are available to some grumpy developer who just wants to go home and work on their indie game development. This is a PSA we’ve been taught for decades.
However, after 5 years of being a developer on a social media site, I am fully aware of how little people think about their own data and privacy.
I myself, am no saint. I am surprisingly bad at protecting my personal data, but I do think two or three steps ahead when it comes to my work related data.
I’ve seen online LLMs be used to summerise notes from meetings, to consult regarding medical descisions and loads of other use cases that contain otherwise sensitive data.
I am confident that majority of the people that use e.g ChatGPT, Perplexity, Bing etc anonymize the data enough so that it can’t compromise someone else’s integrity, or rather I am naive enough to hope so.
While I have full confidence that the companies that host these systems have the right intentions, they too are vulnerable.
So before you use the system to do anything, ask yourself:
Would it harm me, someone else or my employer if this task/prompt/output was posted on a public forum for everyone to read. (If the answer is YES, you shouldn’t be using it on an online LLM)
Now I can see you rolling your eyes.
“Not everyone is evil Almira!”
This is not about good or evil. This is about integrity, something we are too eager to sacrifice in the name of efficiency. While companies are committed to guarding their system (it would harm them if leaks would happen) we have to remind ourselves that we can contribute to everyone’s safety by taking that extra step and thinking twice about what we feed in to the system.
There are plenty more people that have written long and intelligent posts about AI and cybersecurity. After this: go and read their work! This is just a reminder to think twice about the data you share.