- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Federated services have always had privacy issues but I expected Lemmy would have the fewest, but it’s visibly worse for privacy than even Reddit.
- Deleted comments remain on the server but hidden to non-admins, the username remains visible
- Deleted account usernames remain visible too
- Anything remains visible on federated servers!
- When you delete your account, media does not get deleted on any server
Anything that is visible to another party can be hijacked - even a 1:1 communication does not guarantee that the other party doesn’t capture the data and then spread it. The only things that are private are thoughts that you have which are not shared with others in any fashion. As soon as information is shared in any fashion, it is not private.
Past this point it’s a matter of how private you think is reasonably private. You could design a system where users are in control of their own data through a series of public and private keys, ensuring that keys must be active to view content, but as stated above even in such a case and the user revoking keys does not stop other people from making copies of said data. This is akin to screenshotting an NFT. For all intents and purposes, a copy of the data as it existed at the time of copying is now publicly available.
Quibbling over the fact that you’re the one who “truly owns” the data when it comes to something like social media feels like a mostly pointless endeavor because the outcome (data is available for others to view/consume/read/etc) is the same regardless of who “owns” it. Copyright law will apply to anything you produce, if it comes to legal problems (someone copies your artwork and sells it, for example) and having a system to prove you own it is primarily a formality to make it easier to prove ownership. Generally people aren’t arguing through this lens, however, and are instead arguing through the privacy/security lens - that they don’t want people stealing/selling their data, which lol, good luck. AI models are proof that no one in the world actually cares about this ownership if they reasonably think they can get away with using your data without any real incentive to not do so - interestingly copyright law and models being trained on corporate data such as movies are a vector by which the legality of this might actually stop or slow AI development and protect the end-users data.