With the recent developments in AI technology, we are entering a new era of manipulation through captology. Mainstream websites/apps are well known to utilize captology to manipulate its users. But now captology is AI driven and will reach a new dimension of manipulation. In the very near future, every mainstream OS will be AI infested. This means, manipulation will start as soon you turn on your PC/smartphone.

The internet and computing in general is no longer a harmless activity. Now it is objectively a great danger for the users mental self-determination to use mainstream websites/OS/software. Because I honestly don’t believe, that anyone can outsmart AI driven Captology. People who claim this, probably also believe they can consume heroin without becoming addicted.

This is why I believe, that now is the last chance to escape from this pernicious trend. This means, now is the time to fundamentally rethink and reorganize our habits of using the internet and computer technology.

I don’t claim to know the perfect way out, but I can only share what I am doing. I believe that some kind of off grid computing is the solution.

-My tech: Laptop with Linux Mint, Nokia Dumbphone, FM tuner -No home internet connection. I only use public wifi. (Psychological reason, to keep me from mindless surfing)

A few years ago, I would have called someone with this lifestyle an insane luddite. But today, I prefer to be an insane luddite, than to be a dopamine addicted consoomer who is being controlled by AI driven captology.

  • eskimofry@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I guess most people can’t afford (both time and money) to run an LLM just to have a saner browsing experience. So i suspect the key to adoption would be for the community to band together and spawn some LLM servers for the community backed by donations or other sustainable practices.

    The future is probably once again crowd computing. We could probably use the federated network to distribute LLM requests from the community and load balance among those that are running LLM endpoints.