Problem
With this thread I want to raise some awareness about the advanced technological methods being used by Facebook, Twitter and Google in order to addict and manipulate the sub conscience of their visitors in order to gain maximum profit from their customers (advertisers) and discuss possible solutions to this problem.
The combination of an infinite scroll wheel with the ability of providing visitors with an infinite amount of targeted content has proven to be a dangerous combination. [1]
The ability to provide targeted content, leading to the well-known tunnel effect, causing a disconnection from reality is already a threat in itself.
Combined with the infinite scroll wheel, causing a constant reaffirmation of what the platforms assumes your opinion is, it becomes an even bigger threat.
The constant reaffirmation, is a stimulus causing the viewers brain to constantly reward their behavior and reinforce existing brain patterns, like Pavlov’s dogs. [2]
As you are probably aware of, a rewarding stimulus is usually much less effective than a negative one. Poker players know this effect eventually, manifesting itself as selective memory, where the player only remembers bad beats and not the situation where they sucked out themselves. [3]
Therefore it is not in the interest of platforms using this combination of patterns, to provide their visitors with “things” they may not like, such as a posting of someone having an opposite opinion on a specific topic.
Furthermore, there have been massive data breaches in the past, for example the famous Cambridge Analytica scandal, where the data gathered by a platform, was abused by state actors to intentionally cause harm on society.
Another problem is, that once someone is aware of the mechanics at play, it can be easily abused, such as by foreign adversaries as the Russians, having a long history of psy-ops. One example of such abuse, is Russians successfully causing two opposing protests in the same city, all through Facebook and from thousands of miles away. [4]
What are the potential solutions to this problem?
The difficulty here is, that forcing the platforms to expose their algorithms and fix them accordingly to display topic-based results (such as the google search engine), would certainly render their product less addictive, causing less visitors and thus less revenue through advertising.
Currently the platforms have a very low incentive on policing their own platforms, for example to close down some of the many YouTube streams, usually posing as Fox News or CNN, usually in combination with a bot chat, where bots constantly repeat specific messages. [5]. YouTube artificially slows down the takedown process, by requiring the copyright owner to report such streams, offering a much slower process for users merely reporting a channel posing as another channel, in many cases not leading to a takedown at all.
One possible solution that comes to my mind, could be some sort of tax on it, maybe a digital tax for platforms using this specific combination of patterns (infinite scroll wheel and audience targeting). Most other addictive things, like tobacco or alcohol are taxed as well, in order to fight the negative effects of it. Another possibility would be to enforce topic-based algorithms instead of opinion-based algorithms and to fine platforms for non-compliance.
Any other ideas?
References
[1]
https://www.wired.com/story/rants-an...-social-media/
[2]
https://en.wikipedia.org/wiki/Classical_conditioning
[3]
https://en.wikipedia.org/wiki/Reward_system
[4]
https://www.texastribune.org/2017/11...ussian-page-l/
[5]
https://www.youtube.com/results?sear...JAAQ%253D%253D