Manipulation 2.0. How does social media turn you against your friends?
It is crucial to understand that you can lie using only the truth.
Imagine you are having a coffee or lunch with your friend. You talk about your daily lives, and at some point the topic turns to social issues or politics. Then, your friend’s eyes widen. He is silent for a while, then he asks, surprised:
Wait, do you really think so? I was sure I would hear exactly the opposite from you.
You ask where his idea came from, and he replies that he had a very different feeling when he saw what you shared and commented on social media.
You slowly realize that these comments that your friend saw, gave him a picture of a person with completely different views than you have in reality. He saw some of your posts but never saw the other ones.
It’s not just both of you that are surprised. Every day millions of people have the same problem: they draw wrong conclusions about their friends based on what they see on social media.
How does it happen?
When the truth can also be a lie
Two things are crucial in using the truth to lie — being selective while providing the information and controlling the frequency of the information.
It is not necessary to provide false information — it is enough just to select a handful of true events and at the same time reduce the number of conflicting information. This will effectively mislead the recipient.
How do social media do it?
Social media use you and the real information that you share by yourself, to manipulate your friends.
The easiest way to use you to influence others is to make some of the articles and blog posts you share appear less often on your friends’ walls.
The algorithm finds keywords in articles, and blog posts you share that are “inappropriate” or inconvenient for the particular social media. Then, the algorithm introduces a range multiplier.
To make it simple, let’s say that the multiplier causes some posts to show up to your friends twice as rare as normally. Consequently, there would also be fewer comments and reactions under it. It would seem less popular and also ignored by other users. Seeing the posts you share less often, your friend would think that they are rather exceptions among your shares.
However, the service authorities welcome some articles you share and would get a 2x multiplier. This article would show up to your friends twice as often as usual and four times more often than the article suppressed by algorithms (0.5x multiplier). It also means the article would get more comments. Your friend will see such an article, a lot of comments under it, and will think that it is who you REALLY are.
As a consequence, your friends (depending on their views) will see only a part of you. The one, which confirms their beliefs, and that aligns with what the social medium wants you to think.
The more friends, the stronger the effect. When algorithms falsify the image of hundreds or thousands of people — psychology does the rest. We are social creatures, and research shows that what others think matters to us. Even if we do not want it to be true.
the closer you are to the person to whom you send a friend request, the better the tool for manipulation you become.
Manipulation 2.0 is horrible, but this is only one of the mechanisms used, and there are many more of them.
And even if you are one of the least prone to this kind of manipulation, are you sure that you can say the same about your friends?
Manipulating the world “for its sake” is not progress.
I am not sure what you think about this, but we at ccFound strongly condemn such practices.
We believe that we all have the right to know our friends’ opinions and views. Even when these views are silly. We also believe that we all have a right to access expert knowledge, even if their opinions do not appeal to big media corporations or governments.
First of all — we do not accept lying and manipulation of people, even “for a good cause”. It never ends well because power always corrupts.
The question is — can we somehow counteract such actions?
Yes, by cutting yourself off from the people and businesses that do this.
We are making a new social media that will be completely resistant to this type of manipulation. We will transfer authority over the website to the user using special tokens. We will let people, and not members of the management board, decide in which direction the platform should go.
This will make the goals of the platform and users one, and the same.
A nice bonus of this solution is that every token owner will … earn part of the platform’s profits. But this is a topic for another article ;)
A platform created by people, of the people, and for people. And completely uncensored.
If this sounds interesting to you, click HERE and find out more about the project.
 Caitlin Dewey; 9 answers about Facebook’s creepy emotional-manipulation experiment, July 1, 2014 www.washingtonpost.com/news/the-intersect/wp/2014/07/01/9-answers-about-facebooks-creepy-emotional-manipulation-experiment/
 How Cambridge Analytica’s Facebook targeting model really worked — according to the person who built it, The Conversation, March 30, 2018 12.03pm https://theconversation.com/how-cambridge-analyticas-facebook-targeting-model-really-worked-according-to-the-person-who-built-it-94078
 Julia Carrie Wong, Revealed: the Facebook loophole that lets world leaders deceive and harass their citizens,Mon 12 Apr 2021 09.00 BST https://www.theguardian.com/technology/2021/apr/12/facebook-loophole-state-backed-manipulation
 Evan Osnos, Facebook and the Age of Manipulation, The New Yorker, November 15, 2018 https://www.newyorker.com/news/daily-comment/facebook-and-the-age-of-manipulation
 Kashmir Hill, 10 Other Facebook Experiments On Users, Rated On A Highly-Scientific WTF Scale, Forbes, Jul 10, 2014,12:34pm EDT https://www.forbes.com/sites/kashmirhill/2014/07/10/facebook-experiments-on-users/