15. The Skinner Box

Among the behaviourists, Burrhus Frederic Skinner (1904 – 1990) is another important representative. In his animal experiments with pigeons and rats, he observed that the latter – in a box isolated from the outside world (“Skinner box”) – were more likely to repeat behaviour that just happened to occur if they were rewarded immediately afterwards (positive reinforcement), and more likely to refrain from the behaviour if they were punished immediately afterwards by means of electric shocks.

The principle of positive reinforcement is omnipresent in our society: action follows reaction. As a result, we learn very quickly what is socially acceptable and what is not. It is part of education, personnel management and how we treat each other.

In social media, any form of feedback affects behaviour – whether you like it or not – because social recognition is one of the most important motivations for users. Even though many claim they post things for fun, over time their drive turns into a hunt for praise and recognition.

So if one is praised for a post or comment, one tends to post similar things again in anticipation of renewed positive feedback. But if you receive a negative reaction, you will be wary of posting the same thing again.

In this way, others have a strong influence on our behaviour, our opinions and our feelings, and can even skilfully direct them.

Influence by platform operators is part of the business strategy. While there is usually a feedback chain reaction after a post (post-feedback-counterfeedback-post, etc.), this automatism does not really get going with some users. In this case, social bots of the platform operators disguised by fake accounts take on the role of a “user” and give feedback generated by algorithms to motivate more posts and thus get the feedback loop moving again. And all this just to keep the ribbon of communication going so that people stay as long as possible or keep coming back.

However, the possibility of exerting influence is increasingly being exploited by individual users and entire companies. For example, the AfD in Germany is one of the most active parties on Facebook. Its increasing electoral success is due to the fact that it deliberately spreads opinion, fake news and fear, and can thus persuade potential voters and mobilise its own supporters.

So-called “sock puppets” or fake accounts are often used to disguise their own identity. These are additional user accounts that serve to subvert rules under their guise or to represent opinions with multiple voices.

Negative reinforcement can also be found in social media, such as “catfishing”. This is the unprovoked rejection, ridicule or ignoring of users by other users. Users who deliberately post provocative or negative comments in order to “draw out” other users are called “trolls”. In game design, this phenomenon is known as “griefing”.

The same is done on a professional level by so-called “troll farms”, which use fake accounts to specifically represent the interests of other companies and thus influence users on a large scale. In Russia, a veritable troll army has been in operation since 2003 to spread propaganda in the social media. It is accused of having manipulated Brexit and the 2016 American presidential election in this way.

Viral contagion: Users can become both perpetrators and victims of cyberbullying. Often one is a consequence of the other.

<< previous article – next article >>

chapter overview