We try to behave well, considering what people say about us. According to the lessons of the game theory model, the social game is controlled in the long term by the logic of speech.

Researchers from the University of Pennsylvania used mathematical tools to investigate the social psychological assumption that what people think about our good news and what people say about each other behind each other's back plays an important role in maintaining social order.

Biomathematical research in Professor Joshua Plotkin's lab modeled what happens when a rumor has a single or multiple random sources. The analysis also found out how many rumors are needed to maintain consensus and cooperation.

Research into the social spread of information and cooperative behavior is scientifically very mature, but they have not often tried to combine the two. By combining the two fields, we were able to create a mechanical model of how cooperative behavior spreads along with information.

- pointed out Mari Kawakatsu, the author of the research paper.

Previous research has shown that people are more likely to cooperate when they believe that their peers are talking about them behind their backs. Rumors punish freebies and help avoid scammers. The other side of the phenomenon is that those who are nicer to others and help them have a better reputation and therefore receive more social support, such as better job offers. It is a kind of social feedback,

which is called indirect reciprocity.

The system of morality and good news ensures that the good are rewarded and the bad are punished. This ensures that the right behavior spreads and not the wrong one

- points out Taylor A. Kessinger, who joined the research as a physicist.

When someone would punish a bad character, he must be sure that others agree that he is guilty, otherwise he himself would do wrong. Gossip helps solve this problem

he added.

The research was based on a game theory model, a game if you like, in which one player could decide whether or not to give something to the other. Before a decision was made whether the recipient would receive something, information about how he had previously behaved towards another player was summed up.

The research revealed that individual players chose completely different strategies. There were those who were always cooperative (i.e. giving), there were those who always behaved in a negative way, and there were discriminators who relied on rumors influencing their partner's reputation during their decisions, rewarding good behavior and punishing bad.

The researchers found that in the long term

the game developed according to the will of the discriminators.

The representatives of the single-plane strategies were pushed to the shoreline, while a balance developed in the field of reputation. It turns out that the spread of false information can hinder or enhance cooperation depending on the scale of the spread. The development of balance was hindered only by the excessive amount and noise of unbiased information.

Kessinger highlighted X, formerly known as Twitter, where the logic of indirect retribution is reversed and the platform begins to reward bad behavior instead of good.

In his further research, Kawakatsu wants to investigate how social information influences altruism, i.e. sacrifice, and what are the limits of in-group and out-group biases, and what happens when a breach occurs in the social judgment of an individual due to word of mouth from two different sources.

Index

Featured Image: University Of California