BBC sounds the alarm – Corriere.it

BBC sounds the alarm – Corriere.it
to Velia Alevich

School protests and street protests are amplified by the algorithm. “It’s normal for people to be interested in certain issues,” app spokespersons deny. But former employees say the company is not prepared to address the problem

It’s the algorithm’s fault
Tik Tok
If the videos lead to riots. This is what the investigation he conducted revealed BBC: Content that encourages anti-social behavior is pushed by the app. In some cases it can even reach 20 million views.

Therefore, it will be technology’s fault if some news issues become common. It happened this summer in the UK. A “call to arms” went out on social media to loot shops Oxford Street in London. Thus, young people appeared in front of store windows, ready to respond to the call that was launched over the Internet and transmitted from one phone to another.

Or the protests that arose in Rainford High School, where students reported feeling insulted by how their skirt lengths were measured. First sixty schools joined the protest, then a hundred. The tom tom also started from a Chinese social media network, where the protest was filmed and then replicated.

This is also the case in France, where unrest broke out following the young man’s death Nahil M. – who was killed this summer in France by a police bullet – found a space to reproduce on the Internet.

Even crime cases are the subject of this technological trap that prompts users to create content that goes viral, but has also sometimes led to baseless accusations against innocent people. It happened to JAk Showalterfalsely accused ofFour students were killed in Idaho
, the previous January. The finger points directly at the Chinese app: the same video uploaded to Snapchat had just over 150,000 views, but it has become… 850 million on TikTok.

See also  The Arms Race, Japan and the United States: A New Military Order for Observation Aircraft

whose fault is it? Spokespeople for the app put up their hands: “It’s normal” for people to be interested in similar cases. Not only that, but the 40,000 employees responsible for content security are doing their best to block harmful content. But some former employees of the social network deny this: the algorithm was created to make choreographies go viral, It is not equipped Enough to prevent antisocial behavior from becoming real madness.

21 September 2023 (Changed 21 September 2023 | 10.35pm)

Leave a Reply

Your email address will not be published. Required fields are marked *