Digital media such as social media, messenger groups or comment columns in online media have a predominantly negative influence on political processes. They can encourage populist movements, increase polarization and undermine trust in institutions.
Digital media such as social media, messenger groups or comment columns in online media have a predominantly negative influence on political processes. They can encourage populist movements, increase polarization and undermine trust in institutions.
I’m glad this is finally being looked at because I’ve been saying this for years and I feel like the vast majority doesn’t realize how bad this is.
If you look at social media use and the rise of the far right over the last 15 years, they pretty much go hand in hand. Sure, that isn’t proof for anything, but it’s alarming and needs to be studied further.
In Germany, we have safeguards that prevent a single media outlet to gain too much power over controlling the public opinion. We have this for TV stations, news papers etc. Just not social media. Because social media wasn’t even on the horizon when this law was put in place.
I think it’s insane that we just let foreign profit driven companies steer the public opinion and discourse in our society and we finally have to start to heavily regulate them.
Correlation is not causation. This is Statistics 101… I can point to other things that are correlated with the rise in the far right. For example, centralization, the increase in monopolies, the number of years since World War II, the average temperature of the earth, the number of years into the new millennium.
Anyway, when I read your comment on the whole what I actually see is that your concerned that social media is too centralized and therefore ripe for abuse. That’s vastly different from saying that social media itself is inherently going to be abused.
Corellation is not causation but in this case it is very suspicious and studies like these show that there might be causation That’s why I said it’s not proof but needs to be studied further.
And yes, of course I mean centralized, profit driven social media. But that is the overwhelming majority of social media.
I think where social media is particularly insidious is the use of algorithms.
No normal person wakes up and thinks “I think we should bring back the Nazis”, but like a post on Facebook about Remembrance Sunday or patriotism or even just catapults, and suddenly you’re on a fast track to white supremacist groups.
I’ve seen people go from chewing out a stranger in the supermarket for whining about foreigners, to spreading made up rumours about immigrants killing white girls.
In the past you’d hear about hate groups when they’re marching through London in Union Jack T-shirts, smashing curry shop windows. Social media lets people in on the bottom rung and ride it to that point.
I’m not defending social media or its algorithms, but you’ve also got to look at what predated it. People used to get their news and information from news anchors and newspapers that were mouthpieces for capitalists and conservatives. So maybe not as extremist or even as effective at influence as a social media algorithm, but still not great. Personally, I don’t want to go back to relying on corporate news outlets either.
In Europe we have public news broadcasters in every country, so relying on corporate news isn’t a huge problem.
Anyway, a news outlet pushing propaganda and misinformation is not nearly as dangerous as a social media platform doing it via algorithm. The fact that the algorithm can give you exactly the kind of propaganda that will resonate with you is extremely dangerous.
Not saying privately owned news outlets with a huge reach aren’t a problem, they definetely are, it’s just a whole other dimension.
Well put, this is exactly the problem. I remember the early days of facebook, twitter and instagram when there were no algorithms and your feed was just posts by people you follow in chronological order. Discourse was much more tame back then. We urgently need stricter laws that regulate these algorithms.