Facebook

Facebook is Seriously Taking Key Steps to Evolve its Content Approach

A few weeks back, when Facebook and Twitter sparked a new round of controversy by banning then US President Donald Trump, I noted that the important thing to focus on within this process was not the banning of Trump itself, but the lessons learned from the Trump era and how the platforms look to evolve their approaches as a result.

And last week, at Facebook, we saw the first key hints of just how the platform is indeed looking to adjust, with two potentially critical updates relating to its post-Trump shift.

First, we got the initial rulings from Facebook’s new, independent Oversight Board, which ruled on five cases, and laid the foundation for how it will look to influence Facebook policy moving forward.

As per the Oversight Board:

“We believe the first case decisions by the Oversight Board demonstrate our commitment to holding Facebook to account, by standing up for the interests of users and communities around the world, and by beginning to reshape Facebook’s approach to content moderation. This is the start of a process that will take time, and we look forward to sharing our progress through the Board’s many subsequent case decisions.”

Indeed, in four of its initial rulings, the Oversight Board overruled Facebook’s original enforcement decisions, while it also criticized Facebook’s approach in all cases. That, in itself, could lead to an improvement in Facebook’s process – but more importantly, the Oversight Board’s rulings also largely aligned with what human rights organizations have been calling for for years with respect to Facebook’s approach.

That, in essence, could see Facebook regulated by proxy. While it’s not official regulation, as such, via a government-appointed body, if the Oversight Board is able to influence Facebook’s approach, in line with broader community expectation, then the result could be the same, which would be a massive shift, and could help Facebook avoid further political scrutiny.

If Facebook does change its approach. The Social Network says that it will honor the Oversight Board’s decisions on individual cases, but it’s not as committal on the Board’s suggested policy revisions. Facebook says that it will take the board’s advice on such under consideration, but it won’t be held to those as updates, necessarily.

It’s impossible to know at this stage how influential the Board will ultimately be, but these first cases do suggest that it could end up being a major impetus for change at The Social Network, and may even show a way forward for more effective regulation across the entire social media sector.

It’s worth noting, too, that Facebook’s VP of Global Affairs Nick Clegg also reiterated the company’s call for a new approach to independent social platform regulation. Maybe, the Oversight Board will become the template for change in this respect.

The other significant update last week was Facebook CEO Mark Zuckerberg noting that Facebook will no longer recommend civic and political groups to its users, as part of a broader effort to lessen political debate within the app.

As Zuckerberg said on Facebook’s Q4 earnings call:

“One of the top pieces of feedback that we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services.”

Whether Facebook follows through with this – and how hard it actually tries to squeeze out divisive political content – we’ll have to wait and see, but if Zuckerberg is serious, and Facebook wants to get rid of such debate, that could also be a major shift for The Social Network.

The pervading view over time has been that Facebook doesn’t really want to get rid of divisive political content, no matter how loudly it might state such intent publicly because such content sparks discussion, which then prompts even more engagement, and keeps people on the platform for longer.

That, seemingly, is what the daily listing of the most engaging posts on the platform generally indicates.

But maybe, that’s no longer the case.

As per Facebook’s latest results, its daily active user count is actually flatlining in the US, which could support what Zuckerberg’s now saying – that users have had enough of the political debates on the platform, which could even be turning people away.

Facebook wants active engagement, but not at the expense of overall users. If the balance is shifting, and more people are using Facebook less because of that content, maybe the time has come for Facebook to de-emphasize those posts.

Which it can do. In the days after the 2020 US election, and amid rising political tensions, Facebook deliberately reduced the reach of more partisan, divisive news outlets on the platform, in favor of more reputable providers to ensure improved balance in political news coverage. This lead to what Facebook staffers internally referred to as the ‘nicer’ News Feed, reducing the intensity of debate and division across the board, while also keeping people who rely on the platform for news adequately informed.

Several staffers reportedly asked if they could keep the nicer feed beyond the post-election period. Perhaps that’s now where Zuckerberg is leaning.

Make no mistake; I would credit this change in approach to what’s best for business, not a sudden attack of conscience. But if the result is a less divisive, less angst-inducing platform – which we now know has the power to spark full-scale civil disorder – then that, indeed, is still a positive outcome.

Again, there’s a long way to go, there’s a lot to be seen before we can assume that Facebook is actually for real about changing its ways. But these are potentially important Indicators for internal change and a new way forward for the world’s largest social media platform.

Similar Posts