After the ‘Facebook Files’, the social media giant must be more transparent
Facebook is having significant challenges in moderating the content on its platform in the face of conflicting demands from users, advertisers and civil society organisations. In this cross-posting from The Conversation, Nicolas Suzor looks at the need for transparency from the social media giant.
Most people on Facebook have probably seen something they wish they hadn’t, whether it be violent pictures or racist comments. ![]()
How the social media giant decides what is and isn’t acceptable is often a mystery.
Internal content guidelines, recently published in The Guardian, offer new insight into the mechanics of Facebook content moderation.
The slides show the rules can be arbitrary, but that shouldn’t be surprising. Social media platforms like Facebook and Twitter have been around for less than two decades, and there is little regulatory guidance from government regarding how they should police what people post.
A long argument from the social networks has been that they are not publishing anything, more enabling others to self publish, utilising their platforms. This, from where I am looking, is the grey area. If an old skool publisher allowed comments on their sites, like the ones you can read on Facebook, there would be uproar and complaints to the newspaper / broadcaster / (publisher)…
The answer? Google’s Local Guides is building and building and building. It is an online community and when you reach a certain level you can suggest edits to addresses, contact details; all sorts! ‘Community policing’ is important in our real communities and today, it seems, is very important online and is lacking. With such a mammoth operation for Facebook to clean up it’s comment threads, perhaps they should take a leaf from Google’s book and start rewarding their users for participation. Give a bit back and hey presto, slowly their community can become far more civil. We all want to live in a civil society and some would say that FB has enabled a monster like Trump to get into power. (Flippant comment or the truth?) Let’s hope FB do the right thing and start showing us all that they care and empower the good users to do good and help FB to clean up their act.
Hah, Google Local Guides.
Ever read some of the threads on a YouTube video?
Vile, disgusting and hateful.
Google aren’t innocent in all of this.
Hearing you re YouTube. I referred to Local Guides as a model to watch. Slowly but surely, it would appear that Google will be identifying the credible online users and empowering them to police. Rome wasn’t built in a day and it has got out of hand on all social networks. With an empowerment and reward model, the long tail of credible, qualified, (by algorithm), users, will slowly clean up the web. It is really the only way. Wikipedia’s model can also work, again by only allowing trusted, credible folks to edit certain pieces. My bet is that in 5 years time ‘vile, disgusting and hateful’ comments will be a thing of the past. Lets hope so.
Facebook policing the internet – we’re all screwed.
As for Facebook’s ongoing concern… they had better figure out their future audience because my late teens kids and all their friends steer well clear of it.
They’re all on Insta (owned by FB*……) ?