On 26th August, Facebook published a blog post announcing that the Trending feature will become more automated in order for it to grow:
“Our goal is to enable Trending for as many people as possible, which would be hard to do if we relied solely on summarizing topics by hand. A more algorithmically driven process allows us to scale Trending to cover more topics and make it available to more people globally over time. This is something we always hoped to do but we are making these changes sooner given the feedback we got from the Facebook community earlier this year.” – via newsroom.fb.com
Back in 2014, Facebook’s Trending was launched to help users find “interesting and relevant conversations”. It was run by a group of “news curators”, who wrote headlines and descriptions for the trending posts. The involvement of real life human beings lead to controversy earlier this year when Gizmodo accused Facebook of political bias by disfavouring news about conservative topics, although Facebook denies this stating that they “found no evidence that the anonymous allegations are true”.
When the algorithms get it wrong
Facebook has been working to rely on human involvement in trending less and less, apparently to reduce the chance of “unintentional bias”. All sounds good, especially in answer to the allegations of actively keeping some political views off the list, but it all kind of fell apart over the weekend.
Just days after their original announcement to switch to an automated algorithm, Facebook’s Trending Topics shared a story claiming a Fox News presenter, Megyn Kelly, was secretly affiliated with Hillary Clinton and that she’d lost her job over it. Completely false.
Although Facebook claims humans are still involved in the Trending process, it’s unclear as to how. The fake news story was eventually removed from the Trending Topics, but with 62% of U.S. adults getting their news from social media, it’s of little solace. Luckily, many realised the story to be fake:
Megyn Kelly is trending on Facebook for an article that has no basis in reality. pic.twitter.com/31f4ERnzHI
— Kyle Blaine (@kyletblaine) August 29, 2016
Is personalisation the answer?
Ok, the topics you see will be slanted towards your location, your ‘liked’ Pages and Trending Topics you’ve engaged with in the past as well as be based on topics that are being talked about or that have suddenly been talked about more, but could it go further still?
Personalisation in marketing can be great, so what if Facebook adopted a personalised list of Trending Topics for users? Yes, it will involve some form of human input, both from users who need to “tell” Facebook the kind of news they’re interested in and the sources they would deem trustworthy; but also from a team of employees who’d need to keep a watchful eye on the stories being promoted. They would need to define “news” items and “opinion” pieces to ensure quality and avoid mishaps like the McChicken incident.
The issue with relying on personalisation is that it can shut people off from wider news and could actually put people off by assuming too much and placing them in a demographic bubble. The failure to effectively target the ‘right’ users with the ‘right’ news could render Trending Topics absolutely useless.
If the Facebook robots can successfully differentiate between fact and fiction, Trending Topics would be a force to be reckoned with in terms of news distribution, but until then, actual real life people are necessary to mediate it.