Nah. But he’s definitely getting high.
So lots of folks asking me about Facebook’s news today. I’ve read what I can, and here’s my initial take, in a thread. First, here’s the news from FB:Read More
This contextual information is “pulled from across Facebook and other sources, such as information from the publisher’s Wikipedia entry… trending articles or related articles about the topic”. The hope is that this contextual information will give users the information they need to decide whether the link in question is reliable or not.
This is the best move I have seen any of the platforms take in response to fake news so far. This approach provides a non-invasive system which helps people make up their own minds about what they want to believe and share, rather than deciding what is and is not true for them.
I was wrong.
It’s still uncertain exactly what they have implemented since Google tends to not tell people how their algorithm works, but they did make an announcement back in April that they would be taking measures to punish “low quality content” such as misleading information, offensive results, hoaxes and conspiracy theories.
Like you, I am on Facebook. In two ways, actually. There’s this public page, which Facebook gives to people who are “public figures.” My story of becoming a Facebook public figure is tortured (years ago, I went Facebook bankrupt after reaching my “friend” limit), but the end result is a place that feels a bit like Twitter, but with more opportunities for me to buy ads that promote my posts (I’ve tried doing that, and while it certainly increases my exposure, I’m not entirely sure why that matters).
Then there’s my “personal” page. Facebook was kind enough to help me fix this up after my “bankruptcy.” On this personal page I try to keep my friends to people I actually know, with mixed success. But the same problems I’ve always had with Facebook are apparent here — some people I’m actually friends with, others I know, but not well enough to call true “friends.” But I don’t want to be an ass…so I click “confirm” and move on.Read More
Stung by charges that it had become a lie amplifier, Facebook has announced a set of experiments to combat “fake news.” Users will be able to flag stories for review by third-party fact-checkers, and those confirmed as problematic will get tagged with warnings. In Steven Levy’s Backchannel piece on the move, Facebook leaders say their goal is to go after “clear black-and-white hoaxes, the bottom of the barrel, worst of the worst part of the fake news” — like the stories that claim the Pope endorsed Donald Trump or that Hillary Clinton kept sex slaves in a pizzeria.
Good luck to you, Facebook! You’ve just brought an algorithmic knife to a partisan machine-gun fight. You hope you can avoid taking sides, but the conservative part of your customer base is already rejecting the third parties — like Snopes, Politifact, and the AP — to whom you have outsourced the factchecking (Business Insider). And you’ve now incentivized partisan hordes to click “dispute” on all the stories they don’t like, true or false.Read More