A year after it launched in the US, Facebook is taking its Watch video platform international. The company announced in a blog post this morning that the platform will be available globally, meaning partners can now reach an audience of billions of people across countries and languages. That’s a massive audience addition, and it gives Facebook access that’s on par with YouTube and Netflix. While those new audience numbers don’t necessarily mean more partners are going to jump on board Watch, it at least gives them a better reason to do so.
It also means advertisers can reach those billions of users to profit off of ad breaks. Congrats to all the partners. Ad break partners can publish ads in English and other local languages. Watch is available on iOS and Android in the Facebook app. It’s also available through Apple TV, Samsung Smart TV, Amazon Fire TV, Android TV, Xbox One, and Oculus TV.
Facebook’s blog post says more than 50 million people in the US watch at least a minute’s worth of video per month and that the total time for watching videos has increased 14 times since the beginning of 2018. Still, one-minute views as a reference point isn’t extremely reassuring regarding the platform’s possible success.
Since its launch, Facebook has designed multiple new formats for publishers to try, including live game shows and series that feature interactive portions like quizzes and polls. The company seems to want to create a truly social TV network where creators and viewers can interact with one another in real time. Now, that audience participation will be worldwide.
Facebook has started rating its users’ trustworthiness in order to help the social network know how much to value user reports that a certain news story might be fake. The Washington Post has details on the system and confirmation from Facebook that it’s been put in place. The system certainly sounds a touch dystopian, but Facebook sees it as a valuable tool for weeding out disinformation.
The trust ratings went into place over the last year, according to the Post, and were developed as part of Facebook’s fight against fake and malicious stories. Facebook relies, in part, on reports from users to help catch these stories. If enough people report a story as false, someone on a fact-checking team will look into it. But checking every story that racks up “fake news” reports would be overwhelming, so Facebook uses other information to figure out what it should and shouldn’t bother looking into.
One of those is this trust rating. Facebook didn’t tell the Post everything that went into the score, but it is partly related to a user’s track record with reporting stories as false. If someone regularly reports stories as false, and a fact-checking team later finds them to be false, their trust score will go up; if a person regularly reports stories as false that later are found to be true, it’ll go down.
“People often report things that they just disagree with,” Tessa Lyons, Facebook’s product manager for fighting misinformation, told the Post.
In that sense, this may be less of a “trust” score and more of a “fact-check” score, and the name isn’t likely to do it any favors. Algorithms are often flawed and can have larger, deleterious effects that aren’t immediately visible, so Facebook will have to be careful about what other information it factors in and how else this score is used, lest it accidentally discount reports from a specific community of people.
Facebook pushed back on the score’s eeriness factor in a statement to Gizmodo, saying that the company doesn’t maintain a “centralized ‘reputation’ score.” Instead, the system is just part of “a process to protect against people indiscriminately flagging news as fake and attempting to game the system … to make sure that our fight against misinformation is as effective as possible.”
Right now, it isn’t clear if the trust score is being used for anything other than reports on news stories, as well as reports on whether another Facebook user has posted something inappropriate or otherwise needing the company’s attention.
If it’s used as advertised, the scores could help Facebook home in more quickly on disinformation that’s spreading around the network. While bad reports can come from all over, President Donald Trump and other Republican leaders have made a habit out of calling any story they dislike “fake news,” which could influence others to abuse the term. That could lead to fact-checkers wasting time on stories that are obviously correct.
The real backstop here is the fact-checkers. Facebook largely seems to rely on third-party fact-checking services like Snopes and PolitiFact to determine what is and isn’t real. That means the final determinations ought to be trustworthy, but there’s still a layer of Facebook’s algorithm in the way.
The Columbia Journalism Review published a report back in April that looked at Facebook’s fact-checking efforts. It found that many fact-checkers were frustrated with Facebook’s lack of transparency. Fact-checkers weren’t clear on how Facebook was determining which stories to show or hide from them and in which order. That means that even though widely accepted fact-checkers have a shot at monitoring these stories — and therefore a direct impact on users’ trust scores — it still comes down to Facebook to pick out the right stories to show them in the first place.
A small, useful Facebook feature is doing a lot of good. CEO Mark Zuckerberg wrote in a post yesterday that birthday fundraisers, in which users can request that their friends donate to a cause for their birthday, have raised over $300 million for more than 750,000 nonprofits. The service only launched last year. He said those organizations range from food banks to animal shelters to Alzheimer’s research. In June, Facebook said it would donate $5 for every birthday fundraiser started, so long as it supports one of those 750,000 vetted US nonprofits.
This isn’t the only fundraising option users have. They can also start Pages to raise money for charity, for example, but the clear success of birthday fundraisers makes sense. People love Facebook’s birthday reminders; it’s the most crucial platform offering for lots of people, other than event invites. Letting people parlay all that attention into donations was a wise move.
Facebook’s newest Messenger feature allows users to play augmented reality games with up to six friends. It’s rolling out today with two games: Don’t Smile, which challenges users not to smile first, and Asteroids Attack, which involves navigating a spaceship.
To use them, make sure you’re on the latest version of Messenger. Then, open an existing conversation or find who you’d like to chat with, and tap the video icon on the upper right corner of the screen. Tap the star button, and select one of the AR games. The person or group you are video chatting with will get a notification indicating it’s game time. If they accept, everyone will be put into a live video group chat where the game will start with the cameras pointed at everyone’s faces.
Snapchat has also been experimenting with AR games that show up over users’ faces. The company launched its version, called Snappables, back in April, and it’s reportedly considering a bigger push into the gaming space. Meanwhile, Facebook already had regular games in Messenger, and it’s easy to imagine the company porting these AR games over to Instagram where lots of people are already chatting.
Facebook has removed four pages run by conspiracy theorist Alex Jones, explaining that the channels repeatedly violated the site’s policies against hate speech and bullying.
The decision to take down the four pages is the strongest censure of Jones’ behavior by Facebook yet. Last week, the company removed four videos shared on Jones’ channels and gave the radio host’s personal profile a 30-day suspension. However, this did not affect the output of pages run by Jones and his associates, which kept on uploading new content.
Now, though, four of these Facebook pages have been removed altogether. These include the Alex Jones Channel Page, the Alex Jones Page, the Infowars Page, and the Infowars Nightly News Page. Visiting any of these pages now shows the message: “Sorry, this content isn’t available right now.”
In an apparently unconnected move, Apple has also removed content made by Jones. Five of Infowars’ six podcasts were removed from the iPhone maker’s iTunes and Podcasts app. The company said in a statement: “Apple does not tolerate hate speech.”
Facebook explained its decision to remove the pages in a blog post titled “Enforcing Our Community Standards.” The post makes clear that the pages were unpublished not because they shared fake news (an activity that Facebook executives have repeatedly defended), but because they violated the company’s community standards, particularly its rules against hate speech and bullying.
“While much of the discussion around Infowars has been related to false news, which is a serious issue that we are working to address by demoting links marked wrong by fact checkers and suggesting additional content, none of the violations that spurred today’s removals were related to this,” said the blog post.
The rest of the post details how Facebook’s “strike” system works to judge which pages and individuals have broken the site’s rules enough to warrant a ban. This system has been criticized in recent months for its seemingly arbitrary and opaque nature. A recent undercover documentary detailing the work of Facebook’s moderators adding to the criticism, when it showed that the company repeatedly let far-right fringe groups exceed the usual number of strikes for bad behavior.
According to Facebook, Jones’ four pages were taken down for “glorifying violence” and “using dehumanizing language to describe people who are transgender, Muslims and immigrants.”
Facebook has officially launched a new feature called Watch Party that will let users simultaneously watch Facebook videos together. Their streams will be synced so they can comment and react while the host adds videos to the watch queue and controls playback. Multiple people can “host” the Watch Party, which gives them the ability to choose videos to add. Attendees can suggest videos but not actually play them. For now, Watch Parties can only be started within Groups — not Pages — and the videos have to be hosted on Facebook itself, although they can be Live or prerecorded. The company first announced and starting testing Watch Party in January.
This type of feature is one I’ve heard lots of people request from Netflix and Spotify. People want to enjoy content with friends or family even when they’re apart. Facebook has invested heavily in its Watch content, so it’s not surprising to see the company building out additional features. I just don’t know how many people want to sit on Facebook and watch videos with their random Groups. Facebook says it built the feature to help video streaming become more of an experience than a passive action.