| 
						Facebook offers limited 
						detail on formula behind News Feed 
		 Send a link to a friend 
		
		 [June 30, 2016] 
		By Yasmeen Abutaleb 
 SAN 
		FRANCISCO (Reuters) - Facebook Inc on Wednesday offered a rare glimpse 
		into how it ranks and shows content in its News Feed, part of an effort 
		to provide more transparency about its operations as the social 
		network's cultural and political influence continue to grow.
 
 The disclosures, though lacking in detail, are notable in part because 
		they come in the wake of a May news report alleging liberal political 
		bias in a Facebook feature called Trending Topics.
 
 “News Feed is a system that’s designed and built by people, but people 
		have values and those values are reflected in how we make decisions on a 
		regular basis,” Adam Mosseri, vice president of product management for 
		News Feed, told a press briefing.
 
 Mosseri said the core principle of News Feed - the place most people see 
		postings on Facebook - is that posts from family and friends get ranked 
		first. That is followed by "informative" content, which can range from 
		news articles to recipes and is determined by what types of posts an 
		individual tends to click on.
 
 "Entertaining" content is the third priority, and is similarly based on 
		past preferences.
 
		
		 
		Although News Feed is separate from Trending Topics, it is the central 
		feature of the Facebook experience and any hint that is was influenced 
		by a political agenda could be hugely damaging to the company. The 
		heated U.S. presidential election has heightened concerns about possible 
		attempts to influence elections.
 Facebook denied the May allegations about Trending Topics, but the 
		claims spurred a Congressional letter demanding answers. Facebook then 
		provided a first-ever explanation of how Trending Topics articles were 
		chosen and also made changes in its process.
 
 “We realize we need to be more and more proactive” in communicating how 
		News Feed operates, said Mosseri.
 
 Facebook launched News Feed in 2006 as a way to help users see the 
		content that would be most important to them from their friends, family 
		and pages they choose to follow. It uses an algorithm that it says it is 
		constantly updating, along with human editors, to decide what content it 
		should show customers.
 
 [to top of second column]
 | 
            
			
			 
            
			
			Computer screens display the Facebook 
			sign-in screen in this photo illustration taken in Golden, Colorado, 
			United States July 28, 2015. REUTERS/Rick Wilking 
            
			
 
Facebook stressed in a blog post Wednesday that it does not favor 
		certain sources or ideas. “We are not in the business of picking which 
		issues the world should read about.”
 The company also said it is working to better identify content that 
		users find authentic and surface it higher in the News Feed, as well as 
		removing more “click bait,” which it said users find misleading.
 
 Responding to criticism that Facebook and other social networks create 
		an "echo chamber" in which people see only stories that reflect their 
		views, Mosseri said the team tries to help users find new pages to 
		follow that could diversify their feeds.
 
 In the United States, he added, 25 percent of people’s friends who 
		report their political affiliation have a different affiliation than the 
		person.
 
 “We’re trying to figure out what people find interesting," Mosseri said. 
		"People find opposing views interesting.”
 
 (Reporting by Yasmeen Abutaleb. Editing by Jonathan Weber and Andrew 
		Hay)
 
				 
			[© 2016 Thomson Reuters. All rights 
				reserved.] Copyright 2016 Reuters. All rights reserved. This material may not be published, 
			broadcast, rewritten or redistributed. 
			
			
			 |