Cognizant, perhaps, that its algorithm doesn’t give users enough control over what they see, Facebook has announced a subtle update to its much-criticized news feed: For the first time, users will be able to choose for themselves which posts and pages appear at the top of their feeds.
“We know that ultimately you’re the only one who truly knows what is most meaningful to you,” product manager Jacob Frantz said in a statement, “and that is why we want to give you more ways to control what you see.”
To try out the new feature, users on iOS (Android and desktop versions are rolling out later) can open “news feed preferences” and tap “prioritize” to see a list of friends and followed pages whose posts appear in their feed.
Selecting preferred friends puts a star above their photos. Those friends’ posts will then appear above the algorithmically ranked news feed, in their entirety.
This is a pretty significant change from how the news feed works now. Facebook’s home stream is, by all accounts, a pretty mysterious beast: 30 percent of American adults get their news there, according to a recent study, but most don’t understand its mechanisms — or, when it comes to controlling it, their own personal lack of agency.
Facebook uses a slate of factors, including “whom you tend to interact with, and what kinds of content you tend to like and comment on,” to advance the posts that it thinks you’re most likely to read. But users often don’t know quite how their inputs map to Facebook’s outputs. (According to one oft-quoted paper, more than 60 percent of Facebook users don’t even realize that a system algorithmically ranks and filters the posts they see.)
In either case, that’s made the news feed a common target of critics, who argue that it is fundamentally disempowering: While algorithms are helpful in powering much of what we see online, this one makes choices for users without their conscious or considered input.
“The questions that concern me are how these algorithms work, what their effects are, who controls them, and what are the values that go into the design choices,” the sociologist Zeynep Tufekci wrote in May. “At a personal level, I’d love to have the choice to set my newsfeed algorithm to ‘please show me more content I’d likely disagree with.’ ”
This change doesn’t go quite that far, of course, nor does it give users the ability to see exactly what signals they’re sending to Facebook, or to correct misinterpretations in that data. (Just because you clicked on a high-school classmate’s baby picture once does not mean you want to see every post from that person.)
Everything you tell Facebook, from your home town to your LGBT-ally status to the posts you want to see in your news feed, is a new data point in the constellation that Facebook uses to monetize you.
Source: Dispatch
0 comments:
Post a Comment