ITsecurity
twitter facebook rss

Top Facebook Stories: Your Choice?

Posted by on April 8, 2015.

This doesn’t have much unequivocal connection with security except in so far as everything Facebook does has an impact on privacy. But bear with me…

You may or may not be as confused and irritated by Facebook’s habit of preferring to show you what it considers to be Top Stories (or Top News) rather than the live feed of Most Recent stories. If you’re one of those rational-and-not-necessary-friendless people who avoids Facebook altogether, you won’t care about this and I don’t blame you. But it means that Facebook prefers to show you the Top Stories it chooses rather than list all the ‘stories’ that hit your feed in ‘latest-first’ chronological order. If you don’t log in for a while, it will invariably reset your feed to show Top Stories the next time you do. And while it’s not difficult to reset it to Most Recent, you do so in the certain knowledge that Facebook will prompt you to change it back every time you log in, and eventually it will decide that enough is enough and change it for you. ‘Eventually’ being even more frequently than it resets your security settings…

While there are people who are convinced that all their (FB) friends read everything they post and throw a wobbly if something they consider important doesn’t get its share of Likes and comments, it’s sad but true that for anyone who makes a lot of use of Facebook – that is, has lots of Facebook friends, is subscribed to several groups and pages, and so forth – there will be a lot of material of comparatively little interest even at the time it’s posted, let alone hours or days later. Still, I’ve been wondering what algorithm Facebook uses to prioritize certain ‘stories’ (which seems like a strangely portentous word to describe what may be as trivial as very brief notes about what someone in the Far East or the Midwest had for lunch). And I finally got around to doing a little digging.

A Facebook Tip by Matt Hicks tells us that Top News

… uses factors such as how many friends are commenting on a post to aggregate content that you’ll find interesting. It displays stories based on their relevance, rather than in chronological order.

No-one should be surprised that Facebook keeps a close enough (albeit automated) eye on my Facebook activity to hazard a guess as to what interests me. But ‘relevance’ doesn’t really explain the sudden reappearance of a throwaway witticism a friend posted on April 1st which I didn’t Like and on which I didn’t comment, though it did attract about a dozen comments from others at the time. Or, come to that, why switching to Top Stories reactivates notifications for stories for which I’d previously turned off notifications.

The Facebook Help Centre tells me that stories are chosen on the basis of my connections and activity – no surprise there. The choice is also influenced by how many comments and likes a post gets and ‘what kind of story’ it is. The implication is, I guess, that photos and videos, for example, stand a better chance of being prioritized, which explains the sudden presence of a music video posted to a group I rarely visit.

Then there are status updates. Status updates? Aren’t they all status updates? Well, a web search for status updates gets me hits on the funniest, silliest, even the worst status updates, and I suppose that’s the sort of content being aimed at. That, after all, is the sort of content that keeps a lot of Facebook users (presumably) locked into Facebookland so as to maximize their exposure to Facebook’s real business.

Ah, I hear you say, you mean advertising? Well, yes, up to a point. Though you may have found that there’s a lot less direct advertising cluttering up your feed nowadays. And that’s because there’s more to Facebook’s business than selling stuff to you on behalf of third parties. As Paul Levy put it a few months ago: Your life is Facebook’s business model – like it or not. And indeed, one of my very last articles on behalf of (ISC)2 made somewhat similar points: Privacy wars – personal data and the social contract.

Why, you may ask, am I so well acquainted with a service I don’t seem to like very much? To quote from a longer article in process:

I originally opened a FB account because I felt that as a so-called security expert I ought to know more about the FB world from that point of view. However … I do value the fact that I’ve connected/reconnected this way with people I rarely get to talk to non-virtually. That’s the trade-off: as they say, if you’re not paying for the service, you’re the product. Or, rather, a depressingly high volume of your personal data is. (That applies to many of the services we loosely refer to as social media, not just Facebook, of course.)

I was never naïve enough to believe the Internet – let alone the social media – to be a totally free lunch.  And presumably lots of people are prepared to accept the same trade-off (if they think about it at all) even if it involves the same “inadvertent algorithmic cruelty” flagged by Eric Meyer after Facebook offered him a ‘Year in Review’ update to edit and post that featured a picture of his daughter, who had recently died.

I’m sure that Facebook didn’t intend to cause him that pain, of course, let alone invite the bad press it (Facebook) received as a result. And I don’t mean to minimize its efforts to improve the security and privacy of its users. But I suspect that it sees PR glitches and security breaches as another trade-off: worth it for the chance to demonstrate to its paying customers the extent to which it can manipulate its non-paying subscribers, to the extent of carrying out an ethically dubious and conceptually decidedly flawed experiment in psychological manipulation and emotional contagion. The Facebook/Cornell paper states that:

Although these data provide, to our knowledge, some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network, the effect sizes from the manipulations are small (as small as d = 0.001).

That might be seen as a spark of recognition that Facebook and its algorithms might not be quite as smart as they’d like us to think. But I’m not sure we should take too comfort from that. Algorithms and manipulation make the media (and indeed the whole online world) go round. And mere accuracy is a side issue.

David Harley 
Reluctant social engineer and media manipulator


Share This:
Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *

Submitted in: David Harley | Tags: , , , ,