ITsecurity
twitter facebook rss

Facebook’s not-so-smart algorithms

Posted by on June 26, 2016.

Facebook’s suggestions for pages I might like are a constant source of amusement and bewilderment.

  • Music stuff, OK. I’m not quite at the stage yet where my life is more focused on music than on security-related writing, but it’s getting there.
  •  IT stuff, OK. Some of it is even relevant to what I do.
  • Stuff related to Shropshire, OK: I don’t live there any more, but I didn’t go out of my way to advise Facebook that I moved. (I don’t go out of my way to tell Facebook anything…) And I still have an interest in the place: I lived there for much of my life, and I have family and friends there.
  • But USA Patriots for Donald Trump (or something like that)? Wha-a-a-t???? Where the heck did that come from???

Well, a few seconds thought provided a solution for that. Yesterday I posted a link to Scottish reactions to The Donald’s ill-considered tweet about the Scottish vote in the EU referendum. I’m sure I’ll regret that when Trump is president, but it amused me at the time. ‘Scotland voted Remain, you weapons-grade plum’ was one of the kinder comments…

There is, of course, a serious point to this rant, and it even has a vague connection to security, or at any rate privacy. I’ve already addressed it on this blog (here and here) already, but I’m going to quote the first of those articles again, just because it’s more relevant to the state of online privacy than the Trump fan club is to me.

The Facebook/Cornell paper* states that:

Although these data provide, to our knowledge, some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network, the effect sizes from the manipulations are small (as small as d = 0.001).

That might be seen as a spark of recognition that Facebook and its algorithms might not be quite as smart as they’d like us to think. But I’m not sure we should take too comfort from that. Algorithms and manipulation make the media (and indeed the whole online world) go round. And mere accuracy is a side issue.

Unfortunately, it’s not ‘just’ the social media that rely heavily on behavioural algorithms to determine how services are delivered. But some services are better than others at noting feedback as regards how well such algorithms are working out in the real world.

*A paper entitled ‘Experimental evidence of massive-scale emotional contagion through social networks’ based on ethically suspect research carried out on Facebook-supplied data.

David Harley


Share This:
Facebooktwittergoogle_plusredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *

Submitted in: David Harley | Tags: , ,