Posted by David Harley on June 26, 2016.
Facebook’s suggestions for pages I might like are a constant source of amusement and bewilderment.
Well, a few seconds thought provided a solution for that. Yesterday I posted a link to Scottish reactions to The Donald’s ill-considered tweet about the Scottish vote in the EU referendum. I’m sure I’ll regret that when Trump is president, but it amused me at the time. ‘Scotland voted Remain, you weapons-grade plum’ was one of the kinder comments…
There is, of course, a serious point to this rant, and it even has a vague connection to security, or at any rate privacy. I’ve already addressed it on this blog (here and here) already, but I’m going to quote the first of those articles again, just because it’s more relevant to the state of online privacy than the Trump fan club is to me.
The Facebook/Cornell paper* states that:
Although these data provide, to our knowledge, some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network, the effect sizes from the manipulations are small (as small as d = 0.001).
That might be seen as a spark of recognition that Facebook and its algorithms might not be quite as smart as they’d like us to think. But I’m not sure we should take too comfort from that. Algorithms and manipulation make the media (and indeed the whole online world) go round. And mere accuracy is a side issue.
Unfortunately, it’s not ‘just’ the social media that rely heavily on behavioural algorithms to determine how services are delivered. But some services are better than others at noting feedback as regards how well such algorithms are working out in the real world.
*A paper entitled ‘Experimental evidence of massive-scale emotional contagion through social networks’ based on ethically suspect research carried out on Facebook-supplied data.
David HarleyShare This: Submitted in: David Harley |