Algorithmic transparency is not the solution you’re looking for – algorithmic accountability is

When Angela Merkel addressed a media conference last month she lambasted internet companies like Google and Facebook for a lack of algorithmic transparency.

“I’m of the opinion that algorithms must be made more transparent, so that one can inform oneself as an interested citizen about questions like ‘what influences my behaviour on the internet and that of others?’.

“Algorithms, when they are not transparent, can lead to a distortion of our perception, they can shrink our expanse of information.”

This is terribly naive.

Opening the door to algorithmic transparency just means that our feeds and searches will instead be full of content from those organisations best equipped to game the system. A return to the bad old days of spammy drug advertisements and content farms!

The game of cat and mouse Google and Facebook play when they tweak their algorithms to prevent us from the spammers would just get weighted heavily in the mouse’s favour!

Yet, the obvious pitfalls of this approach haven’t stopped her from pushing for regulation and guidance. The article continues:

A cross-party working group is compiling recommendations urging more openness by internet platforms, including making details on how their algorithms collect, choose, evaluate and present information available to users

As gamification gurus we need to head off this very damaging call for regulation that could destroy a large section of our nascent gamification industry.

In particular danger, from this potential regulation, are any apps you are designing with “under the bonnet” gamification features like Lithium and Quora which hide the points system from end users.

So what should we be campaigning for instead?

There are strong cases for ranking algorithms, particularly immature ones, to be opaque in their early days.  Indeed, in the post Scoring Opacity Considerations I argue that hybrid models are usually more effective as they balance the benefits of transparency (allowing gamers to see what behaviours to turn into good habits) and the benefits of opacity (providing automatic blocks to prevent spammy or undesired behaviour).

However I fully support the case for users, or players in our terminology, to have adequate insight into the ranking algorithm. This transfers some control to individuals within a system and prevents abuses of power.

That’s why I believe instead we should champion “Algorithmic Accountability

The principles should be:

  • explained  – the algorithm should be explained to all players in general terms
  • declared – where transparency is beneficial then metrics and weightings can be fully shared, where metrics and weightings must be hidden  to prevent system gaming then the existence of hidden metrics should be declared by the algorithm designer. This can be in aggregate e.g. “Hidden Metrics represent 34% of the algorithm’s points”.
  • auditable – a third party under NDA should be able to audit the algorithm (at Rise we built a role for this called the Steward)

What to do now.

  • Define your own professional ethos on algorithmic accountability on your blog
  • Write to your MEP  (UK citizens can use https://www.writetothem.com)
  • Request your industry association e.g. GamFed to represent your views at European level

Thanks!

FURTHER  RESOURCES 

I’ll provide links to good quality additional resources as I discover them:

Toby Beresford

Toby is founder and CEO of Rise the success tracking network to track, publish and share success. He was the 2013 chair of Gamfed.com - the International Gamification Confederation and organises the UK Gamifiers meetup. As a gamification leader, he speaks at conferences and hosts workshops. Follow him on twitter @tobyberesford and Subscribe to this blog at Gamification Of Work blog feed

More Posts - Website - Twitter - Facebook