Our Governance Algorithms

Friday, June 12, 2015

by Nick Charney RSS / cpsrenewalFacebook / cpsrenewalLinkedIn / Nick Charneytwitter / nickcharneygovloop / nickcharneyGoogle+ / nickcharney

I spent a couple of days this week attending the Marcom Forum in Ottawa. I've been on the conference's advisory board for almost as long as I've been a public servant and its always a nice change of pace to spend a couple of days locked in a conference room full of marketing and communications people. It gives me the opportunity to apply a totally different lens on my work and is a healthy signal check with respect to how different vocations prioritize strategies, tactics and ultimately, outcomes.

One of the things that stuck out for me from the Fab Dolan's (Google) keynote address was the idea that the pace of technological growth is no longer linear but exponential. That single purpose technology is now being widely diffused and applied to other areas (e.g. IBM's Watson to Health Care) and that consumers (and thus citizens) are increasingly interested in immersive experiences rather than static ones. In many ways this isn't necessarily a new insight but I found being reminded of it in a different context extremely helpful.

The other thing Dolan mentioned — the thing I want to reflect on today — was the idea that ethnography generates insights too slowly for today's marketplace whereas data allows you to test all of your hypotheses immediately and at a fraction of the cost. Dolan went on to illustrate his point by talking about machine learning in the context of video games and explain its marketing corollary dynamic creative.

Dynamic creative is a fancy way of saying continuous and concurrent A/B testing. Think 'nudge' not in the policy innovation sense but rather in the classic marketing sense, generating conversions. Essentially dynamic creative allows marketers to perpetually test and improve their marketing algorithm in real time using the behavioural data people generate when using the web.

But here's the rub.

Yes our current institutional array – our governance algorithm – is under pressure because in many ways it remains linear when everything else around it is becoming exponential, but its also the result of hundreds of years of the evolution of our thinking. To be perfectly honest, I'm not sure there's a natural corollary to the world of public policy.

What does dynamic governance – or cognitive government (See: Open Gov, Values, and the Social Contract) – actually look like?

Where's the data that governments can tap into, use to test hypotheses against and lever to improve its governance algorithms in real time?

And, more importantly, even if we could do such a thing, should we?

Food for thought.