Data:Value — trust, ethics, equity and the needs of society.

Image for post
Image for post
Image credit: Mike Rose

In a year that’s seen the introduction of GDPR, the Cambridge Analytica/Facebook scandal, and news that Streams will be joining Google, even more opportunities have opened up this year to discuss the role of data in our ever-changing society. We’re seeing a rise in the public’s role with data as it becomes more and more mainstream.

This year’s ODI Summit, curated by Anna Scott (Head of Content) covered a broad range of topics — from trust to ethics, the values and bias we embed in data, identity to diversity — brought to life by speakers from governments, businesses, civil society and NGOs, academics, artists and poets.

Jemima has already shared her thoughts on the main themes that emerged. I’ll add more ODI Summit blog posts to the bottom of this blog post as I discover them.

So in true Jeni style, here are my three key takeaways:

Data trusts are one potential data access model. They’ve been defined by the ODI as ‘a legal structure that provides independent third-party stewardship of data’. Data trusts will be further explored by the ODI with the Government’s Office for AI (a joint unit between DCMS and BEIS). You can also read more here.

In the State of the nation opening session Sir Nigel Shadbolt suggested that data trusts may become a fundamental utility in ‘pooling’ data by trusted organisations. However, ‘rules’ will be essential to building trust by people, giving the example that data assets/sources that are part of the data trust can’t be privatised or sold-off in the future. Those rules need to be stated openly from the start and adhered to throughout.

Dr Nora Loideain further discussed data trusts in the Designing for fairness panel, stating that bringing data from multiple sources into one place can bring serious data privacy risks, and that data trusts must show how they operate. Data trusts should be transparent about the data assets, the processes used, the people and organisations involved — especially with who provides oversight.

“Wherever you use data, in whatever way you use data, you must always think of the ethics’”— Sir Tim Berners-Lee

In the Building Trust in Data and Tech panel, Catherine Miller shared insights from Doteveryone’s* first annual Digital Attitudes report. People believe that they get a lot from technology as individuals, but that technology doesn’t give back to society.

There’s a need for clear rules and social infrastructure. Starting with how we talk about trust is important, because conversations are mainly from an individualistic view and we need to find the language that bridges the self/society gap. Much work remains to be done in getting value from data for our collective social good.

And on the topic of ‘good’, Catherine received a mid-panel applause for being bold and calling out some of the unethical and untrustworthy behaviour that happens when companies build ‘good’ products.

“Think very carefully about your definition of ‘good’. Who is it good for? Could you explain it to anyone? Could you tell your mum about this? And would you feel a bit uncomfortable and ashamed when you did so? If so, then it’s not okay” — Catherine Miller

Catherine also asked attendees to reflect on who decides what is good and good for whom. Should we give people ‘nudges’ to hep them make ‘better’ data decisions? Don’t obvious nudges/manipulation cause us to distrust? And who even defines what a better decision is?

Martin Tisné reflected on the power and inequality that comes with data and Kit Collingwood how we protect the most vulnerable people in society — some of whom may not have internet access, or be able to fully understand and provide consent.

Value has never just been about the data, its about the people, the infrastructure, the skills that can be developed and the culture change that can be delivered. And as Kit said (using emercoleman’s incredible achievements as an example) opening up that data for everyone to benefit from is usually one person’s tireless efforts — how do we make that scaleable and sustainable?

The Data and Diversity panel was just wonderful. Zamila Bunglawala led the Race Disparity Audit and shared the benefits she’d seen in publishing the data and coming together to solve data problems together as a ‘sector’:

  • It helps to understand gaps and missed opportunities
  • It helps with representation
  • It brings together different skills and creativity
  • By consistently measuring and publishing the data, people can tell stories and show the impact — and one small first step is to expand on the systems you already have and putting things in one place.

Christine Forde outlined that understanding diversity and data needs two levels:

  • Building blocks — real leadership, governance, mainstreaming and understanding what things really look like.
  • Actions — specific initiatives that actually take an organisation take note. What’s your burning platform? How can you translate data into a series of actions you can take and monitor?

Other points from the panel:

  • We should turn conversations on their head, so rather than just focus on low numbers of girls and women in STEM, talk openly about the toxic male representation we see. And rather than focus on saying we should be inclusive, we have to call out where people are being actively excluded and campaign to change things.
  • We should ‘test for everyone’ — but its not enough to just listen to a diverse set of users. We have to actually act on what the data is showing us, whether that’s through changing services, policies, products, etc.
  • Start with what is already there. Use and build on existing standards.

It’s not just enough to have the right data to understand diversity — initiatives must be diverse and teams must reflect the society that they serve. We need to balance formal governance with more informal communities of practice to bring more voices into the conversation. We need to be more open to share our thinking about the problems we’re trying to solve and get more ideas and answers. We need make data relevant to people other than those who actually collected it.

  • ‘Be trustworthy, don’t be evil. Once you’ve lost trust it’s gone forever. Be transparent. Sunlight is the best disinfectant. When dealing with data, be fair’ — Patrick Fagan
  • ‘Data is an expression of power’ — Martin Tisne
  • ‘How do we properly exploit data without exploiting humans?’- Kit Collingwood
  • It’s incredibly hard to know at what specific point a policy changes culture’ — Goergia Thompson
  • ‘We need to make ‘data destinations’ much more transparent’ — Nigel Shadbolt
  • ‘Ethics is not an end state. It’s an ongoing practice’ — Alix Dunn

* Rachel, Cassie and the Doteveryone team have been doing an exceptional work in exploring community tech, social infrastructure and social impact. Further recommended reading is here, here and here.

Design, leadership, open culture, data, ethics, justice. These are my personal thoughts on work.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store