[Book review] Technology vs. Humanity: the coming clash between man and machine

Amanda
4 min readNov 22, 2018

--

NB: I’ve been on the sofa all day wrapped up in blankets after coming down with a horrific cold, so today’s #NaBloPoMo blog post are some quick reflections/takeaways from one of the books I’ve recently read.

Gerd Leonhard describes himself as a futurist and a humanist. When we deliver Forward Leadership training to teams, we play one of his talks to spark ideas and discussion around the room, particularly to dive deep into empathy, ethics, digital transformation and automation.

I recently finished reading the book that the talk is based on — ‘Technology vs. Humanity: the coming clash between man and machine’and wanted to share some of the points that stood out to me.

  • The future can’t be created based on blind optimism or paralysing fear.
  • Technology should serve humanity and further human flourishing
  • Human happiness and wellbeing should be at the heart of decision thinking and governance processes
  • We must be open, yet critical; scientific yet humanistic; adventurous and curious, yet aimed with pre-caution and entrepreneurial yet collectively minded
  • We must invest more energy in furthering humanity than we do in developing technology
  • We lead our lives largely down to our values, beliefs and mindsets — not according to data and algorithms.

‘By far, the greatest danger of AI is that people conclude too early that they understand it’ — Eliezer Yudhousky

  • Humanity will change more in the next two years than in the previous 300 years

‘You may live to see man-made horrors that are beyond your comprehension’ — Nikola Tesla

  • Dasein = the German word for human existence/reality

‘Technologies are morally neutral until we apply them’ — William Gibson

  • We will not be able to comprehend how bots have arrived at their decisions, and yet they will unceasingly run our lives. This is fundamentally wrong and unethical.
  • Technology used to just enhance our actions and outward possibilities — but not technology changes us inside in a deeply, irreversibly, neurological, biological, psychological and/or spiritual way.

‘What I’m saying now is that we are gods and that we have to get good at it’ — Stewart Brand

  • Public officials need to become ‘future stewards’ ← I like this A LOT
  • Technology is neither good nor bad, ‘it’ simply is — but as humans we must decide and agree which exact use is evil or not. how do we agree on some kind of ethical foundation globally? How do we (could we?) get all nations to agree on defining or constraining the dark sides of tech development?
  • How should we define where ‘the magic ends’?

‘Any sufficiently advanced technology is indistinguishable from magic’ — Arthur C Clarke

  • To truly maintain an environment of human flourishing we must give deep consideration to unintended consequences — we must pay attention to the side effects and impacts outside of your organisational mission and business model.
  • Will intelligence digital assistant lead to the increased deskilling of humans? Will we give up authentic human experiences in return for an increasing desire to be in several places at once?
  • More is better, but less is best.
  • Need to create a sustainable balance between precaution and proaction
  • We accept hedonic (as opposed to eudaemonic) happiness as good enough as it can be organised/provided instantly by technology. When do we really give ourselves the time to focus on deeper happiness, contentment and human flourishing?
  • Technology will only consider our values as data feed explaining our behaviour. They will always be approximations, stimulations and simplifications. Useful? Yes. Real? No.
  • In the future, could true free will only be available to the rich?

Gerd lists a set of questions against which we should gauge new scientific/technological breakthroughs, which are:

  1. Does the idea violate the human rights of anyone involved?
  2. Does the idea seek to replace human relationships with machine relationships?
  3. Does this idea put efficiency over humanity?
  4. Does it replace happiness with mere consumption?
  5. Does the idea automate core human activities or interactions that should not be?

And he lists five new human rights for the digital age:

  1. The right to remain natural (biological) — to exist in an un-augmented state, ie not forced to wear AR/VR or use apps as a condition of employment
  2. The right to be inefficient if and where it defined our basic humanness — ie: the choice to be slower than tech, the right to go and see a doctor in person rather than use a quick app
  3. The right to disconnect
  4. The right to be anonymous
  5. The right to employ or involve people instead of machines.

Lots to continue thinking about! ❤

--

--

Amanda
Amanda

Written by Amanda

All things data, digital, design, communities, leadership & open culture. With relentless optimism and plenty of magic.

No responses yet