Ground rules for discussing technological advance (part 4)

LeeAnn Felder-Heim
O’Humanity
Published in
3 min readJul 3, 2017

--

To recap: in this series, I am debunking a few recurring assumptions that I have found in recent readings about the potential benefits of technological advance, namely:

  1. That “a rising tide lifts all boats”
  2. That technological advance is benign because it is “natural”
  3. That the adoption of new technologies is simply an individual choice, and
  4. That ethics has no place in the hard sciences.

In part 3 we talked about the idea that the adoption of new technologies is simply an individual choice. Now let’s talk about the fourth and last assumption: that ethics has no place in the hard sciences.

In his article enumerating the dangers of transhumanism, Francis Fukuyama argues that humans possess a unifying human essence that is created by an “interlinked package of traits” that would be modified by transhumanist efforts. Fukuyama states that modification of our humanness would lead to unpredictable outcomes because “our good characteristics are intimately connected to our bad ones.” Fukuyama is not only warning of the unpredictability of the outcomes of modifying the human essence here, he is also postulating that humanness is valuable and should be preserved in all of its imperfect glory.

Francis Fukyama

Nick Bostrom dismisses Fukuyama’s warnings of the unpredictable outcomes of transhumanism by stating: “Moral progress in the last two millennia has consisted largely in our gradually learning to overcome our tendency to make moral discriminations on such fundamentally irrelevant grounds.” Here, Bostrom evades a discussion of transhumanism’s impact on the human essence because Fukuyama’s definition of human essence lacks objectivity. But what could be more important to discussions of transhumanism than the impact it will have on humans? Shouldn’t human happiness be our goal — rather than the maintenance of objectivity and rationality?

I imagine that Bostrom would counter that transhumanism objectively improves the lives of humans by increasing life expectancy and improving human efficiency. I would counter that I value the substance of my life — not just the length of it — and that I value a great many things above my efficiency — many things that Bostrom would probably call “fundamentally irrelevant.”

In Player Piano, Kurt Vonnegut writes of a dystopian future of almost complete automation, in which the vast majority of humanity live unfulfilling lives. Vonnegut writes:

Without regard for the wishes of men, any machines or techniques or forms of organization that can economically replace men do replace men. Replacement is not necessarily bad, but to do it without regard for the wishes of men is lawlessness.

I whole-heartedly agree with Vonnegut’s definition of lawlessness here — the society described in Player Piano is lawless because its laws were created to protect economic efficiency rather than to protect humans.

This leads me to my fourth and final rule for discussions of technological advance: don’t dismiss other people’s ideas because they are moral judgments. Instead, make arguments in regards to the effect those ideas will have on human happiness.

--

--