On his book The Coevolution: The Entwined Futures of Humans and Machines
Cover Interview of May 20, 2020
In a nutshell
We are all used to the idea that humanity shapes technology.
After all, we humans are the designers, right? Wait. Maybe we are being a bit
arrogant here. The French philosopher Émile-Auguste Chartier, known as Alain, wrote
this about fishing boats in Brittany:
Every boat is copied from another boat. ... Let’s
reason as follows in the manner of Darwin. It is clear that a very badly made
boat will end up at the bottom after one or two voyages and thus never be
copied. ... One could then say, with complete rigor, that it is the sea herself
who fashions the boats, choosing those which function and destroying the
others.
In this view, boat designers are more agents of mutation
than designers, and sometimes their mutations result in a “badly made boat.” Could
it be that Facebook has been fashioned more by teenagers than by software
engineers?
My book takes the position that digital technology coevolves
with humans. Facebook changes its users and its designers who then change
Facebook. The thinking of software engineers is shaped by the tools they use,
themselves earlier outcomes of software engineering. And the success of each
mutation depends less on its technical excellence than on its ability to “go
viral.” The techno-cultural context has more effect on the outcome than all of the
deliberate decisions of the software engineers. And this context evolves.
All of this implies that we humans are less in control of
the trajectory of technology than we tend to think. My book tries to help us understand
this trajectory as a Darwinian coevolution. To do that, I had to take a deep
dive into how evolution works, how humans are different from computers, and how
technology today resembles the emergence of a new life form on our planet.
This latter idea, to view digital technology as a new life
form, is likely to be the most controversial idea in the book. Computers are
made of silicon and wires, not meat and leaves. Sure, the mechanisms and the
chemistry are different, but what we need to focus on is not how they are made,
but rather on how they work.
Life is a process, not a thing. In the words of Daniel
Dennett, “It ain’t the meat, it’s the motion.” The digital processes that
surround us, like living creatures, respond to stimulus from their environment.
They grow. Think about how Wikipedia started on one server in 2001 and has
grown to run on hundreds of servers scattered around the planet. The machines,
and most especially the software, even reproduce (mostly with our help, for
now). They also inherit traits from their forebearers (“Every boat is copied
from another boat.”)
Don’t get me wrong. To consider the machines to be “living”
is not to assign them rights or agency. It is just understanding that they have
a certain autonomy and an ability to sustain their own processes. Some are
capable of behaviors that we can call “intelligent,” but most are not.
Even if we view them as “living,” in some sense, we have to
recognize that they are not biological beings, and they differ from us in important
ways. Digital machines, defined by software, can be copied perfectly and
“travel” at the speed of light. No biological being can do that. Also, no AI
software has a body like ours. To the extent that our own cognitive selves
depend on our embodiment, the AIs will never be like us. But the machines are
acquiring bodies. Consider a self-driving car. Will it ever reach the point
that we must hold it accountable for its actions?
[T]he Holocaust transformed our whole way of thinking about war and heroism. War is no longer a proving ground for heroism in the same way it used to be. Instead, war now is something that we must avoid at all costs—because genocides often take place under the cover of war. We are no longer all potential soldiers (though we are that too), but we are all potential victims of the traumas war creates. This, at least, is one important development in the way Western populations envision war, even if it does not always predominate in the thinking of our political leaders.Carolyn J. Dean, Interview of February 01, 2011
The dominant premise in evolution and economics is that a person is being loyal to natural law if he or she attends to self’s interest and welfare before being concerned with the needs and demands of family or community. The public does not realize that this statement is not an established scientific principle but an ethical preference. Nonetheless, this belief has created a moral confusion among North Americans and Europeans because the evolution of our species was accompanied by the disposition to worry about kin and the collectives to which one belongs.Jerome Kagan, Interview of September 17, 2009
In a nutshell
We are all used to the idea that humanity shapes technology. After all, we humans are the designers, right? Wait. Maybe we are being a bit arrogant here. The French philosopher Émile-Auguste Chartier, known as Alain, wrote this about fishing boats in Brittany:
Every boat is copied from another boat. ... Let’s reason as follows in the manner of Darwin. It is clear that a very badly made boat will end up at the bottom after one or two voyages and thus never be copied. ... One could then say, with complete rigor, that it is the sea herself who fashions the boats, choosing those which function and destroying the others.
In this view, boat designers are more agents of mutation than designers, and sometimes their mutations result in a “badly made boat.” Could it be that Facebook has been fashioned more by teenagers than by software engineers?
My book takes the position that digital technology coevolves with humans. Facebook changes its users and its designers who then change Facebook. The thinking of software engineers is shaped by the tools they use, themselves earlier outcomes of software engineering. And the success of each mutation depends less on its technical excellence than on its ability to “go viral.” The techno-cultural context has more effect on the outcome than all of the deliberate decisions of the software engineers. And this context evolves.
All of this implies that we humans are less in control of the trajectory of technology than we tend to think. My book tries to help us understand this trajectory as a Darwinian coevolution. To do that, I had to take a deep dive into how evolution works, how humans are different from computers, and how technology today resembles the emergence of a new life form on our planet.
This latter idea, to view digital technology as a new life form, is likely to be the most controversial idea in the book. Computers are made of silicon and wires, not meat and leaves. Sure, the mechanisms and the chemistry are different, but what we need to focus on is not how they are made, but rather on how they work.
Life is a process, not a thing. In the words of Daniel Dennett, “It ain’t the meat, it’s the motion.” The digital processes that surround us, like living creatures, respond to stimulus from their environment. They grow. Think about how Wikipedia started on one server in 2001 and has grown to run on hundreds of servers scattered around the planet. The machines, and most especially the software, even reproduce (mostly with our help, for now). They also inherit traits from their forebearers (“Every boat is copied from another boat.”)
Don’t get me wrong. To consider the machines to be “living” is not to assign them rights or agency. It is just understanding that they have a certain autonomy and an ability to sustain their own processes. Some are capable of behaviors that we can call “intelligent,” but most are not.
Even if we view them as “living,” in some sense, we have to recognize that they are not biological beings, and they differ from us in important ways. Digital machines, defined by software, can be copied perfectly and “travel” at the speed of light. No biological being can do that. Also, no AI software has a body like ours. To the extent that our own cognitive selves depend on our embodiment, the AIs will never be like us. But the machines are acquiring bodies. Consider a self-driving car. Will it ever reach the point that we must hold it accountable for its actions?