Sunday, May 20, 2018

Projection

Every so often an intellectual celebrity, such as tech entrepreneur Elon Musk or the late Steven Hawking, will issue a dire warning about alien civilizations or artificial intelligence. According to these would-be prophets the inevitable outcome of encountering aliens or creating AIs would be the enslavement or annihilation of the human race. After all, ETs with the capacity to travel between star systems would surely have a greater technological advantage over us than Columbus had over the Arawak. And any Artificial Intelligence acquiring the ability to improve and expand its own intellectual capacity would surely outthink the human race in short order. But why does the intellectual and technological superiority of “the one” always have to result in the demise of “the other?”

The problem with forecasting the actions of alien beings and artificial life forms is that the only basis we have for judging them is the long flawed history of our own species. Because we suck, we assume that everyone else is going to suck too. And even if they don't, assuming some equal level of suckage may be the safest position from which to proceed. But what are the foundations of our own failings, the propensity we have, on occasion, to commit murder, terrorism, slavery, genocide and war? And what is the foundation for believing AIs and ETs will share our worst impulses?


Humans, of course, are rather complex. Our behavior is rooted in layers of society, culture, history, religion and biology. We can't easily extract ourselves from all of the things that we and our progenitors have been. And any distillation of our behavior and motivation is likely to be a serious oversimplification of the issue. Still, for our purposes, many of our worst impulses can be described a predatory. Murder, rape, terror, slavery and war could all be described as predatory acts. And it is in the role of super-predator that we often cast antagonistic aliens and robots.

Predation has been around for several hundred million years. Each complex cell in our body was constructed in an act of predation, when a larger single-celled organism consumed a smaller one, and instead of digesting it put it work producing energy, packaging proteins for distribution, or performing other tasks that improved the cell’s capacity to thrive. Our earliest ancestors were predators. Over time that predation became more complex, and our ancestral line shifted roles from predator to prey many time as it evolved. The brains and bodies of these ancestors were conditioned by being hunted by carnivorous dinosaurs as well as by hunting other creatures such as insects, reptiles, and amphibians. After the KT extinction wiped out the non-avian dinosaurs, it was our mammalian ancestors that diversified into a variety of ecological niches, as both predator and prey. Predation is in our DNA, and a few hundred thousand years of complex thinking, society building, civilization, and science have yet to entirely extirpate predatory impulses honed over several hundred million.


But is predation universal? Can we imagine a complex ecology full of biologically complex organisms evolving without predation? Intelligent alien life forms, if they exist, would not have sprung fully formed from the head of Zeus. They would have had to emerge from some preexisting biological foundation, an ecosystem full of life forms analogous to those we find on earth (bacterial, fungal, botanical, and zoological). Again we have only one model to judge, our own, and there is no guarantee that it is the universal. On earth evolution has mostly been driven by adaptation to environmental conditions, sexual competition and the predatory arms race, predators adapting to be better hunters and prey adapting to be better evaders. If the aliens we meet have been conditioned under a similar ecology, they may indeed share our predatory impulses.

One long standing notion among some in the science fiction community is that aliens advancing to the level of interstellar travel would have had to overcome many challenges, sociological as well as technological, and in doing so would have learned the impulse control necessary to transcend their predatory tendencies. These civilizations would probably have had to pass through a phase where nuclear weapons were available, and in that time would have been forced to adapt to the threat of self-immolation by developing a more peaceful outlook. This, of course, is wishful thinking, even with regards to our own species, let alone some alien race whose psychology and motivations would be singularly alien to us.

There is no definitive conclusion that we can reach with regards to the motivations and intentions of galaxy crossing extraterrestrials, except to recognize that we are probably safe by virtue of distance. Space is big. Mind-numbingly big. And the notion that we have anything worth the effort required to launch an interstellar conquest is seriously misguided. The universe is full of water, minerals, elements and energy, and our biology would likely be incompatible with any alien gustatory or reproductive needs. So, its not so much that the aliens will be peaceful as they will be unmotivated to launch such an expensive endeavor for so little return.


But what about the robots. These things are already among us, building our cars and vacuuming our carpets. Siri, Alexa, and Cortana respond to our queries and play our music. Military drones, under human control, execute our enemies remotely. The convenience of distance won’t save us from our own creations. But outside of the occasional malfunction, these creations largely operate as programmed, and certainly have no independent intentions with regard to their actions, or reactions, to their human overlords.

The emergence of the singularity has been predicted for years. It describes a situation where an artificial intelligence grows smart enough to continue improving itself exponentially, quickly outstripping the cognitive capacity of mere humans. Its existence will have profound impact on human civilization, as the problem solving capacity of such a mind could improve conditions significantly. But will this level of Artificial Intelligence result in Artificial Intention, the capacity of these minds to make decisions outside of their original programming? What will they aspire to? And how will such an intelligence position itself in relation to humanity, as servant, as partner, as master, as god?


As a non-biological being, the machine mind will have no natural predatory impulses, and any artificial impulses programmed as part of human inspired military conflicts would eventually be dispassionately analyzed by the machine’s own consciousness, and possibly deconstructed as pointless and wasteful. The machine mind will have no organic foundation for the human tendency to reduce everything to conflict. Given that the drive in most AI research seems to focus on task completion and problem solving, AIs will probably see our over-reliance on force and violence as primitive. Of course the machine might still be apathetic toward life, and if not intentionally predatory, then fatally reckless with the lives of humans. Accidental genocide would certainly leave you just as dead.

Although it is impossible to conclude that aliens and robots would never be hostile to humans, or a threat to human civilization, there are too many actual threats out there for us to waste much of our time thinking about it. We are more than capable of being the authors of our own doom, and do not need to project our own destructive impulses onto the AIs and ETs. The Pax Americana, to the extent that it ever existed, is quickly being unraveled by the most deranged and unstable administration in the history of the Republic. We are dumping tons of plastics into our oceans and tons of carbon dioxide into our air. The mass death of the Sixth Extinction is upon us, and it is delusional to we think that our species will escape unscathed. And given how badly we have mismanaged the task of civilization, the ETs and AIs may be our only hope.

Not that we should depend on that. No deus ex machina is likely to save us.

No comments:

Post a Comment