Thursday, December 04, 2008

The singularity: The fantasy and its effect


(I am posting early this week since I will be on vacation and away from home until late next week.)

This Thanksgiving I was discussing the idea of technological progress with my father. I asked him if he had ever heard the term "singularity." He recognized the word had something to do with physics, but did not know any meaning that related to our discussion of technology. Then, he went on to describe a view of technology that seemed strikingly similar to that espoused by believers in the so-called "technological singularity," a speed-up in the rate of technological change so immense that it would constitute a third revolution in human history alongside the agricultural and industrial revolutions.

But his explanation had a twist. He thought it very likely that this technological progress would result in the destruction of human civilization and perhaps all life on the planet within a century. Alas, he didn't see any way to stop it.

The idea that the advance of technology is speeding up is not a new one. And, the idea that technological progress may actually be putting us on a path to destruction precisely because we don't know when enough is enough has a long history as well. But perhaps the most pernicious idea of the three my father mentioned is that nothing can be done to stop it.

I was struck by how deeply the idea of inevitable, unstoppable, rapid technological progress had become ingrained in the culture. If my father--who reads a lot, but is not particularly versed in things scientific or technological--could describe this idea and its possible consequences in such great detail, then it must indeed have made its way into the minds of nearly every thinking and perhaps many nonthinking persons.

The effect of this idea has been threefold. First, the vast majority of people regard technological progress as an unalloyed blessing. Of course, they are, in part, confusing the availability of cheap energy to run the technology with the technology itself. Without cheap energy much of that technology would not be available to the masses. And, we would not have been able to build the necessary infrastructure nor been able to put the necessary number of people to work to develop so many new technologies.

Second, many people are also discounting the ill effects. If someone had told you at the beginning of the 20th century that the automobile would become ubiquitous in American life, that it would lead to tens of thousands of fatalities and countless injuries each year, that it would be a major cause of urban decline, that it would make our country dangerously dependent on foreign oil imported from the most politically unstable parts of the world, and that it would be a large contributor to climate change, would you not have joined a campaign to ban it? Yet, even today most people are largely blind to or at least have little concern about these clearly deleterious effects.

Third, faith in technology turns most people into citizen-couch potatoes. Since technology will fix everything, we'll put the technologists in charge and then sit back and wait for the miracles to arrive.

The persistence and depth of this conviction results not from the available evidence, but rather from a pseudo-religious belief in the innate goodness of technological progress. Ray Kurzweil, the high priest of the singularity idea, tells us in his tome, "The Singularity is Near," that humans have become joined to machines in their cultural evolution. So far, this is not news. Human ecologist William Catton Jr. made the same point in his 1980 book, "Overshoot," where he refers to human beings as homo colossus, a man-tool hybrid of extraordinary destructive power.

But Kurzweil goes on to say that evolution creates better solutions to the problems of survival, and that technological evolution as part of overall evolution inevitably makes humans more fit for survival. This, he says, is the necessary progression of the universe. That's not exactly what the original evolutionist, Charles Darwin, thought. Changes in living organisms are due to random mutations that are just that, changes. They do not have a purpose per se. The natural world simply sorts through these experiments (including presumably any human technological inventions), keeping the ones which make animals and plants more fit and discarding the ones that don't. Since this sorting takes place over many generations and sometimes many millennia, there is no good way to tell ahead of time what will work and what won't.

So, Kurzweil's faith that our technology will make us more fit for survival in the universe is, in reality, a religious view, not a scientific one. To be fair, Kurzweil does acknowledge many potential dangers from new technologies such as genetic research, nanotechnology and robotics. But he believes we can mitigate or eliminate those dangers with proper regulation.

The main problem with this worldview is that nature is almost entirely absent from it. In this view nature is something which we seek to understand in order to manipulate it for our benefit and for the benefit of other creatures when we deem it necessary. And, nature is something we can fix when we have to. Witness the many ideas for geoengineering the climate including giant mirrors in space to reflect a portion of the sunlight that would otherwise fall on the Earth and a proposal to seed the ocean with iron to increase algae growth, algae that will ultimately die and fall to the ocean floor thereby sequestering carbon.

First, the natural world is so complex that environmental education giant David Orr believes we will never solve the knowledge problem. For everything we learn about the natural world and how to manipulate it, we create an equal and consequential void of ignorance concerning the effects of our actions. When it came to chlorofluorocarbons--a set of chemicals used in refrigerators and spray cans--we almost found out too late that they were eating a hole in the ozone. Given our countless industrial and technological processes, we simply cannot know all their effects on our biosphere.

Second, those effects might be so severe that they could wipe out human civilization. Bill Joy, formerly the chief scientist for Sun Microsystems, wrote a widely read article for Wired back in 2000 about just such possibilities. The article entitled "Why the Future Doesn't Need Us" details the possibilities for the dissemination of designer viruses with the power to kill selectively, self-replicating nanobots that devour the world, and robotic intelligence too great for us to understand or control. The problems may seem like something out of science fiction, but at least the designer viruses and the self-replicating nanobots are in principle possible. Robotic intelligence that mimics and outpaces human intelligence is still just a dream. And, many debate whether such a thing is even possible. But if it were to come to pass, it would have enormous consequences, not all of them salutary for the human race or the biosphere.

Finally, there is the perception that technological progress is speeding up. But is it? After one hundred years, we are still dependent on the internal combustion engine for almost all of our land and much of our sea transportation. We were promised miracle cures for genetic diseases a decade ago, but they haven't arrived. After a half a century of research, we expected fusion reactors to be in place. But the latest international project promises to bring us commercial fusion power only by the mid-21st century. In truth, it is not altogether clear that we will ever be able to master fusion energy. Our main fuels by far remain fossil fuels, 86 percent by energy content. And, these fuels are heading toward depletion faster than anyone anticipated as the world economy and population grow, and as more and more people want access to high-energy lifestyles.

In reality, technology sometimes progresses in fits and starts, and sometimes not at all. Joseph Tainter, author of "The Collapse of Complex Societies," suggests that we may have reached an era of diminishing returns for technology and for the complexity it fosters. Complexity, Tainter explains, can increase the power and reach of a civilization. But increasing complexity will also eventually have diminishing and even negative returns to a society thereby endangering its very cohesiveness. He cites Roman and Mayan civilizations as examples.

An aura of inevitability surrounds the idea of technological progress. And, that aura implies meaningful progress for human society as well. But is that aura in reality merely a paralyzing agent that prevents careful examination of technology and its claims for the future? Humans have, in fact, stopped, slowed or restricted technology on a few occasions. Whether wisely or not, the nuclear power industry was essentially stopped in its tracks after the accident at the Three Mile Island reactor in Pennsylvania in 1979. The public wanted other solutions.

We should want other solutions now, too. Technology enthusiasts claim that new as yet created technologies will keep human society overflowing with the cheap energy it needs for the energy-hungry technological wonderworld of the future. And, yet despite all our new technology, oil discoveries continue to fall. Geology is remorseless and doesn't yield to mere faith in technology. The development of alternatives is lagging far behind our need for quick replacements. The effects of climate change are visiting us sooner than even the most pessimistic estimates had presumed.

The singularitarians tell us, "Just wait! The great breathtaking exponential acceleration of technological progress is about to begin and will play out over the next few decades." The new technologies that will emerge will solve the problems of energy supply, clean water, hunger, and even climate change. And, they will also lead to much greater longevity and far better human health.

But as the world hurtles toward peak oil, catastrophic climate change, widespread water shortages and further vast destruction of the biosphere, can we afford to wait for the singularity to arrive? Or do we need to be pragmatic and start addressing these issues now as well as we can, not just with our technology, but with a plan to change the very way in which we live to make our presence more consonant with the limits of the Earth?

17 comments:

Anonymous said...

You could give a shout out to Toffler and Future Shock. Actually, his swing(*) from pessimism in Future Shock to optimism in The Third Wave presaged this Singluarity Stuff.

* - I believe Toffler said it was less his shift than a shift in response from his readers. It's been a long time since I read them, but I certainly found The Third Wave to be optimistic.

Here we are, digital-info "prosumers" in Toffler's mold.

- odograph

Anonymous said...

Ha! Ha!
Lousy odograph evangelizing again on "reverse Black Swans", have hope folks, luck is just around the corner!
The trouble with singularitarians is that they have zero understanding of entropy and energy dependence, they assume that miraculous growth comes out of thin air and "intelligence".
But what does intelligence needs to show up?
A mechanism to grind out "ideas" which requires collateral energy for its operation and information sources as the raw material of "thinking", neither comes for free nor are "a given", even for purely analytical developments as in mathematics, see The Limits of Reason.
So, shove this ridiculous tech religion where it belongs.

Anonymous said...

A related interesting take at "evolution" by the Archdruid, who beside his nutty druidness is quite smart.

infrarad said...

we seem to consistently overplay the physical effects of new technology in our imaginations while underplaying the social effects. I'll be interested to see if social connectivity continues to increase; a world with no fundamentally new technology but an increase in connectivity might be more different than you'd expect.

Alan said...

Well, regardless of what we may think of "the singularity," it's either going to happen or it isn't. As I understand Mr. Kurzweil and his adherents, it's not something we have any say over - sort of like the rapture.

Anonymous said...

sort of like the rapture.

Very accurate comparison, about the same plausibility, but this is old news: The Rapture for Nerds

Jon said...

The singularity is just another in a long line of stories relating to mythical lands. Utopia (which means 'noplace'), Shangra-La, El Dorado, the New Jerusalem, the promised land, the city on a hill, New York, New Amsterdam, New Orleans, New Jersey, the voyage of the Pilgrims, the march of the Mormons, the migrations of pirates, pillagers, Huns and barbarians, and any other number of communes, cults, hippy communities, splinter groups, reformations, workers paradises, gold paved cities, eco villages, chartered companies, wagon trains and what have you. The singularity is just another expression of people who are not content where they currently live thinking it will be paradise just over the horizon.

In other words, it's been done.

Jon.

kanzure said...

You might be interested in this set of critiques of Kurzweil:

http://heybryan.org/fernhout/

There are four parts to these critiques, one of which is copied below.


[Fwd: Review of Ray Kurzweil's The Singularity Is Near]
From: "Paul D. Fernhout"
To: Bryan Bishop
Date: 04/29/08 09:14 am
-------- Original Message --------
Subject: Review of Ray Kurzweil's The Singularity Is Near
Date: Sun, 04 Feb 2007 15:11:47 -0500
From: Paul D. Fernhout
To: Ray Kurzweil

Ray-

The last time I wrote to you was 03/18/2001, "Comments on "The Singularity
is Near", so a lot has changed in the world since then. Still, I think
some parts of your argument have not adapted as much as needed along the
lines I suggested then. :-)

I just wrote this about your 2005 book and I send you the first
copy. Essentially, I suggest that while you are right in presenting the
trends leading up to the singularity, ultimately your view of what should
be done as we approach it and afterwards is more a result of the mirror
effect of the singularity reflecting your own unacknowledged current
personal biases in a quasi-Republican/Libertarian direction. The most
productive response to the singularity may come from a very different
perspective -- that of a return to the gift economy ideals of most
hunter/gatherer societies, as exemplified by GNU/Linux these days.

Good luck with your next book if there is one, and maybe you'll hear from
me again then (another six years? :-)

All the best.

--Paul Fernhout

==================

It is hard to improve on the first spotlight review of this book by Robert
David Steele at Amazon:
http://www.amazon.com/Singularity-Near-Humans-Transcend-Biology/dp/0670033847
http://www.amazon.com/gp/discussionboard/discussion.html/ref=cm_rdp_st_rd/002-3072141-3270460?ie=UTF8&ASIN=0670033847&store=yourstore&cdThread=TxSKBB34MU2RN1&reviewID=R794OJ10SAPU4&displayType=ReviewDetail#wasThisHelpful
but I will try.

One of the curious things about Raymond Kurzweil is that he has no
background in evolutionary biology, yet he talks a lot about "evolution".
By itself, that is not a problem; lots of people become knowledgeable in
multiple disciplines and in cross-discipline connections. But his writing
also does not even seem to show an understanding of even just the best of
that popular literature related to it (e.g. Stephen J. Gould's work).
Gould is in the index in a few places, but not as I can see as a major
thread of argument anywhere -- more to make a few technical points.
Somehow I feel if Kurzweil had a greater appreciation for evolutionary
biology, the tone of his book might be quite different.

Kurzweil's perspective on life and politics presumably derives from being
a self-made captain of industry in the capitalist USA -- having made a
fortune producing sophisticated computer equipment and related software no
doubt through a lot of cleverness and hard work. In the USA, historically
that position in society generally implies adopting a
Republican/Libertarian militaristic and market-driven perspective, if for
no other reason than to get along easily with peers and to do well in the
marketplace. But also, for the few percent who succeed at the American
dream (unlike the masses of failures) he can look back at his experience,
and perhaps ignoring luck or help from others, claim his success was due
to his own choices, and if others just made similar good choices, they too
would be successful. It is like a millionaire lottery winner exhorting
everyone to play the lottery. As an expert in statistics though, Kurzweil
should, if self-reflective, be able to see some statistical problems with
this viewpoint. Who would be the workers to be bossed around if everyone
was a successful as him? And how would his products command a price
premium if everyone was making such things? Clearly such a society of
universal success would need to be fundamentally different than the one
which produced his own personal success.

Also, Kurzweil made his money in control of patents and copyrights and
must presumably strongly believe in the value of their role in controlling
resources to create artificial scarcity to justify his own financial
success. Thus, for example, he laments the problems of the commercial
music industry in the USA in enforcing scarcity of the product of
musicians they control under contract, while he ignores the rise of
uncontracted individuals more easily producing their own garage band music
and the blossoming of a world of personal and private media production.

One would expect anyone's personal experience to color his or her
projections for the future and what the best public policy would be in
relation to those projections. That is a given. But it is the failure to
acknowledge this that the most harm can be done.

To grossly simplify a complex subject, the elite political and economic
culture Kurzweil finds himself in as a success in the USA now centers
around maintaining an empire through military preparedness and preventive
first strikes, coupled with a strong police state to protect accumulated
wealth of the financially obese. This culture supports market driven
approaches to supporting the innovations needed to support this
militarily-driven police-state-trending economy, where entrepreneurs are
kept on very short leashes, where consumers are dumbed down via compulsory
schooling, and where dissent is easily managed by removing profitable
employment opportunities from dissenters, leading to self-censorship.
Kurzweil is a person now embedded in the culture of the upper crust
economically of the USA's military and economic leadership. So, one might
expect Kurzweil to write from that perspective, and he does. His solutions
to problems the singularity pose reflect all these trends -- from
promoting first strike use of nanobots, to design and implementation
facilitated through greed, to widespread one-way surveillance of the
populace by a controlling elite.

But the biggest problem with the book _The Singularity Is Near: When
Humans Transcend Biology_ is Kurzweil seems unaware that he is doing so.
He takes all those things as given, like a fish ignoring water, ignoring
the substance of authors like Zinn, Chomsky, Domhoff, Gatto, Holt, and so
on. And that shows a lack of self-reflection on the part of the book's
author. And it is is a lack of self-reflection which seems dangerously
reckless for a person of Kurzweil's power (financially, politically,
intellectually, and persuasively). Of course, the same thing could be said
of many other leaders in the USA, so that he is not alone there. But one
expects more from someone like Ray Kurzweil for some reason, given his
incredible intelligence. With great power comes great responsibility, and
one of those responsibilities is to be reasonably self-aware of ones own
history and biases and limitations. He has not yet joined the small but
growing camp of the elite who realize that accompanying the phase change
the information age is bringing on must be a phase change in power
relationships, if anyone is to survive and prosper. And ultimately, that
means not a move to new ways of being human, but instead a return to old
ways of being human, as I shall illustrate below drawing on work by
Marshall Sahlins.

The first part of the book on trends is largely accurate and in agreement
with other writers on the subject of the singularity. But is has two
glaring exceptions which are commonly overlooked by other authors writing
in the transhumanist literature -- ignoring the likely happiness of
previous generations and ignoring complexities related to life expectancy
calculations.

Kurzweil's writing seems uninformed by the writings by Marshall Sahlins on
"The Original Affluent Society"
http://en.wikipedia.org/wiki/Original_affluent_society
http://www.eco-action.org/dt/affluent.html
It is suggested by Sahlins that a lot of hunter/gather societies involved
little "work" as we might term it, and a lot of real "happiness" as people
engaged in interesting activities and socializing. People in agrarian
societies and industrial societies generally were sicker and unhappier
than hunter/gatherers, at least in the early to middle part of those
societies. Later social forces emerge allowing the populace to control the
excesses of the elites (e.g. feudal paternalism, OSHA). Sahlins is in the
index, but he is just quoted by another author being quoted on an
unrelated topic.

Also Kurzweil ignores the complicating factor of infant mortality in his
calculations of life expectancy. If people in an older society lived past
age five, they might live quite long, to 70 or so -- but if 50% of young
children might perish this would produce a low average life expectancy. In
general, the evolutionary "grandmother hypothesis" suggest selective
pressure on humans for survival to the age of grand parent or great grand
parent will help increase the rate of survival of grandchildren and great
grand children, by being able to pass on accumulate knowledge to offspring
at the parenting stage. And it also suggests why death through a sudden
heart attack at an advanced age before mental faculties start to fail
might also be an adaptive response. It is quite possible most adults of
hunter/gatherer societies lived in good health into their 60s, 70s, 80s,
and beyond -- and may have had a surprising youthfulness and vitality
compared to present day human specimens. This may have made the work of
anthropologists difficult in estimating the life expectancy of our
ancestors. Also, infant mortality of hunters and gathers is in part due to
the rise of cities as breeding grounds for germs, so high infant mortality
of, say, native Americans in contact with western invaders with near 95%
mortality to each of several waves of disease (smallpox, measles, etc.)
does not necessarily reflect historic trends for that population going
back thousands of years. Yet, rather than diminish industrialization as
the creation of disease, Kurzweil celebrates it while ignoring the
internal selection for resistance to those diseases which went on for
generations at great personal cost. (See for example the book, _Guns,
Germs and Steel_).

One of the biggest problems as a result is Kurzweil's view of human
history as incremental and continual "progress". He ignores how our
society has gone through several phase changes in response to continuing
human evolution and increasing population densities: the development of
fire and language and tool-building, the rise of militaristic agricultural
bureaucracies, the rise of industrial empires, and now the rise of the
information age. Each has posed unique difficulties, and the immediate
result of the rise of militaristic agricultural bureaucracies or
industrialism was most definitely a regression in standard of living for
many humans at the time. For example, studies of human skeleton size,
which reflect nutrition and health, show that early agriculturists were
shorter than preceding hunter gathers and showed more evidence of disease
and malnutrition. This is a historical experience glossed over by
Kurzweil's broad exponential trend charts related to longevity which jumps
from Cromagnon to industrial era. Yes, the early industrial times of
Dickens in the 1800s were awful, but that does not mean the preceding
times were even worse -- they might well have been better in many ways.
This is a serious logical error in Kurzweil's premises leading to logical
problems in his subsequent analysis. It is not surprising he makes this
mistake, as the elite in the USA he is part of finds this fact convenient
to ignore, as it would threaten the whole set of justifications related to
"progress" woven around itself to justify a certain unequal distribution
of wealth. It is part of the convenient ignorance of the implications
that, say, the Enclosure acts in England drove the people from the land
and farms that sustained them, forcing them into deadly factory work
against their will -- an example of industrialization creating the very
poverty Kurzweil claims it will alleviate.

As Marshall Sahlins shows, for most of history, humans lived in a gift
economy based on abundance. And within that economy, for most food or
goods people families or tribes were mainly self-reliant, drawing from an
abundant nature they had mostly tamed. Naturally there were many tribes
with different policies, so it is hard to completely generalize on this
topic -- but certainly some did show these basic common traits of that
lifestyle. Only in the last few thousand years did agriculture and
bureaucracy (e.g. centered in Ancient Egypt, China, and Rome) come to
dominate human affairs -- but even then it was a dominance from afar and a
regulation of a small part of life and time. It is only in the last few
hundred years that the paradigm has shifted to specialization and an
economy based on scarcity. Even most farms 200 years ago (which was where
95% of the population lived then) were self-reliant for most of their
items judged by mass or calories. But clearly humans have been adapted,
for most of their recent evolution, to a life of abundance and gift giving.

When you combine these factors, one can see that Kurzweil is right for
most recent historical trends, with this glaring exception, but then shows
an incomplete and misleading analysis of current events and future trends,
because his historical analysis is incomplete and biased.

Further, Kurzweil repeatedly talks about evolution, but seems to have at
best a view of evolution informed by the worst of popular sources. No
professional evolutionary biologist would say something implying evolution
is the same as progress, for example. Or that evolution is always about
increasing complexity. Consider something like a host/parasite interaction
across multiple generations in simulation. You may actually see cycles,
where resistance to a parasite is developed, the parasite overcomes that
as it too evolves, then the host population evolves in new directions to
create new resistances while losing the original resistance, which the
parasite may lose its ability to overcome, only to see the entire cycle
repeated when the host reinvents the defense and the parasite reinvents
the way around it. It takes energy and mass to keep a memory of past
innovations encoded in RNA or DNA or other means, and evolutionarily that
is often just excess baggage slowing down reproduction. There may be some
limited controversy on this topic if you consider a bacterial pool of
genetic information capable of storing an immense amount of genetic
information in a distributed fashion and so able to save novel code for
various enzyme pathways and so perhaps ratchet up a store of useful
genetic information, but Kurzweil admits to no notion of controversy on
evolution being linked to progress.

A study of the fossil record would show the repeated loss of large numbers
of species with various clever adaptations, and how often a single general
species (like the stickleback) will radiate into a variety of specialize
species which seem to eclipse the generalist, but then with some sudden
shock to the environment, the numerically vastly superior specialized
variants are all wiped out, leaving sometimes the generalist ancestor, and
sometimes nothing. (See the work of evolutionary biologist Axel Meyer.)

I think if Kurzweil studied more evolutionary biology from the
professional literature, he would not have a rosy view of things like,
say, uploading your brain in a digital world. It is, frankly, naive to
think that an uploaded brain derived from duplicating a clunky chemical
architecture would compete with the populations of digital organisms which
might evolve native to a digital context. In short, those uploaded brains
are going to be eaten alive by digital piranha that overwrite their
computer memory and take over their runtime processor cycles. It has taken
evolution billions of years to lead up to the mammalian immune system, yet
Kurzweil seems to thing an effective digital immune system or nanobot
immune system can be developed in a few years. More likely the result will
be ages of chaos and suffering until co-evolutionary trends emerge. But
that would be in line with the other phase changes and their effect of
most human lives when militaristic agricultural bureaucracies emerged, or
when industrial empire building emerged. These evolutionary factors exist
even for the current elite if they uploaded themselves. So, the only
alternative may be to avoid building such a competitive landscape into the
digital world. as much as possible -- and likely that will involve
reducing the competitiveness of those building the digital world driven
through short term greed. It is almost as either we all go together into
the digital world in a reasonable level peace and prosperity or no one
goes for long. And it is time we need in a digital world to adapt to it --
perhaps even as much as a second gained from a peaceful digital world
might be all it takes to ensure humanities survival of the singularity.
And that perhaps one second of peaceful runtime then needs to be bought
now with a lot of hard work making the world a better place for more people.

So, this would suggest more caution approaching a singularity. And it
would suggest the ultimate folly of maintain R&D systems motivated by
short term greed to develop the technology leading up to it. But it is
exactly such a policy of creating copyright and patents via greed that
(the so called "free market" where paradoxically nothing is free) that
Kurzweil exhorts us to expand. And it is likely here where his own success
most betrays him -- where the tragedy of the approach to the singularity
he promotes will results from his being blinded by his very great previous
economic success. If anything, the research leading up to the singularity
should be done out of love and joy and humor and compassion -- with as
little greed about it if possible IMHO. But somehow Kurzweil suggests the
same processes that brought us the Enron collapse and war profiteering
through the destruction of the cradle of civilization in Iraq are the same
ones to bring humanity safely thorough the singularity. One pundit, I
forget who, suggested the problem with the US cinema and TV was that there
were not enough tragedies produced for it -- not enough cautionary tales
to help us avert such tragic disasters from our own limitations and pride.

Kurzweil's rebuttals to critics in the last part of the book primarily
focus on those who do do not believe AI can work, or those who doubt the
singularity, or the potential of nanotechnology or other technologies. One
may well disagree with Kurzweil on the specific details of the development
of those trends, but many people beside him, including before him, have
talked about the singularity and said similar things. Of the fact of an
approaching singularity, there is likely little doubt it seems, even as
one can quibble about dates or such. But the very nature of a singularity
is that you can't peer inside it, although Kurzweil attempts to do so
anyway, but without enough caveats or self-reflection. So, what Ray
Kurzweil sees in the mirror of a reflective singularity is ultimately a
reflection of -- Ray Kurzweil and his current political beliefs.

The important thing is to remember that Kurzweil's book is a
quasi-Libertarian/Conservative view on the singularity. He mostly ignores
the human aspects of joy, generosity, compassion, dancing, caring, and so
on to focus on a narrow view of logical intelligence. His antidote to fear
is not joy or humor -- it is more fear. He has no qualms about enslaving
robots or AIs in the short term. He has no qualms about accelerating an
arms race into cyberspace. He seems to have an significant fear of death
(focusing a lot on immortality). The real criticisms Kurzweil needs to
address are not the straw men which he attacks (many of whom are being
produced by people with the same capitalist / militarist assumptions he
has). It is the criticisms that come from those thinking about economies
not revolving around scarcity, or those who reflect of the deeper aspects
of human nature beyond greed and fear and logic, which Kurzweil needs to
address. Perhaps he even needs to addres them as part of his own continued
growth as an individual. To do so, he needs to intellectually,
politically, and emotionally move beyond the roots that produced the very
economic and political success which let his book become so popular. That
is the hardest thing for any commercially successful artist or innovator
to do. It is often a painful process full of risk.

Ultimately, Kurzweil's book just exhorts us to do more of the same in the
USA we have been doing for decades (centralization of decision making,
lack of regulation, market driven innovation) and things will come out as
best that they can. He ignores, for example, the rise of the multinational
corporation over the past 100 years as an amoral entity with human rights
but not human responsibilities -- the results of which show how much
damage such megascale artificially intelligent entities can produce if
created and then left to operate unchecked. No wonder the Wall Street
Journal or the New York Times and the mainstream press gives his book such
glowing reviews. That celebration of current US elite cultural ideology is
a denial both of history and current day trends, when, for example, the
usually unacknowledged fact in the USA that populations in most other
industrialized countries are healthier and happier than in the USA.
http://www.ppionline.org/ppi_ci.cfm?knlgAreaID=108&subsecID=900003&contentID=253543
http://news.bbc.co.uk/1/hi/world/africa/3157570.stm
http://www.yesmagazine.org/article.asp?ID=1503

I do not intend to vilify Kurzweil here. I think he means well. And he is
courageous to talk bout the singularity and think about ways to approach
it to support the public good. His early work on music equipment and tools
for the blind are laudable. So was his early involvement with Unitarians
and social justice. But somewhere along the line perhaps his perspective
has become shackled by his own economic success. To paraphrase a famous
quote, perhaps it is "easier for a camel to go through the eye of a needle
than a rich man to comprehend the singularity." :-) I wish him the best in
wrestling with this issue in his next book.

--Paul Fernhout
(Princeton '85, so "the pot calling the kettle black". A decade or so ago
I might have written of similar solutions to ones your propose, before
much soul searching and exploratory reading in a variety of fields. As the
character Elwood P. Dowd says in the movie "Harvey", "My mother used to
say to me, 'Elwood' -- she always called me Elwood -- 'Elwood, in this
world you must be oh-so clever, or oh-so pleasant.' For years I was
clever. I'd recommend pleasant -- and you may quote me." Perhaps good
advice to live by when approaching a singularity driven in part by
unlimited cleverness. :-)

John Michael Greer said...

A crisp and thoughtful post. Kurt, are you familiar with Karl Popper's critique of "historicism" -- the belief, central to Marxism and most other secular religions, that history is inevitably headed in some particular direction, which just happens to be the one that the believers think they want? Kurzweil et al. are simply the latest example of the same thing, and the Singularity as, ahem, inevitable as the glorious proletarian revolution.

Anonymous said...

'Technology' is a social construct. We should always ask, which technology, in whose interests? For example, I grew up visiting the Centre for Alternative Technology in Wales. It was (and remains) a mini-utopia of heat pumps, windmills and passive solar design. It's 'high tech', but not the kind of high tech that Kurtzweil and associates necessarily regard as the future. From the perspective of grid-group cultural theory, developed by anthropologist Mary Douglas, there are always four possible futures, Individualist, Egalitarian, Hierarchist and Fatalist. Most of the Singularity proponents tend to be somewhat Individualist. In contrast, the alternative/renewable solar future crowd tend to be somewhat Egalitarian. I guess there might be four possible visions of the Singularity, of which the Individualist, Promethian version is only one.

Iconoclast421 said...

"Singularity" is not an odd footnote in the story of human civilization. It is the primary purpose, the reason why we are here. We are being guided towards it with unerring precision.

Instead of singularity, I prefer to call it "Interface". We form a collective consciousness, and then it interfaces with the "God Consciousness".

The occult ruling class seeks to control all aspects of Interface. They believe they can harness the power of this interface for some nefarious purpose. Possible manifestations of this are illustrated in pop culture, in films such as the Matrix.

Anonymous said...

Iconoclast421, may I have some of the stuff you smoke?

Anonymous said...

Pinning our hopes on technofixes and a transformative "singularity" gives us an excuse to dismiss the practical necessity to proactively impose limits on ourselves BEFORE reality (i.e. scarcities and strife) whacks us aside the head.

Isn't it interesting that Libertarianism is popular enough to merit a formal name, while the champions of a philosophy we might call "Responsibilitarianism" number so few that most people would consider the name itself absurd?

We are a nation of spoiled children, so accustomed to getting bigger and more extravagent toys every Christmas that we feel ENTITLED to a "better" Christmas every year...forever. The most popular "freedoms" we cherish are in fact variations on our fervent belief that we have a right to consume and emit without limits. (For example, most Americans see freedom of movement as the right to drive whenever, wherever, and as much as we want, in a vehicle as large as we can obtain on credit - and then park for free!)


Hans Noeldner

"Civilization is the presence of enlightened self-restraint"

Rice Farmer said...

One thing that came to mind as I read this fine essay (and I am certainly not the first to make this observation) was that humanity isn't mature enough to use technology responsibly, certainly not the planetary-scale technologies that are supposed to realize our Glorious Technological Future. The atom was never limited to "peaceful" use, and never will be. Same for just about everything else you can think of. What's more, nearly every day we read about how new technologies are being hijacked for military use. "Regulation" never has, and never will, stop power-mad people from abusing technology. I just don't envision a good ending here.

michael said...

Great thread, and great comments.

A more concise take on the issue in cartoon format here:

http://xkcd.com/251/

Iconoclast421 said...

If you think I'm smoking something, then perhaps you can offer a rational explanation of how exactly the flagellum motor came into existance.

Explain how the individual parts evolved randomly when the motor requires all parts to be present in order to function. Explain how evolution of such a device can occur when natural selection is not possible because the design either works or it doesnt. (You cant naturally select a binary trait.)

When you fail to find such an explanation, you are forced to conclude that the design was specifically placed into our evolutionary equation. Certainly placed by an advanced civilization with god-like properties. But certainly not from "god" himself/herself/itself. After all, what would god want with a motor? (To quote Captain Kirk, "What does god need with a starship?")

Obviously, whoever designed the flagellum motor was not God, and was not human. Now, stick that in your pipe and smoke it.

Anonymous said...

Who was it that said we did not go beyond the stone age because we ran out of stones? We will not need to run out of fossil fuels to bring about a new age in energy.