Friday, January 05, 2007

the evolutionary ability of humankind to do the right things, even though it sometimes happens only after all possible mistakes are exhausted.

reposted from Edge.org. Chris Street highlights/edits in bold.

HAIM HARARI
Physicist, former President, Weizmann Institute of Science

The Evolutionary Ability of Humankind To Do the Right Things

I am optimistic about the evolutionary ability of humankind to do the right things, even though it sometimes happens only after all possible mistakes are exhausted.

I am optimistic about technology and world leaders (in that order) discovering ways to combine energy savings and alternative sources of energy (in that order), so that our planet is saved, while we still have a reasonable standard of living.

I am optimistic about the irreversible trend of increasing the economic value of knowledge and decreasing the relative economic importance of raw materials, reducing the power of ruthless primitive dictators and increasing the rewards for education and talent.

I am optimistic about the emerging ability of the life sciences to use mathematics, computer science, physics, and engineering in order to understand biological mechanisms, detect and prevent medical problems and cure deadly diseases.

I am optimistic that more scientists will understand that public awareness and public understanding of science and technology are the only weapons against ignorance, faith healers, religious fanaticism, fortune tellers, superstitions and astrology, and that serious programs will emerge in order to enhance the contribution of the scientific community to this effort.

I am optimistic that, in the same way that Europe understood during the last Fifty years that education for all and settling disputes peacefully are good things, while killing people just because of their nationality or religion is bad, so will the Muslim world during the new century.

I am optimistic that we will soon understand that wise medical and genetic ethics mean that we should not absolutely forbid any technology and we should not absolutely allow any technology, but find ways to extract the good and eliminate the bad, from every new scientific development.

I am optimistic about the fact that an important fraction of the nations on this planet succeeded in refuting the extrapolations concerning a population explosion and I hope that the remaining nations will do likewise, for their own advancement and survival.

I am optimistic about the power of education to alleviate poverty and advance health and peace in the third world and I am hopeful that the affluent world will understand that its own survival on this planet depends on its own help to the rest of humanity in advancing its education.

I am not at all optimistic that any of the above will happen soon. All possible mistakes and wrong turns will probably be attempted. The weakest link is our chronic short sightedness, which is bad in the case of the general public and is much worse for its elected political leaders, who think in terms of months and years, not decades and certainly not centuries.

A film is worth a thousands words

reposted from Edge.org. Chris Street highlights/edits in bold.

JEAN PIGOZZI

Collector, Contemporary African Art; High-Tech Ecological Researcher & Director, Liquid Jungle Lab, Panama

Breaking Down the Barriers Between Artists and the Public

For me, the most interesting development in the art world is what Charles Saatchi is doing with The Saatchi Gallery (www.saatchi-gallery.co.uk), his new online gallery which opens Summer 2007. It is a tool that is immensely powerful as it is open on both sides to the artists and to the public without interference of curators, editors, dealers, critics, etc. This is exactly the way Contemporary Art should be presented. The artists can show whatever they want and the public can see whatever they choose to look at, there is no more the barrier of the museum or the gallery or the art magazine between the artists and the public. I find this immensely refreshing and interesting.

The recent creation of YouTube is another very interesting and important development which provides Internet users with a cheap and easy way to make and post short videos. I think we have just begun to touch the surface of this huge iceberg. It means that anyone who sees a policeman beating someone up, or someone kicking their dog, or Paris Hilton kissing a young man in a car, or someone being mistreated in a hospital, etc. can post it on YouTube and have the entire world see it in less than ten minutes. People can write great editorials and post great blogs, but the power of a short film is a thousand times stronger than any well written anything anywhere. I am excited and also terrified by this new opportunity.

God may come at the end

reposted from Edge.org. Chris Street highlights/edits in bold.

Interesting stuff! I wish i knew what he was talking about.

MARTIN E.P. SELIGMAN

Psychologist, University of Pennsylvania, Author, Authentic Happiness


The First Coming

I am optimistic that God may come at the end.

I've never been able to choke down the idea of a supernatural God who stands outside of time, a God who designs and creates the Universe. There is, however, an alternate notion of God relevant to the secular community, the skeptical, evidence-minded community that believes only in nature.

Isaac Asimov wrote a short story in the 1950's called "The Last Question." The story opens in 2061 with the Earth cooling down. Scientists ask the giant computer, "can entropy be reversed?" and the computer answers "not enough data for a meaningful answer." In the next scene, earth's inhabitants have fled the white dwarf that used to be our sun, for younger stars; and as the galaxy continues to cool, they ask the miniaturized supercomputer, which contains all of human knowledge, "can entropy be reversed." It answers "not enough data." This continues through more scenes, with the computer even more powerful and the cosmos even colder. The answer, however, remains the same. Ultimately trillions of years pass, and all life and warmth in the Universe have fled. All knowledge is compacted into a wisp of matter in the near-absolute zero of hyperspace. The wisp asks itself "can entropy be reversed?"

"Let there be light," it responds. And there was light.

There is a theory of God imbedded in this story that is based not on faith and revelation, but on hope and evidence. God in the Judeo-Christian theory has four properties: omnipotence, omniscience, goodness, and the creation of the universe. I think we need to give up the last property, a supernatural creator at the beginning of time. This is the most troublesome property in the Judeo-Christian theory: it runs afoul of evil in the universe. If God is the designer, and also good, omniscient, and omnipotent, how come the world is so full of innocent children dying, of terrorism, and of sadism? The creator property also contradicts human free will. How can God have created a species endowed with free will, if God is also omnipotent and omniscient? And who created the creator anyway?

There are crafty, involuted theological answers to each of these conundrums. The problem of evil is allegedly solved by holding that God's plan is inscrutable: 'What looks evil to us isn't evil in God's inscrutable plan.' The problem of reconciling human free will with the four properties of God is a very tough nut. Calvin and Luther gave up human will to save God's omnipotence. In contrast to this Reformation theory, modern "process" theology holds that God started things off with an eternal thrust toward increasing complexity (so far, so good). But mounting complexity entails free will and self-consciousness, and so human free will is a strong limitation on God's power. This theory of God gives up omnipotence and omniscience to allow human beings to enjoy free will. To circumvent 'who created the creator,' process theology gives up creation itself by claiming that the process of becoming more complex just goes on forever: there was no beginning and will be no end. So the process theology God allows free will, but at the expense of omnipotence, omniscience, and creation.

There is a different way out of these conundrums: It acknowledges that the creator property is so contradictory to the other three properties as to mandate jettisoning the property of Creator. Importantly, this very property is what makes God so hard to swallow for the scientifically minded person. The Creator is supernatural, an intelligent and designing being who exists before time and who is not subject to natural laws; a complex entity that occurs before the simple entities, thereby violating most every scientific process we know about. . Let the mystery of creation be consigned to the branch of physics called cosmology. 'Good riddance.'

This leaves us with the idea of a God who had nothing whatever to do with creation, but who is omnipotent, omniscient, and righteous? Does this God exist?

Such a God cannot exist now because we would be stuck once again with two of the same conundrums: how can there be evil in the world now if an existing God is omnipotent and righteous, and how can humans have free will if an existing God is omnipotent and omniscient. So there was no such God and there is no such God now.

Consider now the principle of NonZero that Robert Wright (2000) articulates in his book of the same name. Wright argues that the invisible hand of biological and cultural evolution ineluctably select for the complex over the simple because positive sum games have the survival and reproductive edge over zero sum games, and that over epochal time more and more complex systems, bulkily, but necessarily, arise. Space does not allow me to expand on Wright's thesis and I must refer the justifiably unconvinced reader to his very substantial arguments.

A process that selects for more complexity is ultimately aimed at nothing less than omniscience, omnipotence, and goodness. Omniscience is, arguably, the literally ultimate end product of science. Omnipotence is, arguably, the literally ultimate end product of technology. Righteousness is, arguably, the literally ultimate end product of positive institutions. So in the very longest run the principle of Nonzero heads toward a God who is not supernatural, but who ultimately acquires omnipotence, omniscience and goodness through the natural progress of Nonzero. Perhaps, just perhaps, God comes at the end

So I am optimistic that there may be in the fullness of time a First Coming. I am optimistic that this is the door through which meaning may enter our lives. A meaningful life is a life that joins with something larger than the self and the larger that something is, the more meaning. I am optimistic that as individuals we can choose to be a tiny part of this process. Partaking of a process that has as it ultimate end the bringing of a God, who is endowed with omniscience, omnipotence, and goodness joins our tiny, accidental lives to something enormously larger.

the madness of Scientific Method

reposted from Edge.org. Chris Street highlights/edits in bold.

PIET HUT

Professor of Astrophysics, Institute for Advanced Study, Princeton

The Real Purity of Pure Science

I grew up reading heroic stories about progress in science, the absolute superiority of the scientific method, the evil of superstition, and other one-dimensional optimistic views.

Almost half a century later, I have a much more nuanced view of progress, method, and ways of looking at the world. What has been presented as the scientific method, at any given time, has been a simplified snapshot of an intrinsically much more opportunistic enterprise. As such, much damage has been done by suggesting that others areas, from social science and economy to politics, should adopt such a simple and always outdated picture.

The strength of science is not at all its currently accepted method. The strength is the fact that scientists allow the method to change.

The way the method changes is the exact same way through which progress is made by applying the method in doing everyday research. Change of method takes place slowly and carefully, through long and detailed peer discussions, and may be almost imperceptible in any given field during the lifetime of a scientist. The scientific method is like spacetime in general relativity: it provides the stage for matter to play, but the play of matter in turn affects the stage.

The real basis for the success of science is its unique combination of progressive and conservative elements. A scientist gets brownie points for crazy new ideas, as long as they are really interesting and stimulating, and also for being extremely conservative in criticizing any and all new ideas, as long as the criticism can be shown to be valid. What is interesting in new ideas and what is valid in criticism thereof is determined solely by peer review, by the collective opinions of the body of living scientists, not by falling back on some kind of fixed notion of a method.

My optimism is that other areas of human activities can learn from science to combine conservative and progressive approaches, taming the usual black-white duality in a collaborative dance of opposites.

Pure science has been held up as a beacon of hope, as a way to allow scientists to pursue their own intuitions, and thus to find totally new solutions to old problems. This is seen in contrast to applied science, where short-term goals do not allow sufficient room for finding really new approaches. Indeed, the irony here is that the best applications of sciences are ultimately based on pure, rather than applied research.

The moral of the story has been to say that long-term research should not focus on goals, but rather it should let the scientific method follow its own course. Purified from goals, the scientific method is held up as the beacon to follow. But I think this story is still misleading. The greatest breakthroughs have come from a doubly pure science, purified from goals and methods alike. In small and large ways, each major breakthrough was exactly a breakthrough because it literally broke the rules, the rules of the scientific method as it had been understood so far. The most spectacular example has been quantum mechanics, which changed dramatically even the notion of experimental verification.

I am optimistic that all areas of human activities can be inspired by the example of science, which has continued to thrive for more than four centuries, without relying on goals, and without even relying on methods. The key ingredients are hyper-critical but non-dogmatic conservatism, combined with wildly unconventional but well-motivated progressiveness. Insofar as there is any meta-method, it is to allow those ingredients to be played off against each other in the enactments of scientific controversies, until consensus is reached.

A Proper Scientific Understanding of Irrationality In General, and of Religion In Particular

reposted from Edge.org. Chris Street highlights/edits in bold.

ANDREW BROWN

Journalist, The Guardian; Author, The Darwin Wars

A Proper Scientific Understanding of Irrationality In General, and of Religion In Particular

I'm not actually optimistic about anything very much, but it's clear that if civilisation is to survive, we need a proper scientific understanding of irrationality in general, and of religion in particular. To be optimistic about that is a precondition for optimism about anything else. What might such an understanding look like?

For a start, it would be naturalistic and empirical. It would not start from definitions of religion or faith, but from a careful study, in the spirit of William James, of how it is that religious people actually behave and believe. What would be found, again in a Jamesian spirit, is that there are varieties of religious behaviour, as there are varieties of religious experience. We would need to know how these are related to each other, and to other things that are not described as religious. It may well be that "religion" is a concept no more useful than phlogiston.

It would take seriously Dan Dennett's distinction between beliefs and opinions—more seriously, I think, than he sometimes does himself. A belief, in Dennett's sense, is a kind of behaviour or a propensity to behave as if certain things were true. It need not be conscious at all. The kind of conscious, articulable propositions about the world which most people mean by "belief" he calls an "opinion".

In this sense, an enquiry into religious belief would be distinct from an enquiry into religious opinions: Religious "belief" would involve all of the largely unconscious mechanisms which lead people to behave superstitiously, or reverently, or with a disdain for heretics; religious opinions would be the reasons that they give for this behaviour. We need to understand both. It may be that their opinions would correspond to their beliefs but that is something to be established in every case by empirical enquiry. It's obvious that in most cases they don't. Intellectuals are supposed to be motivated by their opinions; some of them actually are. But everyone is motivated by their beliefs and prejudices as well.

In particular, such an enquiry would be very careful about what counts as evidence. A friend of mine who does consciousness research once said sourly that "The problem with the brain is that if you go looking for something in there, you're very liable to find it." Similarly, if you go looking for some particular quality in religious belief you are likely to find it there, as well as its opposite. What's needed is the distinctly scientific attitude that takes disconfirming evidence seriously, and doesn't respond to it by simply repeating the confirming evidence.

I happened to see a play "On Religion" by the British atheist philosopher AC Grayling last night, which is an excellent dramatisation of some of these issues. The atheist character, a woman lecturer, is given a speech in which she recounts the story of a scientist who has spent fifteen years arguing that the Golgi apparatus does not in fact exist. It is an artifact of the inadequacies of our microscopes. Finally, he attends a lecture from a visiting cell biologists who proves conclusively that the Golgi apparatus does exist. And, just as the whole department is trying to avoid his eye from sympathetic shame, he rushes up to the lecturer, grabs his hand, and says "My dear fellow, I wish to thank you. I have been wrong these fifteen years." It is an improving and inspiring story, which pitches over into bathos as soon as the atheist spells out the moral. "No religious person could ever say that" she says. Has she really never heard of the phenomenon of conversion? What do the converted say, if not that some evidence has convinced them they were wrong all their lives before?

So, I think, if I am to be optimistic, that there will be a real breakthrough in the empirical study of religion, at the end of which no scientist will ever feel able to assert that "no religious person could ever say" without making a careful enquiry into what religious people actually do say and what they mean by it.

The Increasing Coalescence of Scientific Disciplines

reposted from Edge.org. Chris Street highlights/edits in bold.

GERALD HOLTON
Mallinckrodt Research Professor of Physics and Research Professor of History of Science, Harvard University; Author, Thematic Origins of Scientific Thought

The Increasing Coalescence of Scientific Disciplines

Under our very eyes, research in science has been taking a courageous and promising turn, to realize in our time an ancient dream.

Since Thales and other philosophers on the island in the Ionian Sea, over 2500 years ago, there has been an undying hope that under all the diverse and fluctuating phenomena, there could be found in Nature a grand, majestic order. This fascination, the "Ionian Enchantment," persisted ever since in various forms.

Thus, Isaac Newton thought mechanical forces that explained the motions of the solar system would also turn out to run all else, including human senses. After Darwin's magnificent synthesis, many attempts were made to extend it to include all societal phenomena. The influential Austrian polymath, Ernst Mach, to whom young Einstein referred as one of his most important influences, taught that the true task of scientific research is to establish a form of fundamental science, an Einheitswissenschaft, on which is based every different specialty. From about 1910 on, an increasing number of scientists in Europe and America gave allegiance to the idea of the "Unity of Science," a widespread movement hoping to find functioning bridges between not only different sciences but also between science and philosophy—Niels Bohr being one of the prominent promoters.

But, by and by, it became clear that such hopes were at best premature, that there was not enough of what William James had called "cash value," in terms of having secured many actual accomplishments—not least in attaining a Unified Field Theory. At one of the last meetings devoted to discussions about the Unity of Science, in 1956, J. Robert Oppenheimer, with typical eloquence, offered a valedictory to the Ionian Enchantment, with these words:

"It may be a question [whether there] is one way of bringing a wider unity in our time. That unity, I think, can only be based on a rather different kind of structure than the one most of us have in mind....The unity we can seek lies really in two things. One is that the knowledge that comes to us in such terrifyingly inhumanly rapid rate has some order in it....The second is simply this: We can have each other to dinner. We ourselves, and with each other by our converse, can create, not an architecture of global scope,but an immense, intricate network of intimacy, illumination, and understanding."

But even as such opinions were accepted with resignation, something new had been born, quietly grew, and in our time has become the source of increasing optimism about the value of the old dream—by turning in a new direction. I mean that scientific research, at first only sporadically during the last century, but more and more in our time, has been successfully reaching out for a new sort of unity—in practice, for an integration among disciplinary fragments. This time the movement is not driven by a philosophy of science or a search for the Ur-science. Rather it is appearing as if spontaneously in the pursuit and progress of research science itself.

There is an increasing coalescence of scientific disciplines in many areas. Thus the discovery of the structure of the genome not only required contributions from parts of biology, physics, chemistry, mathematics, and information technology, but in turn it led to further advances in biology, physics, chemistry, technology, medicine, ecology, and even ethics. And all this scientific advance is leading, as it should, to the hopeful betterment of the human condition (as had been also one of the platform promises of the Unity of Science movement, especially in its branch in the Vienna Circle).

Similar developments happen in the physical sciences—a coalescence of particle physics and large-scale astronomy, of physics and biology, and so forth. It is a telling and not merely parochial indicator that about half of my 45 colleagues in my Physics Department, owning to their widespread research interests, now have joint appointments with other departments at the University: with Molecular and Cellular Biology, with Mathematics, with Chemistry, with Applied Sciences and Engineering, with History of Science. Just now, a new building is being erected next to our Physics Department. It has the acronym LISE, which stands for the remarkable name, Laboratory of Integrated Science and Engineering. Although in industry, here and there, equivalent labs have existed for years, the most fervent follower of the Unity of Science movement would not have hoped then for such an indicator of the promise of interdisciplinarity. But as the new saying goes, most of the easy problems have been solved, and the hard ones need to be tackled by a consortium of different competences.

From other parts of this university, plans are under way to set up a program for higher degrees in the new field of Systems Biology, which has the goal of reaching "an integrated understanding" of biological/medical processes; that program is to bring together faculty and students from biology, medicine, chemistry, physics, mathematics, computation and engineering. And these parochial examples are indications of a general trend in many universities. The new password to success is now "integration" and "interdisciplinarity." If an "official" sacralization of this movement were needed, it would be the 2005 release of a big volume by the National Academy of Sciences, with the title "Facilitating Interdisciplinary Research."

All this is not precisely what the philosophers and scientists, from Thales on, were hoping for. We will not, at least not for a long time, have that grand coalescence of all sciences and more. What has come lacks exalted philosophical pretensions, being instead a turn to weeks and years of many-heads-together, hands-on work on specific, hard problems of intense scientific interest, many of them also of value to society at large.

And, of course, these co-workers can also still have each other to dinner.

The tools for cultural production and distribution ... may lead to Religions decline

reposted from Edge.org. Chris Street highlights/edits in bold.

This reminds me of Daniel Dennetts' idea The Evaporation of the Powerful Mystique of Religion by Daniel C. Dennett

"Why am I confident that this will happen? Mainly because of the asymmetry in the information explosion. With the worldwide spread of information technology (not just the internet, but cell phones and portable radios and television), it is no longer feasible for guardians of religious traditions to protect their young from exposure to the kinds of facts (and, yes, of course, misinformation and junk of every genre) that gently, irresistibly undermine the mindsets requisite for religious fanaticism and intolerance."

HOWARD RHEINGOLD
Communications Expert; Author, Smart Mobs

The tools for cultural production and distribution are in the pockets of 14 year olds

The tools for cultural production and distribution are in the pockets of 14 year olds. This does not guarantee that they will do the hard work of democratic self-governance: the tools that enable the free circulation of information and communication of opinion are necessary but not sufficient for the formation of public opinion. Ask yourself this question: Which kind of population seems more likely to become actively engaged in civic affairs — a population of passive consumers, sitting slackjawed in their darkened rooms, soaking in mass-manufactured culture that is broadcast by a few to an audience of many, or a world of creators who might be misinformed or ill-intentioned, but in any case are actively engaged in producing as well as consuming cultural products? Recent polls indicate that a majority of today's youth — the "digital natives" for whom laptops and wireless Internet connections are part of the environment, like electricity and running water — have created as well as consumed online content. I think this bodes well for the possibility that they will take the repair of the world into their own hands, instead of turning away from civic issues, or turning to nihilistic destruction.

The eager adoption of web publishing, digital video production and online video distribution, social networking services, instant messaging, multiplayer role-playing games, online communities, virtual worlds, and other Internet-based media by millions of young people around the world demonstrates the strength of their desire — unprompted by adults — to learn digital production and communication skills. Whatever else might be said of teenage bloggers, dorm-room video producers, or the millions who maintain pages on social network services like MySpace and Facebook, it cannot be said that they are passive media consumers. They seek, adopt, appropriate, and invent ways to participate in cultural production. While moral panics concentrate the attention of oldsters on lurid fantasies of sexual predation, young people are creating and mobilizing politically active publics online when circumstances arouse them to action. 25,000 Los Angeles high school students used MySpace to organize a walk-out from classes to join street demonstrations protesting proposed immigration legislation. Other young people have learned how to use the sophisticated graphic rendering engines of video games as tools for creating their own narratives; in France, disaffected youth, the ones whose riots are televised around the world, but whose voices are rarely heard, used this emerging "machinima" medium to create their own version of the events that triggered their anger (search for "The French Democracy" on video hosting sites). Not every popular YouTube video is a teenage girl in her room (or a bogus teenage girl in her room); increasingly, do-it-yourself video has been used to capture and broadcast police misconduct or express political opinions. Many of the activists who use Indymedia — ad-hoc alternative media organized around political demonstrations — are young.

My optimism about the potential of the generation of digital natives is neither technological determinism nor naive utopianism. Many-to-many communication enables but does not compel or guarantee widespread civic engagement by populations who never before had a chance to express their public voices. And while the grimmest lesson of the twentieth century is to mistrust absolutist utopians, I perceive the problem to be in the absolutism more than the utopia. Those who argued for the abolition of the age-old practice of human slavery were utopians.

The historical Rise of Science v The Decline of Religion

reposted from Edge.org. Chris Street highlights/edits in bold.

MICHAEL SHERMER
Publisher of Skeptic magazine, monthly columnist for Scientific American; Author, Why Darwin Matters

Science and The Decline of Magic

I am optimistic that science is winning out over magic and superstition. That may seem irrational, given the data from pollsters on what people believe. For example, a 2005 Pew Research Center poll found that 42 percent of Americans believe that "living things have existed in their present form since the beginning of time." The situation is even worse when we examine other superstitions, such as these percentages of belief published in a 2002 National Science Foundation study:

ESP 60%
UFOs 30%
Astrology 40%
Lucky numbers 32%
Magnetic therapy 70%
Alternative medicine 88%

Nevertheless, I take the historian's long view, and compared to what people believed before the Scientific Revolution, there is much cause for optimism. Consider what people believed a mere four centuries ago, just as science began lighting candles in the dark. In 16th- and 17th-century England, for example, almost everyone believed in sorcery, werewolves, hobgoblins, witchcraft, astrology, black magic, demons, prayer, and providence. "A great many of us, when we be in trouble, or sickness, or lose anything, we run hither and thither to witches, or sorcerers, whom we call wise men…seeking aid and comfort at their hands," noted Bishop Latimer in 1552. Saints were worshiped. Liturgical books provided rituals for blessing cattle, crops, houses, tools, ships, wells, and kilns, not to mention the sick, sterile animals, and infertile couples. In his 1621 book, Anatomy of Melancholy, Robert Burton explained, "Sorcerers are too common; cunning men, wizards, and white witches, as they call them, in every village, which, if they be sought unto, will help almost all infirmities of body and mind."

Just as alcohol and tobacco were essential anesthetics for the easing of pain and discomfort, superstition and magic were the basis for the mitigation of misfortune. As the great Oxford historian of the period, Keith Thomas, writes in his classic 1971 work Religion and the Decline of Magic, "No one denied the influence of the heavens upon the weather or disputed the relevance of astrology to medicine or agriculture. Before the seventeenth century, total skepticism about astrological doctrine was highly exceptional, whether in England or elsewhere." And it wasn't just astrology. "Religion, astrology and magic all purported to help men with their daily problems by teaching them how to avoid misfortune and how to account for it when it struck." With such sweeping power over nearly everyone, Thomas concludes, "If magic is to be defined as the employment of ineffective techniques to allay anxiety when effectives ones are not available, then we must recognize that no society will ever be free from it." The superstitious we will always have with us.

Nevertheless, the rise of science ineluctably attenuated this near universality of magical thinking by proffering natural explanations where before there were only supernatural ones. Before Darwin, design theory (in the form of William Paley's natural theology, which gave us the "watchmaker" argument) was the only game in town so everyone believed that life was designed by God. Today less than half believe that in America, the most religious nation of the developed democracies, and in most other parts of the world virtually everyone accepts evolution without qualification. That's progress.

The rise of science even led to a struggle to find evidence for superstitious beliefs that previously needed no propping up with facts. Consider the following comment from an early 17th-century book that shows how even then savvy observers grasped the full implications of denying the supernatural altogether: "Atheists abound in these days and witchcraft is called into question. If neither possession nor witchcraft (contrary to what has been so long generally and confidently affirmed), why should we think that there are devils? If no devils, no God."

Magic transitioned into empirical magic and formalized methods of ascertaining causality by connecting events in nature—the very basis of science. As science grew in importance, the analysis of portents was often done meticulously and quantitatively, albeit for purposes both natural and supernatural. As one diarist privately opined on the nature and meaning of comets: "I am not ignorant that such meteors proceed from natural causes, yet are frequently also the presages of imminent calamities."

Science arose out of magic, which it ultimately displaced. By the 18th century, astronomy replaced astrology, chemistry succeeded alchemy, probability theory dislodged belief in luck and fortune, city planning and social hygiene attenuated disease, and the grim vagaries of life became less grim, and less vague. As Francis Bacon concluded in his 1626 work, New Atlantis: "The end of our foundation is the knowledge of causes and the secret motions of things and the enlarging of the bounds of human empire, to the effecting of all things possible."

Sic itur ad astra — Thus do we reach the stars.

The Decline of Violence

reposted from Edge.org. Chris Street highlights/edits in bold.



STEVEN PINKER
Psychologist, Harvard University; Author, The Blank Slate

The Decline of Violence

In 16th century Paris, a popular form of entertainment was cat-burning, in which a cat was hoisted on a stage and was slowly lowered into a fire. According to the historian Norman Davies, "the spectators, including kings and queens, shrieked with laughter as the animals, howling with pain, were singed, roasted, and finally carbonized."

As horrific as present-day events are, such sadism would be unthinkable today in most of the world. This is just one example of the most important and under appreciated trend in the history of our species: the decline of violence. Cruelty as popular entertainment, human sacrifice to indulge superstition, slavery as a labor-saving device, genocide for convenience, torture and mutilation as routine forms of punishment, execution for trivial crimes and misdemeanors, assassination as a means of political succession, pogroms as an outlet for frustration, and homicide as the major means of conflict resolution—all were unexceptionable features of life for most of human history. Yet today they are statistically rare in the West, less common elsewhere than they used to be, and widely condemned when they do occur.

Most people, sickened by the headlines and the bloody history of the twentieth century, find this claim incredible. Yet as far as I know, every systematic attempt to document the prevalence of violence over centuries and millennia (and, for that matter, the past fifty years), particularly in the West, has shown that the overall trend is downward (though of course with many zigzags). The most thorough is James Payne’s The History of Force; other studies include Lawrence Keeley’s War Before Civilization, Martin Daly & Margo Wilson’s Homicide, Donald Horowitz’s The Deadly Ethnic Riot, Robert Wright’s Nonzero, Peter Singer’s The Expanding Circle, Stephen Leblanc’s Constant Battles, and surveys of the ethnographic and archeological record by Bruce Knauft and Philip Walker.

Anyone who doubts this by pointing to residues of force in America (capital punishment in Texas, Abu Ghraib, sex slavery in immigrant groups, and so on) misses two key points. One is that statistically, the prevalence of these practices is almost certainly a tiny fraction of what it was in centuries past. The other is that these practices are, to varying degrees, hidden, illegal, condemned, or at the very least (as in the case of capital punishment) intensely controversial. In the past, they were no big deal. Even the mass murders of the twentieth century in Europe, China, and the Soviet Union probably killed a smaller proportion of the population than a typical hunter-gatherer feud or biblical conquest. The world’s population has exploded, and wars and killings are scrutinized and documented, so we are more aware of violence, even when it may be statistically less extensive.

What went right? No one knows, possibly because we have been asking the wrong question—"Why is there war?" instead of “Why is there peace?" There have been some suggestions, all unproven. Perhaps the gradual perfecting of a democratic Leviathan—"a common power to keep [men] in awe"—has removed the incentive to do it to them before they do it to us. Payne suggests that it’s because for many people, life has become longer and less awful—when pain, tragedy, and early death are expected features of one’s own life, one feels fewer compunctions about inflicting them on others. Wright points to technologies that enhance networks of reciprocity and trade, which make other people more valuable alive than dead. Singer attributes it to the inexorable logic of the golden rule: the more one knows and thinks, the harder it is to privilege one’s own interests over those of other sentient beings. Perhaps this is amplified by cosmopolitanism, in which history, journalism, memoir, and realistic fiction make the inner lives of other people, and the contingent nature of one’s own station, more palpable—the feeling that "there but for fortune go I."

My optimism lies in the hope that the decline of force over the centuries is a real phenomenon, that is the product of systematic forces that will continue to operate, and that we can identify those forces and perhaps concentrate and bottle them.

Using Sunlight to power all our energy consumption

reposted from Edge.org
Chris Street highlights in bold

ALUN ANDERSON
Senior Consultant (and Former Editor-In-Chief and Publishing Director), New Scientist

The Sunlight-Powered Future

I'm optimistic about…a pair of very big numbers. The first is 4.5 x 10ˆ20. That is the current world annual energy use, measured in joules. It is a truly huge number and not usually a cause for optimism as 70 per cent of that energy comes from burning fossil fuels.

Thankfully, the second number is even bigger: 3,000,000 x 10ˆ20 joules. That is the amount of clean, green energy that pours down on the Earth totally free of charge every year. The Sun is providing 7,000 times as much energy as we are using, which leaves plenty for developing China, India and everyone else. How can we not be optimistic? We don't have a long-term energy problem. Our only worries are whether we can find smart ways to use that sunlight efficiently and whether we can move quickly enough from the energy systems we are entrenched in now to the ones we should be using. Given the perils of climate change and dependence on foreign energy, the motivation is there.

Can it be done? I'm lucky that as a writer I get to meet some of the world's brightest scientists each year, and I know that out there are plenty of radical new ideas for a future in which sunlight is turned straight into the forms of energy we need. Here are just three of my favourites out of scores of great ideas. First, reprogramming the genetic make-up of simple organisms so that they directly produce useable fuels (hydrogen, for example). That will be much more efficient than today's fashionable new bioethanol programs because they will cut out all the energy wasted in growing a crop, then harvesting it and then converting its sugars into fuel. Second, self-organizing polymer solar cells. Silicon solar cells may be robust and efficient but they are inevitably small and need a lot of energy to make. Self-organizing polymer cells could be ink jetted onto plastics by the hectare, creating dirt cheap solar cells the size of advertising hoardings. Third, there's artificial photosynthesis. Nature uses a different trick from silicon solar cells to capture light energy, whipping away high-energy electrons from photo-pigments into a separate system in a few thousand millionths of a second. We are getting much closer to understanding how it's done, and even how to use the same principles in totally different nano-materials.

But what of the pessimist's view that we can are just too entrenched in our current energy systems to change? There is a world-wide boom in investment in green technology already under way. And there are many transition technologies coming into operation that enable practice runs for more radical genome reprogramming and creation of new nano-structures. Although the consensus view is that the sunlight-powered future won't be taking over until 2050, I'd place an optimistic bet that one of the many smart ideas being researched now will turn out to be an unforeseen winner much earlier.

The Secret of Life, the Universe and Everything

reposted from Edge.org

LAWRENCE KRAUSS

Physicist, Case Western Reserve University; Author, Atom

Renewal of Science for the Public Good

I am optimistic that after almost 30 years of sensory deprivation in the field of particle physics, during which much hallucination (eg. string theory) has occurred by theorists, within 3 years, following the commissioning next year of the Large Hadron Collider in Geneva, we will finally obtain empirical data that will drive forward our understanding of the fundamental structure of nature, its forces, and of space and time.

My biggest optimism is that the data will be completely unexpected, forcing revisions in all our carefully prepared ideas about what will supplant the Standard Model of elementary particle physics. Since 1975 or so, every single experiment done at the microscopic forefront has been consistent with the predictions of the Standard Model, giving little or no direction to what lies behind it, what is the origin of mass, why there are three families of elementary particles, why some quarks are heavy, and why neutrinos are very light.

Yes, neutrino masses were discovered, but that was no big surprise, and no insight at all into their origin has been obtained thus far. With empirical data, theoretical particle physics might once return to the days when the key to distinguishing good theory from bad was how many empirical puzzles the theory might resolve, rather than how fancy it might look.

I am also completely optimistic that within what I hope will be my lifetime we will unlock the secret of life, and finally take our understanding of evolutionary biology back to that remarkable transition from non-biological chemistry to biology. Not only will we be able to create life in the laboratory, but we will be able to trace our own origins back, and gain insight into the remarkable question of how much life there is in the universe. We will surely discover microbial life elsewhere in our solar system, and I expect we will find that it is our cousin, from the same seed, if you will, rather than being truly alien. But all of this will make living even more fascinating.

A Good Death - a gutsy, scientifically informed existential courage in the face of personal extinction.

reposted from Edge.org


GEOFFREY MILLER
Evolutionary Psychologist, University of New Mexico; Author, The Mating Mind

Death

I'm optimistic about death. For the first time in the history of life on Earth, it is possible—not easy, but possible—for conscious animals like us to have a good death. A good death is a great triumph, and something to be sought, accepted, and cherished. Indeed, a good death should be recorded and broadcast as a moral example to us all.

What do I mean by a good death? I do not mean opiate-fuelled euthanasia, or heroic self-sacrifice during flash-bang tactical ops, or a grudgingly tolerated end to a millennium of grasping longevity. I do not mean a painless, clean, or even dignified death. I mean a death that shows a gutsy, scientifically informed existential courage in the face of personal extinction. I mean a death that shows the world that we secular humanists really mean it.

There is, of course, no way to escape the hardwired fears and reactions that motivate humans to avoid death. Suffocate me, and I'll struggle. Shoot me, and I'll scream. The brain stem and amygdala will always do their job of struggling to preserve one's life at any cost.

The question is how one's cortex faces death. Does it collapse in mortal terror like a deflated soufflé? Or does it face the end of individual consciousness with iron-clad confidence in the persistence of virtually identical consciousnesses in other human bodies? My optimism is that in this millennium, well-informed individuals will have a realistic prospect of sustaining this second perspective right through the end of life, despite death's pain and panic.

When I die in 50 years, or next week, or whenever, here's what I hope I remember:

  • My genes, proteins, neural networks, beliefs, and desires are practically identical to those sustaining the consciousness of 6 billion other humans, and countless other animals, whose experiences will continue when mine do not.
  • Since life must be common throughout the universe and resilient across time, such subjective experiences will continue not just on Earth in the short term, but across many worlds, for billions of years.
  • There is no spooky personal after-life to fear or hope for, only this wondrous diversity of subjectivity that trillions of individuals get to partake in.
  • The more science one knows, the more certain and comforting this knowledge is.

These life-lessons are, to me, the distilled wisdom of evolutionary psychology.

Many people resist this knowledge. They listen only to the hair-trigger anxieties of the amygdala—which constantly whispers 'fear death, fear death'. They construct pathetic ideologies of self-comfort to plug their ears against such mortal terror. They nuzzle through reality's coarse pelt for a lost teat of supernatural succor. I call them the Gutless, because they aren't bright enough or brave enough to understand their true place in the universe. A whole new branch of psychology called Terror Management Theory studies the Gutless and their death-denying delusions.

A great ideological war is raging between the Godless—people like me, who trust life—and the Gutless—the talking heads of the extreme, religious right, who fear death, and fear the Godless, and fear ongoing life in the future when they no longer exist. I'm also optimistic about the outcome of this war, because people respect guts and integrity. People want moral role models who can show them how to live good lives and die good deaths. People want to believe that they are participating in something vastly greater and more wonderful than their solipsism. Science quenches that thirst far more effectively, in my experience, than any supernatural teat sought by the Gutless.

'Proof' of methane lakes on Titan

reposted from BBC News. Chris Street highlights/edits in bold.

The lakes should grow and shrink with Titan's seasons

The Cassini probe has spotted what scientists say is unequivocal evidence of lakes of liquid methane on Titan, Saturn's largest moon.

Radar images reveal dark, smooth patches that range in size from three to 70km across (two to 44 miles).

The team says the features, which were spied in the moon's far north, look like crater or caldera lakes on Earth.

The researchers tell the journal Nature that everything about the patches points to them being pools of liquid.

"They look very similar to lakes on Earth," explained Dr Ellen Stofan, a Cassini radar team member from Proxemy Research in Washington DC, US.

"They have channels feeding into them just like you have rivers feeding into lakes on the Earth. Their shapes, their shore-lines, all of those geologic aspects are actually very familiar."

Northern strip

The atmospheric chemistry on Titan is dominated by nitrogen and carbon-based compounds.

And with temperatures on the Saturnian satellite rarely venturing above -179C (-290F), it has long been hypothesised that abundant volumes of methane should pool on the surface into lakes, and even large seas.

Cassini
Cassini has been investigating Saturn and its moons since 2004
But evidence for current bodies of liquid material on the surface has until now been sparse and equivocal.

Cassini must use radar to pierce the photochemical haze that obscures Titan's surface from its optical camera system.

The latest data was obtained last July, when the probe made its most northern radar pass of Titan to date.

The spacecraft imaged a narrow strip about 250km wide and over 1,000km long. It was found to contain more than 75 lakes.

Everything scientists know about the atmospheric chemistry on Titan suggests the liquid in the lakes should be predominantly methane, with some ethane also mixed in.

Some of the liquid would be expected to rain out of the sky, some could have welled up from below the surface.

Methane cycle

"The methane-ethane would become transparent, the way water is on Earth; it would be behaving like water, the lakes could have small waves on the surfaces," speculated Dr Stofan.

"So if it was possible for you to stand on Titan and look at the lakes, you wouldn't really know it's this weird chemistry."

Scientists have long predicted the existence of lakes

On Earth, the cycling of water between the atmosphere, the land and oceans is known as the hydrological cycle. Titan would appear to be the only other place in the Solar System to have a similar, active fluid cycle. Scientists have already dubbed it the "methane-ologic cycle".

Last month, it was announced that the radar instrument on Cassini had found an enormous mountain range on Titan.

The range lies south of the equator and is about 150km long (93 miles), 30km (19 miles) wide and about 1.5km (nearly a mile) high.

Scientists told the American Geophysical Union (AGU) Fall Meeting that the range was probably as hard as rock, but made of icy materials.

The mountains appeared in the radar images to be coated with layers of material that researchers thought could be methane "snow".

The Cassini-Huygens mission is a cooperative project of the US space agency (Nasa), the European Space Agency (Esa) and the Italian Space Agency (Asi).

Heart deaths fall by 36% - Statins save 9,700 lives in 2005

reposted from BBC News. Chris Street highlights in bold.

Heart deaths 'continue to fall'
Heart monitor
Heart disease death rates are falling
The government is on track to meet its target to reduce deaths from heart disease, official figures indicate.

Data from 2003-2005 shows that the death rate had fallen 35.9% since 1996; the government is aiming for a 40% fall by 2010.

The data also showed patients were getting quicker treatment and there were more consultants.

Experts said changes in lifestyle had also had an impact on death rates, which have been falling for decades.

The British Heart Foundation said heart disease deaths have been on a downward trend since the 1970s and while improved services had played a part, they were not the only cause.

This report shows the fantastic achievements the NHS has made since 2000
Patricia Hewitt, health secretary

The government published its National Service Framework for Coronary Heart Disease in 2000, which set out a 10-year plan to improve standards of care.

The latest report, Shaping the Future, gives an update on how care is being delivered.

As well as showing the fall in the death rate, latest figures also revealed a narrowing in the gap between the poorest areas and the national average.

In 1996 it stood at 36.7 extra deaths per 100,000, but has now dropped to 26.4 per 100,000.

Other data shows the number of lives saved through cholesterol-busting drugs called statins had tripled since 2000 to 9,700 in 2005.

The number of consultant cardiologists has risen by nearly 300 to 725 in the last six years and no patient now waits over three months for surgery compared to 5,663 in 2002.

Facilities

Professor Roger Boyle, national clinical director for heart disease and stroke, said: "The National Service Framework continues to set the standard in a local NHS which now has greater financial and decision-making power than ever before.

"Increased specialist facilities and better frontline treatments for heart attack victims continue to improve services for patients."

But he added: "It is still not perfect. We would like to get better rehabilitation and we need to look at how people are cared for at the end of their life."

Health Secretary Patricia Hewitt said: "This report shows the fantastic achievements the NHS has made since 2000, not only in treating CHD patients - with better use of statins and faster access to heart surgery - but also in helping to prevent it."

Professor Peter Weissberg, medical director of the British Heart Foundation, said: "It is true that significant improvements in services have helped, but death rates have been falling for a long time and that has also been due to changes in lifestyle, such as people giving up smoking."