Bio · Human-ities

Anything But Human

The very idea of an “ought” is foreign to evolutionary theory. It makes no sense for a biologist to say that some particular animal should be more cooperative, much less to claim that an entire species ought to aim for some degree of altruism. If we decide that we should neither “dissolve society” through extreme selfishness, as Wilson puts it, nor become “angelic robots” like ants, we are making an ethical judgment, not a biological one. Likewise, from a biological perspective it has no significance to claim that I should be more generous than I usually am, or that a tyrant ought to be deposed and tried. In short, a purely evolutionary ethics makes ethical discourse meaningless.

Excerpt from an article written by RICHARD POLT, NYT. Continue HERE

Human-ities · Philosophy · Social/Politics · Technology · Theory

We’re Underestimating the Risk of Human Extinction

Unthinkable as it may be, humanity, every last person, could someday be wiped from the face of the Earth. We have learned to worry about asteroids and supervolcanoes, but the more-likely scenario, according to Nick Bostrom, a professor of philosophy at Oxford, is that we humans will destroy ourselves.

Bostrom, who directs Oxford’s Future of Humanity Institute, has argued over the course of several papers that human extinction risks are poorly understood and, worse still, severely underestimated by society. Some of these existential risks are fairly well known, especially the natural ones. But others are obscure or even exotic. Most worrying to Bostrom is the subset of existential risks that arise from human technology, a subset that he expects to grow in number and potency over the next century.

Despite his concerns about the risks posed to humans by technological progress, Bostrom is no luddite. In fact, he is a longtime advocate of transhumanism—the effort to improve the human condition, and even human nature itself, through technological means. In the long run he sees technology as a bridge, a bridge we humans must cross with great care, in order to reach new and better modes of being. In his work, Bostrom uses the tools of philosophy and mathematics, in particular probability theory, to try and determine how we as a species might achieve this safe passage. What follows is my conversation with Bostrom about some of the most interesting and worrying existential risks that humanity might encounter in the decades and centuries to come, and about what we can do to make sure we outlast them.

Excerpt of an article by Ross Andersen at The Atlantic. Continue HERE

Bio · Human-ities · Philosophy · Podcast · Science · Sonic/Musical · Technology

Transhumanism and Posthumanism

What is the future of humanity? What limits should we impose on our biotechnological and other scientific developments – what will happen when we don’t? Grant Bartley from Philosophy Now asks Debra Shaw from the University of East London, Blay Whitby from the University of Sussex, and David Gamez from Imperial College London, for answers. With live music from Bucky Muttel on the Chapman Stick. First broadcast on 14 February 2012 on Resonance FM.

Via Philosophy Now Radio Show