Human-ities · Science

How the brain creates visions of God

For most of recorded history, human beings situated the mind — and by extension the soul — not within the brain but within the heart. When preparing mummies for the afterlife, for instance, ancient Egyptian priests removed the heart in one piece and preserved it in a ceremonial jar; in contrast, they scraped out the brain through the nostrils with iron hooks, tossed it aside for animals, and filled the empty skull with sawdust or resin. (This wasn’t a snarky commentary on their politicians, either—they considered everyone’s brain useless.) Most Greek thinkers also elevated the heart to the body’s summa. Aristotle pointed out that the heart had thick vessels to shunt messages around, whereas the brain had wispy, effete wires. The heart furthermore sat in the body’s center, appropriate for a commander, while the brain sat in exile up top. The heart developed first in embryos, and it responded in sync with our emotions, pounding faster or slower, while the brain just sort of sat there. Ergo, the heart must house our highest faculties.

Meanwhile, though, some physicians had always had a different perspective on where the mind came from. They’d simply seen too many patients get beaned in the head and lose some higher faculty to think it all a coincidence. Doctors therefore began to promote a brain-centric view of human nature. And despite some heated debates over the centuries—especially about whether the brain had specialized regions or not—by the 1600s most learned men had enthroned the mind within the brain. A few brave scientists even began to search for that anatomical El Dorado: the exact seat of the soul within the brain.

Read full article written by Sam Kean at SALON.
Image above: Eugene Thirion’s “Jeanne d’Arc” (1876)

Bio · Science · Technology

Seeing In The Pitch-Dark Is All In Your Head

A few years ago, cognitive scientist Duje Tadin and his colleague Randolph Blake decided to test blindfolds for an experiment they were cooking up.

They wanted an industrial-strength blindfold to make sure volunteers for their work wouldn’t be able to see a thing. “We basically got the best blindfold you can get.” Tadin tells Shots. “It’s made of black plastic, and it should block all light.”
Tadin and Blake pulled one on just to be sure and waved their hands in front of their eyes. They didn’t expect to be able to see, yet both of them felt as if they could make out the shadowy outlines of their arms moving.
Being scientists, they wondered what was behind the spooky phenomenon. “We knew there wasn’t any visual input there,” Tadin says. They figured their minds were instinctively filling in images where there weren’t any.

After conducting several experiments involving computerized eye trackers, they proved themselves right. Between 50 and 75 percent of the participants in their studies showed an eerie ability to “see” their own bodies moving in total darkness. The research, put together by scientists at the University of Rochester and Vanderbilt University, is published in the journal Psychological Science.

How were they so sure? “The only way you can produce smooth eye movements is if you’re following a target,” Tadin tells Shots. When our eyes aren’t tracking something very specific, they tend to jerk around randomly. “If you just try to make your eyes move smoothly, you can’t do it.” The researchers used this knowledge to test whether people could really distinguish their hand movements in the dark.

Text and Image via Neuromorphogenesis

Bio · Science

Einstein’s Brain (…and the neuroscientist who studied it)

Marian Diamond began her graduate work in 1948 and was the first female student in the department of anatomy at UC Berkeley. The first thing she was asked to do when she got there was sew a cover for a large magnifying machine (?!?!?!?!).

“They didn’t know what to do with me because they weren’t used to having a woman. They thought I was there to get a husband. I was there to learn.”

Such challenges were not uncommon. Years later she requested tissue samples of Albert Einstein’s brain from a pathologist in Missouri. He didn’t trust her.

“He wasn’t sure that I was a scientist. This is one thing that you have to face being a woman. He didn’t think that I should be the one to be looking at Einstein’s brain.”

Marian persisted for three years, calling him once every six months, and received four blocks of the physicist’s brain tissue (about the size of a sugar cube).

Her research found that Einstein had twice as many glial cells as normal males — the discovery caused an international sensation as well as scientific criticism.

What are glial cells? Previously, scientists believe that neurons were responsible for thinking and glial cells were support cells in the brain. Now Researchers believe the glial cells play a critical role in brain development, learning, memory, aging and disease.

All text and Images via UC Research

Bio · Design · Science · Technology · Vital-Edible-Health

Miniature brains grown in test tubes – a new path for neuroscience?

Scientists have grown miniature human brains in test tubes, creating a “tool” that will allow them to watch how the organs develop in the womb and, they hope, increase their understanding of neurological and mental problems.

Just a few millimetres across, the “cerebral organoids” are built up of layers of brain cells with defined regions that resemble those seen in immature, embryonic brains.

The scientists say the organoids will be useful for biologists who want to analyse how conditions such as schizophrenia or autism occur in the brain. Though these are usually diagnosed in older people some of the underlying defects occur during the brain’s early development.

Human brain ‘organoid’ grown from human pluripotent stem cells. This is a cross-section of the entire organoid showing development of different brain regions. All cells are in blue, neural stem cells in red, and neurons in green. Photograph: Madeline A Lancaster.

The organoids are also expected to be useful in the development and testing of drugs. At present this is done using laboratory animals or isolated human cells; the new organoids could allow pharmacologists to test drugs in more human-like settings.

Scientists have previously made models of other human organs in the lab, including eyes, pituitary glands and livers.

In the latest work researchers at the Institute of Molecular Biotechnology in Vienna started with stem cells and grew them into brain cells in a nourishing gel-like matrix that recreated conditions similar to those inside the human womb. After several months the cells had formed spheres measuring about 3-4mm in diameter.

Text by Alok Jha, science correspondent at The Guardian. Continue article THERE

Animalia · Human-ities · Science

One of Us: On Animal Consciousness

These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”

Excerpt from an essay written by John Jeremiah Sullivan at Laphan’s Quarterly. Continue HERE

The Cambridge Declaration on Consciousness in Non-Human Animals

Social/Politics · Technology

Good News Beats Bad on Social Networks

BAD NEWS SELLS. If it bleeds, it leads. No news is good news, and good news is no news.

Those are the classic rules for the evening broadcasts and the morning papers, based partly on data (ratings and circulation) and partly on the gut instincts of producers and editors. Wars, earthquakes, plagues, floods, fires, sick children, murdered spouses — the more suffering and mayhem, the more coverage.

But now that information is being spread and monitored in different ways, researchers are discovering new rules. By scanning people’s brains and tracking their e-mails and online posts, neuroscientists and psychologists have found that good news can spread faster and farther than disasters and sob stories.

“The ‘if it bleeds’ rule works for mass media that just want you to tune in,” says Jonah Berger, a social psychologist at the University of Pennsylvania. “They want your eyeballs and don’t care how you’re feeling. But when you share a story with your friends and peers, you care a lot more how they react. You don’t want them to think of you as a Debbie Downer.”

Excerpt from an article written by JOHN TIERNEY, at the NYT. Continue THERE

Human-ities · Science

Amnesia and the Self That Remains When Memory Is Lost

Daniel Levitin: Tom was one of those people we all have in our lives — someone to go out to lunch with in a large group, but not someone I ever spent time with one-on-one. We had some classes together in college and even worked in the same cognitive psychology lab for a while. But I didn’t really know him. Even so, when I heard that he had brain cancer that would kill him in four months, it stopped me cold.

I was 19 when I first saw him — in a class taught by a famous neuropsychologist, Karl Pribram. I’d see Tom at the coffee house, the library, and around campus. He seemed perennially enthusiastic, and had an exaggerated way of moving that made him seem unusually focused. I found it uncomfortable to make eye contact with him, not because he seemed threatening, but because his gaze was so intense.

Excerpt from an article written by Daniel Levitin at The Atlantic. Read it HERE

Human-ities · Philosophy · Science · Technology

A Look at the Original Roots of Artificial Intelligence, Cognitive Science, and Neuroscience? Steven Pinker moderates Noam Chomsky, Marvin Minsky, Barbara H. Partee, Patrick H. Winston, and Emilio Bizzi

Moderator: Steven Pinker, Harvard College Professor and Johnstone Family Professor, Department of Psychology, Harvard University

* Emilio Bizzi, MIT Institute Professor; Founding Member, McGovern Institute for Brain Research

* Sydney Brenner, Senior Distinguished Fellow, Crick-Jacobs Center, Salk Institute? for Biological Studies

* Noam Chomsky, MIT Institute Professor, Emeritus; Department of Linguistics and Philosophy

* Marvin Minsky, Professor of Media Arts and Sciences, Emeritus, MIT?

* Barbara H. Partee PhD ’65, Distinguished University Professor Emerita of Linguistics and Philosophy, University of Massachusetts

* Patrick H. Winston ’65 SM ’67 PhD ’70, Ford Professor of Artificial Intelligence and Computer Science, MIT?; Principal Investigator, Computer Science and Artificial Intelligence Laboratory; Chairman and Co-founder, Ascent Technology

See this panel HERE. Photo above via

Human-ities · Videos · Vital-Edible-Health

Stress: Portrait of a Killer

National Geographic Documentary. Over the last three decades, science has been advancing our understanding of stress—how it impacts our bodies and how our social standing can make us more or less susceptible. From baboon troops on the plains of Africa, to neuroscience labs at Stanford University, scientists are revealing just how lethal stress can be. Research tells us that the impact of stress can be found deep within us, shrinking our brains, adding fat to our bellies, even unraveling our chromosomes. Understanding how stress works can help us figure out ways to combat it and how to live a life free of the tyranny of this contemporary plague. In Stress: Portrait of a Killer, scientific discoveries in the field and in the lab prove that stress is not just a state of mind, but something measurable and dangerous.

Human-ities · Science · Theory

What Happens to Consciousness When We Die

Where is the experience of red in your brain? The question was put to me by Deepak Chopra at his Sages and Scientists Symposium in Carlsbad, Calif., on March 3. A posse of presenters argued that the lack of a complete theory by neuroscientists regarding how neural activity translates into conscious experiences (such as redness) means that a physicalist approach is inadequate or wrong. The idea that subjective experience is a result of electrochemical activity remains a hypothesis, Chopra elaborated in an e-mail. It is as much of a speculation as the idea that consciousness is fundamental and that it causes brain activity and creates the properties and objects of the material world.

Excerpt of an article written by y Michael Shermer, at Scientific American. Continue HERE

Human-ities · Science

Empathy’s surprising roots in the sense of touch

When a friend hits her thumb with a hammer, you don’t have to put much effort into imagining how this feels. You know it immediately. You will probably tense up, your “Ouch!” may arise even quicker than your friend’s, and chances are that you will feel a little pain yourself. Of course, you will then thoughtfully offer consolation and bandages, but your initial reaction seems just about automatic. Why?

Neuroscience now offers you an answer: A recent line of research has demonstrated that seeing other people being touched activates primary sensory areas of your brain, much like experiencing the same touch yourself would do. What these findings suggest is beautiful in its simplicity—that you literally “feel with” others.

Excerpt from an article written by Jakub Limanowski, at Scientific American. Continue HERE

Bio · Human-ities · Science · Technology · Vital-Edible-Health

Neuroscience: The mind reader. Communicating with vegetative patients

Adrian Owen still gets animated when he talks about patient 23. The patient was only 24 years old when his life was devastated by a car accident. Alive but unresponsive, he had been languishing in what neurologists refer to as a vegetative state for five years, when Owen, a neuro-scientist then at the University of Cambridge, UK, and his colleagues at the University of Liège in Belgium, put him into a functional magnetic resonance imaging (fMRI) machine and started asking him questions.

Incredibly, he provided answers. A change in blood flow to certain parts of the man’s injured brain convinced Owen that patient 23 was conscious and able to communicate. It was the first time that anyone had exchanged information with someone in a vegetative state.

Excerpt of an article written by David Cyranoski, at Nature. Continue HERE

Human-ities · Science · Vital-Edible-Health

Deconstructing Dad

Having children changes a man. All of us know examples of that. I’m pretty sure, for instance, that the only time I ever saw my father sing was to his kids. It wasn’t always pretty, but it was pure Dad.

But is there something about fatherhood that actually changes the male brain? Studies suggest that it does, including one published a few years ago which found that new sets of neurons formed in brains of mouse dads that stayed around the nest after their pups were born.

Still, there’s much yet to be learned about the effects of being a father. And so scientists continue to explore the eternal question: “What’s with this guy?”

HERE are 10 recent studies deconstructing dad. Text and Image via Smithsonian.

Book-Text-Read-Zines · Human-ities

BBC Four: Why Reading Matters

Science writer Rita Carter tells the story of how modern neuroscience has revealed that reading, something most of us take for granted, unlocks remarkable powers. Carter explains how the classic novel Wuthering Heights allows us to step inside other minds and understand the world from different points of view, and she wonders whether the new digital revolution could threaten the values of classic reading.

Animalia · Bio · Human-ities · Science

The Original Colonists: ‘The Social Conquest of Earth,’ by Edward O. Wilson

This is not a humble book. Edward O. Wilson wants to answer the questions Paul Gauguin used as the title of one of his most famous paintings: “Where do we come from? What are we? Where are we going?” At the start, Wilson notes that religion is no help at all — “mythmaking could never discover the origin and meaning of humanity” — and contemporary philosophy is also irrelevant, having “long ago abandoned the foundational questions about human existence.” The proper approach to answering these deep questions is the application of the methods of science, including archaeology, neuroscience and evolutionary biology. Also, we should study insects.

Insects? Wilson, now 82 and an emeritus professor in the department of organismic and evolutionary biology at Harvard, has long been a leading scholar on ants, having won one of his two Pulitzer Prizes for the 1990 book on the topic that he wrote with Bert Hölldobler. But he is better known for his work on humans. His “Sociobiology: The New Synthesis,” a landmark attempt to use evolutionary theory to explain human behavior, was published in 1975. Those were strange times, and Wilson was smeared as a racist and fascist, attacked by some of his Harvard colleagues and doused with water at the podium of a major scientific conference. But Wilson’s days as a pariah are long over. An evolutionary approach to psychology is now mainstream, and Wilson is broadly respected for his scientific accomplishments, his environmental activism, and the scope and productivity of his work, which includes an autobiography and a best-selling novel, ­“Anthill.”

In “The Social Conquest of Earth,” he explores the strange kinship between humans and some insects. Wilson calculates that one can stack up log-style all humans alive today into a cube that’s about a mile on each side, easily hidden in the Grand Canyon. And all the ants on earth would fit into a cube of similar size. More important, humans and certain insects are the planet’s ­“eusocial” species — the only species that form communities that contain multiple generations and where, as part of a division of labor, community members sometimes perform altruistic acts for the benefit of others.

Excerpt from an article written by PAUL BLOOM, NYT. Continue HERE
Image above: Edward O. Wilson holds a jar of ant specimens from a dig in Puerto Rico.

Human-ities · Performativity · Science · Sonic/Musical

Did Humans Invent Music?

Did Neanderthals sing? Is there a “music gene”? Two scientists debate whether our capacity to make and enjoy songs comes from biological evolution or from the advent of civilization.

Music is everywhere, but it remains an evolutionary enigma. In recent years, archaeologists have dug up prehistoric instruments, neuroscientists have uncovered brain areas that are involved in improvisation, and geneticists have identified genes that might help in the learning of music. Yet basic questions persist: Is music a deep biological adaptation in its own right, or is it a cultural invention based mostly on our other capacities for language, learning, and emotion? And if music is an adaptation, did it really evolve to promote mating success as Darwin thought, or other for benefits such as group cooperation or mother-infant bonding?

Excerpt of an article written by Gary Marcus and Geoffrey Miller, at The Atlantic. Continue HERE

Image above: A neanderthal instrument. A 40,000 year old flute at Divje Babe, Slovenia. Via Glen Morton.

Human-ities · Science · Videos

James Fallon: Confessions of a Pro-Social Psychopath

Neuroscientist James Fallon is a self-styled “hobbit scientist.” The rules are simple: Don’t talk to the press and don’t go out of your area of expertise. But when a fascinating new brain scanner enters the lab, Fallon can’t resist. He ends up breaking both rules, and learns a lot more about himself than he bargained for.

Bio · Science · Sonic/Musical

Attention tunes the mind’s ear. Brain activity shows how one voice pattern stands out from the crowd

The brain’s power to focus can make a single voice seem like the only sound in a room full of chatter, a new study shows. The results help explain how people can pick out a speaker from a jumbled stream of incoming sounds.

A deeper understanding of this feat could help scientists better treat people who can’t sort out sound signals effectively, an ability that can decline with age.

“I think this is a truly outstanding study, which has deep implications for the way we think about the auditory brain,” says auditory neuroscientist Christophe Micheyl of the University of Minnesota, who was not involved in the new research.

Excerpt from an article written by Laura Sanders, Science News. Continue HERE

For the project, engineer Nima Mesgarani and neurosurgeon Edward Chang, both of the University of California, San Francisco, studied what happens in the brains of people who are trying to follow one of two talkers, a scenario known to scientists as the cocktail party problem.

Book-Text-Read-Zines · Human-ities · Philosophy · Theory

Is Free Will an Illusion?

Free will has long been a fraught concept among philosophers and theologians. Now neuroscience is entering the fray.

For centuries, the idea that we are the authors of our own actions, beliefs, and desires has remained central to our sense of self. We choose whom to love, what thoughts to think, which impulses to resist. Or do we?

Neuroscience suggests something else. We are biochemical puppets, swayed by forces beyond our conscious control. So says Sam Harris, author of the new book, Free Will (Simon & Schuster), a broadside against the notion that we are in control of our own thoughts and actions. Harris’s polemic arrives on the heels of Michael S. Gazzaniga’s Who’s In Charge? Free Will and the Science of the Brain (HarperCollins), and David Eagleman’s Incognito: The Secret Lives of the Brain (Pantheon), both provocative forays into a debate that has in recent months spilled out onto op-ed and magazine pages, and countless blogs.

What’s at stake? Just about everything: morality, law, religion, our understanding of accountability and personal accomplishment, even what it means to be human. Harris predicts that a declaration by the scientific community that free will is an illusion would set off “a culture war far more belligerent than the one that has been waged on the subject of evolution.”

The Chronicle Review brought together some key thinkers to discuss what science can and cannot tell us about free will, and where our conclusions might take us.

You Don’t Have Free Will

Jerry A. Coyne

The Case Against the Case Against Free Will
Alfred R. Mele

Free Will Is an Illusion, but You’re Still Responsible for Your Actions
Michael S. Gazzaniga

Want to Understand Free Will? Don’t Look to Neuroscience
Hilary Bok

The End of (Discussing) Free Will
Owen D. Jones

Free Will Does Not Exist. So What?
Paul Bloom

Text by The Chronicle Review
Image via Wired’s earlier article on Free Will

Human-ities · Theory

Freud’s Radical Talking

Death is supposed to be an event proclaimed but once, and yet some deaths, curiously enough, need to be affirmed again and again, as if there were a risk that the interred will crawl back up into the world of the living if fresh handfuls of dirt are not tossed on their graves. One such member of the living dead, accompanying the likes of God and Karl Marx, is Sigmund Freud. How often one hears, thanks recently to the fetishization of neuroscience, that psychoanalysis is now bunk, irrelevant, its method decadent and “dangerous,” as the recent David Cronenberg film, “A Dangerous Method,” informs us.

Over the years, the “talking cure” — so dubbed by Bertha Pappenheim, a.k.a. “Anna O.,” who became Freud’s first psychoanalytic case study — has received quite a bit of ridicule and reworking. With countless children and adults taking behavior-altering drugs, many are again tolling the bell for psychoanalysis. Who wouldn’t choose fast-acting pills over many years on the couch, health insurance companies most of all? Perhaps, after surviving scandal, revision and pop characterization, drugs and money will definitively put old Sigmund to rest.

If psychoanalysis were simply a way of “curing” certain individuals of socially unwanted behavior, then I would have no problem with its disappearance. Similarly, if psychoanalysis were just a way for wealthy individuals to talk to one another about their lackluster existences, it might as well continue on its way to the dustbin of history. And if, God forbid, psychoanalysis has devolved into just another addition to the theory toolkit of academics in the humanities, someone ought to put it out of its misery now.

Excerpt of an article by BENJAMIN Y. FONG at NYT. Continue HERE

Bio · Human-ities · Science · Technology · Vital-Edible-Health

Nuffield Council on Bioethics: Exploring the Impact of ‘Novel Neurotechnologies’

The Nuffield Council on Bioethics is an independent body that examines and reports on ethical issues in biology and medicine. It was established by the Trustees of the Nuffield Foundation in 1991, and since 1994 it has been funded jointly by the Foundation, the Wellcome Trust and the Medical Research Council.

The Council has achieved an international reputation for advising policy makers and stimulating debate in bioethics.

To launch a Nuffield Council on Bioethics consultation, Professor Tom Baldwin, chair of the inquiry, outlines the ethical issues raised by novel neurotechnologies that intervene in the brain. Dr Alena Buyx, Assistant Director of the Council, goes on to describe neurostimulation and neural stem cell therapy in more detail, and Professor Kevin Warwick, a member of the Working Party, discusses some of his work around brain-computer interfaces.

Terms of reference

The Council’s terms of reference require it:

1. To identify and define ethical questions raised by recent advances in biological and medical research in order to respond to, and to anticipate, public concern;

2. To make arrangements for examining and reporting on such questions with a view to promoting public understanding and discussion; this may lead, where needed, to the formulation of new guidelines by the appropriate regulatory or other body;

3. In the light of the outcome of its work, to publish reports; and to make representations, as the Council may judge appropriate.

All text via The Nuffield Council on Bioethics

Human-ities · Science


V.S. RAMACHANDRAN: I’m interested in all aspects of the human mind, including aspects of the mind that have been regarded as ineffable or mysterious. The way I approach these problems is to look at patients who have sustained injury to a small region in the brain, a discipline called Behavioral Neurology or Cognitive Neuroscience these days.

Let me tell you about the problem confronting us. The brain is a 1.5 kilogram mass of jelly, the consistency of tofu, you can hold it in the palm of your hand, yet it can contemplate the vastness of space and time, the meaning of infinity and the meaning of existence. It can ask questions about who am I, where do I come from, questions about love and beauty, aesthetics, and art, and all these questions arising from this lump of jelly. It is truly the greatest of mysteries. The question is how does it come about?

When you look at the structure of the brain it’s made up of neurons. Of course, everybody knows that these days. There are 100 billion of these nerve cells. Each of these cells makes about 1,000 to 10,000 contacts with other neurons. From this information people have calculated that the number of possible brain states, of permutations and combinations of brain activity, exceeds the number of elementary particles in the universe.

Continue at EDGE

V.S. RAMACHANDRAN, M.D., PH.D., Is Director of the Center for Brain and Cognition and distinguished professor with the Psychology Department and the Neurosciences Program at the University of California, San Diego, and Adjunct Professor of Biology at the Salk Institute. Ramachandran’s early research was on visual perception but he is best known for his work in Neurology. His most recent book is The Tell-Tale Brain.

Human-ities · Public Space · Social/Politics · Theory

Economics and the Brain: How People Really Make Decisions in Turbulent Times

In a 2008 paper on neuroeconomics, Stanford economist George Loewenstein said: “Whereas psychologists tend to view humans as fallible and sometime even self-destructive, economists tend to view people as efficient maximisers of self-interest who make mistakes only when imperfectly informed about the consequences of their actions.”

This view of humans as completely rational – and the market as eminently efficient – is relatively recent. In 1922, in the Journal of Political Economy, Rexford G. Tugwell, said (to paraphrase) that a mind evolved to function best in “the exhilarations and the fatigues of the hunt, the primitive warfare and in the precarious life of nomadism”, had been strangely and quickly transported into a different milieu, without much time to modify the equipment of the old life.

The field of economics has since rejected this more pragmatic (and I would argue, realistic) view of human behavior, in favor of the simpler and neater “rational choice” perspective, which viewed the power of reflection as the only force driving human behavior.

But to paraphrase sociologist Zygmunt Bauman, our currently held views of what is reasonable, sensible and good sense tend to take shape in response to the realities “out there” as seen through the prism of human practice – what humans currently do, know how to do, are trained, groomed and inclined to do.

We compare ourselves to people we know, and come into contact with – either through social groups, or lately, with the advent of mass and, even fragmented media, people we think are like us.

Excerpt of an article written by Paul Harrison, The Conversation. Continue HERE

Art/Aesthetics · Education · Events · Performativity · Science · Vital-Edible-Health

Individual ecstasies: the revelatory experience conference

On March 23rd London will host a unique conference on the neuroscience, psychiatry and interpretation of revelatory visionary experiences.

Mental health professionals frequently encounter people who report experiences of God or supernatural beings speaking or acting through them to reveal important truths. In some cases it is difficult to know to what extent such experiences are best explained as ‘illness’, or represent experiences which are accepted and valued within a person’s religious or cultural context. Indeed, revelatory experiences form a key part of the formation and development of major world religions through figures such as prophets, visionaries, and yogins, as well as in the religious practice of shamans and others in traditional smaller scale societies. Why are revelatory experiences and related altered states of consciousness so common across cultures and history? What neural and other processes cause them? When should they be thought of as due to mental illness, as opposed to culturally accepted religious experience? And what value should or can be placed upon them? In this one day conference leading scholars from neuroscience, psychiatry, theology and religious studies, history and anthropology gather to present recent findings, and debate with each other and the audience about these fundamental aspects of human experience.

Who should attend: This one day interdisciplinary conference will be useful to academic psychologists, neuroscientists and humanities scholars interested in understanding the possibilities for interdisciplinary understanding of complex human behavior; as well as psychiatrists, clinical psychologists, nurses and any professionals whose work requires them to make sense of the relations between culture, religion, and mental health.

Confirmed Speakers

Dr Quinton Deeley, Senior Lecturer, Institute of Psychiatry Kings College London, and Honorary Consultant Psychiatrist, SLAM

Professor Stephen Pattison, Professor of Religion, Ethics, and Practice at the University of Birmingham

Dr Mitul Mehta, Senior Lecturer, Centre for Neuroimaging Sciences, Institute of Psychiatry, Kings College London

Dr Eamonn Walsh, Post doctoral Researcher, Institute of Psychiatry, Kings College London

Professor Chris Rowland, Dean Ireland Professor of the Exegesis of Holy Scripture from Oxford

Professor Roland Littlewood, Professor of Anthropology and Psychiatry at University College London

Professor David Oakley, University College London
The Very Rev Dr Jane Shaw, Dean of Grace Cathedral San Fransisco

Dr Piers Vitebsky, Head of Anthropology and Russian Arctic Studies, Scott Polar Research Institute, University of Cambridge

Professor Geoffrey Samuel, Religious Studies, University of Cardiff

Click HERE for more Info. Image source

Human-ities · Science

Accept Defeat: The Neuroscience of Screwing Up

It all started with the sound of static. In May 1964, two astronomers at Bell Labs, Arno Penzias and Robert Wilson, were using a radio telescope in suburban New Jersey to search the far reaches of space. Their aim was to make a detailed survey of radiation in the Milky Way, which would allow them to map those vast tracts of the universe devoid of bright stars. This meant that Penzias and Wilson needed a receiver that was exquisitely sensitive, able to eavesdrop on all the emptiness. And so they had retrofitted an old radio telescope, installing amplifiers and a calibration system to make the signals coming from space just a little bit louder.

But they made the scope too sensitive. Whenever Penzias and Wilson aimed their dish at the sky, they picked up a persistent background noise, a static that interfered with all of their observations. It was an incredibly annoying technical problem, like listening to a radio station that keeps cutting out.

At first, they assumed the noise was man-made, an emanation from nearby New York City. But when they pointed their telescope straight at Manhattan, the static didn’t increase. Another possibility was that the sound was due to fallout from recent nuclear bomb tests in the upper atmosphere. But that didn’t make sense either, since the level of interference remained constant, even as the fallout dissipated. And then there were the pigeons: A pair of birds were roosting in the narrow part of the receiver, leaving a trail of what they later described as “white dielectric material.” The scientists evicted the pigeons and scrubbed away their mess, but the static remained, as loud as ever.

For the next year, Penzias and Wilson tried to ignore the noise, concentrating on observations that didn’t require cosmic silence or perfect precision. They put aluminum tape over the metal joints, kept the receiver as clean as possible, and hoped that a shift in the weather might clear up the interference. They waited for the seasons to change, and then change again, but the noise always remained, making it impossible to find the faint radio echoes they were looking for. Their telescope was a failure.

Written by Jonah Lehrer, WIRED. Continue HERE

Image: Cristiana Couceiro, Scientis: Getty Images

Education · Human-ities · Science · Theory

What Mind, Brain, and Education (MBE) Can Do for Teaching

Discipline and sub-disciplines in Mind, Brain, and Education Science. Source: Bramwell for Tokuhama-Espinosa

Evidence-Based Solutions for the Classroom

How do we learn best? What is individual human potential? How do we ensure that children live up to their promise as learners? These questions and others have been posed by philosophers as well neuroscientists, psychologists, and educators for as long as humans have pondered their own existence. Because MBE science moves educators closer to the answers than at any other time in history, it benefits teachers in their efficacy and learners in their ultimate success.

Great teachers have always “sensed” why their methods worked; thanks to brain imaging technology, it is now possible to substantiate many of these hunches with empirical scientific research. For example, good teachers may suspect that if they give their students just a little more time to respond to questions than normal when called upon, they might get better-quality answers. Since 1972 there has been empirical evidence that if teachers give students several seconds to reply to questions posed in class, rather than the normal single second, the probability of a quality reply increases.[1] Information about student response time is shared in some teacher training schools, but not all. Standards in MBE science ensure that information about the brain’s attention span and need for reflection time would be included in teacher training, for example.

The basic premise behind the use of standards in MBE science is that fundamental skills, such as reading and math, are extremely complex and require a variety of neural pathways and mental systems to work correctly. MBE science helps teachers understand why there are so many ways that things can go wrong, and it identifies the many ways to maximize the potential of all learners. This type of knowledge keeps educators from flippantly generalizing, “He has a problem with math,” and rather encourages them to decipher the true roots (e.g., number recognition, quantitative processing, formula structures, or some sub-skill in math). MBE science standards make teaching methods and diagnoses more precise. Through MBE, teachers have better diagnostic tools to help them more accurately understand their students’ strengths and weakness. These standards also prevent teachers from latching onto unsubstantiated claims and “neuromyths” and give them better tools for judging the quality of the information. Each individual has a different set of characteristics and is unique, though human patterns for the development of different skills sets, such as walking and talking, doing math or learning to read, do exist. One of the most satisfying elements of MBE science is having the tools to maximize the potential of each individual as he or she learns new skills.

The following was an excerpt from Mind, Brain, and Education Science: A comprehensive guide to the new brain-based teaching (W.W. Norton) a book based on over 4,500 studies and with contributions from the world’s leaders in MBE Science. Continue HERE

Human-ities · Performativity · Science · Technology · Vital-Edible-Health

Deep-Brain Stimulation Found to Fix Depression Long-Term

Deep depression that fails to respond to any other form of therapy can be moderated or reversed by stimulation of areas deep inside the brain. Now the first placebo-controlled study of this procedure shows that these responses can be maintained in the long term.

Neurologist Helen Mayberg at Emory University in Atlanta, Georgia, followed ten patients with major depressive disorder and seven with bipolar disorder, or manic depression, after an electrode device was implanted in the subcallosal cingulate white matter of their brains and the area continuously stimulated.

All but one of twelve patients who reached the two-year point in the study had completely shed their depression or had only mild symptoms.

For psychiatrists accustomed to seeing severely depressed patients fail to respond—or fail to maintain a response—to antidepressant or cognitive therapy, these results seem near miraculous.

Text by Alison Abbott and Nature magazine. Continue HERE

Book-Text-Read-Zines · Human-ities · Science · Theory

The Optimism Bias and our Brains may be Hardwired to Look on the Bright Side

The Optimism Bias: Why we’re wired to look on the bright side by Tali Sharot

Tali Sharot: We like to think of ourselves as rational creatures. We watch our backs, weigh the odds, pack an umbrella. But both neuroscience and social science suggest that we are more optimistic than realistic. On average, we expect things to turn out better than they wind up being. People hugely underestimate their chances of getting divorced, losing their job or being diagnosed with cancer; expect their children to be extraordinarily gifted; envision themselves achieving more than their peers; and overestimate their likely life span (sometimes by 20 years or more).

The belief that the future will be much better than the past and present is known as the optimism bias. It abides in every race, region and socioeconomic bracket. Schoolchildren playing when-I-grow-up are rampant optimists, but so are grown-ups: a 2005 study found that adults over 60 are just as likely to see the glass half full as young adults.

You might expect optimism to erode under the tide of news about violent conflicts, high unemployment, tornadoes and floods and all the threats and failures that shape human life. Collectively we can grow pessimistic – about the direction of our country or the ability of our leaders to improve education and reduce crime. But private optimism, about our personal future, remains incredibly resilient. A survey conducted in 2007 found that while 70% thought families in general were less successful than in their parents’ day, 76% of respondents were optimistic about the future of their own family. Continue HERE

Book-Text-Read-Zines · Human-ities · Science

BRAINTRUST: What Neuroscience Tells Us about Morality

BRAINTRUST: What Neuroscience Tells Us about Morality. Patricia S. Churchland. xii + 273 pp. Princeton University Press, 2011.

Robert J. Richards at American Scientis: In Braintrust, Patricia Churchland, a philosopher at the University of California at San Diego, seems intent on advancing a project comparable to Darwin’s through the application of the most recent science, as the subtitle of her book suggests: What Neuroscience Tells Us about Morality. Readers may, however, decide instead to stick with that old-time evolution.

Churchland does not think that moral behavior can be reduced to any special kind of activity, as Darwin believed; rather, in her view, the term “moral” hovers over a variety of social behaviors, behaviors that might attract the same term but vary considerably across different cultures and individuals. Such behaviors, she argues, are not usually governed or motivated by explicit rules but are constituted by habits and emotionally guided decisions. She seeks to understand those habits and emotionally fed values as consequences of our neurobiology. She thus undertakes in several chapters to lay out the terrain of the brain, its regions and functions, and the kinds of hormones important for fertilizing the flowering of social relationships.

Churchland investigates other neurological features that might plausibly be offered as part of the scaffolding of moral behavior. She considers, for example, the possibility that there is an innate and heritable impulse to behave morally (Darwin’s view) and the hypothesis that moral behavior is grounded in mirror neurons, so that we might effortlessly imitate empathetic behaviors. Churchland chips away at these as possible neural structures for moral behavior. For instance, she attempts to undermine the concept of innate behavior generally by requiring a specification of the relevant genes and their relation to brain circuitry—a criterion beyond reach even for highly heritable traits, such as height. Indeed, by that criterion Darwin’s general theory of heritable adaptations, for which he had no reliable genetic foundation, would be but a passing fancy for the delectation of Intelligent Designers. Continue review HERE

Bio · Digital Media · Human-ities · Science · Technology

The End of Free Will

Functional imaging package.

Megan Erickson: The field of neuroscience evolved so rapidly in the past twenty years that it will pose unprecedented challenges to the legal system in the decades to come, changing the way we understand crime and punishment, says neuro-pioneer Joy Hirsch, director of the Functional Magnetic Resonance Imaging Center at Columbia.

Functional imaging, for instance, has given scientists the ability to identify which specific areas of the brain are active during specific tasks. It’s a development that Hirsch compares to manna from heaven.

“I was at Kettering in 1991, when the blood oxygen level dependent signal – the primary signal of functional imaging – was discovered,” she says. “I had a feeling that this was going to change the course of neuroscience, because if that signal was real then it meant that we would actually be able to observe, physiologically, the function of the brain that we had made inferences about from more or less the black box system of study.”

By 2005, a technique utilizing this knowledge had been adopted by the AMA, resulting in widespread use in research and community hospitals across the country. Over the course of about five years, the way surgeons plan and execute operations was entirely revised.

Now, imaging technology creates a map of the patient’s brain, allowing his or her surgeon to pinpoint the areas most vital to the performance of tasks memory storage and sight in that individual patient. Before operating, a surgeon knows exactly where to cut and what to avoid.

“It’s [one] example of [an application] that has gone all the way from the bench stage, the place where the science actually happened, to the bed stage, where patients actually benefit from the new procedure,” says Hirsch. “We’ve begun to tap in to the dynamics of the language of the brain as opposed to just understanding specific areas.” Continue HERE