For most of recorded history, human beings situated the mind — and by extension the soul — not within the brain but within the heart. When preparing mummies for the afterlife, for instance, ancient Egyptian priests removed the heart in one piece and preserved it in a ceremonial jar; in contrast, they scraped out the brain through the nostrils with iron hooks, tossed it aside for animals, and filled the empty skull with sawdust or resin. (This wasn’t a snarky commentary on their politicians, either—they considered everyone’s brain useless.) Most Greek thinkers also elevated the heart to the body’s summa. Aristotle pointed out that the heart had thick vessels to shunt messages around, whereas the brain had wispy, effete wires. The heart furthermore sat in the body’s center, appropriate for a commander, while the brain sat in exile up top. The heart developed first in embryos, and it responded in sync with our emotions, pounding faster or slower, while the brain just sort of sat there. Ergo, the heart must house our highest faculties.
Meanwhile, though, some physicians had always had a different perspective on where the mind came from. They’d simply seen too many patients get beaned in the head and lose some higher faculty to think it all a coincidence. Doctors therefore began to promote a brain-centric view of human nature. And despite some heated debates over the centuries—especially about whether the brain had specialized regions or not—by the 1600s most learned men had enthroned the mind within the brain. A few brave scientists even began to search for that anatomical El Dorado: the exact seat of the soul within the brain.
Read full article written by Sam Kean at SALON.
Image above: Eugene Thirion’s “Jeanne d’Arc” (1876)
A few years ago, cognitive scientist Duje Tadin and his colleague Randolph Blake decided to test blindfolds for an experiment they were cooking up.
They wanted an industrial-strength blindfold to make sure volunteers for their work wouldn’t be able to see a thing. “We basically got the best blindfold you can get.” Tadin tells Shots. “It’s made of black plastic, and it should block all light.”
Tadin and Blake pulled one on just to be sure and waved their hands in front of their eyes. They didn’t expect to be able to see, yet both of them felt as if they could make out the shadowy outlines of their arms moving.
Being scientists, they wondered what was behind the spooky phenomenon. “We knew there wasn’t any visual input there,” Tadin says. They figured their minds were instinctively filling in images where there weren’t any.
After conducting several experiments involving computerized eye trackers, they proved themselves right. Between 50 and 75 percent of the participants in their studies showed an eerie ability to “see” their own bodies moving in total darkness. The research, put together by scientists at the University of Rochester and Vanderbilt University, is published in the journal Psychological Science.
How were they so sure? “The only way you can produce smooth eye movements is if you’re following a target,” Tadin tells Shots. When our eyes aren’t tracking something very specific, they tend to jerk around randomly. “If you just try to make your eyes move smoothly, you can’t do it.” The researchers used this knowledge to test whether people could really distinguish their hand movements in the dark.
Text and Image via Neuromorphogenesis
Marian Diamond began her graduate work in 1948 and was the first female student in the department of anatomy at UC Berkeley. The first thing she was asked to do when she got there was sew a cover for a large magnifying machine (?!?!?!?!).
“They didn’t know what to do with me because they weren’t used to having a woman. They thought I was there to get a husband. I was there to learn.”
Such challenges were not uncommon. Years later she requested tissue samples of Albert Einstein’s brain from a pathologist in Missouri. He didn’t trust her.
“He wasn’t sure that I was a scientist. This is one thing that you have to face being a woman. He didn’t think that I should be the one to be looking at Einstein’s brain.”
Marian persisted for three years, calling him once every six months, and received four blocks of the physicist’s brain tissue (about the size of a sugar cube).
Her research found that Einstein had twice as many glial cells as normal males — the discovery caused an international sensation as well as scientific criticism.
What are glial cells? Previously, scientists believe that neurons were responsible for thinking and glial cells were support cells in the brain. Now Researchers believe the glial cells play a critical role in brain development, learning, memory, aging and disease.
All text and Images via UC Research
Scientists have grown miniature human brains in test tubes, creating a “tool” that will allow them to watch how the organs develop in the womb and, they hope, increase their understanding of neurological and mental problems.
Just a few millimetres across, the “cerebral organoids” are built up of layers of brain cells with defined regions that resemble those seen in immature, embryonic brains.
The scientists say the organoids will be useful for biologists who want to analyse how conditions such as schizophrenia or autism occur in the brain. Though these are usually diagnosed in older people some of the underlying defects occur during the brain’s early development.
Human brain ‘organoid’ grown from human pluripotent stem cells. This is a cross-section of the entire organoid showing development of different brain regions. All cells are in blue, neural stem cells in red, and neurons in green. Photograph: Madeline A Lancaster.
The organoids are also expected to be useful in the development and testing of drugs. At present this is done using laboratory animals or isolated human cells; the new organoids could allow pharmacologists to test drugs in more human-like settings.
Scientists have previously made models of other human organs in the lab, including eyes, pituitary glands and livers.
In the latest work researchers at the Institute of Molecular Biotechnology in Vienna started with stem cells and grew them into brain cells in a nourishing gel-like matrix that recreated conditions similar to those inside the human womb. After several months the cells had formed spheres measuring about 3-4mm in diameter.
Text by Alok Jha, science correspondent at The Guardian. Continue article THERE
These are stimulating times for anyone interested in questions of animal consciousness. On what seems like a monthly basis, scientific teams announce the results of new experiments, adding to a preponderance of evidence that we’ve been underestimating animal minds, even those of us who have rated them fairly highly. New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”
Excerpt from an essay written by John Jeremiah Sullivan at Laphan’s Quarterly. Continue HERE
The Cambridge Declaration on Consciousness in Non-Human Animals
BAD NEWS SELLS. If it bleeds, it leads. No news is good news, and good news is no news.
Those are the classic rules for the evening broadcasts and the morning papers, based partly on data (ratings and circulation) and partly on the gut instincts of producers and editors. Wars, earthquakes, plagues, floods, fires, sick children, murdered spouses — the more suffering and mayhem, the more coverage.
But now that information is being spread and monitored in different ways, researchers are discovering new rules. By scanning people’s brains and tracking their e-mails and online posts, neuroscientists and psychologists have found that good news can spread faster and farther than disasters and sob stories.
“The ‘if it bleeds’ rule works for mass media that just want you to tune in,” says Jonah Berger, a social psychologist at the University of Pennsylvania. “They want your eyeballs and don’t care how you’re feeling. But when you share a story with your friends and peers, you care a lot more how they react. You don’t want them to think of you as a Debbie Downer.”
Excerpt from an article written by JOHN TIERNEY, at the NYT. Continue THERE
Daniel Levitin: Tom was one of those people we all have in our lives — someone to go out to lunch with in a large group, but not someone I ever spent time with one-on-one. We had some classes together in college and even worked in the same cognitive psychology lab for a while. But I didn’t really know him. Even so, when I heard that he had brain cancer that would kill him in four months, it stopped me cold.
I was 19 when I first saw him — in a class taught by a famous neuropsychologist, Karl Pribram. I’d see Tom at the coffee house, the library, and around campus. He seemed perennially enthusiastic, and had an exaggerated way of moving that made him seem unusually focused. I found it uncomfortable to make eye contact with him, not because he seemed threatening, but because his gaze was so intense.
Excerpt from an article written by Daniel Levitin at The Atlantic. Read it HERE