(...) · Human-ities · Science

Evidence Rebuts Chomsky’s Theory of Language Learning

A680FCC2-FE3E-4FA4-A1623C6B4EDB9A79_source.png

The idea that we have brains hardwired with a mental template for learning grammar—famously espoused by Noam Chomsky of the Massachusetts Institute of Technology—has dominated linguistics for almost half a century. Recently, though, cognitive scientists and linguists have abandoned Chomsky’s “universal grammar” theory in droves because of new research examining many different languages—and the way young children learn to understand and speak the tongues of their communities. That work fails to support Chomsky’s assertions.

The research suggests a radically different view, in which learning of a child’s first language does not rely on an innate grammar module. Instead the new research shows that young children use various types of thinking that may not be specific to language at all—such as the ability to classify the world into categories (people or objects, for instance) and to understand the relations among things. These capabilities, coupled with a unique hu­­­man ability to grasp what others intend to communicate, allow language to happen. The new findings indicate that if researchers truly want to understand how children, and others, learn languages, they need to look outside of Chomsky’s theory for guidance.

Read HERE

Bio · Technology · Vital-Edible-Health

Nanoparticles may harm the brain

A simple change in electric charge may make the difference between someone getting the medicine they need and a trip to the emergency room—at least if a new study bears out. Researchers investigating the toxicity of particles designed to ferry drugs inside the body have found that carriers with a positive charge on their surface appear to cause damage if they reach the brain.

These particles, called micelles, are one type of a class of materials known as nanoparticles. By varying properties such as charge, composition, and attached surface molecules, researchers can design nanoparticles to deliver medicine to specific body regions and cell types—and even to carry medicine into cells. This ability allows drugs to directly target locations they would otherwise be unable to, such as the heart of tumors. Researchers are also looking at nanoparticles as a way to transport drugs across the blood-brain barrier, a wall of tightly connected cells that keeps most medication out of the brain. Just how safe nanoparticles in the brain are, however, remains unclear.

So Kristina Bram Knudsen, a toxicologist at the National Research Centre for the Working Environment in Copenhagen, and colleagues tested two types of micelles, which were made from different polymers that gave the micelles either a positive or negative surface charge. They injected both versions, empty of drugs, into the brains of rats, and 1 week later they checked for damage. Three out of the five rats injected with the positively charged micelles developed brain lesions. The rats injected with the negatively charged micelles or a saline control solution did not suffer any observable harm from the injections, the team will report in an upcoming issue of Nanotoxicology.

Knudsen speculates that one of the attributes that makes positive micelles and similar nanoparticles such powerful drug delivery systems may also be what is causing the brain damage. Because cells have a negative charge on their outside, they attract positively charged micelles and bring them into the cell. The micelles’ presence in the cell or alteration of the cell’s surface charge, she says, may disrupt the cell’s normal functioning.

Negatively charged nanoparticles can also enter cells, according to other research. However, they do so less readily and must be able to overcome the repulsion between themselves and the cell surface. It is possible that the reason the negatively charged micelles were not found to be toxic was that they did not invade cells to the same extent as the positively charged micelles.

The findings are intriguing, says biomedical engineer Jordan Green of Johns Hopkins University in Baltimore, Maryland. But he cautions that there is no evidence that all positively charged nanoparticles behave this way. Other factors can also play a role in the toxicity of nanoparticles, adds pharmaceutical expert Jian-Qing Gao of Zhejiang University in Hangzhou, China. The size and concentration of the particles, as well as the strain of rat used, could all have influenced the results, he says.

Text and Image via ScienceMag

Human-ities · Science

How the brain creates visions of God

For most of recorded history, human beings situated the mind — and by extension the soul — not within the brain but within the heart. When preparing mummies for the afterlife, for instance, ancient Egyptian priests removed the heart in one piece and preserved it in a ceremonial jar; in contrast, they scraped out the brain through the nostrils with iron hooks, tossed it aside for animals, and filled the empty skull with sawdust or resin. (This wasn’t a snarky commentary on their politicians, either—they considered everyone’s brain useless.) Most Greek thinkers also elevated the heart to the body’s summa. Aristotle pointed out that the heart had thick vessels to shunt messages around, whereas the brain had wispy, effete wires. The heart furthermore sat in the body’s center, appropriate for a commander, while the brain sat in exile up top. The heart developed first in embryos, and it responded in sync with our emotions, pounding faster or slower, while the brain just sort of sat there. Ergo, the heart must house our highest faculties.

Meanwhile, though, some physicians had always had a different perspective on where the mind came from. They’d simply seen too many patients get beaned in the head and lose some higher faculty to think it all a coincidence. Doctors therefore began to promote a brain-centric view of human nature. And despite some heated debates over the centuries—especially about whether the brain had specialized regions or not—by the 1600s most learned men had enthroned the mind within the brain. A few brave scientists even began to search for that anatomical El Dorado: the exact seat of the soul within the brain.

Read full article written by Sam Kean at SALON.
Image above: Eugene Thirion’s “Jeanne d’Arc” (1876)

Bio · Science · Technology

Seeing In The Pitch-Dark Is All In Your Head

A few years ago, cognitive scientist Duje Tadin and his colleague Randolph Blake decided to test blindfolds for an experiment they were cooking up.

They wanted an industrial-strength blindfold to make sure volunteers for their work wouldn’t be able to see a thing. “We basically got the best blindfold you can get.” Tadin tells Shots. “It’s made of black plastic, and it should block all light.”
Tadin and Blake pulled one on just to be sure and waved their hands in front of their eyes. They didn’t expect to be able to see, yet both of them felt as if they could make out the shadowy outlines of their arms moving.
Being scientists, they wondered what was behind the spooky phenomenon. “We knew there wasn’t any visual input there,” Tadin says. They figured their minds were instinctively filling in images where there weren’t any.

After conducting several experiments involving computerized eye trackers, they proved themselves right. Between 50 and 75 percent of the participants in their studies showed an eerie ability to “see” their own bodies moving in total darkness. The research, put together by scientists at the University of Rochester and Vanderbilt University, is published in the journal Psychological Science.

How were they so sure? “The only way you can produce smooth eye movements is if you’re following a target,” Tadin tells Shots. When our eyes aren’t tracking something very specific, they tend to jerk around randomly. “If you just try to make your eyes move smoothly, you can’t do it.” The researchers used this knowledge to test whether people could really distinguish their hand movements in the dark.

Text and Image via Neuromorphogenesis

Bio · Sculpt/Install · Vital-Edible-Health

Don’t Forget the Brain Is as Complex as All the World’s Digital Data

Twenty years ago, sequencing the human genome was one of the most ambitious science projects ever attempted. Today, compared to the collection of genomes of the microorganisms living in our bodies, the ocean, the soil and elsewhere, each human genome, which easily fits on a DVD, is comparatively simple. Its 3 billion DNA base pairs and about 20,000 genes seem paltry next to the roughly 100 billion bases and millions of genes that make up the microbes found in the human body.

And a host of other variables accompanies that microbial DNA, including the age and health status of the microbial host, when and where the sample was collected, and how it was collected and processed. Take the mouth, populated by hundreds of species of microbes, with as many as tens of thousands of organisms living on each tooth. Beyond the challenges of analyzing all of these, scientists need to figure out how to reliably and reproducibly characterize the environment where they collect the data.

“There are the clinical measurements that periodontists use to describe the gum pocket, chemical measurements, the composition of fluid in the pocket, immunological measures,” said David Relman, a physician and microbiologist at Stanford University who studies the human microbiome. “It gets complex really fast.”

Excerpt from an article by Emily Singer at Quanta. Continue THERE

Bio · Science

Einstein’s Brain (…and the neuroscientist who studied it)

Marian Diamond began her graduate work in 1948 and was the first female student in the department of anatomy at UC Berkeley. The first thing she was asked to do when she got there was sew a cover for a large magnifying machine (?!?!?!?!).

“They didn’t know what to do with me because they weren’t used to having a woman. They thought I was there to get a husband. I was there to learn.”

Such challenges were not uncommon. Years later she requested tissue samples of Albert Einstein’s brain from a pathologist in Missouri. He didn’t trust her.

“He wasn’t sure that I was a scientist. This is one thing that you have to face being a woman. He didn’t think that I should be the one to be looking at Einstein’s brain.”

Marian persisted for three years, calling him once every six months, and received four blocks of the physicist’s brain tissue (about the size of a sugar cube).

Her research found that Einstein had twice as many glial cells as normal males — the discovery caused an international sensation as well as scientific criticism.

What are glial cells? Previously, scientists believe that neurons were responsible for thinking and glial cells were support cells in the brain. Now Researchers believe the glial cells play a critical role in brain development, learning, memory, aging and disease.

All text and Images via UC Research

Bio · Science · Technology

Neurons Could Outlive the Bodies That Contain Them

Most of your body is younger than you are. The cells on the topmost layer of your skin are around two weeks old, and soon to die. Your oldest red blood cells are around four months old. Your liver’s cells will live for around 10 to 17 months old before being replaced. All across your organs, cells are being produced and destroyed. They have an expiry date.

In your brain, it’s a different story. New neurons are made in just two parts of the brain—the hippocampus, involved in memory and navigation, and the olfactory bulb, involved in smell (and even then only until 18 months of age). Aside from that, your neurons are as old as you are and will last you for the rest of your life. They don’t divide, and there’s no turnover.

But do neurons have a maximum lifespan, just like skin, blood or liver cells? Yes, obviously, they die when you die, but what if you kept on living? That’s not a far-fetched question at a time when medical and technological advances promise to prolong our lives well past their usual boundaries. Would we reach a point when our neurons give up before our bodies do?

Image above: Stainless steel sculpture “Neuron” by Roxy Paine. Outside the Museum of Contemporary Art, Sydney.
Excerpt from an article written by Ed Yong at NATGEO. Continue THERE

Bio · Design · Science · Technology · Vital-Edible-Health

Miniature brains grown in test tubes – a new path for neuroscience?

Scientists have grown miniature human brains in test tubes, creating a “tool” that will allow them to watch how the organs develop in the womb and, they hope, increase their understanding of neurological and mental problems.

Just a few millimetres across, the “cerebral organoids” are built up of layers of brain cells with defined regions that resemble those seen in immature, embryonic brains.

The scientists say the organoids will be useful for biologists who want to analyse how conditions such as schizophrenia or autism occur in the brain. Though these are usually diagnosed in older people some of the underlying defects occur during the brain’s early development.

Human brain ‘organoid’ grown from human pluripotent stem cells. This is a cross-section of the entire organoid showing development of different brain regions. All cells are in blue, neural stem cells in red, and neurons in green. Photograph: Madeline A Lancaster.

The organoids are also expected to be useful in the development and testing of drugs. At present this is done using laboratory animals or isolated human cells; the new organoids could allow pharmacologists to test drugs in more human-like settings.

Scientists have previously made models of other human organs in the lab, including eyes, pituitary glands and livers.

In the latest work researchers at the Institute of Molecular Biotechnology in Vienna started with stem cells and grew them into brain cells in a nourishing gel-like matrix that recreated conditions similar to those inside the human womb. After several months the cells had formed spheres measuring about 3-4mm in diameter.

Text by Alok Jha, science correspondent at The Guardian. Continue article THERE

Digital Media · Sonic/Musical · Technology

How Do Our Brains Process Music? by David Byrne

In an excerpt from his new book, David Byrne explains why sometimes, he prefers hearing nothing:

“I listen to music only at very specific times. When I go out to hear it live, most obviously. When I’m cooking or doing the dishes I put on music, and sometimes other people are present. When I’m jogging or cycling to and from work down New York’s West Side Highway bike path, or if I’m in a rented car on the rare occasions I have to drive somewhere, I listen alone. And when I’m writing and recording music, I listen to what I’m working on. But that’s it.

I find music somewhat intrusive in restaurants or bars. Maybe due to my involvement with it, I feel I have to either listen intently or tune it out. Mostly I tune it out; I often don’t even notice if a Talking Heads song is playing in most public places. Sadly, most music then becomes (for me) an annoying sonic layer that just adds to the background noise.

As music becomes less of a thing—a cylinder, a cassette, a disc—and more ephemeral, perhaps we will start to assign an increasing value to live performances again. After years of hoarding LPs and CDs, I have to admit I’m now getting rid of them. I occasionally pop a CD into a player, but I’ve pretty much completely converted to listening to MP3s either on my computer or, gulp, my phone! For me, music is becoming dematerialized, a state that is more truthful to its nature, I suspect. Technology has brought us full circle.”

Text and Image via the Smithsonian. Continue THERE

Science · Vital-Edible-Health

The Eye, the Brain, and What happens during Death Experience

Scout motto is “be prepared,” but it’s hard to be prepared for death, be it our own or a loved one’s. Too much is unknown about what dying feels like or what, if anything, happens after you die to ever feel truly ready. However, we do know a bit about the process that occurs in the days and hours leading up to a natural death, and knowing what’s going on may be helpful in a loved one’s last moments.

During the dying process, the body’s systems shut down. The dying person has less energy and begins to sleep more and more. The body is conserving the little energy it has, and as a result, needs less nourishment and sustenance. In the days (or sometimes weeks) before death, people eat and drink less. They may lose all interest in food and drink, and you shouldn’t force them to eat. In fact, pushing food or drink on a dying person could cause him or her to choke — at this point, it has become difficult to swallow and the mouth is very dry.

As the person takes in less food and drink, he or she will urinate less frequently and have fewer bowel movements. The person may also experience loss of bladder and bowel control. People who are dying may become confused, agitated or restless, which could be a result of the brain receiving less oxygen. It can be disconcerting and painful to hear a loved one so confused in his or her last days.

The skin will also show the effects of slowing circulation and less oxygen — the extremities, and later, the entire body, may be cool to the touch and may turn blue or light gray. Some skin may exhibit signs of mottling, which is reddish-blue blotchiness. As the person gets closer to death, it will become harder and harder to breathe. Respiration will be noisy and irregular; it will sometimes seem as if the person can’t breathe at all. When there’s fluid in the lungs, it can cause a sound known as the death rattle. It may be possible to alleviate the gurgling and congestion by raising the person’s head. If the dying person is experiencing pain, he or she will usually be given medications to manage it.

When we’re watching someone die, we may have a preconceived notion of how the person should handle death emotionally and spiritually. It’s important to remember that every person experiences dying differently. Some people have the need to say goodbye or to hear from another person before death, some don’t. Some people prefer to partake in religious rites, while others may remain silent until the end and pass away when everyone has left the room. Doctors and other professionals who manage end-of-life care advise loved ones to take their cues from the dying and avoid projecting their own desires or needs onto the person. They also urge loved ones to continue speaking comfortingly to a dying person — hearing may be one of the last things to go.

Clinical death occurs when the person’s heartbeat, breathing and circulation stop. Four to six minutes later, biological death occurs. That’s when brain cells begin to die from lack of oxygen, and resuscitation is impossible.

All Text via How Stuff Works. See Experiencing Death video THERE

Electrodes trigger out-of-body experience

Bio · Science · Technology · Vital-Edible-Health

Russian billionaire reveals real-life ‘avatar’ plan – and says he will upload his brain to a hologram and become immortal by 2045

32 year-old Dmitry Itskov believes technology will allow him to live forever in a hologram body. His ‘2045 initiative’ is described as the next step in evolution, and over 20,000 people have signed up on Facebook to follow its progress, with global conferences planned to explore the technology needed.

‘We are in the process of creating focus groups of experts,’ said Itskov. ‘Along with these teams, we will prepare goal statements and research programs schedules.’ The foundation has already planned out its timeline for getting to a fully holographic human, and claims it will be ready to upload a mind into a computer by 2015, a timeline even Itskov says is ‘optimistic’.

‘The four tracks and their suggested deadlines are optimistic but feasible,’ he said of the foundation’s site.
‘This is our program for the next 35 years, and we will do our best to complete it.’
The ultimate aim is for a hologram body.
‘The fourth development track seems the most futuristic one,’ said Itskov.
‘It’s intent is to create a holographic body. Indeed, its creation is going to be the most complicated task, but at the same time could be the most thrilling problem in the whole of human evolution.’

Continue HERE

Bio · Science · Technology

How to Make an Implant that Improves the Brain

The abilities to learn, remember, evaluate, and decide are central to who we are and how we live. Damage to or dysfunction of the brain circuitry that supports these functions can be devastating, leading to Alzheimer’s, schizophrenia, PTSD, or many other disorders. Current treatments, which are drug-based or behavioral, have limited efficacy in treating these problems. There is a pressing need for something more effective.

One promising approach is to build an interactive device to help the brain learn, remember, evaluate, and decide. One might, for example, construct a system that would identify patterns of brain activity tied to particular experiences and then, when called upon, impose those patterns on the brain. Ted Berger, Sam Deadwyler, Robert Hampsom, and colleagues have used this approach. They are able to identify and then impose, via electrical stimulation, specific patterns of brain activity that improve a rat’s performance in a memory task. They have also shown that in monkeys stimulation can help the animal perform a task where it must remember a particular item.

Their ability to improve performance is impressive. However, there are fundamental limitations to an approach where the desired neural pattern must be known and then imposed. The animals used in their studies were trained to do a single task for weeks or months and the stimulation was customized to produce the right outcome for that task. This is only feasible for a few well-learned experiences in a predictable and constrained environment.

Text (Loren M. Frank) and Image via MIT Technology Review. Continue HERE

Bio · Science · Technology · Vital-Edible-Health

Researchers Identify The Key to Aging In The Hypothalamus

An exciting new study published in the prestigious journal Nature shows for the first time that manipulation of a brain chemical in a single region influences lifespan.

The researchers at Albert Einstein College of Medicine measured the activity of a molecule called NF-κB in the brains of mice. Specifically they looked as levels of NF-κB in an area of the brain called the hypothalamus. This region is considered a deep old brain region and is involved in circadian rhythm, sleep/wake, hunger and thirst functioning.

NF-κB itself is a protein that controls DNA transcription and is involved in stress and inflammatory responses.

They discovered that NF-κB levels became higher as the mice age, and the high levels were due to increasing age-related inflammation in the hypothalamus. When they blocked NF-κB activation, the mice lived longer. Increasing NF-κB activity reduced lifespan.

Furthermore inhibition of NF-κB produced dramatically reduced evidence of cognitive and motor decline in the animals suggesting the molecule stimulates the development of disease.

They were also able to increase the mean and maximum lifespan by 23% and 20% respectively in middle aged mice by inhibiting IKK-β, an enzyme that activates NF-κB.

It is also reported that NF-κB blocks gonadotropin releasing hormone (GnRH), and by giving mice GnRH aging was slowed.

This research is being hailed as a major breakthrough in aging and could quickly lead to real therapies to prolong human lifespan, which could even simply involve regular administration of GnRH.

It suggests that cumulative stress and inflammation in the body and the hypothalamus in particular signals increased production of NF-kB in the hypothalamus which then accelerates aging leading to decline and death. It also proves that a small crucial brain region may control aging in the whole body.

The authors conclude:

To summarize, our study using several mouse models demonstrates that the hypothalamus is important for systemic ageing and lifespan control. This hypothalamic role is significantly mediated by IKK-band NF-kB-directed hypothalamic innate immunity involving microglia–neuron crosstalk. The underlying basis includes integration between immunity and neuroendocrine of the hypothalamus, and immune inhibition and GnRH restoration in the hypothalamus or the brain represent two potential strategies for combating ageing-related health problems.

Full text HERE. Text and Image via Extreme Longevity

Human-ities · Performativity · Science · Technology

Will we ever communicate telepathically?

In a lab at Harvard Medical School, a man is using his mind to wag a rat’s tail. To send his command, he merely glances at a strobe light flickering on a computer screen, and a set of electrodes stuck to his scalp detects the activity triggered in his brain. A computer processes and relays the electrodes’ signal to an ultrasound machine poised over the rat’s head. The machine delivers a train of low-energy ultrasound pulses into the rat’s brain, stimulating its motor cortex – the area that governs its movements. The pulses are aimed purposely at a rice-grain-sized area that controls the rat’s tail. It starts to wag.

This link-up is the brainchild of Seung-Schik Yoo, and it works more than 94% of the time. Whenever a human looks at the flickering lights, the rat’s tail almost always starts to wag just over a second later. The connection between them is undeniably simple. The volunteer is basically flicking a switch in the rat’s brain between two positions – move tail, and don’t move tail. But it is still an impressive early example of something we will see more of in coming years – a way to connect between two living brains.

Science-fiction is full of similar (if more flamboyant) brain-to-brain links. From the Jedi knights of Star Wars to various characters in the X-Men comics, popular culture abounds with telepathic characters that can read minds and transmit their thoughts without any direct physical contact or the use of their senses. There’s no evidence that any of us mere mortals share the same ability, but as Yoo’s study shows, technology is edging us closer in that direction. The question is: how far can we recreate telepathy using electronics? A human wagging a rat’s tail is one thing. Will we ever get to the point where we can share speech or emotions or memories?

Excerpt from an article written by Ed Yong at Phenomena/NatGeo. Continue HERE

Bio · Photographics · Vital-Edible-Health

Malformed – A Collection of Human Brains from the Texas State Mental Hospital

Two years ago Scientific American magazine sent me to the University of Texas at Austin to borrow a human brain. They needed me to photograph a normal, adult, non-dissected brain that the university had obtained by trading a syphilitic lung with another institution. The specimen was waiting for me, but before I left they asked if I’d like to see their collection.

I walked into a storage closet filled with approximately one-hundred human brains, none of them normal, taken from patients at the Texas State Mental Hospital. The brains sat in large jars of fluid, each labeled with a date of death or autopsy, a brief description in Latin, and a case number. These case numbers corresponded to micro film held by the State Hospital detailing medical histories. But somehow, regardless of how amazing and fascinating this collection was, it had been largely untouched, and unstudied for nearly three decades.

Driving back to my studio with a brain snugly belted into the passenger seat, I quickly became obsessed with the idea of photographing the collection, preserving the already decaying brains, and corresponding the images to their medical histories. I met with my friend Alex Hannaford, a features journalist, to help me find the collection’s history dating back to the 1950s.

Excerpt from a text by Adam Voorhes. Continue HERE

Bio · Digital Media · Science · Technology · Videos

See-through brains clarify connections

A chemical treatment that turns whole organs transparent offers a big boost to the field of ‘connectomics’ — the push to map the brain’s fiendishly complicated wiring. Scientists could use the technique to view large networks of neurons with unprecedented ease and accuracy. The technology also opens up new research avenues for old brains that were saved from patients and healthy donors.

“This is probably one of the most important advances for doing neuroanatomy in decades,” says Thomas Insel, director of the US National Institute of Mental Health in Bethesda, Maryland, which funded part of the work. Existing technology allows scientists to see neurons and their connections in microscopic detail — but only across tiny slivers of tissue. Researchers must reconstruct three-dimensional data from images of these thin slices. Aligning hundreds or even thousands of these snapshots to map long-range projections of nerve cells is laborious and error-prone, rendering fine-grain analysis of whole brains practically impossible.

The new method instead allows researchers to see directly into optically transparent whole brains or thick blocks of brain tissue. Called CLARITY, it was devised by Karl Deisseroth and his team at Stanford University in California. “You can get right down to the fine structure of the system while not losing the big picture,” says Deisseroth, who adds that his group is in the process of rendering an entire human brain transparent.

Excerpt from an article written by Helen Shen at Nature. Continue THERE

Bio · Human-ities · Projects · Science · Technology

The Human Brain Project

The Human Brain Project’s first goal is to build an integrated system of six ICT-based research platforms, providing neuroscientists, medical researchers and technology developers with access to highly innovative tools and services that can radically accelerate the pace of their research. These will include a Neuroinformatics Platform, that links to other international initiatives, bringing together data and knowledge from neuroscientists around the world and making it available to the scientific community; a Brain Simulation Platform, that integrates this information in unifying computer models, making it possible to identify missing data, and allowing in silico experiments, impossible in the lab; a High Performance Computing Platform that provides the interactive supercomputing technology neuroscientists need for data-intensive modeling and simulations; a Medical Informatics Platform that federates clinical data from around the world, providing researchers with new mathematical tools to search for biological signatures of disease; a Neuromorphic Computing Platform that makes it possible to translate brain models into a new class of hardware devices and to test their applications; a Neurorobotics Platform, allowing neuroscience and industry researchers to experiment with virtual robots controlled by brain models developed in the project. The platforms are all based on previous pioneering work by the partners and will be available for internal testing within eighteen months of the start of the project. Within thirty months, the platforms will be open for use by the community, receiving continuous upgrades to their capabilities, for the duration of the project.

The second goal of the project is to trigger and drive a global, collaborative effort that uses the platforms to address fundamental issues in future neuroscience, future medicine and future computing. A significant and steadily growing proportion of the budget will fund research by groups outside the original HBP Consortium, working on themes of their own choosing. Proposals for projects will be solicited through competitive calls for proposals and evaluated by independent peer review.

The end result will be not just a new understanding of the brain but transformational new ICT. As modern computers exploit ever-higher numbers of parallel computing elements, they face a power wall: power consumption rises with the number of processors, potentially to unsustainable levels. By contrast, the brain manages billions of processing units connected via kilometres of fibres and trillions of synapses, while consuming no more power than a light bulb. Understanding how it does this – the way it computes reliably with unreliable elements, the way the different elements of the brain communicate – can provide the key not only to a completely new category of hardware (Neuromorphic Computing Systems) but to a paradigm shift for computing as a whole, moving away from current models of “bit precise” computing towards new techniques that exploit the stochastic behaviour of simple, very fast, low-power computing devices embedded in intensely recursive architectures. The economic and industrial impact of such a shift is potentially enormous.

Text via The Human Brain Project

Animalia · Bio · Science · Technology

The Power of Swarms Can Help Us Fight Cancer, Understand the Brain, and Predict the Future

Thanks to new observation technologies, powerful software, and statistical methods, the mechanics of collectives are being revealed. Indeed, enough physicists, biologists, and engineers have gotten involved that the science itself seems to be hitting a density-dependent shift. Without obvious leaders or an overarching plan, this collective of the collective-obsessed is finding that the rules that produce majestic cohesion out of local jostling turn up in everything from neurons to human beings. Behavior that seems impossibly complex can have disarmingly simple foundations. And the rules may explain everything from how cancer spreads to how the brain works and how armadas of robot-driven cars might someday navigate highways. The way individuals work together may actually be more important than the way they work alone.

Excerpt from an article written by Ed Yong at WIRED. Continue THERE

Science · Technology · Vital-Edible-Health

Flip of a single molecular switch makes an old brain young


The flip of a single molecular switch helps create the mature neuronal connections that allow the brain to bridge the gap between adolescent impressionability and adult stability. Now Yale School of Medicine researchers have reversed the process, recreating a youthful brain that facilitated both learning and healing in the adult mouse.

Scientists have long known that the young and old brains are very different. Adolescent brains are more malleable or plastic, which allows them to learn languages more quickly than adults and speeds recovery from brain injuries. The comparative rigidity of the adult brain results in part from the function of a single gene that slows the rapid change in synaptic connections between neurons.

Excerpt from an press release by Bill Hathaway at Yale News. Continue HERE

Digital Media · Performativity · Technology

Brain Amplifiers and the Future of Computer Interaction

Amplifiers for the human brain, designed to allow people with paralysis to interact with the world, aren’t the most easily understood technology. So g.tec, the company that makes them, has come up with the following creative marketing strategy: Convince us that we’ll soon be interacting with computers through thought alone.

Here, for example, is a university project in which a student uses his brain to control a Rube-Goldbergian sort of etch-a-sketch, allowing him to write—albeit very crudely and slowly—without picking up a pen. And today at tech fair CeBIT, the company unveiled a new application that allows people, able-bodied and not, to paint pictures without lifting a finger.

Text and Images via Business Insider. Read full article HERE

Earthly/Geo/Astro · Science · Technology · Vital-Edible-Health

Green tea extract interferes with the formation of amyloid plaques in Alzheimer’s disease

Researchers at the University of Michigan have found a new potential benefit of a molecule in green tea: preventing the misfolding of specific proteins in the brain. The aggregation of these proteins, called metal-associated amyloids, is associated with Alzheimer’s disease and other neurodegenerative conditions. A paper published recently in the Proceedings of the National Academy of Sciences explained how U-M Life Sciences Institute faculty member Mi Hee Lim and an interdisciplinary team of researchers used green tea extract to control the generation of metal-associated amyloid-β aggregates associated with Alzheimer’s disease in the lab.

Excerpt from an article at PhysOrg. Continue HERE

Bio · Human-ities · Philosophy · Science · Theory

SPLIT BRAIN, SPLIT VIEWS

1) Given that the brain consists in a mass of connections, whose power depends on the number and complexity of those connections, why is it divided? Or is that just random, and we should give up trying to find a pattern which make sense in terms of evolutionary advantage? (Animal ethologists have already found that asymmetry is an evolutionary advantage, and some of the reasons why – I take those into account in the book.) 
2) Is it logical or just a prejudice to dismiss the idea that there are significant hemisphere differences? 
3) If it is logical, why? If it is not logical, should we not all be interested in what sort of difference this might be? 
4) If not, why not? If so, what sort of difference would he himself suggest? 
5) Failing any suggestion of his own, why is he opposed to others making suggestions? 
6) Since it is in the nature of a general question that the answer will be general, what sort of criticism is it that an answer that has been offered is general in nature (though highly specific in its unfolding of the many aspects of cerebral function involved, of the implications for the phenomenological world, and in the data that are adduced)? 
7) It is in the nature of generalisations that they are general. It is also almost always the case that there will be exceptions. Does that mean that no generalisations should ever be attempted for fear of being called generalisations or because there are exceptions? 
8) I have never tried to hide the difficulties surrounding generalisations. My book is replete with caveats, qualifications, and admonitions to the reader. Does either KM or Ray Tallis think they have said anything substantial by calling a generalisation ‘sweeping’? What kind of generalisation is not, other than one that is qualified?

Excerpt from a response from Kenan Malik to Iain McGilchrist. Read it HERE

Kenan Malik is an Indian-born English writer, lecturer and broadcaster, trained in neurobiology and the history of science.

Iain McGilchrist is a psychiatrist, doctor, writer, and former Oxford literary scholar. McGilchrist came to prominence after the publication of his book The Master and His Emissary, subtitled The Divided Brain and the Making of the Western World.

Education · Vital-Edible-Health

What Our Brains Can Teach Us

So it goes with the brain. We are the aliens in that landscape, and the brain is an even more complicated cipher. It is composed of 100 billion electrically active cells called neurons, each connected to many thousands of its neighbors. Each neuron relays information in the form of miniature voltage spikes, which are then converted into chemical signals that bridge the gap to other neurons. Most neurons send these signals many times per second; if each signaling event were to make a sound as loud as a pin dropping, the cacophony from a single human head would blow out all the windows. The complexity of such a system bankrupts our language; observing the brain with our current technologies, we mostly detect an enigmatic uproar.

Looking at the brain from a distance isn’t much use, nor is zooming in to a single neuron. A new kind of science is required, one that can track and analyze the activity of billions of neurons simultaneously.

Excerpt from an article written by DAVID EAGLEMAN, NYT. Continue HERE

Animalia · Science

Zombies, Disinfection, and the Jewel Wasp

This gorgeous animal, which measures just under an inch from mandibles to tail, lives across much of Africa and Asia, as well as a few Pacific Islands. Don’t be fooled by its lovely glittering appearance, though. This is a deeply sinister creature. Jewel wasps don’t rear their young in a familiar paper nest. For them, home is the inside of a cockroach.

When the female wasps are ready to lay their eggs, they take to the air and search for roaches. They find them on trees, on the ground, and even in people’s apartments. Since cockroaches don’t want to play host to their young, the wasps have to sneak up on their victims and subdue them–without killing them. So a wasp will sneak up and clamps her mandibles on the roach. As the roach tries to shake her off, the wasp hooks her tail underneath and stings her victim just below the head, temporarily paralyzing the roach’s front legs. Now the roach is easier to handle. The wasp then conducts brain surgery.

The jewel wasp (Ampulex Compressa) snakes its stinger up into the cockroach’s brain, using sensors at its tip to feel its way to specific regions where it then releases cocktails of neurotransmitters. The wasp removes her stinger and walks away to find a crevice that will serve as a suitable burrow. Her first sting wears off, and the roach is now free to run away. Except it doesn’t. It becomes the insect equivalent of a zombie, having lost all will.

All text and Images via The Loom/Carl Zimmer

Bio · Human-ities

Anything But Human

The very idea of an “ought” is foreign to evolutionary theory. It makes no sense for a biologist to say that some particular animal should be more cooperative, much less to claim that an entire species ought to aim for some degree of altruism. If we decide that we should neither “dissolve society” through extreme selfishness, as Wilson puts it, nor become “angelic robots” like ants, we are making an ethical judgment, not a biological one. Likewise, from a biological perspective it has no significance to claim that I should be more generous than I usually am, or that a tyrant ought to be deposed and tried. In short, a purely evolutionary ethics makes ethical discourse meaningless.

Excerpt from an article written by RICHARD POLT, NYT. Continue HERE

Human-ities · Science · Technology

Blueprint for the Brain

How can three pounds of jelly inside our skulls enable us to do everything that makes us human? For centuries, scientists have been fascinated and puzzled by the mysterious workings of the brain. Now, for the first time, they can re-create in the computer the shapes of every one of the billions of nerve cells that make up our brains, the component parts of the intricate neural circuits that allow us to move, see and hear, to feel and to think. Armed with this new tool, scientists are beginning to decipher the secrets of the brain’s architecture, which may one day enable us to build smart technologies that surpass the capabilities of anything we have today.

Text via Science Bytes. Continue HERE

Digital Media · Motion Graphics · Technology · Videos

Female Orgasm in Brodmann Brain Regions

The human brain can be separated into regions based on structure and function – vision, audition, body sensation, etc, known as Brodmann’s area map.

This animation shows the functional magnetic resonance imaging, fMRI, brain data of a participant experiencing an orgasm and the corresponding relationships seen within these different regions based on utilization of oxygen levels in the blood. 20 snapshots in time of the fMRI data are taken from a 7 minute sequence. Over the course of the 7 minutes the participant approaches orgasm, reaches orgasm and then enters a quiet period.

Oxygen utilization levels are displayed on a spectrum from dark red (lowest activity) to yellow/white (highest). As can be observed, an orgasm leads to almost the entire brain illuminating yellow, indicating that most brain systems become active at orgasm.

Text and Image by The Visual MD

Via The Guardian

Human-ities · Science · Theory

Mind bending: Why our memories are not always our own

Without memories, we would be lost. Yet, in an extract from his new book,the psychologist Charles Fernyhough reveals that some of our most precious recollections are perhaps not ours at all.

Adult siblings generally do not face the same pressures as, say, married couples to agree on a story about their pasts. Individuals who have spent a lifetime trying to define themselves in opposition to each other are unlikely to be quite as motivated to settle their memory differences. And the fact is that adult siblings usually do not get as many opportunities as couples do to negotiate their memory disputes.

Excerpt of an extract from ‘Pieces of Light: The New Science of Memory’ by Charles Fernyhough. Read it at The Independent

Human-ities · Science · Theory

What Happens to Consciousness When We Die

Where is the experience of red in your brain? The question was put to me by Deepak Chopra at his Sages and Scientists Symposium in Carlsbad, Calif., on March 3. A posse of presenters argued that the lack of a complete theory by neuroscientists regarding how neural activity translates into conscious experiences (such as redness) means that a physicalist approach is inadequate or wrong. The idea that subjective experience is a result of electrochemical activity remains a hypothesis, Chopra elaborated in an e-mail. It is as much of a speculation as the idea that consciousness is fundamental and that it causes brain activity and creates the properties and objects of the material world.

Excerpt of an article written by y Michael Shermer, at Scientific American. Continue HERE

Human-ities · Science · Vital-Edible-Health

An Alzheimer’s Researcher Ends Up on the Drug She Helped Invent


Given her relatively young age, Dr. Rae Lyn Burke didn’t think much about her family history of Alzheimer’s disease — a grandmother and an aunt had suffered from it, but they were much older. Ironically, Burke was just in her late 50s when she started having her own symptoms of early onset Alzheimer’s. Even more ironic is that Burke had been one of the key developers of the Alzheimer’s drug bapineuzumab, which she now takes herself to reduce the progression of the disease in her own brain.

“My expertise was in vaccine development,” says Burke, “so when Elan Pharmaceuticals got surprising evidence that a vaccine approach might be of value in treating Alzheimer’s disease they recruited me as a consultant, since this was a new area for them. My particular role was to ask how adjuvants might potentiate the immune response.” In other words, Burke figured out what compounds could be added to bapineuzumab, an antibody vaccine, that might help kick the recipient’s immune system into higher gear.

Excerpt of an article written by Alice G. Walton, The Atlantic. Continue HERE