A new study from psychologists at the University of Chicago and Pompeu Fabra University in Barcelona finds that people using a foreign language take a relatively utilitarian approach to moral dilemmas, making decisions based on assessments of what’s best for the common good. That pattern holds even when the utilitarian choice would produce an emotionally difficult outcome, such as sacrificing one life so others could live.
“This discovery has important consequences for our globalized world, as many individuals make moral judgments in both native and foreign languages,” says Boaz Keysar, Professor of Psychology at UChicago. “The real world implications could include an immigrant serving as a jury member in a trial, who may approach decision-making differently than a native-English speaker.” Leading author Albert Costa, UPF psychologist adds that “deliberations at places like the United Nations, the European Union, large international corporations or investment firms can be better explained or made more predictable by this discovery.”
Read full article at Science Daily.
When I was a hormone-addled adolescent in the late 1960s and early ’70s, I would often look up at a poster of Sigmund Freud on my brother’s bedroom wall. The title on the portrait – something like ‘Freud: explorer of the unconscious and discoverer of the meaning of dreams’ – depicted a hero of intellectual freedom and creative thought. When you looked at it closely, the portrait seemed to writhe and come alive. In the drug-fueled style of those decades of ongoing sexual revolution, the artist had depicted the nose as an erect penis, the cheeks as a female behind, and the eyes as female breasts. One side of the face was a voluptuous female whose legs wrapped around the body of a muscular male on the other side of the face and, of course, both heads were thrown back in dramatized ecstasy. I recall some of my brother’s stoned friends gazing at the portrait with bewildered looks on their faces, apparently unsure if the writhing torsos they saw were really there or not.
Right from the start, I saw Freud as a kind of secular saint because he was willing to take an unbiased look at himself through the raw material of his dreams. If he found in those dreams a mass of broiling sexual impulses, so be it. Those impulses had to be accepted, understood and explained within a larger picture of the human mind.
Continue this article at AEON.
It’s a postcard-perfect day on Suomenlinna Island, in Helsinki’s South Harbor. Warm for the first week of June, day trippers mix with Russian, Dutch, and Chinese tourists sporting sun shades and carrying cones of pink ice cream.
“Is this the prison?” asks a 40-something American woman wearing cargo pants and a floral sleeveless blouse.
Linda, my guide and translator, pauses beside me between the posts of an open picket fence. After six years of teaching as a volunteer inside American prisons, I’ve come from the private college where I work to investigate the Scandinavian reputation for humane prisons. It’s the end of my twelfth prison tour, and I consider the semantics of the question: If you can’t tell whether you’re in a prison, can it be a prison? I’ve never considered this in so many words. Yet I find that I know the answer, having felt it inside a prison cell in Denmark: There is no punishment so effective as punishment that nowhere announces the intention to punish. Linda is an intern working on a degree in public policy. Young and thoroughly practical, she smiles and says to the tourists, “Yes, you are here.”
Text (Doran Larson) and Image via The Atlantic. Continue THERE
The idea of summoning the spirits took thrilling hold of the Victorian imagination – and has its adherents now. But the psychology behind spiritualism is more intriguing. As the evenings get darker and the first hint of winter hangs in the air, the western world enters the season of the dead. It begins with Halloween, continues with All Saints’ and All Souls’ days, runs through Bonfire Night – the evening where the English burn effigies of historical terrorists – and ends with Remembrance Day. And through it all, Britain’s mediums enjoy one of their busiest times of the year.
People who claim to contact the spirit world provoke extreme reactions. For some, mediums offer comfort and mystery in a dull world. For others they are fraudsters or unwitting fakes, exploiting the vulnerable and bereaved. But to a small group of psychologists, the rituals of the seance and the medium are opening up insights into the mind, shedding light on the power of suggestion and even questioning the nature of free will.
Humanity has been attempting to commune with the dead since ancient times. As far back as Leviticus, the Old Testament God actively forbade people to seek out mediums. Interest peaked in the 19th century, a time when religion and rationality were clashing like never before. In an era of unprecedented scientific discovery, some churchgoers began to seek evidence for their beliefs.
Excerpt from an article by David Derbyshire at The Guardian. Continue THERE
Two eyes, aligned horizontally, above a nose, above a mouth. These are the basic elements of a face, as your brain knows quite well. Within about 200 milliseconds of seeing a picture, the brain can decide whether it’s a face or some other object. It can detect subtle differences between faces, too — walking around at my family reunion, for example, many faces look similar, and yet I can easily distinguish Sue from Ann from Pam.
Our fascination with faces exists, to some extent, on the day we’re born. Studies of newborn babies have shown that they prefer to look at face-like pictures. A 1999 study showed, for example, that babies prefer a crude drawing of a lightbulb “head” with squares for its eyes and nose compared with the same drawing with the nose above the eyes. “I believe the youngest we tested was seven minutes old,” says Cathy Mondloch, professor of psychology at Brock University in Ontario, who worked on that study. “So it’s there right from the get-go.”
Excerpt from an article written by Virginia Hughes at NatGeo. Continue THERE
People who are able to speak two languages usually can do so seamlessly, a trait that likely develops a higher level of mental flexibility, researchers say.
“In the past, bilinguals were looked down upon,” says Judith F. Kroll, distinguished professor of psychology, linguistics and women’s studies at Penn State.
“Not only is bilingualism not bad for you, it may be really good. When you’re switching languages all the time it strengthens your mental muscle and your executive function becomes enhanced.”
Fluent bilinguals seem to have both languages active at all times, whether both languages are consciously being used or not—and both languages are active whether either was used only seconds or several days earlier.
Bilinguals rarely say a word in the unintended language, which suggests that they have the ability to control the parallel activity of both languages and ultimately select the intended language without needing to consciously think about it.
For a study published in Frontiers in Psychology, researchers conducted two separate but related experiments. In the first, 27 Spanish-English bilinguals read 512 sentences, written in either Spanish or English—alternating language every two sentences.
Excerpt from an article by Victoria Indivero-Penn State at Futurity. Continue THERE
For half a century, one theory about the way we experience and express emotion has helped shape how we practice psychology, do police work, and even fight terrorism. But what if that theory is wrong?
Forty-six years ago a young San Francisco–based cowboy of a psychologist named Paul Ekman emerged from the jungle with proof of a powerful idea. During the previous couple of years, he had set out trying to prove a theory popularized in the 19th century by Charles Darwin: that people of all ages and races, from all over the world, manifest emotions the same way. Ekman had traveled the globe with photographs that showed faces experiencing six basic emotions—happiness, sadness, fear, disgust, anger, and surprise. Everywhere he went, from Japan to Brazil to the remotest village of Papua New Guinea, he asked subjects to look at those faces and then to identify the emotions they saw on them. To do so, they had to pick from a set list of options presented to them by Ekman. The results were impressive. Everybody, it turned out, even preliterate Fore tribesmen in New Guinea who’d never seen a foreigner before in their lives, matched the same emotions to the same faces. Darwin, it seemed, had been right. Continue at BOSTON MAGAZINE
Work, friendships, exercise, parenting, eating, reading — there just aren’t enough hours in the day. To live fully, many of us carve those extra hours out of our sleep time. Then we pay for it the next day. A thirst for life leads many to pine for a drastic reduction, if not elimination, of the human need for sleep. Little wonder: if there were a widespread disease that similarly deprived people of a third of their conscious lives, the search for a cure would be lavishly funded. It’s the Holy Grail of sleep researchers, and they might be closing in.
As with most human behaviours, it’s hard to tease out our biological need for sleep from the cultural practices that interpret it. The practice of sleeping for eight hours on a soft, raised platform, alone or in pairs, is actually atypical for humans. Many traditional societies sleep more sporadically, and social activity carries on throughout the night. Group members get up when something interesting is going on, and sometimes they fall asleep in the middle of a conversation as a polite way of exiting an argument. Sleeping is universal, but there is glorious diversity in the ways we accomplish it.
Different species also seem to vary widely in their sleeping behaviours. Herbivores sleep far less than carnivores — four hours for an elephant, compared with almost 20 hours for a lion — presumably because it takes them longer to feed themselves, and vigilance is selected for. As omnivores, humans fall between the two sleep orientations. Circadian rhythms, the body’s master clock, allow us to anticipate daily environmental cycles and arrange our organ’s functions along a timeline so that they do not interfere with one another.
Excerpt from an article written by Jessa Gamble at Aeon. Continue HERE
Dr. Marlene Winell is a human development consultant in the San Francisco Area. She is also the daughter of Pentecostal missionaries. This combination has given her work an unusual focus. For the past twenty years she has counseled men and women in recovery from various forms of fundamentalist religion including the Assemblies of God denomination in which she was raised. Winell is the author of Leaving the Fold – A Guide for Former Fundamentalists and Others Leaving their Religion, written during her years of private practice in psychology. Over the years, Winell has provided assistance to clients whose religious experiences were even more damaging than mine. Some of them are people whose psychological symptoms weren’t just exacerbated by their religion, but actually caused by it.
Two years ago, Winell made waves by formally labeling what she calls “Religious Trauma Syndrome” (RTS) and beginning to write and speak on the subject for professional audiences. When the British Association of Behavioral and Cognitive Psychologists published a series of articles on the topic, members of a Christian counseling association protested what they called excessive attention to a “relatively niche topic.” One commenter said, “A religion, faith or book cannot be abuse but the people interpreting can make anything abusive.”
Is toxic religion simply misinterpretation? What is religious trauma? Why does Winell believe religious trauma merits its own diagnostic label?
Excerpt from an interview with Dr. Marlene Winell by Valerie Tarico at IEET. Continue THERE
A recent study showed that White British heterosexual men’s preferences for larger female breasts were significantly associated with a greater tendency to be benevolently sexist, to objectify women, and to be hostile towards women (Viren Swami and Martin J. Toveé, 2013).
Based on self-reports, they selected a sample of 361 males of Britisch White descent, who didn’t indicate being gay or bisexual or didn’t disclose their preference (average age 30, ranging from 18 to 68). Those were asked to rate the attractability of photo-realistic 3D models that were rotated on the screen. I copied and pasted from the article the model with the smallest breast size out of five on the left, and the one with the largest on the right so you get a bit of an idea. In the study they were presented in colour. After having rated the models, the participants were asked to fill in questionnaires that measure sexist attitudes (Hostility Towards Women Scale HTWS, Attitudes Towards Women Scale AWS and Benevolent Sexism BS subscale of the Ambivalent Sexism Inventory ASI) and that measure objectification of women (an adaptation of the Self-Objectification Scale SOS).
What they found was that on average men found the medium breast size model most attractive, with a skewed distribution towards the larger breast size, which seems unsurprising. The men’s preference for larger breast sizes was significantly and positively correlated with hostility towards women, more sexist attitudes towards women, benevolent sexism and objectification of women. They also found that young men were more likely to rate large breasts as more attractive. Neither education nor relationship status had an effect. Benevolent sexism was the strongest predictor for breast size rating, while objectifaction of women and hostility towards women were also significant predictors.
Via Feminist Philosophers. Continue THERE
Misogynistic attitudes ‘make males more likely to prefer big breasts’
Adultery causes earthquakes? Sexual repression can cause much worse.
BAD NEWS SELLS. If it bleeds, it leads. No news is good news, and good news is no news.
Those are the classic rules for the evening broadcasts and the morning papers, based partly on data (ratings and circulation) and partly on the gut instincts of producers and editors. Wars, earthquakes, plagues, floods, fires, sick children, murdered spouses — the more suffering and mayhem, the more coverage.
But now that information is being spread and monitored in different ways, researchers are discovering new rules. By scanning people’s brains and tracking their e-mails and online posts, neuroscientists and psychologists have found that good news can spread faster and farther than disasters and sob stories.
“The ‘if it bleeds’ rule works for mass media that just want you to tune in,” says Jonah Berger, a social psychologist at the University of Pennsylvania. “They want your eyeballs and don’t care how you’re feeling. But when you share a story with your friends and peers, you care a lot more how they react. You don’t want them to think of you as a Debbie Downer.”
Excerpt from an article written by JOHN TIERNEY, at the NYT. Continue THERE
The search is on for a couple to train as astronauts, for a privately funded mission to Mars. But wouldn’t any couple squabble if cooped up together for 18 months? Explorer Deborah Shapiro, who spent more than a year with her husband in the Antarctic, provides some marital survival tips.
It never ceases to amaze us, but the most common question Rolf and I got after our winter-over, when we spent 15 months on the Antarctic Peninsula, nine of which were in total solitude, was: Why didn’t you two kill each other?
We found the question odd and even comical at first, because the thought of killing each other had never crossed our minds.
We’d answer glibly that because we relied on each other for survival, murder would be counter-productive.
Excerpt from an article on BBC. Continue HERE
How-to writers are to other writers as frogs are to mammals,” wrote the critic Dwight MacDonald in a 1954 survey of “Howtoism.” “Their books are not born, they are spawned.”
MacDonald began his story by citing a list of 3,500 instructional books. Today, there are at least 45,000 specimens in print of the optimize-everything cult we now call “self-help,” but few of them look anything like those classic step-by-step “howtos,” which MacDonald and his Establishment brethren handled only with bemused disdain. These days, self-help is unembarrassed, out of the bedside drawer and up on the coffee table, wholly transformed from a disreputable publishing category to a category killer, having remade most of nonfiction in its own inspirational image along the way.
Many of the books on Amazon’s current list of “Best Sellers in Self-Help” would have been unrecognizable to MacDonald: Times business reporter Charles Duhigg’s The Power of Habit, a tour of the latest behavioral science; Paulo Coelho’s novel The Alchemist, a fable about an Andalusian shepherd seeking treasure in Egypt; Susan Cain’s Quiet: The Power of Introverts in a World That Can’t Stop Talking, a journalistic paean to reticence; publisher Will Schwalbe’s memoir The End of Your Life Book Club, about reading with his dying mother; and A Child Called “It,” David Pelzer’s recollections of harrowing and vicious child abuse. And these are just the books publishers identify as self-help; other hits are simply labeled “business” or “psychology” or “religion.” “There isn’t even a category officially called ‘self-help,’ ” says William Shinker, publisher of Gotham Books. Shinker discovered Men Are From Mars, Women Are From Venus and now publishes books on “willpower” and “vulnerability”—“self-help masquerading as ‘big-idea’ books.”
Excerp from an article written by Boris Kachka at NYMAG. Continue HERE
Image above: “Me”, by Ed Ruscha. (Photo: Paul Ruscha/© Ed Ruscha/Courtesy of Ed Ruscha and Gagosian Gallery (“Me”, 2001))
Narcissism has long gotten a bad rap. Its unseemly reputation dates back at least to ancient Greek mythology, in which the handsome hunter Narcissus (who undoubtedly would be gloating over his present-day fame) discovered his own reflection in a pool of water and fell in love with it. Narcissus was so transfixed by his image that he died staring at it. In 1914 Sigmund Freud likened narcissism to a sexual perversion in which romantic attraction is directed exclusively to the self. Contemporary views are hardly more flattering. Enter the words “narcissists are” into Google, and the four most popular words completing the phrase are “stupid, “evil,” “bullies” and “selfish.”
In 2008 psychologist Jean M. Twenge of San Diego State University and her colleagues found that narcissism scores have been climbing among American college students in the U.S. for the past few decades. Although the data are controversial, these scholars argue that we are living in an increasingly narcissistic culture.
Some of the opprobrium heaped on narcissists is surely deserved. Yet research paints a more nuanced picture. Although narcissists can be difficult and at times insufferable, they can also make effective leaders and performers. Moreover, because virtually all of us share at least a few narcissistic traits, we may be able to learn something about ourselves from understanding them.
Excerpt from an article written by Scott O. Lilienfeld and Hal Arkowitz, at Scientific American. Continue HERE
Image above: Narcissus by Caravaggio (Galleria Nazionale d’Arte Antica, Rome)
In 1980, the Diagnostic and Statistical Manual of Mental Disorders defined trauma as “a recognizable stressor that would evoke significant symptoms of distress in almost everyone” — universally toxic, like a poison. But it turns out that most trauma victims — even survivors of combat, torture or concentration camps — rebound to live full, normal lives. That has given rise to a more nuanced view of trauma — less a poison than an infectious agent, a challenge that most people overcome but that may defeat those weakened by past traumas, genetics or other factors. Now, a significant body of work suggests that even this view is too narrow — that the environment just after the event, particularly other people’s responses, may be just as crucial as the event itself. The idea was demonstrated vividly in two presentations this fall at the Interdisciplinary Conference on Culture, Mind and Brain at the University of California, Los Angeles. Each described reframing a classic model of traumatic experience — one in lab rats, the other in child soldiers.
Excerpt from an article written by DAVID DOBBS at NYT. Continue HERE
For years they have lived as orphans and outliers, a colony of misfit characters on their own island: the bizarre one and the needy one, the untrusting and the crooked, the grandiose and the cowardly.
Their customs and rituals are as captivating as any tribe’s, and at least as mystifying. Every mental anthropologist who has visited their world seems to walk away with a different story, a new model to explain those strange behaviors.
This weekend the Board of Trustees of the American Psychiatric Association will vote on whether to adopt a new diagnostic system for some of the most serious, and striking, syndromes in medicine: personality disorders.
Personality disorders occupy a troublesome niche in psychiatry. The 10 recognized syndromes are fairly well represented on the self-help shelves of bookstores and include such well-known types as narcissistic personality disorder, avoidant personality disorder, as well as dependent and histrionic personalities.
Excerpt from an article written by BENEDICT CAREY at NYT. Continue HERE
Long periods of stability allow risks to accumulate until there is a major disaster; volatility means that things do not get too far out of kilter. In the economy cutting interest rates at the first sign of weakness stores up more trouble for later. In markets getting rid of speculators means prices are more stable in general but any fluctuations cause greater panic. In political systems the stability brought by regimes such as Hosni Mubarak’s in Egypt was artificial; without any effective way for people to express dissent, change leads to collapse.
The principle applies to career choices too. An apparently secure job within a large company disguises a dependency on a single employer and the risk that unemployment will cause a very sudden and steep loss of income. Professions that have more variable earnings, like taxi-driving or prostitution, are less vulnerable to really big shocks. They also use volatility as information: if a cabbie is in a part of town where there are no fares, he heads to a different area.
Excerpt of an article via The Economist. Continue HERE
Lauren was always a top student, but the pressures of her first year studying for a PhD in atmospheric chemistry at a UK university sent her spiraling into depression. At best, she couldn’t focus on academic tasks, feeling as if her brain was “scrambled”; at worst, she couldn’t get out of bed.
She developed a crippling fear of presenting her research. “Doing a PhD is such a personal thing, one that you’ve invested so much time in, that any criticism can feel like a direct reflection of yourself,” says Lauren.
But she did something that many postgraduates do not: she got help. With counseling and medication, Lauren — a pseudonym that she uses on a blog detailing her experience (see go.nature.com/4ta9fo) — is entering the final year of her PhD. Hers is one of more than 50 stories highlighted on the website Students Against Depression, funded by the Charlie Waller Memorial Trust in Thatcham, UK. “The website aims to raise awareness that depression isn’t a personal failing or weakness; it’s a serious condition that requires treatment,” says psychologist Denise Meyer, the website’s project manager.
Text and Image via Nature. Continue article written by Virginia Gewin HERE
Also: The Ones We’ve Lost: The Student Loan Debt Suicides
N 1919, SIGMUND FREUD devoted a brief essay to “The Uncanny” (das Unheimliche). Pages of dictionary definitions were followed by a long literary analysis of E.T.A. Hoffmann’s fantastic 1816 story “The Sandman,” in which a young medical student is threatened by various doubles of mad scientists and perfidious salesmen of glasses and optical instruments, falls in love with what turns out to be a mechanical doll, goes mad and finally kills himself. Examples of the uncanny, taken from Freud’s own experience as well from literature and superstition, included getting lost in the woods and always ending up in the same place, déjà vu, missing body parts, dead objects that turn out to be alive, the fear of being buried alive, meeting one’s double, the evil eye, and so on. From all this, Freud concluded that the uncanny is a mild shade of anxiety or unease that arises when the familiar suddenly appears strange. This occurs when something in the familiar experience or object triggers the return of repressed complexes (for example, castration anxiety), or when certain primitive ideas (for example, the belief that inanimate objects are animated) seem to be reconfirmed. “Among instances of frightening things there must be one class in which the frightening element can be shown to be something repressed which recurs,” Freud wrote:
This class of frightening things would then constitute the uncanny; […] if this is indeed the secret nature of the uncanny, we can understand why linguistic usage has extended das Heimliche [“homely”] into its opposite, das Unheimliche; for this uncanny is in reality nothing new or alien, but something which is familiar and old-established in the mind and which has become alienated from it only through the process of repression.
Excerpt from a review by Anneleen Masschelein on The Memory of Place : A Phenomenology of the Uncanny, LABR. Continue HERE
When people evaluate claims, they often rely on what comedian Stephen Colbert calls “truthiness,” or subjective feelings of truth. In four experiments, we examined the impact of nonprobative information on truthiness. In Experiments 1A and 1B, people saw familiar and unfamiliar celebrity names and, for each, quickly responded “true” or “false” to the (between-subjects) claim “This famous person is alive” or “This famous person is dead.” Within subjects, some of the names appeared with a photo of the celebrity engaged in his or her profession, whereas other names appeared alone. For unfamiliar celebrity names, photos increased the likelihood that the subjects would judge the claim to be true. Moreover, the same photos inflated the subjective truth of both the “alive” and “dead” claims, suggesting that photos did not produce an “alive bias” but rather a “truth bias.” Experiment 2 showed that photos and verbal information similarly inflated truthiness, suggesting that the effect is not peculiar to photographs per se. Experiment 3 demonstrated that nonprobative photos can also enhance the truthiness of general knowledge claims (e.g., Giraffes are the only mammals that cannot jump). These effects add to a growing literature on how nonprobative information can inflate subjective feelings of truth.
Text via Nonprobative photographs (or words) inflate truthiness’ abstract.
Photo via Medical Daily
If ignorance is bliss, then optimism must be euphoria. Thanks to a mechanism called the optimism bias, humans are pretty much incapable of applying basic risk statistics to their own lives. We know smoking causes cancer, but we don’t expect it to happen to us. We find a lump on our body and we tell ourselves it’s probably nothing.
Although the term optimism bias was first used in the 1980’s, psychologist and Nobel Prize winner Daniel Kahneman was most likely the one who made it part of general vocabulary. In his 2011 book “Thinking, Fast and Slow”, Kahneman notes that “people tend to be overly optimistic about their relative standing on any activity in which they do moderately well.” The optimism bias generates the illusion of control: the idea that we are in control of our lives. Bad things only happen to others.
You can see where this bright outlook on life can cause trouble. Wearing seatbelts? Not necessary. Opening a savings account? Maybe later. Being overly optimistic in life puts us at risk. In addition, people who show cheerful, optimistic personality traits during childhood, have a shorter life expectancy than their more serious counter parts. On the other hand, optimists are more psychologically resilient, have stronger immune systems, and live longer on average than more reality-based opposites. So who’s better off in life; the optimist or the pessimist? And who’s reality comes closest to the truth?
Excerpt from an article written by Anouk Vleugels, United Academics. Continue HERE
The Positive Power of Negative Thinking
LAST month, in San Jose, Calif., 21 people were treated for burns after walking barefoot over hot coals as part of an event called Unleash the Power Within, starring the motivational speaker Tony Robbins. If you’re anything like me, a cynical retort might suggest itself: What, exactly, did they expect would happen? In fact, there’s a simple secret to “firewalking”: coal is a poor conductor of heat to surrounding surfaces, including human flesh, so with quick, light steps, you’ll usually be fine.
But Mr. Robbins and his acolytes have little time for physics. To them, it’s all a matter of mind-set: cultivate the belief that success is guaranteed, and anything is possible. One singed but undeterred participant told The San Jose Mercury News: “I wasn’t at my peak state.” What if all this positivity is part of the problem? What if we’re trying too hard to think positive and might do better to reconsider our relationship to “negative” emotions and situations?
Excerpt from an article written by OLIVER BURKEMAN, NYT. Continue HERE
The Baining—one of the indigenous cultural groups of Papua New Guinea—have the reputation, at least among some researchers, of being the dullest culture on earth. Early in his career, in the 1920s, the famous British anthropologist Gregory Bateson spent 14 months among them, until he finally left in frustration. He called them “unstudiable,” because of their reluctance to say anything interesting about their lives and their failure to exhibit much activity beyond the mundane routines of daily work, and he later wrote that they lived “a drab and colorless existence.” Forty years later, Jeremy Pool, a graduate student in anthropology, spent more than a year living among them in the attempt to develop a doctoral dissertation. He too found almost nothing interesting to say about the Baining, and the experience caused him to leave anthropology and go into computer science. Finally, however, anthropologist Jane Fajans, now at Cornell University, figured out a way to study them.
Excerpt from a text written by Peter Gray, Psychology Today. Continue HERE
Many children (and adults) have heard Aesop’s fable about the crow and the pitcher. A thirsty crow comes across a pitcher partly filled with water but can’t reach the water with his beak. So he keeps dropping pebbles into the pitcher until the water level rises high enough. A new study finds that both young children and members of the crow family are good at solving this problem, but children appear to learn it in a very different ways from birds.
Excerpt of an article written by Michael Balter, Science AAAS. Continue HERE
Without memories, we would be lost. Yet, in an extract from his new book,the psychologist Charles Fernyhough reveals that some of our most precious recollections are perhaps not ours at all.
Adult siblings generally do not face the same pressures as, say, married couples to agree on a story about their pasts. Individuals who have spent a lifetime trying to define themselves in opposition to each other are unlikely to be quite as motivated to settle their memory differences. And the fact is that adult siblings usually do not get as many opportunities as couples do to negotiate their memory disputes.
Excerpt of an extract from ‘Pieces of Light: The New Science of Memory’ by Charles Fernyhough. Read it at The Independent
Stanford neuroeconomist Brian Knutson is an expert in the pleasure center of the brain that works in tandem with our financial decisions – the biology behind why we bypass the kitchen coffeemaker to buy the $4 Starbucks coffee every day.
He can hook you up to a brain scanner, take you on a simulated shopping spree and tell by looking at your nucleus accumbens – an area deep inside your brain associated with fight, flight, eating and fornicating – how you process risk and reward, whether you’re a spendthrift or a tightwad.
So when his colleagues saw him putting Tibetan Buddhist monks and nuns into the MRI machine in the basement of the Stanford psychology building, he drew a few double-takes.
Knutson is still interested in the nucleus accumbens, which receives a dopamine hit when a person anticipates something pleasant, like winning at blackjack.
Excerpt of an article written by Meredith May, SFGate. Continue HERE
There’s a jerk inside all of us: we roll our eyes when someone in line has a complicated order, curse at little old ladies who don’t drive fast enough, and sneer at people who are just too happy. Over time, that snark kills our productivity and poisons our relationships. Here’s how to keep your inner asshole in check.
There’s a difference between being occasionally sarcastic and a little derisive in your head, but when negativity becomes your default reaction, you have a problem. You may have had a wake-up moment, much like Anna Holmes, founding editor of Jezebel, had when she realized she was sneering at someone for no reason other than that the person was happy. Here’s what she said:
Just rolled my eyes at a woman skipping happily across 42nd Street. Then I realized I’M the asshole.
— Anna Holmes (@AnnaHolmes) May 31, 2012
How about a quick check. Do you:
-Roll your eyes at every “hipster” who, by most accounts, is just a person trying (successfully or not) to dress fashionably?
-Primarily complain about how horrible people/things are on Facebook/Twitter?
-Get angrier every passing moment that you stand in line at the grocery store, or have to wait for your check to arrive at a restaurant?
-Find you’re constantly frustrated with coworkers who don’t “get it?”
-Comment angrily on blogs, videos, and other web sites (usually beginning with “ummm” and ending with “just saying?”)
-Feel like it’s okay to be a complete jerk, as long as you’re “witty” about it?
Excerpt of an article written by Alan Henry, at Lifehacker. Continue HERE
What, if any, evolutionary advantage does intelligence give us?
Actually, less intelligent people are better at doing most things. In the ancestral environment general intelligence was helpful only for solving a handful of evolutionarily novel problems.
Suggested reading: “The Bell Curve: Intelligence and Class Structure in American Life” by Herrnstein, Richard J. and Charles Murray (1994)
You mean our ancestors did not really have to reason?
Evolution equipped humans with solutions for a whole range of problems of survival and reproduction. All they had to do was to behave in the ways in which evolution had designed them to behave—eat food that tastes good, have sex with the most attractive mates. However, for a few evolutionarily novel problems, evolution equipped us with general intelligence so that our ancestors could reason in order to solve them. These evolutionarily novel problems were few and far between. Basically, dealing with any type of major natural disaster that is very infrequent in occurrence would require general intelligence.
Suggested reading: “Evolutionary Psychology and Intelligence Research” by Satoshi Kanazawa, in American Psychologist; 65: 279-289 (2010)
Excerpt of an interview with Satoshi Kanazawa on intelligence. Continue HERE
SATOSHI KANAZAWA is Reader in Management at the London School of Economics and Political Science, and Honorary Research Fellow in the Department of Psychology at Birkbeck College, University of London. He has written over 80 articles across the fields of psychology, sociology, political science, economics, anthropology and biology. One such was his widely reported article “Why Liberals and Atheists Are More Intelligent” (2010). His latest book is called “The Intelligence Paradox: Why the Intelligent Choice Isn’t Always the Smart One” (2012).
Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?
The vast majority of people respond quickly and confidently, insisting the ball costs ten cents. This answer is both obvious and wrong. (The correct answer is five cents for the ball and a dollar and five cents for the bat.)
For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking. While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.
Excerpt of an article written by Jonah Lehrer, at the New Yorker. Continue HERE