Two eyes, aligned horizontally, above a nose, above a mouth. These are the basic elements of a face, as your brain knows quite well. Within about 200 milliseconds of seeing a picture, the brain can decide whether it’s a face or some other object. It can detect subtle differences between faces, too — walking around at my family reunion, for example, many faces look similar, and yet I can easily distinguish Sue from Ann from Pam.
Our fascination with faces exists, to some extent, on the day we’re born. Studies of newborn babies have shown that they prefer to look at face-like pictures. A 1999 study showed, for example, that babies prefer a crude drawing of a lightbulb “head” with squares for its eyes and nose compared with the same drawing with the nose above the eyes. “I believe the youngest we tested was seven minutes old,” says Cathy Mondloch, professor of psychology at Brock University in Ontario, who worked on that study. “So it’s there right from the get-go.”
Excerpt from an article written by Virginia Hughes at NatGeo. Continue THERE
Infinite Realities® is the 3D scanning service provided by Lee Perry-Smith, the leading 3D modelling and Scanning specialist based in Suffolk, UK. In simple terms, according to them: “We can scan any human being and replicate them in three dimensions as data held in a computer. Our scanning process picks up every detail of their eyes, face, hair, skin colour, body shape and distinguishing features – everything that makes them who they are.”
A downloadable demo by Infinite-Realities put together in Unity features high resolution 3D scans of people in a virtual environment. Incredibly realistic, and can be viewed through an Occulus Rift headset. You really need a next-gen PC to run this demo.
Below are two videos which demonstrate the demo:
Combining 3D scans of real life models in ultra high detail with the Oculus Rift and the Razer Hydra for movement controls to make one of the most realistic and uncanny experiences in Virtual Reality.
Thanks to Yoni Goldstein.
Artist Neil Harbisson is completely colour-blind. Here, he explains how a camera attached to his head allows him to hear colour.
Until I was 11, I didn’t know I could only see in shades of grey. I thought I could see colours but that I was confusing them.
When I was diagnosed with achromatopsia [a rare vision disorder], it was a bit of a shock but at least we knew what was wrong. Doctors said it was impossible to cure.
When I was 16, I decided to study art. I told my tutor I could only see in black and white, and his first reaction was, “What the hell are you doing here then?” I told him I really wanted to understand what colour was.
I was allowed to do the entire art course in greyscale – only using black and white. I did very figurative art, trying to reproduce what I could see so that people could compare how my vision was to what they saw. I also learned that through history, there have been many people who have related colour to sound.
At university I went to a cybernetics lecture by Adam Montandon, a student from Plymouth University, and asked if we could create something so I could see colour. He came up with a simple device, made up of a webcam, a computer and a pair of headphones and created software that would translate any colour in front of me into a sound.
Via BBC. Continue article HERE. Thanks to Crystal Henson
Photo by Suren Manvelyan
As you read these words, try paying attention to something you usually never notice: the movements of your eyes. While you scan these lines of text, or glance at that ad over there or look up from the screen at the room beyond, your eyes are making tiny movements, called saccades, and brief pauses, called fixations. Scientists are discovering that eye movement patterns — where we look, and for how long — reveals important information about how we read, how we learn and even what kind of people we are.
Researchers are able to identify these patterns thanks to the development of eye-tracking technology: video cameras that record every minuscule movement of the eyes. Such equipment, originally developed to study the changes in vision experienced by astronauts in zero-gravity conditions, allows scientists to capture and analyze that always-elusive entity, attention. The way we move our eyes, it turns out, is a reliable indicator of what seizes our interest and of what distracts us. Scientists are now using eye-tracking technology to explore how we learn from text and images, including those viewed onscreen.
Written by Annie Murphy Paul for Time Ideas. Continue HERE
Directed by Koichiro Tsujikawa
Music by Cornelius