Infinite Realities® is the 3D scanning service provided by Lee Perry-Smith, the leading 3D modelling and Scanning specialist based in Suffolk, UK. In simple terms, according to them: “We can scan any human being and replicate them in three dimensions as data held in a computer. Our scanning process picks up every detail of their eyes, face, hair, skin colour, body shape and distinguishing features – everything that makes them who they are.”
A downloadable demo by Infinite-Realities put together in Unity features high resolution 3D scans of people in a virtual environment. Incredibly realistic, and can be viewed through an Occulus Rift headset. You really need a next-gen PC to run this demo.
Below are two videos which demonstrate the demo:
Combining 3D scans of real life models in ultra high detail with the Oculus Rift and the Razer Hydra for movement controls to make one of the most realistic and uncanny experiences in Virtual Reality.
Impressive demonstration of turning 2D objects in photographs into manipulable 3D objects, using a simple 3 point method at key areas. Via kesen.realtimerendering.com
To put a human face on our ancestors, scientists from the Senckenberg Research Institute used sophisticated methods to form 27 model heads based on tiny bone fragments, teeth and skulls collected from across the globe. The heads are on display for the first time together at the Senckenberg Natural History Museum in Frankfurt, Germany. Continue HERE
For the first time in history individual piano notes have been made visible using the CymaScope instrument. The piano notes were painstakingly recorded by Evy King and then fed into the CymaScope one by one and the results recorded in high definition video. Click HERE to see sound.
Shannon Novak, a New Zealand-born fine artist, commissioned us to image 12 piano notes as inspiration for a series of 12 musical canvases. We decided to image the notes in video mode because when we observed the ‘A1’ note we discovered, surprisingly, that the energy envelope changes over time as the string’s harmonics mix in the piano’s wooden bridge. Instead of the envelope being fairly stable, as we had imagined, the harmonics actually cause the CymaGlyphs to be wonderfully dynamic. Our ears can easily detect the changes in the harmonics and the CymaScope now reveals them–probably a first in acoustic physics.
Capturing the dynamics was only possible with HD video but taming the dynamics of the piano’s first strike, followed by the short plateau and long decay phase, was tricky. We achieved the result with the help of a professional audio compressor operating in real time.
The Cymascope is an instrument that makes sound or music visible, creating detailed 3D impressions of sound or music vibrations. Here the rapidly expanding sphere is captured in a frozen moment. The interior reveals a beautiful and complex structure representing the rich harmonic nature of violin music.
Oscar Niemeyer’s celebrated work will soon be available in 3D, courtesy of Paddle8, the online art seller, and Visionaire, a multi-format art publication. See it HERE
Univers Revolvedis a three-dimensional alphabet. It invites the reader to play with their imaginative mind and think beyond the conversation of their familiar reading method.
A team at Manchester Royal Infirmary hospital, England, claim to be the first surgeons to perform keyhole surgery using 3D cameras and monitors — and embarrassingly clunky spectacles. Furthermore, if that wasn’t high-tech enough, the lead surgeon also performed the surgery using a hand-held robotic claw.
Call it automated photograph station, seven-camera system, 3-D model showcase, or digital reconstruction tool. OrcaM is being described as all these things. Whatever the tag, the “OrcaM” name stands for Orbital Camera System, according to its Germany-based developers NEK GmbH. A video demo was making the rounds of web gadget blogs and news sites this week as a camera system to watch.
The OrcaM system involves a large sphere, likened by one viewer as a giant maw, inside which one places the desired object for 3-D scanning. Once the object is placed inside, the sphere is sealed shut and the seven cameras and lights go to work. The cameras take simultaneous high-definition photos of the object at different angles. Serving to define the object’s geometry, various combinations of lights illuminate the object differently for every shot, capturing the finest details. After the photo processing, computer processing of the image creates the 3-D model. Observers say the end result is a highly impressive agreement of the real object.
This video demonstrates the OrcaM 3D reconstruction system, developed in the context of a project of the department Augmented Vision of DFKI (http://av.dfki.de)
In this video it is shown how the hardware is opened to insert an object to be reconstructed. Currently the maximum size of objects is limited to 80cm diameter and a weight of approximately 100kg.
After closing the sphere again the acquisition process is fully automatic, though tuneable to account for complicated object geometries. Please note that the acquisition process has been extremely condensed and only drafts some steps necessary to acquire the respective information for a single camera position. I.e. horizontal and vertical fringe projection, directed illumination with light(patches), rotation of the carrier, etc. After the acquisition process the reconstruction of the object is computed fully automatic. A rendered result of the vase can be found at the end of the video. Note first that the rendering has been performed using a real world high-resolution HDR environment, which is reflecting in the vase and which introduces a pretty high amount of blue sky color to the rendering. Secondly note that the reconstructed vase is NOT symmetric, which is in perfect agreement with the original.
Within ‘Somewhere’ We are transported to a time where the boundaries between what is real and what is simulated are blurred. We live online and download places to relax, parks and shopping malls. We can even interact with our friends as if they were in the same room with simulated tele-presence. Everyone is connected and immersed in nanorobotic replications of any kind of object or furnishings, downloadable on credit based systems. Distance and time become as alien as the ‘offline’ The local becomes the global and the global becomes the local. Consumer based capitalism has changed forever. A truly ‘glocolised’ world. The singularity is near.
The film places us into this vision, observing an average inhabitant within the ever changing environment of the latest SimuHouse. From a painting to a park and from a telephone call to a shopping mall. That is until there is a leek in the system and everything malfunctions. The film concludes with the house being forced to reset, giving the character and viewer a stark reminder that nothing is ‘real’ even her dog, which re-materialises in front of her.
CREDITS:
Directed By: Paul Nicholls
3D, 2D, Tracking, Post Production, Compositing, Camera Work: Paul Nicholls
Cast: Indre Balestuta, Iffy
Sound Design: Jesse Rope
Narration: Robert Leaf
Greek Vocal Talent: Lia Loanniti
Serbian Vocal Talent: Mina Micevic
Store Voice: Guillaume Nyssens
System Voice: Anita Shim
Music By: Kourosh Dini, Twighlight Archive, Pete Berwick http://www.factoryfifteen.com/