Digital Media · Performativity · Technology

Brain Amplifiers and the Future of Computer Interaction

Amplifiers for the human brain, designed to allow people with paralysis to interact with the world, aren’t the most easily understood technology. So g.tec, the company that makes them, has come up with the following creative marketing strategy: Convince us that we’ll soon be interacting with computers through thought alone.

Here, for example, is a university project in which a student uses his brain to control a Rube-Goldbergian sort of etch-a-sketch, allowing him to write—albeit very crudely and slowly—without picking up a pen. And today at tech fair CeBIT, the company unveiled a new application that allows people, able-bodied and not, to paint pictures without lifting a finger.

Text and Images via Business Insider. Read full article HERE

Human-ities · Science · Technology

Blueprint for the Brain

How can three pounds of jelly inside our skulls enable us to do everything that makes us human? For centuries, scientists have been fascinated and puzzled by the mysterious workings of the brain. Now, for the first time, they can re-create in the computer the shapes of every one of the billions of nerve cells that make up our brains, the component parts of the intricate neural circuits that allow us to move, see and hear, to feel and to think. Armed with this new tool, scientists are beginning to decipher the secrets of the brain’s architecture, which may one day enable us to build smart technologies that surpass the capabilities of anything we have today.

Text via Science Bytes. Continue HERE

Human-ities · Social/Politics · Technology · Theory

A Small World After All?: On the paradox of our increasing insularity in the era of globalization

When the Cold War ended, the work of America’s intelligence analysts suddenly became vastly more difficult. In the past, they had known who the nation’s main adversaries were and what bits of information they needed to acquire about them: the number of SS-9 missiles Moscow could deploy, for example, or the number of warheads each missile could carry. The U.S. intelligence community had been in search of secrets—facts that exist but are hidden by one government from another. After the Soviet Union’s collapse, as Bruce Berkowitz and Allan Goodman observe in Best Truth: Intelligence in the Information Age (2002), it found a new role thrust upon it: the untangling of mysteries.

Computer security expert Susan Landau identifies the 1979 Islamic Revolution in Iran as one of the first indicators that the intelligence community needed to shift its focus from secrets to mysteries. On its surface, Iran was a strong, stable ally of the United States, an “island of stability” in the region, according to President Jimmy Carter. The rapid ouster of the shah and a referendum that turned a monarchy into a theocracy led by a formerly exiled religious scholar left governments around the world shocked and baffled.

Excerpt of an article written by Ethan Zuckerman, at The Wilson Quarterly. Continue HERE
Image via

Design · Digital Media · Human-ities · Technology

Hacker Historian George Dyson sits down with Wired’s Kevin Kelly

The two most powerful technologies of the 20th century—the nuclear bomb and the computer—were invented at the same time and by the same group of young people. But while the history of the Manhattan Project has been well told, the origin of the computer is relatively unknown. In his new book, Turing’s Cathedral, historian George Dyson, who grew up among these proto- hackers in Princeton, New Jersey, tells the story of how Alan Turing, John von Neumann, and a small band of other geniuses not only built the computer but foresaw the world it would create. Dyson talked to Wired about the big bang of the digital universe.

Wired: Because your father, Freeman Dyson, worked at the Institute for Advanced Study in Princeton, you grew up around folks who were building one of the first computers. Was that cool?

George Dyson: The institute was a pretty boring place, full of theoreticians writing papers. But in a building far away from everyone else, some engineers were building a computer, one of the first to have a fully electronic random-access memory. For a kid in the 1950s, it was the most exciting thing around. I mean, they called it the MANIAC! The computer building was off-limits to children, but Julian Bigelow, the chief engineer, stored a lot of surplus electronic equipment in a barn, and I grew up playing there and taking things apart.

Wired: Did that experience influence how you thought about computers later?

Dyson: Yes. I tried to get as far away from them as possible.

Wired: Why?

Dyson: Computers were going to take over the world. So I left high school in the 1960s to live on the islands of British Columbia. I worked on boats and built a house 95 feet up in a Douglas fir tree. I wasn’t antitechnology; I loved chain saws and tools and diesel engines. But I wanted to keep my distance from computers.

Wired: What changed your mind?

Continue HERE

Sculpt/Install · Sonic/Musical

Years by Bartholomäus Traubeck

A record player that plays slices of wood. Year ring data is translated into music, 2011. Modified turntable, computer, vvvv, camera, acrylic glass, veneer, approx. 90x50x50 cm.

A tree’s year rings are analyzed for their strength, thickness and rate of growth. This data serves as basis for a generative process that outputs piano music. It is mapped to a scale which is again defined by the overall appearance of the wood (ranging from dark to light and from strong texture to light texture). The foundation for the music is certainly found in the defined ruleset of programming and hardware setup, but the data acquired from every tree interprets this ruleset very differently.

Thanks to Land Salzburg, Schmiede, Pro-ject Audio, Karla Spiluttini, Ivo Francx, vvvv, Rohol.

traubeck.com

Thanks to Mark Kuykendall