Science of Cyborgs

Written by: admin

Michel Maharbiz answers an audience member's question during the Q&A session. A beetle is flying through the air, wings buzzing as it moves forward, and then – suddenly – it falls to the ground. Then the wings start up again, the beetle is back in the air – then again, the wings halt and the beetle lands on the floor. It’s almost as though it’s being controlled by a remote, flying and dropping out of the air as if someone were pushing the “Start” and “Stop” buttons over and over again.

But wait – that beetle is being controlled by a remote. How? It’s a cyborg, of course – half insect, half machine. As the beetle’s creator Michel Maharbiz (associate professor with the Department of Electrical Engineering and Computer Science at the University of California, Berkeley) explained to the audience of The Exchange’s latest event “The Science of Cyborgs,” the beetles (June bugs and other varieties) have been implanted with neural stimulating electrodes in their brains and muscles. Researchers fly the beetle via a radio signals between the electrodes and a hacked Wii remote.

It’s an amazing technological advancement – but so are artificial retinas and robotic fish, the focus of two other presentations by Mark Humayun (Cornelius J. Pings Chair in Biomedical Sciences at the Doheny Eye Institute at the University of Southern California) and Malcolm MacIver (associate professor of mechanical and biomedical engineering at Northwestern University), respectively. “The Science of Cyborgs,” held March 1 at the Directors Guild of America, also featured Jonathan Mostow (director of Terminator 3: Rise of the Machines and Surrogates) and moderators Chuck Bryant and Josh Clark from the How Stuff Works’ Stuff You Should Know Podcast.

Before the cyborg beetles, artificial retinas, and robotic fish, Bryant and Clark opened the evening with a look at transhumanism – the interface between humans and machines. What did they want the audience to know? “If you think this is in the future, if you think this is all something in the movies, then we’ve got some news for you, the future is now,” said Bryant.

From left to right: Josh Clark, Jonathan Mostow, Mark Humayun, Michel Maharbiz, Malcolm MacIver, and Chuck BryantIn television and film, though, that future seems, well, dark. Mostow offered his two cents on why Hollywood gravitates toward a negative view of technology. “The idea that I think is behind [films with evil technology] is it’s servicing some kind of generalized anxiety we have about how we relate to technology. In the sense that we are now more connected to each other than ever before, and yet with all of this connectedness, there’s less human contact then there’s ever been before,” he said. “Why do we go to the dark place? I think we go to the dark place because these movies allow us to explore our anxieties.”

But for every film with evil cyborgs, there is a real-life example of technology being part of the solution, not the problem. Take, for instance, the U.S. Department of Energy’s Artificial Retina Project. The project (lead by Mark Humayun, a member of both the National Academy of Sciences and the Institute of Medicine) aims to restore sight to the blind. “If you look at your five senses, the sensory input through your eye is the greatest. It’s about 200 megabytes through each one of your eyes. So, how are we going to restore sight to the blind?” Humayun said. Previously, patients were dissuaded from experimental operations to restore sight because the operations required implanting devices in the brain. “I started thinking, ‘If the optic nerve from the eye to the brain was still intact, could we jumpstart the blind eye?’ And that’s where this work began,” explained Humayun.

 The artificial retina is a system of a retinal implant with a receiver and a camera mounted in eyeglasses. “[The camera] captures the image and then converts it – all in real time – and sends it to the implant in the eye. When this information is sent in, the receiver decodes it … and the information is then sent by these very delicate, Saran wrap–thin electrodes that ‘jumpstart’ the otherwise blind eye,” Humayun explained. The artificial retina does not automatically restore vision – it provides only 60 pixels of black and white visuals at the moment. However, most patients relearned visual cues in two months. “[Patients] tell us they could detect lights on the Christmas tree for the first time. For the first time they could see fireworks on Fourth of July,” said Humayun.

Another technology meant to benefit humanity is Malcolm MacIver’s robotic fish. A team of researchers (lead by MacIver) developed the robot by studying the black ghost knifefish, which emits a weak electric signal to detect prey. Why study and then make a robot of the knifefish? MacIver showed the audience a video of the fish swimming by its prey. The knifefish automatically reversed its body and attacked the prey. “It’s really interesting to me to sort of highlight what it is that animals can do that we can’t really touch yet with machines. And really, what that’s about is we’re good as engineers at devising systems that can go very fast in a straight line,” he said. “What animals can do very well is moving with extraordinary agility around cluttered spaces. What exactly that requires is sucking in this vast amount of sensory data, doing a quick little bit of processing in a few milliseconds and spitting that out.”

Black Ghost Knifefish Swimming from Malcolm MacIver on Vimeo.

he robotic fish, dubbed GhostBot, mimics the knifefish with 32 independently controlled motors, 32 very thin rods with a lycra fin, and an electrosensory system. Built from computer simulations of the knifefish, GhostBot can move like its animal counterpart: forward, backward, vertically, instantly. What’s the immediate application of the GhostBot’s technology? “It’s a system that has exceptional agility. One of the problems we saw with contemporary underwater vehicles … was that current underwater vehicles are about as maneuverable as a submerged bathtub,” explained MacIver. “What we’d like to do is develop this alternative approach which gives you very high agility and some autonomy to that system so that it senses when it’s about to collide. We’d like to give the people who maintain these submerged structures a technology for doing up-close work and avoiding collisions.” 

GhostBot Swimming from Malcolm MacIver on Vimeo.

Beetles hardwired for remote-controlled flying, restoring sight to the blind, and studying fish to create better underwater vehicles; there’s never a dull moment at an Exchange event. The audience members flocked to the speakers at the end of the event, wanting just a little more of the amazing ideas presented. As audience member Kath Lingenfelter (writer/producer on House) remarked, “Whenever I receive an invitation from The Exchange, I always say yes.  Every event, be it screening, tour, or lecture, is an adventure.  I walk away energized and full of ideas.  Some people thrill at the prospect of getting Spielberg on the phone, but that’s nothing compared to being able to get Neil deGrasse Tyson or Sean Carroll.”

 


The statements and opinions expressed in this piece are those of the event participants and do not necessarily reflect the views of any organization or agency that provided support for this event or of the National Academies of Sciences, Engineering, and Medicine.