Charles Higgins, an associate professor at the University of Arizona, has built a robot that is guided by the brain and eyes of a moth. Higgins told Computerworld that he basically straps a hawk moth to the robot and then puts electrodes in neurons that deal with sight in the moth's brain. Then the robot responds to what the moth is seeing -- when something approaches the moth, the robot moves out of the way.
Higgins explained that he had been trying to build a computer chip that would do what brains do when processing visual images. He found that a chip that can function nearly like the human brain would cost about $60,000.
"At that price, I thought I was getting lower quality than if I was just accessing the brain of an insect which costs, well, considerably less," he said. "If you have a living system, it has sensory systems that are far beyond what we can build. It's doable, but we're having to push the limits of current technology to do it."
This organically guided, 12-in.-tall robot on wheels may be pushing the technology envelope right now, but it's just the seed of what is coming in terms of combining living tissue with computer components, according to Higgins.
"In future decades, this will be not surprising," he said. "Most computers will have some kind of living component to them. In time, our knowledge of biology will get to a point where if your heart is failing, we won't wait for a donor. We'll just grow you one. We'll be able to do that with brains, too. If I could grow brains, I could really make computing efficient."
While the moth is physically attached to the robot at this point, Higgins said he expects that one day only the brain itself will be needed. "Can we grow a brain that does what we want it to do? Can I grow an eye with a brain connected to it and have it do what I need it to do? Can I engineer an organism and hook it into my artificial system?" he asked. "Yes, I really think this is coming. There are things biology can do so much better. Think of a computer that can be both living and nonliving. We'd be growing tissue that has no more intelligence than a liver or a heart. I don't see ethical issues here."
He does see an ethical line, though. "Our goal is not to hook up primate brains to a robot," said Higgins. "There's the possibility, when you start to tap into brains, for all sorts of evil applications. There are certainly all these ethical issues when you start talking about human and primate brains."
Will future desktops and laptops have organic parts?
Why not, said Higgins. "Computers now are good at chess and Word and Excel, but they're not good at being flexible or interacting with other users," he added. "There may be some way to use biological computing to actually make our computers seem more intelligent."
Right now, Higgins has successfully attached electrodes into a single vision neuron in the moth's brain. (Different neurons perform different functions like vision and the sense of smell. Humans have millions, if not trillions, of neurons. Insects have hundreds.) Now, Higgins is experimenting with connecting four electrodes into neurons on both sides of the moth's brain, expanding the visual image that the robot receives. "That should give me information about things moving on the left and right of the animal, at different speeds and moving up and down," he explained.
Higgins is also experimenting with tapping into the moth's muscles and olfactory senses. If he can work with the muscles, for instance, a strapped down moth trying to move in a certain direction would actually propel the robot.
Source: http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9050258&pageNumber=2
http://www.computerworld.com/action/article.do?command=viewArticleBasic&taxonomyId=10&articleId=9050258&intsrc=hm_topic
No comments:
Post a Comment