The John B. Pierce Laboratory

Dr. Pieribone awarded multi-million-dollar DARPA contract to explore ways to create systems where the brain transforms digital images into the equivalent of eyesight

Published in YALE NEWS on November 14, 2017
Written by Sonya Collins

A research team led by Vincent Pieribone (left) is exploring ways to create systems where the brain transforms digital images into the equivalent of eyesight. Jason Crawford is investigating genes that may be integral to achieving that result.

The process by which the brain decodes and recognizes a visual image is mostly unknown. Solving its mysteries is the essential first step toward the goal of restoring vision to people who are blind.

An international, multidisciplinary team of researchers led by Vincent A. Pieribone, professor of cellular and molecular physiology and of neuroscience at Yale School of Medicine, has received a four-year, multi-million-dollar contract from the Pentagon’s Defense Advanced Research Projects Agency (DARPA) to take that step.

DARPA’s Neural Engineering System Design (NESD) program aims to develop a portable neural interface system capable of providing precise communication between the brain and the digital world. Such an interface could allow individuals to operate a computer with their minds rather than their fingers, say the researchers. For people who lack sight, it also could bypass the eyes and transmit visual images directly to the brain, they add, noting that the implications for people with sensory disabilities would be immense, while the technology could also benefit other areas of society.

Pieribone’s is one of six teams based at institutions across the country that DARPA has enlisted to generate the basic knowledge necessary to develop this interface and create non-invasive technology potentially applicable to sensory restoration. Each team will take a different approach to learning how the brain processes sensory information.

“The big problem is our inability to monitor the activity of nerve cells through the entire pathway from hearing or seeing to processing that sound or image,” says Pieribone, who is also director and fellow in the John B. Pierce Laboratory. His team of neuroscientists, engineers, computational scientists, chemists, a neurosurgeon, and a marine biologist will approach the problem in two ways.

First, they will attempt to modify neurons that process visual stimuli, says Pieribone, “so that every time they fire, they produce a burst of light.” Next, chemical biologist Jason M. Crawford, associate professor of chemistry and of microbial pathogenesis, in collaboration with a marine biologist and a biochemist, will seek to identify the gene best suited to accomplish that result.  By studying bioluminescent sharks, jellyfish, and a glowing variety of small insect-like sea creatures called copepods, they plan to learn how luminescent molecules called luciferin are genetically encoded. Among the various forms of luciferin the team will study, the one that now appears most likely to produce the best result, says Crawford, is coelenterazine, which consists of simple amino acids found in human metabolism.

“Some luciferins are from weird animals, and they use substrates not found in humans,” says Crawford. “We’ve identified candidate enzymes in comb jellies that we think are involved in coelenterazine production.”

When the investigators find a promising combination of genes and luciferin, they will begin observations of neurons in which those components have been implanted.

Meanwhile, engineers will develop a tiny, lens-less imaging system — about one centimeter square and the thickness of two sheets of paper — that will fit under the skull and, it is hoped, document the work of brain cells to record the means by which the brain processes images. “It’s ambitious,” says Pieribone. “Some of these pieces have been done before, but independently, never in a way that would allow them all to work together in one package.”

For Pieribone, even partial success would be a huge step forward. “If we don’t get to the goal line in four years, but we get pretty close, the whole field is going to be advanced as a result,” he says. If all ultimately works as hoped, Pieribone envisions a commercially available system — probably still decades away, he notes — where tiny cameras built into eyeglasses would create images that travel to a device implanted in the user, which then would convert them into data that the brain could transform into the equivalent of eyesight.

The researchers are funded by an initial infusion of $6 million. They will continue to receive annual multi-million-dollar installments as long as they reach scheduled milestones.