With 1 quadrillion connections, 100 billion neurons and 100,000 miles of blood transport systems, the human brain may seem to be incomprehensibly complex. However, scientists are attempting to simplify their analysis of brain activity by examining a considerably less sophisticated subject: the mind of the fruit fly. By using machine vision and learning algorithms, researchers are developing an understanding of how neural activity relates to fruit-fly behavior today—and perhaps human behavior in the future.
The experiment, conducted by the Howard Hughes Medical Institute’s Janelia Research Campus in Ashburn, Virginia, involves the analysis of massive amounts of video records of fruit-flies in action. Manual review and classification of all this video would take enormous amounts of time and effort. So instead, Janelia is using machine vision to observe the insects’ movements and machine learning to categorize those activities.
The institute’s machine vision software automatically tracks the fruit flies, estimating their position and pose in every frame of an input video. The software examines actions such as forward or backward movement and orientation of the flies’ wings.
The algorithm compares these behaviors with definitions developed by biologists. The biologists manually annotate the insects’ behaviors in a small portion of the video, and the machine-learning algorithm searches for the parametric function that can reproduce these annotations automatically, according to Janelia.
Manually reviewing such video would require a person to visually monitor 20,000 videos of fruit-fly activity, each 16 minutes in length, according to Science Magazine. It would also require a level of attention to detail that would be challenging or impossible for human reviewers.
The algorithm can perform this review much more quickly than a human. It also can detect changes in behavior with greater precision than is possible with human review, with the capability to see a 5 percent variation in walking speed.
The institute’s work could lead toward a better understanding of how an organism’s behavior relates to certain patterns of neural activity in the brain.
“This is going to be a huge and valuable tool for the community,” said Bing Zhang, a fly neurobiologist at the University of Missouri in Columbia, in comments made to Science Magazine. “I am sure that follow-up studies will show this is a gold mine.”
Janelia said the new machine vision and learning algorithms it is developing are suited for use in large-scale neuroscience experiments. The algorithms will be useful for scientists performing other neuroscience projects. The new data analysis techniques coming out of Janelia’s work also will help answer new types of questions regarding the brain and behavior.
Such future studies and questions may involve using machine vision and learning algorithms to gain a greater understanding of how neurological activity in the human brain relates to individuals’ actions.
Tyler Schulze is vice president, strategy & development at Veritone. He serves as general manager for developer partnerships, cognitive engine ecosystem, and media ingestion for the Veritone platform. Learn more about our platform and join the Veritone developer ecosystem today.