Researchers at the New Jersey Institute of Technology, while testing the “station keeping” functions of the glass knifefish, have created an augmented reality system that tricks the animal’s electric sensing organs in real time. The fish keeps itself hidden by moving inside of its various holes/homes and the researchers wanted to understand what kind of autonomous sensing functions it used to keep itself safe.
“What is most exciting is that this study has allowed us to explore feedback in ways that we have been dreaming about for over 10 years,” said Eric Fortune, associate professor at NJIT. “This is perhaps the first study where augmented reality has been used to probe, in real time, this fundamental process of movement-based active sensing, which nearly all animals use to perceive the environment around them.”
The fish isn’t wearing a headset, but instead the researchers have simulated the motion of a refuge waving in the water.
“We’ve known for a long time that these fish will follow the position of their refuge, but more recently we discovered that they generate small movements that reminded us of the tiny movements that are seen in human eyes,” said Fortune. “That led us to devise our augmented reality system and see if we could experimentally perturb the relationship between the sensory and motor systems of these fish without completely unlinking them. Until now, this was very hard to do.”
To create their test they put a fish inside a tube and synced the motion of the tube to the fish’s eyes. As the fish swam forward and backward, the researchers would watch to see what happened when the fish could see that it was directly effecting the motion of the refuge. When they synced the refuge to the motion of the fish, they were able to confirm that the fish could tell that the experience wasn’t “real” in a natural sense. In short, the fish knew it was in a virtual environment.
“It turns out the fish behave differently when the stimulus is controlled by the individual versus when the stimulus is played back to them,” said Fortune. “This experiment demonstrates that the phenomenon that we are observing is due to feedback the fish receives from its own movement. Essentially, the animal seems to know that it is controlling the sensory world around it.”
Whether or not the fish can play Job Simulator is still unclear.
“Our hope is that researchers will conduct similar experiments to learn more about vision in humans, which could give us valuable knowledge about our own neurobiology,” said Fortune. “At the same time, because animals continue to be so much better at vision and control of movement than any artificial system that has been devised, we think that engineers could take the data we’ve published and translate that into more powerful feedback control systems.”
Go to Source
Author: John Biggs