The ability to link multiple senses in robotics is something that has long been taken for granted. Researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) have come up with a new system that could give robots more human senses.
The new predictive AI technology created by CSAIL enables robots to see using “sense” of touch. This may sound confusing, but the technology aims to equip robots with something that people do every day that is looking at an object, surface, or material and envisage how it will feel once touched. The sense of touch is an extremely important aspect of human life as it provides the knowledge of how the object feels like i.e. soft, squishy, rough, etc.
The vice versa also holds good as the system can also take touch-based input for the prediction about what it would look like. This would be something like a game which involves touching things blindfolded and guessing what it might be.
Right now, it might not be that clear as to why and how this is going to be useful in robotics science. However, as explained by CSAIL researchers, this is a futuristic tech that would make robots even more efficient and powerful. The system was demonstrated with a robot arm to help it anticipate where an object is located without seeing it and then recognizing it using the sense of touch. This would help robots identify what they have picked up is the right thing or not and operate more efficiently in a low-light environment, without the need of advanced sensors.