Our senses work together synergistically, meaning they collaborate to provide a more comprehensive understanding, especially when individual signals are subtle.
In this process, the collective sum of biological inputs can be greater than the individual contributions of each sense.
see more
Fapesp makes a public call for subscription of quotas in FIPs
ChatGPT and Canva team up to create new design tools;…
However, robots have traditionally tended to follow a more direct approach, processing information in isolation.
In view of this, researchers at Pennsylvania State University (Penn State) are adopting the biological concept of sensory synergy to apply it to Artificial Intelligence (AI).
Product of the biological concept of sensory synergy
The result of this research is the development of the first multisensory artificial neuron integrated, which allows machines to combine and process information from different sensors.
With this, there would be an imitation of the human ability to incorporate multiple senses for a more complete understanding of the surrounding environment.
The work recently published in this month's Nature Communications marks a significant advance in AI research.
Saptarshi Das, the leader of this initiative, highlights that robots, when making decisions, are generally based on the environment in which they are located.
However, its sensors usually operate in isolation, without communicating with each other. This means that sensory information is not shared efficiently.
Furthermore, Das raises an important question: is collective decision-making, through a sensor processing unit, the most efficient method?
He makes a comparison with the human brain, in which one sense can influence and complement another, enabling the person to better evaluate a situation.
This process of sensory interconnection in the human brain can result in more informed and adaptive decisions. Therefore, research seeks to apply these biological principles to AI.
The objective is to improve the ability of machines to make more sophisticated and contextual decisions based on integrated sensory information, that is, inspired by human senses.
(Image: Reproduction/Internet)
Currently, in AI, sensors operate independently, sending information to a central unit for decisions, which consumes more energy.
On the other hand, this research proposes that sensors can communicate directly with each other, making the process more efficient, especially when the information is subtle.
This promises to improve the ability of AI machines to make decisions based on integrated sensory information. To achieve this, the research focused on integrating a tactile and visual sensor.
Thus, it was possible for the output of one sensor to affect the other, with the help of visual memory. This led to improvements in navigation, with visual memory influencing and aiding tactile responses.
They managed to create a neuron multisensory artificial device connecting a tactile sensor to a phototransistor based on molybdenum disulfide, allowing the effective integration of visual and tactile signals.
Therefore, given this, we have the potential to improve the ability of machines to process information from different sensors in a more efficient and adaptable way.
At Trezeme Digital, we understand the importance of effective communication. We know that every word matters, which is why we strive to deliver content that is relevant, engaging and personalized to meet your needs.