Abstract
The study proposes to optimize eye tracking by applying nanotechnology and reduce the response time, as well as the number of interactions in displacements, through the implementation of a collaborative omnidirectional system applied to the mobility of wheelchair users. For the application, an artificial vision system is used with two small-sized cameras with 130 nm CMOS sensors and a coating of low refractive index nanocrystals composed of nanoparticles which eliminate ghosting and glare effects and thus define the trajectory by calculating the speed of each Mecanum wheel that respond to algorithms developed in two programming languages: Python, for eye tracking and image processing; and C++, which processes the trajectory and controls the motors by means of a specific algorithm of the mathematical system, calculating speeds and emergency stops in case of possible collisions with the help of eight distributed laser sensors of 960 nm wavelength located in each direction of movement. The analogy made in the study between ultrasonic sensors and nanolasers presents findings that can be set as standards, the fusion of nanotechnology together with artificial intelligence programming provides users with greater autonomy and freedom of movement, with the ability to move in all directions, reducing maneuvering space by at least 30% compared to a conventional wheelchair.
doi: 10.17756/nwj.2024-s1-040
Citation: Vásquez G, Morales PA, Terán HC, Arteaga O. 2024. Collaborative Omnidirectional Robot with Remote Eye Tracking System to Optimize Mobility. NanoWorld J 10(S1): S223-S228.