REZA ABIRI, a University of Rhode Island engineering professor, recently received a National Science Foundation CAREER Award for the development of novel shared autonomy multimodal artificial intelligence to achieve comparable levels of dexterous manipulation using fully noninvasive, human-centered technologies. URI says the $546,848 monetary award will help Abiri develop frameworks allowing for individuals with severe motor impairments to be more independent.
What does receiving this award mean to you? I am deeply honored. [It] represents a meaningful milestone for my long-term research and academic journey. As one of the most prestigious awards for early-career researchers, it serves as a strong endorsement of my vision to advance accessible and intelligent assistive technologies through innovative AI algorithms. My research team will expand our current work toward high-performance shared human-AI control systems. Our specific goal is to develop the next generation of assistive AI robotics capable of interpreting and amplifying human intent to support natural, independent interaction across real-world scenarios such as physical restoration for severely impaired people.
What was the impetus behind wanting to develop technology to help people who are severely impaired? My motivation … stems from working closely with individuals who underwent invasive brain surgeries to control external devices. Such invasive operation procedures are risky, cognitively demanding and often unsustainable over time. This experience inspired me to pursue noninvasive alternatives that could potentially offer comparable performance and without the drawbacks. This CAREER award supports the development of AI-driven systems that translate biosignals such as simple head movements, eye movements or muscle activity into complex commands for robotic controls.
How will the $546,848 award be used over the next five years for this research? A unique and valuable aspect of the NSF CAREER award is its integration of research and education across all levels from K–12 to higher education. This funding will support the development of real-time, adaptive human–machine interfaces and AI algorithms in my lab, with a focus on combining noninvasive biosignals and environmental context to enable intuitive control of assistive robotic manipulators. Over five years, we will build machine learning models, conduct user studies and collaborate with medical schools and industry to advance real-world applications. The award also supports graduate and undergraduate research involvement, along with K–12 outreach through internships and hackathons.
Could this work potentially be a breakthrough in helping people with physical limitations live as close to a normal life as possible? How impactful could that be?Our research aims to address fundamental challenges in noninvasive assistive technologies to make them more accessible and practical. Our work seeks to transform how people with disabilities interact with their environment through intuitive, AI-driven systems. In addition to lab-based research, we are collaborating with medical schools, rehabilitation centers and industry partners in Rhode Island and Massachusetts. Ultimately, our goal is to reduce dependence on caregivers, enhance quality of life, and promote inclusive, independent living.