If you have a physical condition that makes it impossible to use your computer’s mouse, how do you check your email? Or what if you’re a physician doing triage and you need to share a computer with other doctors without swapping germs? That’s what Professor Dean Mohamedally asked his colleagues and students to solve for at UCL (University College London)’s Department of Computer Science in the early days of the COVID-19 pandemic in 2020.
With help from Intel and other partners, his students developed what would become UCL MotionInput—now in its third generation. The ground-breaking suite of applications lets people use their PC without physically touching a keyboard, mouse, or joypad. Instead, they can use voice commands, facial expressions, or physical gestures captured by their webcam. With endless uses including healthcare, education, industry, and gaming, UCL MotionInput shows us what the future of user experience (UX) could look like.
It started in a classroom
While offering endless uses, UCL MotionInput was first created in response to the global COVID-19 pandemic. It was designed to give all students access to remote learning and to help the United Kingdom’s National Health Service (NHS) with rapid patient triage—thanks to requirements design from Clinical HCI Researcher Sheena Visram, and NHS GP Dr. Atia Rafiq.
Two computer science students whose talents helped make MotionInput 3.0 a reality were Sinead Tattan and Carmen Meinson. They got involved with the project when Professor Dean shared his ideas with his classes across several taught degree courses, both at undergraduate and master’s levels. It’s the progress students had been making with combining major AI technologies like Intel® OpenVino™ within UCL MotionInput that made Professor Dean confident his students could find a way to help people use their computers without using their hands.
Sinead recalls how Professor Dean tasked students with creating something that would really make a difference in people’s lives, an application that could give those with fine motor skill conditions opportunities that typically abled people take for granted. To build it, Sinead led student teams to eventually combine several machine learning, computer vision, natural language processing, and software processing technologies, thanks to Carmen’s Intel-optimized software architecture designs.
It was no easy task. Separate teams took on specific functionalities—like the world’s first hybrid facial navigation with simultaneous federated on-device voice commands—while Sinead played a key role as team architect, ensuring they could be combined and accessed as individual software features or a single easy-to-use suite of programs.
Intel U.K. mentors Costas Stylianou and Phillippa Chick were eager to see the students progressing in new ways. Features like pinching in the air for “touchless multitouch,” and “nose navigation” for browsing online by pointing one’s nose at regions of the screen, were optimized and built for performance as radically different ways of using everyday PCs and laptops.
Societal benefits
UCL MotionInput could have great societal benefits because it’s economical, easy to use, and makes touchless computing widely available. With just an Intel-based machine and a webcam, UCL MotionInput could lead to massively increased population health benefits. It could also be used in care homes where seniors—who may previously have struggled to grasp objects with their hands—could play games like chess online with their friends or family. It could also be used for online shopping, finding assistance, communication, and exercise. And many who can’t currently use a computer suddenly have an interface that allows them to use one. From educational uses to safer computing in hospitals, exercise, and entertainment, touchless computing offers endless applications.
The future
UCL MotionInput 3 is a fully customizable solution with combined machine learning and computer vision models that adapt to users’ needs and abilities. It opens touchless computing to public, research, industrial, and commercial applications.
Meanwhile, the UCL MotionInput team continues to find ways to make their software available to more people. Leading technology companies have taken notice, and Professor Dean, his academic and clinical colleagues, and especially his students across their courses, welcome any collaboration that can help more people through computing.
They invite everyone to join their community, make requests, test their incredible free and future commercial products, and of course provide feedback. Humanity stands to benefit.
Delivering breakthrough technologies
Learn about Intel Labs’ latest experiments, discoveries, and more that help improve technology for everyone.
More wonderful in action
Touchless Computing at UCL
Get the latest updates and software releases for UCL's MotionInput technology to use Touchless Computing interactions in your next project.
I Will Always Be Me - The book that banks your voice
A special book that helps people living with motor neuron disease (MND) keep their voices – by reading a story that helps explain what they’re going through.
Want to explore further?
Discover more ways we are helping people create wonderful.
More resources
Our vision
We create world-changing technology that improves the life of every person on the planet. We have significant ambitions, and a growing sense of urgency to work with others.
How wonderful gets done
Explore the latest stories, case studies, and testimonials from our customers and learn how together we’re creating technology that moves the world forward.
Jobs at Intel
We pioneered Silicon Valley and have powered world-changing innovation ever since. Today we’re advancing AI, autonomous transport, smart cities and more to solve humanity’s greatest challenges.