UCL MotionInput 3 (MI3) Accessibility Apps
The following Windows apps enable you to play your own games, software and browse the web by using your face, body/neck movements, arms and fingers, and speech.
This software uses a Windows PC, webcam and microphone to simulate keyboard and mouse input.
1. Facial Navigation App - A hands-free mouse that lets you control your mouse cursor and actions just by tracking your head movements and facial gestures with your webcam.
2. In-Air Multitouch App - A touchless computing mouse where a webcam tracks your fingers and arms moving in front of a PC, to move the mouse and trigger mouse presses to control your PC by pinching in the air. For example, you can pinch windows in the air to move them. You can also use two hands to pinch to stretch windows and images where supported. Tracking in the air simulates the Windows Touchpoints mouse pointer as well the option to track a regular mouse pointer.
These apps are part of a suite of several Touchless Computing apps (UCL MotionInput 3 - "MI3") developed by academics and students at University College London. You can find out about more all of these apps at our parent site, www.touchlesscomputing.org.
For users who can use their body and face but not their arms:
Download Facial Navigation v3.2 for Windows 10/11 from the Microsoft Store (UK/USA/Canada)
For users who can use their arms and fingers:
Download In-Air Multitouch v3.2 for Windows 10/11 from the Microsoft Store (UK/USA/Canada)
These software apps install on your own computer, and are free for personal use on Windows 10 and 11 laptops and PCs. They require a webcam, 4GB RAM and an X86 (64-bit) processor from 2016 onwards. You need 1GB free on your hard drive for each app that you install.
If you download from the Microsoft Store links above, they will install into your start menu after 5-6 minutes of downloading and uncompressing.
If the store links above do not work for you or you are outside of the UK, USA and Canada, then please do download directly from UCL here on our XIP licensing platform (it is still free for personal use).
Once you install it, an icon will appear in the Start Menu folder labelled UCL MotionInput v3.
Once you launch the app, the key phrase "BUTTERFLY" in the speech enabled modes is used to centre the mouse to your nose. It also stops it from moving.
"Click", "Double Click" and "Right Click" as spoken phrases will let you trigger those mouse presses.
Pressing escape key on the tracking window will close the window when you have finished using it. A full help document with further features can be found on the instructions tab above.
The project was developed by over 150 UCL Academics and Staff during the COVID-19 Pandemic from summer 2020 onwards. The software is powered by Machine Learning, Natural Language Processing and Computer Vision models, processed on your own computer. The software is not cloud connected and no data is transmitted or stored regarding the users. The software was built in collaboration with Intel Corporation, IBM and Microsoft, with advice and support given from the UK's NHS (National Health Service) and several NHS trusts including Great Ormond Street Hospital for Children in London UK..
See here for our published news stories (2022) on Intel.com, Forbes, The Register, and our own UCL article. Also from the British Computing Society, and Great Ormond Street Hospital for Children. Our research paper from 2021 is published here and Future Healthcare Journal here.
If you would like to give us any feedback or testimonials, or requests for new features, please do send it to us!
Our parent site covers touchless computing for healthcare, gaming, accessibility, fitness and more over at touchlesscomputing.org.