If you’ve ever covered your eyes with a blindfold, you know how hard living could be without one of your five senses. Being blind is tough. Even restoring your sight using the newfangled eye surgeries would mean a great sum of money.
But what if there was an app – a handy, affordable app- that could turn your iPad into a seeing apparatus?
While still a prototype, Shane Wighton of Stuff Made Here has created an app for the new iPad Pro which turns the device into a seeing tool. Using the iPad Pro’s lidar and a custom mechanism on the iPad case, the app can sense the relative location of objects around you; effectively telling you where to go.
The app’s success is largely attributed to the iPad Pro’s lidar. Wighton initially wanted the app to be compatible with a smartphone but since the newest iPhone doesn’t have a lidar and the iPad Pro does, he went with the latter option instead.
For those who don’t know, a lidar uses light to determine the distance of objects. By shooting beams of light at its surroundings and using the speed at which it returns to the receiver as a measuring point, it can determine the general shape of objects and just how far they are relative to the source.
Wighton’s app serves two main purposes. First, it converts the lidar information into the most basic form possible. Say there is a chair in front of you. Instead of rendering a full chair, the app will take the outlines of the said object and will read it as a solid block instead. This simplification of objects is done for everything the lidar sees and can be displayed normally from a first-person view, zoomed in, or viewed top-down like a video game minimap.
The second main function of the app feeds directly into the mechanism on the iPad case. Using the simplified lidar information, objects in your vicinity are represented by a set of pins pushing into your hand.
Objects closer to you push the pins harder while objects further away push the pins a little softer. With this information, you should be able to create a mind map in your brain of where you can and cannot go.
Making the pin mechanism seems to be the hardest part of all this. Moving a single motor controls the left and right movement, while using both of them rotates the inner cylinder.
While it took Wighton a really long time to figure the design and quadratic equations behind it, you can see the sped-up version in the video in about a minute.
Since this is a working prototype, the electronics powering the mechanism aren’t exactly compact. You have a Bluetooth communication module, a microcontroller, and two gigantic stepper motor drivers. To top it off, the motors on either side of the mechanism itself are huge. Wighton believes that all these parts can be miniaturized and compacted, should the model ever be turned into a production model.
With the program and the parts all in place, does the app actually work?
You bet it does!
Both Wighton and his wife took several shots at navigating their workshop. In the beginning, the app isn’t as intuitive as you would hope it would be. Your mind takes time to adjust to this new interface, as you have lost all sight and have to rely more on your hands’ sense of touch to determine where to go.
After using it, Wighton described the experience as akin to learning how to type on a keyboard. With enough practice and adjustment, he trusts this can be a viable way for those with poor or no eyesight to “see” their surroundings.