There’s no end to the electrical implements of the future you can strap to your shoulder blade – laser canons, jet packs, robot cats with laser eyes – but what if you could project a multi-touch panel on any surface (wait, didn’t we see this before and before and before?) No one has probably thought of this, but Hrvoje Benko in conjunction with Carnegie Mellon and Microsoft Research is putting a little twist on the idea.
This wearable multi-touch unit you’ll see in the video below straps to your shoulder, uses a depth-sensing camera and a small projector to simulate the interactions. By sensing your surroundings, or rather the projection surface, it can auto-adjust the size of the touch sensitive area.
Beyond a shoulder-worn system, there is no instrumentation of the user or the environment. Foremost, on such surfaces – without calibration – Wearable Multitouch Interaction provides capabilities similar to those of a mouse or a touchscreen: X and Y locations in 2-D interfaces and whether fingers are “clicked” or hovering, enabling a wide variety of interactions.
Forget the fact you can’t shake someone vigorously while they’re using this, it’s only for 2D surfaces – walls, tables, other walls. So, instead of looking awkward fondling a biscuit you’ve projected a dialpad onto, you’ll have to settle for looking awkward finger-jabbing a tabletop… or someones back fat… granted they’re back fat is flat enough and they’re ok with you projecting things on them.
Can you see this being used for product design and engineering? I see more use out of this and similar devices in the manufacturing environment. Sure, the idea of touch interaction for models or drawing is a nice thought, but tie this onto the shop guy who gets a pallet of parts or has to go to a bin to grab some hardware. With a little depth-sensing IR action, it could recognize objects, display diagrams and highlight attachment locations step-by-step, all while discussing the issue with a engineer projected onto the problem spot.