My mouth… is full of soup. It’s also gapping wide open. So, soup is falling all over my desk as I watch the future controlled by our appendages. Appendages that haven’t been detached by our robot friends, but are being used to manipulate where we want them to display information… or interact with our environment.

It’s another project out of the MIT Media Lab called LuminAR. The idea is to take the computer display off the screen and move it where ever you need it. In addition, the display is projected by a robotic arm.

Don’t worry, it’s taught to do your bidding with programming to recognize your hand gestures and display interfaces according to the surrounding environment. So, no chance at all it would recognize something (YOU) as a threat and shoot beams of high-intensity light into your retinas.

Here’s the video…

The Smack on Product Development

Hmmm, the 3D product development applications. This isn’t so far off from the idea of adaptive environments I’ve talked about before when looking at similar projection technology. When you’re working in a 3-dimensional environment, you move and select commands based on certain conditions. These conditions can be recognized programatically (by the robot that recognizes gestures). The interface (or robot), in the same environment around you, can then adapt to the conditions and build geometry, coordinate constraints, run simulations, begin manufacturing operations, even as you’re iterating the design.

How are you going to use it?

Via Epic Win FTW. THANK YOU Charles Culp!


Josh is founder and editor at, founder at Aimsift Inc., and co-founder of EvD Media. He is involved in engineering, design, visualization, the technology making it happen, and the content developed around it. He is a SolidWorks Certified Professional and excels at falling awkwardly.