Who cares about the Cloud, when the ability to grab our models is out there? Virtually, that is, in a very awkward, ‘why is this sneeze guard in front of me’, ‘dangit I spilled the ranch dressing on the sunflower seeds’ type of way. Yet, as we fall down the rocky gorge of 3D software technology, with each sharp outcropping shattering another rib as we smile gleefully at the pain and possibility of unlimited data manipulating potential, the idea of direct interaction with our model becomes more real. All compounded and emotionally secured by endless iterations of 3D geometry that haunt our dreams throughout the night. Fabulous times. But, which way will it go? Will we touch the 3D data or will it simply see where we are and adapt accordingly? Or both?
3D Holographic Touch or Kinect for CAD?
Either way, Microsoft is on it, researching what it will be like to move objects around with your hand or with the movement of your hand and face. Steve Clayout for Microsoft Research:
with Holodesk you can manipulate 3-D, virtual images with your hands. Whilst this is only a research project at this stage, I can envisage future applications in areas such as board gaming, rapid prototype design or perhaps even telepresence, where users would share a single 3D scene viewed from different perspectives…. what sets it apart from the rest is the use of beam-splitters and a graphic processing algorithm, which work together to provide a more life-like experience.
Of course, Steve totally missed that scenes can already be viewed in 3D and collaboration is already happening from different perspectives, granted not in a holographic, branded-by-Microsoft form. You’re seeing it with gaming, the applications shown above are certainly possible, transitioning that to product design and fab are certainly feasible… a more personal, deeper connection with our CAD data… soak in that comforting thought for a moment. It will come and is, most assuredly, being developed. Which side are you on? Touching your CAD data or moving it around with motion?