While we have seen Kinect-based virtual dressing rooms before, the team at Arbuzz is taking a somewhat different approach (Translation) to the digital gown up game. rather than utilizing flat pictures of clothes superimposed on the subject’s body, their solution uses full 3D models of the garments to accomplish the desired effect. This technique enables them to produce a more true to life experience, where the garments complies with the subject around, flowing naturally with the user’s movements.

Like many other Kinect hacks, they utilize openNI as well as NITE to acquire skeletal data from the sensor. The application itself was written in C# with Microsoft’s XNA game advancement tools, as well as uses a special physics engine to render the simulated cloth in a sensible fashion

[Lukasz] states that the system is still in its infancy, as well as will need plenty of work before they are totally happy with the results. From where we’re sitting, the demo video embedded below is quite neat, even if it is a bit rough around the edges. We were especially pleased to see the Xbox’s native Kinect interface put to work in a diy project, as well as we are rather interested to see exactly how things look when they put the final touches on it.

[vimeo http://vimeo.com/25933286 w=470]