The DRXYA robot is calibrated to change its colour and position based on the actions of a user scanned with a Kinect sensor. The on-screen Processing visualisation relays the information (a series of commands and coordinates) via serial communication to the Arduino, which in turn activates the both the lighting sequence of the addressable neopixel LED strips and the movement of the stepper motors in X and Y directions. With the presence of a user within 1.5m from the facade, the colour of the central ring changes from a rainbow display to blue. When the user’s hand is raised, the colour of the inner ring changes to an animated purple sequence. During this phase, the user is able to control the position of the ring in X and Y through the translation of the Processing sketch’s Cartesian coordinates to that of the built facade. This vertical CNC device has many implication responsive facades, interior partion light systems, 3D mapping, and visual communications.

Video! —> https://vimeo.com/90327378

and more to follow soon…….

This entry was posted in Christoffer Ryan Chua, Luca Gamberini, Ramin Shambayati, Robert Douglas McKaye, Sahil Sharma, Uncategorized, Wen Shan Foo. Bookmark the permalink. Both comments and trackbacks are currently closed.