logo invFor the MakeZurich hackathon in early February we explored how to combine the Carunda24 smart strap technology and dizmo. Dizmo is the Interface of Things, a next-generation UI platform designed for the Internet of Things (IoT), while the Carunda24 smart strap provides gesture recognition based on a wearable sensor. The hackathon was a great opportunity to test out new ways of approaching data visualization using the technologies of two Swiss startups.

dizmo presskit 01 smallGesture Recognition and Things

Dizmo fully embraces the concept of natural interfaces, where “things” can be instinctively understood and learned. Dizmo is a very flexible platform and is usually used with a mouse or on a touchscreen. With dizmo, a user can drag elements around, combine IoT device control with data streams, and intuitively interact with complex systems.

At Carunda24, we’ve been developing a gesture recognition system based on a simple wristband with integrated Soft Condensed Matter Sensor, with the vision of using natural hand gestures to interact with robots, drones, and interactive environments (VR/AR). The MakeZurich hackathon provided a good platform to bring the Carunda24 and dizmo technologies together. Previously the Carunda24 smart strap has been used for NUI drone control, and it was interesting to now investigate data visualization interaction. 

With hand gestures enabled by the smart strap, the natural interface becomes even more untethered, especially on a horizontal screen or as a projected interface on a table. Combining the two technologies can enable Natural User Interface (NUI) interaction in a local experience, or remotely by leveraging IoT device control.

Carunda24 smartstrap 10MakeZurich Hackathon

There were six challenges proposed at the MakeZurich hackathon, our team decided to participate in the Grün Stadt Zürich challenge. Every tree in the city of Zurich that is managed or taken care by Grün Stadt Zürich is included in the Baumkataster, an open data set in GeoJSON that includes around 50,000 trees. A visualization of those trees on the map of Zurich was done by extending the existing Map dizmo with a data layer.

Next, the smart strap gesture control was integrated using a node.js based script. The smart strap program analyzes the wearable sensor data and recognizes gestures that can then be extended with the robotjs package. This npm package allows to control the keyboard or the mouse with JavaScript. Once a gesture was recognized, a mapping was made between the found gesture and a desired keyboard command. The mouse on the screen was placed on the zoom in/out button of the Map dizmo and then a mouse click was sent based on the recognized gesture. This allowed the easiest and quickest integration of gestures, which was important given the fast pace at hackathons.

Carunda24 dizmo MakeZurich