Skip to main content

Projects (Something I made and not yet ported into my portfolio)

Augumented reality for ENEL 5.0 TOUR

Augumented reality software for ENEL 5.0 TOUR, using Unity3D and custom javascript / C# programming. The aim of the interactive installation was to point to a white sphere with an iPad and see a virtual world-sphere popping out from the real one, with infographics about ENEL sorrounding it.

Made with: Unity3Dc, Javascript, C#, String (AR library for Unity3D)

[mejsvideo src=”http://www.inimart.com/portfolio/AR_ENELtour.mov” width=”640″ height=”480″]

 

Projection mapping for ENEL 5.0 TOUR

Projection mapping installation for ENEL 5.0 TOUR. Projection mapping is a projection technology used to turn objects, often irregularly shaped, into a display surface for video projection. In this installation we rendered a set of videos choosing the right perspective and projected them onto a smart car, 2 walls and a floor, using 3 projectors, to achieve the effect of 3D-scene illusion. The projection-calibration was made with Warptool, a custom projection-mapping software made in Objective-C, developed for Quartz-Composer integration.

[mejsvideo src=”http://www.inimart.com/portfolio/iviMap.mp4″ width=”370″ height=”270″]

Made with: Quartz Composer, Warptool, Cinema4D.

 

Projection mapping on Palazzo Priori (Perugia)

This is a custom interactive scene made with Unity3D that allow the DJ to modify virtual light to illuminate the Palace or wind to move the virtual fabric. The scene is then projected on the Palace.

[mejsvideo src=”http://www.inimart.com/portfolio/Perugia_mapping.mp4″ width=”340″ height=”270″]

Made with: Quartz Composer, Unity3D.

 

Stretchy Lines – Interactive visual table

We developed a table that reacts with all kind of objects that is over it.

The setup includes a Kinect perpendicular to the table. The TUIO Server detects objects within a specific range, and then a calibrated projector draws random lines around the objects detected.
TUIO sends objects coords using OSC protocol to Quartz Composer, and then the graphic is rendered using custom Quartz Composer patches.

Made with: Quartz Composer, Javascript, Kinect

 

The brain – tracking a violinist using kinect and color tracking

We need to develop a solution for tracking in realtime a violinist while he is playing. The client shows us this reference. This kind of tracking is achieved with expensive infrared cameras. Due to the fact that the badget was very low, we need a cheaper solution to that kind of tracking. My first prototypes used kinect camera for body tracking, but the violin and the particular pose of the violinist while he is playing doesn’t allow good tracking, and the result data was very noisy. I then decide to adopt a custom solution: use the kinect for torso and head tracking, and color tracking using little colored spheres on the violinist for the other skeleton joint. I develop a custom software in OpenFrameworks for receive kinect tracking data and do color tracking. This is the software at work:

[mejsvideo src=”http://www.inimart.com/portfolio/virtual_violin/theBrain_03.mp4″ width=”350″ height=”270″]

This is the output the software produced:

[mejsvideo src=”http://www.inimart.com/portfolio/virtual_violin/TheBrain_rendering_finale.mp4″ width=”640″]

And this is the software used in the TV show “The Brain”:

[mejsvideo src=”http://www.inimart.com/portfolio/virtual_violin/theBrain_violino_little.mp4″ width=”430″ height=”230″]

Made with: Openframeworks, Kinect.

 

Experiments (Something that I made but are still prototype)

 

EL-Wire audio-reactive dress

Our client wanted something like this. I started to read about Electroluminescent Wire and ways to control it wirelessly. There were a lot of aspects to take into account and solve:

  • How to wireless control the ELWire (wifi / radio).
  • How to make the dress.
  • How to power the whole things.

I made a prototype using Arduino, some dedicated shields, and external power. Very Ugly, but it worked:

El Wire wireless control prototype

El Wire wireless control prototype

 

For the dress, I buy some litte iron circles and stitch them to an old sweatshirt. Not perfect, but it worked:

EL Wire dress prototype

EL Wire dress prototype

And here it is the result of the first tests and finally a complete test waring the ELWire dress. In the final video you can see that I made the dress controllable in 3 ways, using OSC protocol:

  • Using a fixed timeline (built on an audio track – fake audio reative simulation)
  • Audio-reactive
  • Human controlled by a simply OSC interface on iOS (an iPhone in the video)
[mejsvideo src=”http://www.inimart.com/portfolio/ELWIRE/EL_Prototype1.1.mp4″ width=”480″ height=”280″]

 

[mejsvideo src=”http://www.inimart.com/portfolio/ELWIRE/EL_prototype_1.1_dress_00.mp4″ width=”340″ height=”560″]

Made with: ELWire, Arduino, Quartz Composer

 

 

Rotating projection mapping

In this experiment we used Arduino to rotate the a cube object. Arduino sends to the Unity scene the current rotation angle via OSC protocol, and the virtual cube in the rendered scene is updated in realtime, resulting in a perfect virtual / real cube pairing, to achieve the effect of a 360 degree projection mapping.

[mejsvideo src=”http://www.inimart.com/portfolio/Rotate_Projection_00.mp4″ width=”360″ height=”280″]

Made with: Arduino, Unity3D.

 

 

Turn a simple LCD or a wall into a multitouch using Kinect

The concept is the same of StretchyLines. If we have an LCD, we can turn it in a multitouch monitor, without using a projector:

[mejsvideo src=”http://www.inimart.com/portfolio/multitouch_LCD_kinect_00.mp4″ width=”360″ height=”540″]

Or, we can turn every wall into a multitouch sourface using a projector. In the following video we are using NecTouch libraries for kinect calibration:

[mejsvideo src=”http://www.inimart.com/portfolio/necTouch_01.mp4″ width=”360″ height=”270″]

Made with: Quartz Composer, Javascript, Kinect

 

 

Paint projection: projection mapping + paint software

This is a prototype for an interactive car-show installation. There is a real car in the scene, wich is projection mapped with a texture live painted from the user: if the user paint something into the interactive canvas, his draw is realtime-projected onto the car.

[mejsvideo src=”http://www.inimart.com/portfolio/paintProjection_00.mp4″ width=”450″ height=”280″]

Made with: Unity3D, Javascript.

 

 

RGBD filmaking. Use depth camera to enanche RGB videos

RGBD is a filmaking technique that overlaid depth-sensing data maps with HD video from a DLSR camera. It was an incredible feat for quick-made 3D models that only a crack team of professional videographers and programmers could pull off. Nowadays, with depth cameras like Kinect, it is very simple to achieve this kind of effect.

This is and example of scene recorded using only Kinect data using a custom software made in Cinder:

And these are examples of that scene rendered in Cinema4D using a python plugin that allows to read depth frames and to displace a set of null-objects:

[mejsvideo src=”http://www.inimart.com/portfolio/kinect_rendering_test_dance.mp4″ width=”480″ height=”270″]

 

This is another example of RGBD, shooted using Kinect and RGBDToolkit software:

[mejsvideo src=”http://www.inimart.com/portfolio/RGBD_03.mp4″ width=”470″ height=”270″]

Made with: Cinema4D, Cinder, Kinect, RGBDToolkit

 

 

Arduino Energy lab for ENEL 5.0 TOUR

We develop this interactive lab based on the concept of renewable energy. Visitors must blow into a tube to produce wind energy, pull a lever to produce hydroelectric power, illuminating the photosensitive sensors to produce solar energy. A set of sensors that reads user interactions are connected with an Arduino board that collects the data and send it to a Unity3D scene via OSC protocol.

[mejsvideo src=”http://www.inimart.com/portfolio/Laboratorio_interattivo_ENELtour.mp4″ width=”360″ height=”270″]

Made with: Unity3D, Arduino, Javascript

 

 

Interactive installations proposals based on kinect

A set of interactive installations proposal that use the kinect to introduce interaction with the virtual world (fee-hand car simulator, platform game, and other stuff).

[mejsvideo src=”http://www.inimart.com/portfolio/installazioni_interattive.mp4″ width=”340″ height=”540″]

 

[mejsvideo src=”http://www.inimart.com/portfolio/kinectProcessingPhisics_00.mp4″ width=”370″ height=”270″]

Made with: Unity3D, Kinect, Proce55ing

 

 

2 layers projection

In this experiment we use the kinect to isolate the object in the scene based on its depth: the scene is divided in 2 layers from the background to the foreground. The projected image on the objects in the background layer is different from that projected onto the objects in the foreground layer.

[mejsvideo src=”http://www.inimart.com/portfolio/Kinect2layersTD.mp4″ width=”500″ height=”280″]

Made with: TouchDesigner, Kinect

Portfolio

 

 

Kinect / Cinema4D integration

This is an example of how to integrate kinect with Cinema4D. We used a plugin from 908lab called KiCapOSC, that sends kinect skeleton data directly into Cinema4D viewer via OSC protocol.Very useful for rigging charachters and animate humanoid objects in a virtual scene. This is the plugin in action:

[mejsvideo src=”http://www.inimart.com/portfolio/kinect_cinema4d.m4v” width=”480″ height=”280″]

While this is the final result, simply attaching spheres to the skeleton joints and some particles generators:

[mejsvideo src=”http://www.inimart.com/portfolio/kinectcinemaLOW.mp4″ width=”450″ height=”270″]

Made with: Cinema4D, Kinect

 

 

Mixing virtual/real objects with phisics

Quartz Composer custom plugin that allows the user to draw custom segmented lines inside the scene projected by the projector so as to tell where there are obstacles in the real world. All the other virtual objects that will appear in the scene will interact with that constraints.
Made with: ObjectiveC, Quartz Composer, Box2D (2D phisic library)

[mejsvideo src=”http://www.inimart.com/portfolio/box2D_rain.mp4″ width=”300″]

 

[mejsvideo src=”http://www.inimart.com/portfolio/box_2D_postit.mp4″ width=”300″]

 

 

3D frame illusion: head tracking

Using Kinect we can estimate the position of the user head, and modigy the perspective of the projected image to give an illusion of 3D projection.

Made with:  Quartz Composer, Kinect

[mejsvideo src=”http://www.inimart.com/portfolio/headTracking_frameIllusion.mp4″ width=”640″ height=”400″]

 

Tests (Something someone else around the world made, and I tested 🙂

 

Two hand tracking

Two depth cameras can be used to track two hands and move objects around a virtual space, using the threegearsystems solution.

[mejsvideo src=”http://www.inimart.com/portfolio/3Gears_kinect_handTracking_00.mp4″ width=”480″ height=”360″]

Made with: threegearsystems, kinect.

 

Faceshift software: face / expression tracking

Testing Faceshift software. Extremely powerful!

[mejsvideo src=”http://www.inimart.com/portfolio/FaceTracking_Luca_00.mp4″ width=”340″, height=”270″]

Made with: Faceshift, kinect

 

Vuforia Augmented Reality test

Augmented reality test using Vuforia library and Unity. It is a very robust library for Augmented reality!

[mejsvideo src=”http://www.inimart.com/portfolio/Vuforia_AR.mp4″ width=”370″ height=”270″]

Made with: Unity3D, Vuforia (AR library).

 

 

Non-conventional ways of interaction using Makey-makey

Some tests using Makey-Makey.

[mejsvideo src=”http://www.inimart.com/portfolio/MakeyMakey_00.mp4″ width=”370″ height=”270″]

Made with: Makey-Makey, Arduino.

 

 

Virtual whiteboard using wii controller

Jonny Ching Lee projects are a set of projects showing the use of wii controller in an unconventional way. This experiment is the implementation of Jonny Lee “Low-Cost Multi-point Interactive Whiteboards Using the Wiimote”

[mejsvideo src=”http://www.inimart.com/portfolio/wii_wall_paint_01.mov” width=”340″ height=”270″]

Made with: Makey-Makey, Arduino.