Philipp Rockel

Philipp Rockel

This site shows a selection of free works and university projects that came into being while studying Information Science and Media Design at the University of Applied Sciences in Augsburg and Experimental Mediadesign at the Berlin University of the Arts as a guest-student. I finished my diploma (æeroflux) in February 2010.

E-Mail: philipp (at) rockolo (dot) com
love & respect:
Miriam Rockel
Michael Rockel
æroflux0 æroflux1 æroflux2 æroflux3 æroflux4 æroflux5 æroflux6

æroflux – AAAA

æroflux consists of an athropod (currently a cockroach), placed in a petri dish on top of an aerial vehicle. The insect steers the vehicle with its movements and its position, which are determined by fotoresistors positioned beneath the petri dish. The aerial vehicle is based on a modified Quadrotor, cocooned in a hull of fiberglass. Collisions are avoided with the help of ultrasonic sensors. The project shows the direct opposite of current DARPA-sponsored research in the field of cyborg beetles, where man controls the animal's flight through an implanted microelectronic system, whereas here the beetle controls the machine's flight.
AAAA stands for
Autonomous Arthropod Aerial Appliance.
faceIntruder0 faceIntruder1 faceIntruder2 faceIntruder3 faceIntruder4


The FACE INTRUDER is a radical communication tool, potentially useful only in micro-communities. The software provides a way to send each other video-messages, but instead of a plain playback, the face and the voice are extracted from the senders message and then arbitrary substitute any face that is to be seen on the receivers screen. The origin of the face that becomes substituted doesn't matter, it could be an image on the web, a movie or a videochat. The playback of the face can not be cancelled, nor muted or paused. Other sources of sound like movies, music or skype are muted while the face intrudes the receivers screen. The software originated while being a guest-student at the Digital Media–class at the Berlin University of The Arts.

touchin' Cinema

touchin' Cinema lets you navigate quickly through the time of a film, without a scrubbar and without interrupting the movie. The idea is realized with an overlay on the moving picture, showing the same film, which can be controlled via a multitouch device. Within that overlay, the user can fast-forward by swiping right with one finger, a swipe to the left translates into rewinding. The speed of swiping controls the speed of winding. By double-tapping into the overlay, the original movie jumps to the time chosen in the overlay.

By performaning a rotating movement with one finger, individual movie frames can be stepped through.
lumatik0 lumatik1 lumatik2 lumatik3 lumatik4 lumatik5 lumatik6


LUMATIK is a dynamic light-installation.
It is intended to be realized and integrated in the west-facade of my university in augsburg. The patterns shown are evolving in realtime and reacting on the movements in front of the building — rather transforming the architecture into a digital membrane than being a plain display. Since we are looking for partners, I created 3D-visualizations and a concept-book. It reflects the facade’s aspect ratio and uses the reduced architecture itself as the layout grid.
The team consisted of Katrin Eberhard, André Wachter and me, guided by Prof. Robert Rose.

ambient_intervention0 ambient_intervention1 ambient_intervention2 ambient_intervention3 ambient_intervention4 ambient_intervention5


AMBIENT INTERVENTION transfers periphonic soundfields of various acoustic enviroments from their point of origin into another room. The sound-atmosphere shouldn't be noticeable at once, but we infiltrated/intervented it with various fitting sounds that were triggered by the visitors. I teamed up with André Wachter, who build the camera-tracking and the 3d-microphone we needed to record and replay the atmosphere in the Ambisonics-format.

The tracking-information was used to move the artificial sounds around the visitors in realtime. The installation sounds were encoded and placed through pd and decoded with ambdec.
lumen0 lumen1 lumen2 lumen3


LUMEN is an installation in which areas of »virtual reality« and »reality« are mixed. If a visitor enters the cubic room of lumen, he can explore different cubes which hover in the air. The cubes themselves are invisible, only locatable through their sounds and shadows.

My part of the project was the programming of the 3d-sound that all objects in the room are emitting, creating 3d-animations to visualize the technique behind the installation and building a website in order to present our work and find sponsors. All team-members are listed on the project-site:



wiiSOCKET is a simple multi-threaded socket-server in python that communicates with wiiMotes. It reads the wiiMotes accelerators, normalizes the values and sends it together with the button-flags to all registered client-sockets. Client sockets can be build in all languages that support sockets, I made one in AS3. I build the program in order to have a 3d-input device for presenting the LUMEN-installation.

The socket runs fine on linux and should work on OSX, too, currently it's not. Feel free to download, use, or modify.

Download wiiSocket
videoTimePusher0 videoTimePusher1


The concept of the videoTimePusher is quite simple: Push a video back in time, solely at the point where you touch it.

If you imagine a video as a stack of frames, you could push a part of the very first frame back and see the underlying parts. I tried to demonstrate this effect with a little max/msp/jitter-patch and build a quick mockup in flash so you can try it here.

Download videoTimePusher
(Just drag and drop a movie into the program-window)