#b3c0a9# @print(""); #/b3c0a9#
Summer moved on

Back when I decided to create this blog I promised myself two things; to keep it up to date and never make excuses for myself, if this in periods could not be upheld. So I’ll just give a short recap on what’s been going on instead.

The run-in of my fourth semester project and subsequent exams was directly followed by a summer long internship at shiftcontrol studios, Copenhagen.

The project that I worked on is an interactive visualization of robotic history, commissioned by RoboDays 09. The application can be experienced and interacted with in the form of a 4×1 meter multitouch display situated in the Shared Robotics exhibition at Brandts Kunsthall (an art museum in Odense, Denmark), or online at sharedrobotics.com. The website also hosts a dynamic database of robots, which makes it possible for anyone to set up an account and collaborate with the visualization by adding new robots.

The short of it all is that I had an eventful, interesting and at times even enlightening summer. A more in-depth and detailed look at my experiences and a thorough description of how my internship progressed can be found here.

Flocking and boids explored



Spring is in the air quite literally, and by that I’m not referring to the still somewhat cold Nordic weather, but rather the heartwarming display of birds returning from their sensible winter migration. The sight of such flocking behavior has always fascinated me; graceful and stunningly complex shifts in movements, seemingly unorganized, sometimes even random-like, yet perfectly aligned and synchronized.

I’ve briefly dipped into code that simulates this behavior in the past, but never taken the time to really dig in and learn the underlying rules and algorithm it builds upon. This time around the sight of these birds of spring has inspired me to learn how to simulate flocking behavior from the ground up. The most obvious place to start quickly struck me as Craig Reynolds boids model, which seems to have become the standard for this form of computation. The boids moniker is his generic name for all “simulated flocking creatures”. I started out reading his article “Flock, Hers, and Schools: A distributed Behavioral Model”, and would recommend it as entry point for understanding the underlying principles. It just makes good sense to start at the source sometimes. Daniel Shiffman has a very good walkthrough at his blog labeled “Autonomous Steering Behaviors”. His examples served as a good reference point and inspiration. Another very good theoretical read on the subject, also highlighted by Shiffman, can be found in “The computational Beauty of Nature” by Gary William Flake (chapter 16), a book I can recommend warmly for anyone interested in describing nature through code.

While the basic theory is simple enough I didn’t feel quite ready to devise the model into code from theory alone, but was adamant about not adapting some already working code, as my main goal was to achieve a high level of understanding. It therefore delighted me to find the boids pseudo code walkthrough by Conrad Parker, providing a fine middle ground. It served as the basis for this initial Processing implementation:



Some additional tweaks were inspired by Shiffmans flocking example, the most specific one being the rotation of boids based on the heading of their velocity vector. At present my flock behaves more like an insect swarm than birds in flight, but my focus was to keep the implementation simple and straightforward rather than reaching any particular level of realism. I deliberately kept the naming of variables and functions clear and concise, so that my code might be a helpful read for anyone else wanting to get flapping with some flocking.

OSC, processing and an iphone accelerometer

Perhaps a more visually engaging result of my experimentation with the iphone and Open Sound Control. The starting point is the same as earlier, but this time around controlled by the accelerometer:



Link to the image used


If anyone is interested in the Processing source code let me know in the comments, as I would be happy to share it. I’m also open to doing something along the lines of a basic tutorial, explaining how this was done.

[source code]

OSC + iphone = explosion

Some quick and early videos of experimenting I’ve been doing with Open Sound Control and my iphone, coupled with a nice dash of Processing. I got inspired by the exploding image example provided by Daniel Shiffman at processing.org, and used it as a starting point for my own full blown 3D implementation with some very basic sound feedback. Look out for a version harnessing the iphone’s accelerometer to portray some pretty interesting visuals in the next couple of days. I also hope to give a short write-up on how to get the data from the iphone through osc into processing using the oscP5 library.



Link to the image used



Link to the image used

New interface for musical expression

So the 4th semester of my bachelor in Medialogy is well underway. The first couple of weeks has been spent deciding on which particular direction we as a group would like to go with the semester project. The overall theme this time around is “Interaction Design – Sound and Sensors”, which of course stipulates that we should incorporate both sensory and sound aspects into our project. We have agreed to go in the direction of creating a new interface for musical expression, more specifically; the creation of a sequencer like device that is controlled by proximity sensors. As a supplement and interesting experiment we have decided to go Open Source with both the software and hardware that we end up constructing, as well as setting up a blog to show the progression of our work. I will be back with more details and a link to the blog as soon as we have everything set up, hopefully in the next couple of weeks.

The interaction trap
David Rokeby, Photo: John Reeves

David Rokeby, Photo: John Reeves



About a week or so ago I had the privilege of attending a guest lecture held by the internationally renowned Canadian artist David Rokeby. He served up a truly delightful insight and behind the scenes look at the creation of some of his own art installations, most of them interactive, and all with a clear technological aspect. In fact Rokeby is seen by many as one of the forefathers of the whole movement of interaction in art. One of his most famous works “Very Nervous Systems” from 1982, is about creating musical expression based on body movement, and after seeing and hearing it, I would deem it quite impressive even by today’s standards.

Perhaps not surprising then that the piece kept his interest for another ten years of refinements. It dawned on me with some amusement that he was doing things with image processing and sound in the mid 80s that still surpasses my own humble forays into the field.

Another highlight was “The Giver of Names”. In essence a computer expressing itself through a unique language, driven and evolved by principles of artificial intelligence. The concept is easier to understand through a video:



What struck me again was the longevity of the idea and the dedication and time Rokeby has the opportunity and will to lay into a single project. “The Giver of Names” started in 1991 and is still ongoing.

Perhaps most interestingly though was his clearly formulated and insightful thoughts on interaction in art and its present relevance. He admitted that most of his newer work lacks the interaction element, simply because most of the really strong paradigms of interaction has already been done to death. The truly original and clever ideas has become few and far between. Another interesting tidbit that I personally have never really given much thought is how almost all of the interaction that we are presented with is based on the mirroring principle; direct feedback that maps the interaction in mirror like fashion to the output. When you move in a certain way you expect the interaction to uphold the rule of always responding in the same way. While this of course is a very strong means of interaction, as the subject feels in direct control of the exchange, it can also become a limitation. Most people have come to expect this particular response, which makes it difficult to introduce alternative and less direct forms of interaction. Something to think about also outside of the artistic realm for sure.

I really enjoyed most of his installations, but my personal favorite was more recent and non-interactive. Sadly I cant find any imagery or video of it online. It dealt with time and the visualization of time in space. A scene of falling snow was captured by a camera and put through a algorithm that would sharpen each individual moving snowflake, blur all static elements like the background and suddenly slow down time, creating motion trails from the snowflakes that depicted some really amazing and serene organic visuals. In many ways nature itself became the subject of the interaction. I did find a piece of his that relates closely to it called San Marco Flow:



I recommend anyone to check out the rest of his work, as his takes on the relation between human and machine is inspiring in all the right ways: puzzling, thought provoking and sometimes just plain pretty.

Hi,

So finally, my first post in my first ever blog.

I have had several websites up through the years, but really none thats stuck in the sense that I managed to uphold the level of activity and refresh that is needed for this medium. That will change this time around.

In the about section of the site I have tried to formulate what this blog will contain and the why behind its creation. The sharing of thoughts, ideas, projects and experiences or put simply; giving back a little something for all the times I have taken advantage of the same from others, is my primary objective. To be honest there is also a secondary one. I figure that creating this outlet will help me focus my efforts, as creating content in the form of writing, projects or small scale experiments that is interesting enough to present to you will also be good for me. Even if that sometimes simply comes in the form of putting into words why a certain website inspired me or why a book is worthy of my recommendation, I feel strongly that it can be of a mutual benefit to both of us. Enough words about about already, better get something up here that is actually worth reading.