Project Rudolph. Part 2

I have gone live with what I would say is a prototype, beta site:  Here, when the stream is on, you can control the lights of my tree by submitting a MIDI file.  It uses the same setup as described in my post Project Rudolph Part 1, but I've built a web front end, to allow the world wide web control the tree.

It'll be up and running from today until Tuesday evening, I have a weird schedule for the next few days, so if this interests you and the stream is down, check back at a later time and see if its up at that time.

How it works:

The front end is powered by a ruby app, making use of the very cool Sinatra and DataMapper libraries.  I keep track of submitted tracks in an sqlite DB.

The backend app that I wrote previously will periodically make a request to my webapp and retrieve the queued up songs.  It will then loop through these songs and use each one to drive the lights.  For all you geeks out there, the code will be up sometime shortly on my Mercurial repository:

The streaming is handling by ustream.


The backend app is single threaded, so the queued songs are only retrieved when no tracks are playing.  To improve on this, I could create a dedicated thread that just retrieves the songs, and then have another thread that reads the downloaded songs and plays them.

I'm also using just the camera and microphone from my macbook pro, so if I wanted to do this for real, I should probably have gotten a real camera and microphone...