As we said earlier, we discovered that we could not get movies (least not the way we made them) to function in Processing because QuickTime was the only video format supported. This was a problem because although one half of the team was using a Mac where this would all work, the other half was using a PC. After investigation we found that QuickTime was not supported on Linux, but was supported on Windows. However, the Bluetooth driver that would work with the Wii Remotes would only work with Microsoft's Bluetooth interface and not the Toshiba interface that the PC had. In the end we re-shot the whole thing as stills, then cut out the images to make them transparent PNGs. This gave us greater flexibility when importing the series of images into Processing, for example:
We then concentrated on saving all our image elements into Processing to build up the scene.
The Processing sketch would have four different states, reflecting the four possible threads through the story.
- KITTEN_CHASING_BIRD
- BIRD_IN_CAGE
- KITTEN_CAUGHT_BIRD
- BIRD_SAVED
The setup() method was used to initialize all the variables required during the sketch and also to load all the images and sounds that would be used.
During development it became clear that a number of the "actors" in our story had similar properties and abilities. For example: the bird and the cage are each assigned to a Wii remote and could be moved independently. In order to reflect this, a class "Actor" was defined that encapsulated the common properties between all of these.
An "Actor" had:
- a position on-screen
- an optional Wii remote assigned to it
An "Actor" was able to:
- detect whether it was near another Actor
- chase another actor
- adjust its position on input from the Wii remote assigned to it
We then identified two variants of these actors: Static and Animated. Static Actors were represented by a single image, whereas Animated Actors were represented by a continuously displayed set images. This Animated Actor was our way of making Processing animate without using the QuickTime library that was not supported on all platforms as mentioned above.
Once we had put these classes in place, the Processing draw() function was simply a matter of detecting whether different Actors were near each other and then altering the state of the sketch to match. For example: if the kitten is near the bird, then the game ends; if the bird is near the cage, then go to the next stage; if the bird in the cage is near the window, go onto the final scene.
The final scene, where the bird flies into the sunset calls on another class Bird, which stores a position and whether the wings are up or down. There are also various functions for drawing a Bird object. This code came from an earlier prototype where we had multiple birds flying around the screen. However, the story for our sketch only called for one bird so that is what we used.