Monday 31 May 2010

Experience Design II Show at Penrhyn Road

A feast for the senses, featuring:
Scarface AR by Nick Irons

Painting with Wii by Jaspreet Deusi, Nico Stromnes and Matt Steggles

Jelly Bellyicious by Siobhan Sawyer, Schinel Outerbridge and Angela Boodoo

Set the Bird Free by Jeff Townsend and Jamie Buckley

Thursday 27 May 2010

Technical specification - making it all come together


As we said earlier, we discovered that we could not get movies (least not the way we made them) to function in Processing because QuickTime was the only video format supported. This was a problem because although one half of the team was using a Mac where this would all work, the other half was using a PC. After investigation we found that QuickTime was not supported on Linux, but was supported on Windows. However, the Bluetooth driver that would work with the Wii Remotes would only work with Microsoft's Bluetooth interface and not the Toshiba interface that the PC had. In the end we re-shot the whole thing as stills, then cut out the images to make them transparent PNGs. This gave us greater flexibility when importing the series of images into Processing, for example:







We then concentrated on saving all our image elements into Processing to build up the scene.


The Processing sketch would have four different states, reflecting the four possible threads through the story.

  • KITTEN_CHASING_BIRD
  • BIRD_IN_CAGE
  • KITTEN_CAUGHT_BIRD
  • BIRD_SAVED

The setup() method was used to initialize all the variables required during the sketch and also to load all the images and sounds that would be used.


During development it became clear that a number of the "actors" in our story had similar properties and abilities. For example: the bird and the cage are each assigned to a Wii remote and could be moved independently. In order to reflect this, a class "Actor" was defined that encapsulated the common properties between all of these.


An "Actor" had:

  • a position on-screen
  • an optional Wii remote assigned to it

An "Actor" was able to:

  • detect whether it was near another Actor
  • chase another actor
  • adjust its position on input from the Wii remote assigned to it

We then identified two variants of these actors: Static and Animated. Static Actors were represented by a single image, whereas Animated Actors were represented by a continuously displayed set images. This Animated Actor was our way of making Processing animate without using the QuickTime library that was not supported on all platforms as mentioned above.


Once we had put these classes in place, the Processing draw() function was simply a matter of detecting whether different Actors were near each other and then altering the state of the sketch to match. For example: if the kitten is near the bird, then the game ends; if the bird is near the cage, then go to the next stage; if the bird in the cage is near the window, go onto the final scene.


The final scene, where the bird flies into the sunset calls on another class Bird, which stores a position and whether the wings are up or down. There are also various functions for drawing a Bird object. This code came from an earlier prototype where we had multiple birds flying around the screen. However, the story for our sketch only called for one bird so that is what we used.

Narrative - what's the point?


As far as advancing the narrative we thought it would be good to involve another "player", and that is when Jeff struck upon the idea of using Wii Remotes. It was now a question of developing the story to give the players some kind of purpose or reason to be engaging with the bird cage.


First, we separated the two images – one was of the bird, the other was of the cage – and gave control of each of the images via the Wii Remotes. Then Jeff struck upon the genius idea of introducing the villain of the piece – the Killer Kitten! We were to try and get the bird on the cage while avoiding being caught by the kitten, thus introducing an element of gameship and requiring some skill. The instructions will be published on card next to the finished piece, as below:



If you managed to get the bird into the cage before the kitten "pounced" on it, then the bird would be safe inside the cage, out of harm's way. (If not the screen turns red!)


We would then move the scene on to a room where you, as the remote controller, could release the bird. This came about with the use of an open window chosen in our scene.




The player(s) are then rewarded by seeing the bird fly off into the sunset!



So we have set up a potential conflict, heightened the risk, then built in a resolution.


A problem that led to a solution ...

Well, let's just say the test shoot didn't go as well as planned!

I booked time in the green screen studio in order to film the moving image of the bird in the cage, and saved that as a movie file. We were going to isolate the moving image by screening out the green background, and import our scene as a continuous movie into Processing. The idea was that you could speed up or slow down the spinning motion in accordance with how fast you flicked your mouse.


Then we made several realisations, which all seemed to stem from one another, and the whole concept changed and found more solid ground. The original concept was to create a little "game" that required very little skill, no brain power and would last "just a minute", hence the blog's title. The idea was that you would be able to suspend whichever screen-based task you were engaged in – e-mails, spreadsheets, cost revisions, etc – and just allow your brain to do nothing but stare at a nice image on the screen that you were able to control.

But we soon realised that we could do more with the idea. Here the development process splits in two: the narrative and the technical specification

And we also discovered that we could not get movies (least not the way we made them) to function in Processing (plus they weren't running at the right frame rate, and the action of the twisting bird/cage was too quick). So we chucked out the green screen movie and re-shot the whole scene as much sharper stills, and cut out the background to isolate the bird/cage images. This gave us greater flexibility and control when importing the images into Processing.


Tuesday 25 May 2010

Final scene pics


Image 1 - with hungry-looking dogs

Image 2 - tantalising glimpse of the outside!

Image 3 - might be the wrong ratio, but like the outside view

Image 4 - a bit strange but very exotic

Image 5 - just right, except that it's a bathroom and I'd have to pay for it!

Wednesday 17 March 2010

Welcome!

I was stuck for inspiration for this module, but then I saw Tim Burton's Sleepy Hollow the other week, and Johnny Depp's character has this optical illusion/plaything which he carries to remind him of his childhood. This is how it works ...



I will be discussing the different ways in which to create this, whether it is best as a screen-based app, or whether it will work on a hand-held device. Or whether indeed you could make the image spin by using a motion sensor. Such things will become evident through our iterative research, so I think we (that is myself and Jeff) will have to create a working prototype to put in to testing.