Assignment I – Through a Child’s Eyes

AR Experience using Argon, demonstrating image tracking and 3D augmenting

URL : http://cc.gatech.edu/~msati3/argon/basic2.html

Markers: http://www.cc.gatech.edu/~msati3/argon/Virtual%20Environments%20Assignment%20I.pdf

Youtube: http://youtu.be/dg8zVCJZCOk

 

Description of the Experience:

I invite you to see the world through the eyes of a child, sharing his/her bewilderment and fantasy in what seems to be a drab world to us adults. In this environment, I augment the
drawing drawn by a kid to conjure up the world that the kid dreamt of when he put paint to canvas. A surreal-peaceful place indeed. The augmentation places a synthetic house over the drawing and The markers control the natural forces of sun and snow, use them to see how the kid imagines herself building snowmen as she eagerly awaiting Santa’s arrival in this painting that she draws. Also see how the day passes by in her imagnination, the shadows lengthening and then shortening again.
Also, as I present a widescreen experience, and the drawing of a house is best looked at while vertically presented, I’ve decided to leave in an Easter egg, that only comes into play when you see the house is placed vertical and the device is in landscape view.

 

Initial idea:

 My initial thoughts were to provide the same experience as exists currently , just to have the interaction marker rotate the augmented house instead of lighting it up (lack of time), and to have audio accompaniment.

 

Current Experience:

The current experience consists of:

1) Augmenting a COLLADA model of a house on the image target.
2) Handling transient occlusion by persisting the state of the image target’s augmentations by using a high value of autoHideAfterFrames.
3) Variable LOD content — from a distance, just the CSSObject is visible, akin to a map marker. On getting nearer, the entire collada model comes into view.
4) Displaying a snow man on the detection of a trigger frame markers.
5) Lighting the environment appropriately by the use of the interaction marker, that serves as the sun.
6) A particle system based snowfall simulation in case the snowman is visible and the world orientation of the device is vertical landscape.

 

Walkthrough:

Use the latest build of Argon to go to http://cc.gatech.edu/~msati3/argon/basic2.html. First, use the device to focus on the natural image marker. It will be appropriately augmented. You can move farther away while still keeping the marker in view to see the switch between the CSS content and the 3D content. Now, use the snowman marker to add an additional (surprise-surprise) snowman. This also hides with the house, sitting beside it. Now, bring the sun marker into view. This will a) increase the brightness in the overall scene and b) act like a spot light that will lighten up regions of the scene that you are visiting. Finally, remove the sun marker and place the snow marker and the image in view and place the image vertically, and the device in landscape. This will trigger particle-snow, and also an end to this walkthrough and the experience! Hope you enjoyed it!

 

Artwork Credits: The house model and the picture of the kid are from the internet. The rest is original artwork.

 

Technical Considerations: A parser within a parser: The work also includes a half baked declarative Argon Augmentation XML parser, that is capable of synthesizing the appropriate AR experience from an XML description of it. This framework also allows for clients to hook into functionality for things like onMarkerVisible, etc, thus allowing them to create scriptable experiences easily.