AR Graffiti on Sony Vita

Check out the video for the graffiti game for the PS Vita. You would want to do more than this (conceptually) in your project!

http://www.gamerevolution.com/news/tg-ar-game-for-vita-lets-you-do-graffiti-with-risk-of-jail-time-11069

Himango 4D Concert

AR experience does not only exist in personal screens. In fact, when augmented reality is used in a larger scale, such as stage performance, it could bring immersive experience of the mixture of virtual and reality to the audience. Here’s an example I would like to share because of its immersive experience with multiple viewers. Himango Concert, performed in National Theater of South Korea, is one of the experimental performance which combines music, dance and augmented Reality. At this show, the augmented reality technology (starting from 1:30) is combined with 4D display technology to create a striking visual of mixed reality for the audience. The virtual pattern created by AR reacts to the dancer’s elegant movements simultaneously. It flies around the dancer to create an aurora effect. Augmented adds interactivity to the performance which could give the viewers unexpected excitement.

AR Experience Report: Articulated Naturality Web

QderoPateo Communications (QPC) introduced their concept of the future as the Articulated Naturality Web (ANW). What I find most compelling about their vision for outdoor AR is the way information is seamlessly integrated into the natural world (when possible), rather than floating in front of the world. With the current state of technology, this kind of integration with the natural world is generally feasible only on a smaller scale. However, as outdoor tracking becomes more precise and 3d models of our outdoor environments become more available, this type of augmentation really makes sense for AR experiences that are meant to inform the user.

For instance, one of the biggest advantages of using natural surfaces when informing the user about their environment is the automatic correlation between semantic information and objects in the environment. This creates a more intuitive interface. Since both information about the object and the actual object are perceptually processed as one and the same, the user is freed from the burden of consciously associating one with the other. As seen in the video (2:04), a hotel with room availability portrayed as glowing rooms is a great example of how useful this concept can be.

ANW is touted by QPC as “a complete renaissance in the way we approach technology”. Although ANW as portrayed in the video is certainly compelling (after getting past all the squinting from lack of HMDs), I would argue that rather than being an entirely a new way of thinking about technology, it’s simply the next logical development for AR. For me, this video inspires me to think carefully about the purposes of a particular interface, and to keep those purposes in mind to make it intuitive. When developing an AR experience that augments the world with many layers of information, making it easy for the user to process can be challenging. In such a scenario, doing as much as possible to merge the semantics with their corresponding objects may be a solution.  

VOCALOID Hatsune Miku concert

When looking around for a AR project, I remembered a youtube video I had watched of a live concert that was an animated character on stage using screens and other effects to make it look like the character was actually on the stage.  This was a video of the live concert of Hatsune Miku, a character from the voice synthesizer program called VOCALOID.  The way that it works is through a twist on an old illusion called Pepper’s Ghost.  The image is projected from a  regular projector overhead pointing down. Then it hits a thin metalized film at a 45 degree angle. The film reflects the image and is pretty much see-through without the image on it. This allows the image to look like the character is either walking or floating around on stage.  It also allows the character to look 3D even though the image is actually only 2D.  The reason that this was so compelling was the idea that we’re getting closer to having actual holograms that we could watch for entertainment.  And it allowed a character whose voice is pretty much the only concrete thing about her to have a live concert for fans to enjoy.

Hatsune Miku Concert

Foxtrax: the Failure of the “Glowpuck”

One early example of AR being used in mainstream sports is the Foxtrax puck developed by the Fox network after they began airing National Hockey League games in 1994. The Foxtrax pucks contained electronics that emitted infrared pulses that were detected by specialized equipment in the hockey arenas, which interpreted the puck’s coordinates. Computer graphics could then be superimposed over the puck – the “glowpuck”, as it   became known colloquially, appeared on one’s TV screen with a blue glow emitting from the puck. This was implemented in an effort by Fox to make the game of hockey easier to watch for casual or new fans, as focus groups had indicated that new hockey viewers had trouble following the puck. For additional style points, the puck would glow red when detected to be in excess of 70 miles per hour and leave a comet tail behind it.

While Fox reported success in attracting new viewers after implementing Foxtrax, there was a huge backlash among hockey’s more serious fanbase who saw the visual additions as pointless distractions. Many claimed that the graphics took away from the gritty nature of the game, turning it into something more akin to a video game. The technology turned into a running joke until ABC bought the NHL broadcast rights in 1998 and immediately returned to normal pucks.

While Foxtrax is often seen as a failure, I find it to be a compelling example of AR in that it did have some success in teaching newcomers to watch hockey – its failure was forcing this teaching tool upon those who didn’t need it. One can imagine a person trying to get into hockey by using a personal Foxtrax filter, and then disabling it once they had acclimated to the speed of the game. Foxtrax can serve as a positive example of designing with a specific problem in mind, but also as a negative example of considering one’s audience.

Touchable Holography

I found this after my first example, but I thought this was actually much more compelling, so I’m adding it.  The inclusion of other senses besides vision to AR would almost certainly increase immersion and help to focus attention.  For instance, an educational application including vision, sound and touch might be much more compelling than just text, diagrams on a board, physical models, or vision alone.  I know one “standard’ application of this sort is chemistry (modeling molecules) but I’ve also heard of immersive water flow simulations and math applications in VR.  I would assume those type applications could be transferred to AR (or a mix of AR/VR).

Studies have shown that attention is a large factor in memory and learning, and immersion or presence can affect attention.  So, presumably, anything which increases attention would improve learning and retention.  I’d be interested in pursuing immersive experiences that help students (especially, but not exclusively, special needs students) to improve the quality and quantity of their education.

 

AR T-shirt

To be honest, I had a hard time finding anything I thought was compelling at all.  Most of the experiences I found were either game or advertising-related, none of which I found interesting in the least.  The best example I found was still game- and ad-related (could be construed as a promo for the magazine), but I guess I thought it was silly for silly’s sake, and that made it less annoying.  My impression from what I found is that there were lots of monetization attempts, and little creative or innovative thinking going into AR experiences, at least the ones that get high-ranking google searches or made it onto youtube.  I really thought I’d find more cool stuff…  I wouldn’t use any of the experiences I found in an AR project.  I’d try to find something that was actually useful and not just a re-hash of what’s already been done.  But I guess that’s hard.

AR Experience Report: Cyborg Vision

This is a video-based argumented reality app example.  By seeing through the camera of   iphone or ipad, user could gain a similar view appeared in the movie Terminator as the app name implies. It utilizes the face recognition technology and by capturing one person’s face, it will compare the image captured and one’s facebook profile photo. If two matches, the relevant information like name, age, gender, place of living etc, will be shown surrounded the person on the screen. But as seen from the video, the recognition process is a bit slow.

Though the dark dim red sight of view appearing distracting, I think this is still an excellent app in aspect of AR experiences. Unlike those seemingly fancy AR games on the market, this is the one having the potential integrating into people’s daily life. This is the app you would use frequently if in the future more advanced HMD is developed. It saves us from the embarrassing scene like forgetting new acquaintance’s name and helps knowing our friends better. It can even show the recent activities of your friend posted on the facebook. It is also a good example of meeting the need of huge amount of data required by AR by retrieving directly from those comercial SNS website.

“Follow Me” Augmented Reality Navigation

Route 66 Maps + Navigation is an Android app featuring a unique, real-time augmented reality tool referred to as, “Follow Me.”  Follow Me relies on mobile augmented reality technology and on the metaphor of following other vehicles to provide real-time navigational feedback to motorists en route to their destination.  The tool overlays a 3D AR visualization of a navigational route on top of a mobile device’s live camera feed.  A three-dimensional model of a car positioned ahead of the user is situated atop the path.  The model car relies on the user’s GPS data to follow the projected path, activating its own turn signals in advance of upcoming turns and exits.  This 3D route plus virtual leader assists the user’s anticipation of the route ahead.

Follow Me is a compelling example of a practical AR experience executed within the confines of current-generation mobile technology; a welcome complement to the growing number of AR apps focused on proof-of-concept, non-practical tech demos.  Follow Me will serve as a useful model to consider in the design of my own navigation/map-related AR experiences.

AR Experience Report 1: Human Pacman

The “Human Pacman” was mixed reality take on the classic PacMan game by the Mixed Reality Lab of National University of Singapore. The idea behind it is such that each players starts off in a physical location and move along the grid that appears before them through their hmd. The main aim is to collect the cookies, that appears infront of them in a form of augmented objects while trying to evade ghosts, who are fellow players. A mix of sensors as well as devices are used to bring the augmented reality experience to live, not limited to the GPS tracking and HMDs.

Why i like this experience would be because of its novel use of participants in game like environment together with augmented reality. Additionally also due to the fact that the idea brings to live a classic game that most people would have played before but giving it an augmented/real life twist, it brings about a cool factor. Thus inspiring me to create AR/MR applications that gives the experience of bringing to life games that  they might have played before in a normal environment like a computer or a board game. I am also inspired on how the game game-mixed reality example can be done without the need for the heavy equipment. Another inspiration that this project gives me would be the ubiquity of how the entire experience is and should be.