Argon Paper : Summary

The authors believe that we never really had a unified immersive AR application environment and thus it needed to be addressed. They draw analogies from the days when 2d-Desktops with 3D displays will coming along and map them to the AR world. The advancement in mobile technology, both software and hardware was a good starting point especially with everything gradually becoming cloud-based. The 3 goals laid down were – (1) an immersive AR window system, (2) build on the existing mobile technology and (3) to create an ecosystem that supports easy and sophisticated authoring of applications. They then discuss the previous work done on AR and how AR is important in conveying the extra information as in the case of the Ultra-sound imagery and some good ideas and attempts came before the eco system had developed. In the architecture of Argon consists of 3 main parts: how Argon integrates with the web (they discuss the 4 approaches), the internal architecture of Argon (Webkit for rendering, JS APIs for scripting and the bridge with the native/interpreted code) and the KARML markup language (discusses how KML has been enhanced for AR and other display related code). KARML also allows us to improve position tracking by using the Camera and LookAt elements just like KML and also allowing the user to replace the video at a location with a panoramic. The last part of the paper discusses various applications. The Server-less AR Mashups is a simple app that uses Yahoo Pipes to create mashups. The Webservice-based Searches uses the Twitter webservice and displays the resulting JSON using the Argon Public API to dynamically create placemarkers. An AR Greeting card app was made to demonstrate Rapid Server-based AR development. And similarly more examples for Region Monitoring and GeoSpot Tracking Override and blending 2D interfaces etc.

Comments are closed.