Displaying posts categorized under

Weekly Paper Summaries

Weekly summaries of paper readings

Week 5 Summaries

KinectFusion: Real-time 3D reconstruction and Interaction Using a Moving Depth Camera This paper talks about a real-time 3D reconstruction and interaction system called KinectFusion, which is built based on Kinect. This system provides 3D scene reconstruction as well as interaction in real-time. 3D models in the scene are reconstructed from the data captured by Kinect […]

Week 5 Summary

Kinect Fusion Kinect fusion is a new software to backend for the existing Kinect camera produced by Microsoft.  At it’s base level the software is a sophisticated 3D mapping tool.  If uses the dual camera in the Kinect system to build 3D model on anything the camera sees in real time.  The backbone of this […]

week 5 summaries [Aurelien Bonnafont]

Kinectfusion; real time 3D Reconstruction and Interaction using a Moving Depth Camera This article deals with the kinectfusion camera which recreates real 3D object quickly and with low cost. Compare to the other system kinect fusion is supporting both real time tracking and reconstruction, is faster, more accurate, infrastructure less and allow interaction of objects […]

[Summaries Week 5]: Kinect Fusion; Going Out

KinectFusion: Real-time 3D reconstruction and interaction using a moving depth camera In their paper Kinect Fusion, Izadi et al. describe their implementation of a system that uses a standard Kinect camera to generate a real-time, interactive 3D representation of the live camera feed that is robust to user intervention/interaction. They also explain their extensions to […]

week 5 summary – Hitesh

Kinect Fusion Kinect Fusion is one of the latest technological advancements in AR. It uses depth data from Kinect sensor to track 3D pose and constructs a 3D representation of the object in the real time. It vouches to be a cost-effective and seamless augmentation of 3D physical data on the real world. As opposed […]

Week 5 Summaries

Week 5 KinectFusion KinectFusion is an awesome tool for 3D surface reconstruction using the Kinect camera. It works in real-time, reconstructing and storing all the depth information it gets. The depth maps are a little noisy but there are optimizations to overcome those issues. Depth cameras have been around for a while but Kinect made […]

Week 5 Summaries

KinnectFusion Kinnect is now widely used and cost effective RGB-D sensor. Many researchers are working with Kinnect nowadays. KinnectFusion is one of the 3D reconstruction works that is based on Kinnect’s depth sensor. As a depth sensor, Kinnect’s cost is compelling but the performance is not as compelling compared to other depth cameras since Kinnect’s […]

Week 5 Summaries

Going out: Robust Model-based Tracking for Outdoor Augmented Reality: The paper presents an augmented reality system that provides realtime overlays on a handheld device.  Traditional augmented reality systems rely on GPS for outdoor position measurements and magnetic compasses and inertial sensors for orientation.  In urban outdoor environments GPS is hindered by buildings and signal reflections.  […]

Week 5 Summary

KinectFusion The KinectFusion – the next step in the connect evolution provides a tool that uses the depth information from the Kinect camera to rapidly construct a model of a room as it is moved through the room. The Kinect camera uses structured lighting techniques to gather a point cloud of data for a scene, […]

Summary Week4

Kinect Fusion The paper presents a novel interactive reconstruction system called the Kinect Fusion. It takes live depth data using a moving Kinect camera and then recreates a 3D model of the scene. They also propose a novel GPU pipeline that allows for accurate camera tracking and surface reconstruction in real time. The authors highlight […]

Week 5 Summaries: Kinect Fusion and Going Out

KinectFusion: Realtime 3D Reconstruction and Interaction Using a Moving Depth Camera This paper describes the currently trending technology: KinectFusion. KinectFusion creates 3D reconstructions of an indoor scene using just the depth data in real time within seconds. However the depth maps are noisy and contain numerous “holes” which are dealt with by continuously tracking 6DOF […]

Week 5 KinectFusion

KinectFusion is a very powerful tool that can help users create detailed 3D reconstructions indoor rapidly Even though the concept of depth camera is not new in this area, Kinect makes depth sensing popular due to its low-lost and real-time features. This system allows people to use a handheld Kinect camera to move within a […]

Week 5 Summary : KinectFusion and Going Out

KinectFusion: Realtime 3D Reconstruction and Interaction Using a Moving Depth Camera This paper discussed one of the most recent and talked about technology currently: KinectFusion. It reconstructs a 3-D model of an object or environment using the data it receives from the Kinect sensor. The depth data from Kinect is used to track the 3D […]

Going Out with KinectFusion

KinectFusion: Realtime 3D Reconstruction and Interaction Using a Moving Depth Camera -Shahram Izadi, David Kim, Otmar Hilliges, David Molyneaux, Richard Newcombe, Pushmeet Kohli, Jamie Shotton, Steve Hodges, Dustin Freeman, Andrew Davison, Andrew Fitzgibbon Simply amazing. In this phenomenal paper Izadi and colleagues describe some novel approaches to 3D scene reconstruction and interactions with the scene […]

KinnectFusion and Going out

KinnectFusion : RealTime 3D Reconstruction and Interaction using a moving depth camera KinectFusion use the Microsoft device Kinect to create in real-time 3D reconstructions of indoor scene only using the depth data of the camera. The Kinect camera generates Cloud Points from whom a mesh is generated. To avoid holes and decrease the noise of […]

[week 5 summaries]

KinectFusion: Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera KinectFusion is a system that supports high quality, geometrically accurate 3D model reconstruction in real-time. All it needs is a depth map generated from a Kinect camera. The camera is held by user with 6 DOF, which provide the data to compose a viewpoint […]

Ruge’s summary of “KinectFusion and “Going Out”

The KinectFusion paper gave a detailed description of a 3d mapping technology based on the Kinect hardware. It provided use cases, explanations of existing hardware, and how the product would be used in an operational sense. Beyond that it captured the the mathematical and computer programming fundamentals that are pivotal to both its operation and […]

NavShoe paper summary

Tracking of users using systems like GPS has many useful applications. However it also suffers from issues like inaccuracy in exact positioning and uncertainty of the exact path being and to be followed by the user. The paper discusses an approach towards pedestrian tracking using shoe-mounted inertial sensors. The NavShoe concept performs robust tracking with […]

Summary for week 4

Pedestrian Tracking with Shoe-Mounted Inertial Sensors InterSense developed NavShoe, which can track the user wearing a shoe on which a small inertial sensor is adapted, on the contrary to previous devices that required a specific environment. Thus, the location and the orientation of the user can be known. The usual problem is the horizontal acceleration […]

Week 4 Summary

Pedestrian Tracking Tracking a person is a challenging problem especially in outdoor environments that have to previous preparation. This capability would prove very useful for things like search and rescue, as well as tracking emergency response workers (like firefighters). The NavShoe is a concept attempting to address this problem by using a small motion sensor […]

[Summary Week4] Pedestrian tracking with Shoe Mounted Inertial sensors – Eric Foxlin

In this paper, the author describes NavShoe, a novel shoe-mounted inertial sensor system that is capable of tracking 6-DOF in an un-instrumented environment. The proposed system builds on the prevalent MEMS gyroscopes that report 3-DOF (orientation in space) by also providing highly accurate position estimates. Un-instrumented inertial tracking is made feasible by the identification of alternating stationary and moving […]

Week 4 NavShoe

In this paper, the author introduces a navigation system called NavShoe, developed by InterSense. Currently, orientation tracking is not very challenging because earth’s gravity and geomagnetic field can help sensors make good judgement. However position tracking is not that simple. It has to rely on radio-navigation aids or some other infrastructure. The appearance of NavShoe […]

Week4 Summary

The author starts the paper with stating the needs for tracking with Shoe-Mounted sensors. The main issue with traditional tracking framework is that it required the system to be instrumented to result in a tracking with reasonable error. To effectively track without prepared instruments, the author came up with an inertial tracking system called NavShoe. […]

Week4_Summary

NavShoe is a position tracking system, small enough to truck into shoe laces. It is much more accurate than head mounted inertia orientation tracker because the higher foot acceleration enables the use of transfer alignment from GPS. InterSense InertiaCube3 was used to get multisensory data such as gyro, accelerometer, and magnetometer. Calibration of sensors is […]

Week 4 Summary

The paper describes a navigation system, NavShoe, that tracts the location of a person with a device that is small enough to be tucked under the shoelaces of a shoe.  A system like this would be extremely useful in many different applications such as emergency responders. Two categories of experiments were done to test the […]

Week 4 – NavShoe

The paper highlights the problems faced by modern day navigation and tracking systems. They then introduce NavShoe a device that provides accurate tracking as well as orientation feedback. The major hurdle faced by such trackers is to maintain the accuracy with both orientation and position. Position tracking is based on inertial sensing but it is […]

Ruge’s Summary of Foxlin’s Shoe-Mounted Sensors

Pedestrian Tracking with Shoe-Mounted Inertial Sensors This article discusses the procedures and uses for a proprietary location sensor. The sensor used is installed in/on a users shoe, and utilizes foot movements and orientation combined with various other sensors to not only gain a user’s current location, but track the operators movement as well. This is […]

Week 4 – Pedestrian Tracking with Shoe-Mounted Inertial Sensors

The authors propose a tracking technology for the real world applications of MR. Computer Vision algorithms are being looked at as the ultimate solution to the tracking problem but there are a lot of complexities involved with it and they are not mature enough to be successful alone. The other alternative is inertial sensors. He […]

Week 4 Summary

Pedestrian Tracking with Shoe-Mounted Inertial Sensors The paper describes the NavShoe system suitable for position tracking based on inertial sensing. Real world deployment of location sensors then required an instrumented or marked environment. The Navshoe system overcomes this problem by enabling position tracking without preparing any indoor or outdoor setting. It basically uses a miniature […]

[week 4 summaries]

Pedestrian Tracking with Shoe-Mounted Inertial Sensors There are a bunch of scenarios when a navigation for tracking the location of a person is useful, some are for life saving (locate to rescue firefighters); some are for assistant (personal navigation); and some are for entertaining (AR applications). The author put forward a pedestrian tracking system NavShoe […]

Week 4 Summary

Pedestrian Tracking with Shoe-Mounted Inertial Sensors   The paper describes NavShoe, a pedestrian navigation system which tracks the location of a person on foot. NavShoe provides real-time location data of individuals in GPS denied areas without the need of pre-existing infrastructure. The system makes use of an wireless inertial sensor which is small enough to fit into […]

week 4 summary Bo Pang

Pedestrian Tracking with Shoe-Mounted Inertial Sensors This paper introduces a shoe-mounted 3D-tracking system called NavShoe system developed by Eric Foxlin and folks in InterSense. NavShoe is built based on inertial sensing technology. The advantage of this system is that it provides relative precise position and orientation tracking without any infrastructure installation. The system is also […]

week 4 summary – Hitesh

The paper highlights some of the existing issues with location based tracking systems in a real world MR and wearable systems, that it requires the objects in the environment to be pre-mapped or marked thus not making the system extensible and robust. Computer vision tracking poses a lot of challenges and requires complex algorithms and […]

Week 4 Summary

Navshoe is new kind of tracking technology that is used to track motion of a user in the real world.  The basic design is small device with various sensors is placed in the shoe of the user.  The device is GPS capable but by now means requires GPS to function.  It combines a few different […]

week 4 summary Aurelien Bonnafont

The current GPSs cannot track the real position without prepare the environment with markers or instrument. The NavShoe device can overcome this problem by providing an inertial sensing. The NavShoe allows to correct the velocity error by applying a zero velocity into an EKF navigation error corrector when a foot is in stance phase. A […]

Week 4 Pedestrian Tracking by gr8dhage

Pedestrian Tracking with Shoe-Mounted Inertial Sensors -Eric Foxlin In this paper Foxlin describes a 3D position tracking system not based on GPS but instead using inertial sensors. The system is compact enough to be able to fit on a shoe and provides reasonably accurate tracking for short distances. It can also work indoors. The major […]

Introduction to Argon

Summary of “The Argon AR Web Browser and Standards-based AR Application Environment” by Blair MacIntyre, et al. This paper from ISMAR 2011 is an introduction and explanation of Argon and the related underlying and supporting technologies. Argon is an ongoing project to implement AR exploiting the ubiquity of mobile phones (specifically, at least for now, […]

Week 3 Summary: One context – Multiple applications

The “app” paradigm that took the mobile industry by storm, has brought on the concept of micro-applications that fulfill one purpose. This is great in terms of maintaining focus, but there is very little interoperability between apps. I need to look for a restaurant, use a different app. I need to take a bus till […]

Argon – Paper Summary

With a short discussion about Augmented Reality (AR) and previous work in the field the paper discusses about web based AR. One of the challenges for AR as a field is the hardware requirements. However, with the development of smart phones with hardware such as camera, GPS, orientation sensors, etc coupled with the ubiquity of […]

Week 3 Summary: Argon

In this paper, the authors introduce how they design the Argon AR Web Browser and what there application environment is like. First of all, the requirement of success is defined: All AR content is presented in one unified AR application environment. And a lot of so-called AR Browsers are criticized because they fail to think about […]

week 3 summary on Argon – Hitesh

The paper describes the background study, motivations and the technology directions towards development of the ARGON web browser. It defines the key goals towards creating an immersive and interactive AR web browsing for mobile devices. I think it involved a lot of research and effort to not just develop an AR browser that supports Augmented […]

Week 3 Summary

Argon Argon is basically an AR design tool analogous to a windowing system for a desktop computer. Argon’s goal is to take advantage of the increasingly important mobile development platforms and their intersection with web capabilities. The three main goals for Argon are: 1. Create a “windowing system” for AR 2. Merge with and build […]

Week 3 Summary-ARGON

The Argon AR Web Browser and Standards-based AR Application Environment   This paper describes the ARGON AR Web browser, the first AR browser for iOS that supports most existing web technologies and standards. The dream of AR is to create an immersive environment in which virtual content is superimposed over the worlds view. Dr. Blair […]

The Argon AR Web Browser and Standards-based AR Application Environment

The key focus of the paper is to introduce ARGON web browser as a stepping stone towards standardizing and environment for development of mobile AR applications. The paper starts by introducing the world of AR and then proceeds to the 3 goals behind Argon web browser. 1)      Creating a supportive AR application environment. 2)      To […]

Argon

The paper presents the Argon AR web browser and the standards and technologies that goes into designing applications for it.  It gives us an introduction to how content can be displayed as well as the KARML markup language, which is an extension on KML.  There has been related work done in this space from various […]

Argon Paper : Summary

The authors believe that we never really had a unified immersive AR application environment and thus it needed to be addressed. They draw analogies from the days when 2d-Desktops with 3D displays will coming along and map them to the AR world. The advancement in mobile technology, both software and hardware was a good starting […]

Summary for week 3

The Argon AR Web Browser and Standards-based AR Application Environment Augmented Reality (AR) has been developed in 1965 by Ivan Sutherland, but researches in this field only began in the late 1980s. Now, this technology should evolve to AR desktops gathering independent applications in a single environment. That’s why Argon, an AR web browser, is […]

Summary week 3

 The Argon AR Web Browser and Standards-based AR Application Environment This paper presents the design and implementation of the Argon AR Web Browser as well as the motivations and goals behind them. It firstly talks about why deciding to build such an AR application environment on mobile web. Because both mobile computing technology and mobile […]

Week 3 Summary

Argon Paper Argon is an augmented reality browser and while it isn’t the first of it’s kind it has been engineered differently the previous attempts.  The goal of Argon is the proliferation of AR technology and content.  The overarching principle of Argon is to leverage existing web technologies to reduce development and content generation time.  […]

Argon summary week 3 [aurelien bonnafont]

AR is becoming, during the past few years, a vision where an AR environment supports others AR content. This technology was difficult to implement in the past, but nowadays with more efficient mobiles, some application have been developed, but still provides some problems. Argon is an application which tend to give a high control, independency […]