Week 12 Discussions

Here are the three papers of the discussion:

A Survey of Design Issues in Spatial Input

http://delivery.acm.org/10.1145/200000/192501/p213-hinckley.pdf?ip=128.61.69.11&acc=ACTIVE%20SERVICE&key=C2716FEBFA981EF16C75A17C38D3955A3C0C269F20590017&CFID=187123085&CFTOKEN=56661494&__acm__=1364965022_4af4fe5334e1e3cbf3c59c7301ae70ee

This paper goes through the different issues and solutions that can be found and designed to make users more comfortable with spatial input.

 

“Put-That-There”: Voice and Gesture  at the Graphics Interface

http://delivery.acm.org/10.1145/810000/807503/p262-bolt.pdf?ip=128.61.69.11&acc=ACTIVE%20SERVICE&key=C2716FEBFA981EF16C75A17C38D3955A3C0C269F20590017&CFID=187123085&CFTOKEN=56661494&__acm__=1364980007_bd27c61ae3279b5bdc5e4d50948a722d

I chose this paper because it presents very natural ways to interact with a virtual environment and also speech recognition.

 

Charade: Remote Control of Objects Using Free-Hands Gestures

http://delivery.acm.org/10.1145/160000/159562/p28-baudel.pdf?ip=128.61.69.11&acc=ACTIVE%20SERVICE&key=C2716FEBFA981EF16C75A17C38D3955A3C0C269F20590017&CFID=187123085&CFTOKEN=56661494&__acm__=1364980061_e9d36824756c3ea3a520e39a9c9c1e14

This paper is about hand gestures recognition in the context of presentations, which provides useful information before building more complex systems.

 

Here are five questions related to them:

1 – Would sound feedback be useful to understand an environment?

2 – How could eye tracking be used to give clues about a virtual world or interact with it?

3 – What other other issues an solutions exist for spatial input?

4 – What oher interactions could be used?

5 – How could we get rid of the disadvantages of these technologies?

Comments are closed.