Wednesday, August 5, 2009

Capstone and Progress

In brief this blog shall illustrate how I progress through my capstone project as partial fulfillment of my degree of Master of Science in Information Technology.

The Vision
As a child playing Contra on my Sega, I wondered what it would be like to be totally immersed in the game. Would it appear to be like an episode that came straight out of the television show Soldier of Fortune? Pondering on the thoughts, I went on living life always asking myself if there was something more to technology than just pretty colors and sound.

What would it be like if Walt Disney's production of Fantasia was never released? Would we be still experiencing monophonic sound productions. Fantasia, has in my opinion, given rise to the immersion of an audience with multiple channels of sound output, that we refer to as surround sound. (If you have a pair of headphones handy, take a look at this 3D holophonic sound immersion)

By the time I was tackling Contra, stereophonic sound had already been developed, but sound alone was not enough to impress me; I needed more. I had almost forgotten about the whole idea of user immersion in virtual reality until I came across the Logitech G25 steering wheel. It was a dream come true or so I thought. The wheel featured force feedback incomparable to any of its competition. This was the turning point in my life where I knew this was a field I had to get involved in.

Upon starting school at RIT, I decided on pursuing this and came about the whole idea of getting a device set up to provide tactile feedback. Hopefully immersing an audience entirely in a virtual/augmented reality without the need for any specific input devices (a mouse/keyboard, etc.)

Similar Projects
A list of various related projects are listed under the "inspirations" tab to the right

Methodology

The data glove has both a software and hardware aspect to it. The hardware compromises of a glove affixed with tiny pager motors and infrared light emitting diodes (LEDs). They are connected to arduino micro controller which is hooked up to a computer.

The software of data glove is written in Processing, an open source programming language and environment, based off of Java.

Apart from the glove there are two web cams involved as part of the hardware. They sit at right angles to each other facing the user. The cameras track the glove by means of the infrared LEDs. The software uses the LEDs as a pointer and relate it to a position in a 3D virtual world. Two cameras are required to capture the user's position in all three dimensions.

The virtual world consists of primitives (cubes/spheres) and moves the camera in the virtual world with relation to the user's hand movement. If he or she were to encounter an obstacle, the user's hand would feel a vibration. User's are able to distinguish size of objects with touch alone and, as a work in progress, will be able to feel a sense of shape as well.

No comments:

Post a Comment