01/24/11

 

Using Motion Capture System for Research

 person_climbing_stairs.jpg

Currently Shawnee State University’s Motion Capture Lab has been used for Computer Animation, Gaming, and real-time interaction. The Motion Capture lab was installed last years and in this time has been used for two videos, one game, and five short interactive demos. However there is one way that is allays mentioned to while showing off the room but currently has nothing to demonstrate this usage, research. This Project intends to legitimize the Motion capture room as a tool that can be used in advanced motion analysis beyond being used for arts or gaming. This project will Hopefully interest other degrees beyond the SGET and SGDA in using the Motion Capture technologies as the room is very under utilized. 


This project will be 3 main parts; Capture, Visualization, and Export. Since no complete tool-set specificity for the Phase Space system exists a framework needs to be made. The current generic capture program “Master” allows for capture in Phasespaces own file format and c3d however is missing many needed features and is not extremely user friendly, to look at data in an overview a secondary tool must be used to read the files and parse the file if any quantifiable data is to be derived from it.

Capture will demonstrate the PhaseSpace systems features, unique IDs per marker and speed. The equipment needed to capture the data requires three bands of markers, one on the head, one on the torso, and one on the hand. These three ridgids will actively track the person while tat move and be rather easy to wear and setup. To show off the system's ability to capture data in near real-time the goal of the program is for a subject to navigate a hedge maze using sound to denote the boundaries. Sound will be used instead of visuals as it is difficult to deploy live video to a person wireless. Sound can be done many different ways and still succeed at conveying the information without the need for expensive equipment.

The Visualization program will be developed almost in parallel with the Capture program. This will allow you to see the data live as its being recorded and optionally compare it to previous runs. This program also serves a duel purpose. The positioning and sound output of the capture application will be tweaked based on the analysis of the Visualization program. So in this way both programs will iteratively improve each-other through their own requirements. Visualization will include multiple ways of viewing the data such as velocity, time, and sample averages. This program will also be able to adjust the data if there was a bias in height of the subject of an off-center start so that it keeps data consistent.

The last part is exporting, having means of seeing the data is a huge step but likely anyone who wants to use the system will want the data formatted in a way that they can pipe into existing software. The export part will take the finalized data and format it into popular formats like excel.

The overall goal of this project is to demonstrate that the Motion Capture lab can be used by degrees beyond the Gaming and 3d animation. Math, Physics, and health majors can benefit from utilizing this room however currently there is nothing to demonstrate its professional usage. As the tools are being developed and used in a semester the focus is more on creating the tools than tryin to answer a question on how alternative sensory input effects the subject. This should keep the project reasonable for a single person in a semester.