Ogre 3D/ PhaseSpace: Project MDI

Motion Driven Interaction

Spring Semester 2011

Developers: Dan Morrison, Sam Bushman

Time Period for Development: Spring Semester 2011


The OGRE 3D/Phase Space Project: Project MDI intends to utilize Shawnee State University's Motion Capture hardware to create an interesting and fun interactive demo for showcasing the Phase Space hardware and the SGET program to the public at large. In order to do this, several key steps must first be surmounted. A general API for communication with the motion capture hardware is necessary for building software on top of. Various features of OGRE 3D (the intended 3D renderer) must be slowly added and tested to ensure that animations and object render and move correctly in a 3D environment. Finally, presentations must be scripted and tested to ensure that all necessary details are included in the presentation, the presentation is technically feasible, and the presentation is interesting to a wide audience. Once these steps have been completed, the resulting product will also have several alternative uses. These include providing a code base for future student projects, allowing for integration of the Phase Space hardware with the 3D modeling/animation program Blender, and even game creation. With this project being developed with 100% free/open source software, other Phase Space customers may utilize this software and at the same time experience the technical expertise of the students of Shawnee State University. The advertisement, technical, and entertainment benefits of this project make it an attractive and challenging one for existing students without the need for further technology purchases.

Objectives of project:

  • Build an extensible Phase Space wrapper for Ogre3d or general use

  • Real time movement of an Orge3d model

  • Real time interaction with an Ogre3d environment

  • Create a presentation for public audience

  • Create professional/portfolio presentation


  • Provide an interesting and interactive alternative use for the motion capture hardware to simply recording animations.

  • Provide resources for future adaption/utilization of motion capture hardware.

  • Show technical competence in utilizing production-grade hardware in a realtime simulation environment.

  • Promote the SGET program at Shawnee State University.

  • To gain experience using the Phase Space motion capture system.

Milestones for spring semester:

  1. Create Humanoid rigids
    1. Able to create rigids on the fly dynamicly
    2. Dump all created rigids to file
    3. Able to load rigids from an easy human readable file
  2. Ken Doll Demo
    1. Able to load a file that defines rigids and assiats them with a mesh, offset, rotation, and scale
  3. Skeletal mesh demo
  4. Full VR demo using heads up display (Acquired fat-shark wireless video glasses)
    1. Add means of adding markers to video glasses
    2. Alter Cityscape demo to use new rigid layout
  5. Reliability improvements
    1. Create a small buffer of all points to store backup data in a temporaty occlusion
    2. using a 2-3 frame reference predict a likely position of the missing marker (with 240hz the movment should be small between intervals)
    3. Better handling of run-time mode switching
  6. Inverse Kinematics
    1. using fewer markers and a constraints replicate Skeletal mesh demo
  7. Another evaluation of API
    1. Clean-out test/debug vars
    2. Document new features
  8. "Project Godzilla"
    1. Using ODE physics make a interactive destructive environment
    2. Have two modes:
      1. Full mode - fulls suit demo using humanoid character
      2. Lite mode- minimal marker layout head band, belt and gloves using a robotic character
        1. quick to setup intended for audiance participance
    3. Destruction gives you a score, making this demo into a mini game
      1. add tanks and missiles attacking subject
      2. add big explosions
  9. Port mathematics to a general lib to remove Ogre Math