Phasespace API Phase 1

The following is the proposal Dan Morrison and I wrote for our motion capture research project, Project MDI. The milestones for Phase 1 have been completed and we are currently building on top of Phase 1’s success for the second phase of Project MDI. For those that are curious, this proposal has been kept for archival purposes. For the latest updates on Project MDI, please visit the Phasespace API Phase 2 link in the Projects tab.

OGRE 3D/ Phase Space: Project MDI

Motion Driven Interaction

Phase 1 – Fall Semester 2010

Developers: Dan Morrison, Sam Bushman

Time Period for Development: Fall Semester 2010


The “OGRE 3D/Phase Space Project” intends to utilize Shawnee State University’s Motion Capture hardware to create an interesting and fun interactive demo for showcasing the Phase Space hardware and the SGET program to the public at large. In order to do this, several key steps must first be surmounted. A general API for communication with the motion capture hardware is necessary for building software on top of. Various features of OGRE 3D (the intended 3D renderer) must be slowly added and tested to ensure that animations and object render and move correctly in a 3D environment. Finally, presentations must be scripted and tested to ensure that all necessary details are included in the presentation, the presentation is technically feasible, and the presentation is interesting to a wide audience. Once these steps have been completed, the resulting product will also have several alternative uses. These include providing a code base for future student projects, allowing for integration of the Phase Space hardware with the 3D modeling/animation program Blender, and even game creation. With this project being developed with 100% free/open source software, other Phase Space customers may utilize this software and at the same time experience the technical expertise of the students of Shawnee State University. The advertisement, technical, and entertainment benefits of this project make it an attractive and challenging one for existing students without the need for further technology purchases.

Objectives of project:

  • Build an extensible Phase Space wrapper for Ogre3d or general use
  • Real time movement of an Orge3d model
  • Real time interaction with an Ogre3d environment
  • Create a presentation for public audience
  • Create professional/portfolio presentation


  • Provide an interesting and interactive alternative use for the motion capture hardware to simply recording animations.
  • Provide resources for future adaption/utilization of motion capture hardware.
  • Show technical competence in utilizing production-grade hardware in a realtime simulation environment.
  • Promote the SGET program at Shawnee State University.
  • To gain experience using the Phase Space motion capture system.

Milestones for Phase 1:

  1. Build an application capable of dumping Phase Space dataProof of concept for writing an application capable of accessing Phase Space hardware.
    1. Application capable of accessing Phase Space hardware and retrieving current data.
    2. Display data in a raw, uninterpreted format to the screen (text-based dump)
  2. Build an application capable of parsing Phase Space dataProof of concept for properly understanding and interpreting raw capture data.
    1. Using the previous demo as a base, gather current data from the Phase Space system and parse raw data.
    2. Store parsed data in logical data structures.
    3. Display stored data in a text-based format that expresses organization of data (ie: table of data).
  3. Use data to display a 3d representation of the nodes and other data in Ogre3dProof of concept for utilizing parsed capture data in a realtime 3d application.
    1. Using previous demo as a base, gather data from the Phase Space hardware, parse and store it.
    2. Initialize a basic OGRE 3D application (3D rendering, window initialization, keyboard polling).
    3. Using the OGRE 3D application, display each node of the capture data as a point in 3D space.
    4. Update 3D data and continue to display updated data in 3D environment.
  4. Design a Generic Phase Space Interaction APIGeneralize our techniques of interacting with the Phase Space hardware in a re-usable way.
    1. Using knowledge gathered from previous milestones, design functions and data structures for an API capable of gathering up-to-date data from the Phase Space hardware and store it.
    2. API should perform all necessary error checking/handling on gathered data automatically and inform user of any problems in a clean and tidy way.
    3. API should be capable of being utilized by other C++ applications for interacting with Phase Space hardware.
    4. Design API to allow for extension via new headers/libraries for future applications.
    5. Utilize free/Open Source libraries where needed (non-viral licenses only)
  5. Make Demo using APIUtilize new API and evaluate the usefulness of the API.
    1. Utilizing the newly developed API, re-produce the “3D Dot” demo created in the 3rd milestone.
    2. Evaluate the API to determine if its design fits with project goals, is efficient, and is easy to use.
  6. Make a Demo that shows simple orientation “Fly swatter” demo (Was changed to a pong demo)Provide an interactive demo showing our work in a way that is interesting to the general public.
    1. Using OGRE 3D, render a simple 3D environment that provides a backdrop of some sort (perhaps sky box), a rendered representation of the user’s avatar (a simple paddle), and a target that moves in the 3D environment (a spherical “fly”).
    2. Update the user’s avatar’s rotation and position based on movement of a motion captured prop in realtime.
    3. Update the world’s camera based on movement from the user’s prop.
    4. Using simple distance-based collision checking, award a point and remove the target from the 3D world if the user’s avatar collides with it. Spawn a new target at a random location after this.

Milestone projections for spring semester will be determined over the winter break pending evaluation by interested parties.

Required Resources:

  • Access to Phase Space labs and resources
  • Access to the Phase Space source code/ API
  • Data storage area
  • Web space for tutorials and tech demos
  • Software
    • OGRE 3D
    • Visual Studio 2008 or newer
    • Blender
    • OpenOffice 3.0 or newer
  • Large White board
  • Computer Projector
  • Weekly supply of Little Caesars 😛
  • Standard office supplies

Future applications/ extensions:

  • Create a generic “Belt and Arm” LED suit for demonstration of realtime demos. Such a suit will be designed for quick application and removal from a user as opposed to an entire body suit. Such a suit would be made up of a simple belt with LEDs along with a primitive arm sleeve with LEDs attached.
  • Make a demo allowing for interaction with a physics-driven OGRE 3D environment via the Phase Space motion capture system.
  • Allow gathered node data to influence positioning of a rigid body as well as smooth interpolation between gathered node positions in an OGRE 3D environment.
  • Build an application front-end for associating a node (or groups of nodes) with an area on a 3D humanoid model in an OGRE 3D environment (ie: Node association with models in Motion Builder).
  • Write a script (or series of scripts) to allow Blender to fully replace Motion Builder as a 3D animation capture/cleanup/preview tool.
  • Purchase and integrate a wireless “VR” type display for a user currently being “Motion Captured” (ie: VR goggles, LCD helmet). This would in effect give the user an independent display that can be relative to their current orientation in the capture area while never losing sight of the display.