P30: An open database of synchronized, high precision 3D motion capture data for human gait analysis research and development
Introduction
The number of recordings of human gait done by 3D motion capture (MOCAP) systems is increasing every day. Despite this, few open databases of time series of multi-view video images are available [1]. The number of motion and gait capturing systems is also increasing, both with respect to passive video-based systems and active body-worn sensors. In addition, systems are becoming more portable and easy to use, and the sampling of data is done not only in a lab, but also in daily life. Typically, each institution has its own system for capturing, storing and analysing data, often restricting possibilities to compare and exchange these data. To help organizing and enhancing data assimilation and testing out new systems and algorithms, a framework for a simple, expandable web-based database has been developed and filled with a first set of example data.
Section snippets
Research Question
How can a database containing motion capture data be constructed and organized to support research and development of gait analysis?
Methods
In the development of a user-friendly database system with easy data input and searching, a relational database with a web-interface has been implemented (see Figure below). The web interface makes the data accessible from all desktop platforms, requiring only a web browser. To help organize the data, time series of raw data can also be stored directly in the database during capture. Data quality control is implemented to ensure the data meets required standards. A first example data set has
Results
A first version of the database system is available for testing with example data (multi-view synchronized images) from the proprietary camera system at the NTNU Lab. The functionalities of the web-interface can be adjusted and expanded to meet different user requirements. Currently, the database consists of gait data from 20 healthy adults (aged 16-70 yrs), captured by 16 synchronized, digital video cameras. Subjects are wearing passive reflective markers, positioned according to the CMU guide
Discussion
Future work includes the possibility of input data from other MOCAP systems and new sensor data, e.g. force/pressure sensor mats for gait analysis and wearable sensors, e.g. inertial measurement units (IMU). Furthermore, possible collaborations with other movement capture laboratories will be explored, as well as expanding the database functionality.
References (2)
- Gkalelis et al., 2009 Conf Vis Media...
- CMU marker placement guide:...