AnDyDataset

This page describes the Dataset « AndyData-lab-onePerson » aka AnDyDataset, collected by team LARSEN at INRIA, within the scope of the EU H2020 project AnDy.

What is AnDyDataset?

A dataset of human motions during industry-like manual activities, fully labeled according to the ergonomics assessment worksheet EAWS. The dataset contains the data of 13 participants, each performing several repetitions of various series of activities. Data are available in proprietary (sensor-specific), standard motion analysis, and csv formats.

Which sensors?

  • Xsens MVN Link inertial motion capture system → whole-body kinematics from wearable sensors
  • Qualisys optical motion capture system → whole-body kinematics from external sensors
  • Emphasis Telematics sensorized glove → fingers flexion, palm and fingertips pressure force
  • 2 video cameras
  • Annotations of action and posture according to the EAWS postural grid

Which human activities?

  • Manipulating loads
  • Screwing in various postures
  • Walking

Where can I download it?

Zenodo: AndyData-lab-onePerson

Citation

Maurice P., Malaisé A., Amiot C., Paris N., Richard G.J., Rochel O., Ivaldi S. « Human Movement and Ergonomics: an Industry-Oriented Dataset for Collaborative Robotics ». The International Journal of Robotics Reserach, in press.

@article{maurice2019human,
    title={Human Movement and Ergonomics: an Industry-Oriented Dataset for Collaborative Robotics},
    author={Maurice, Pauline and Malais{\’e}, Adrien and Amiot, Cl{\’e}lie and Paris, Nicolas and Richard, Guy-Junior and Rochel, Olivier and Ivaldi, Serena},
    journal={The International Journal of Robotics Research},
    year={in-press},
    publisher={Sage Publications}
}

Papers that used it

Malaisé A., Maurice P., Colas F., Ivaldi S. « Activity Recognition for Ergonomics Assessment of Industrial Tasks with Automatic Feature Selection ». IEEE Robotics and Automation Letters, vol. 4, no. 2, pp. 1132-1139, 2019.

If you want your paper to be listed here, drop us an email.

Acknowledgments

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 731540 (An.Dy). The equipment used to create the dataset was partly funded by the CPER IT2MP of Région Grand-Est, France and INRIA. The authors would like to thank Xsens Technologies and Emphasis Telematics for the loan of their equipment used in the data collection.