DFG-Project “Transfer Learning for Human Activity Recognition in Logistics”

DFG-Projekt zur Aktivitätserkennung in der Logistik

In the age of the Industry 4.0, manual activities remain dominant in the logistics sector. Detailed information on the occurrence and duration of human activities is crucial to enhance warehousing efficiency and thus the entire supply chain. As manual assessment is economically inexpedient, methods of human activity recognition (HAR) gain relevance. HAR is a classification task for recognizing human movements from time-series. It is already used in applications such as smart-homes, rehabilitation and health support. HAR based on non-invasive and highly reliable on-body devices is of special relevance, as these devices extend its potential in challenging scenarios. Training a classifier demands a large amount of data, as human movements are highly variable and diverse, in particular in the diverse environments of the logistics sector.

The objective of this project is to develop a method for avoiding the tremendous effort for creating and annotating high quality on-body-devices data for HAR in logistics. Different logistics scenarios will be replicated in a reference field that is equipped with a highly accurate, optical-motion capturing (oMoCap) system. In this constraint environment, oMoCap and on-body device data will be captured synchronously. The combined oMocap recordings of all scenarios constitute a oMoCap-reference dataset. Methods of transfer- and zero-shot learning will enable to utilize the reference dataset across scenarios. Methods of machine learning, especially, deep learning will be considered for processing time-series and for training a classifier based on the reference oMoCap-dataset. The classifier will allow for an automated annotation of synchronized on-body devices data. Furthermore, methods for creating additional synthetic data from raw-oMoCap data will be considered. The performance of the classifier that is trained on the automatically annotated and synthetic on-body device data will be assessed by comparing it to manually annotated data from a real warehouse.

Joint project of the Chair of Materials Handling and Warehousing and the LS XII Pattern Recognition in Embedded Systems Group, TU Dortmund
This is the successor project of “Adaptive, ortsabhängige Aktivitätserkennung und Bewegungsklassifikation zur Analyse des manuellen Kommissionierprozesses”

Funded by: Deutsche Forschungsgemeinschaft DFG
Run time: November 2019 bis October 2022
Publications:

  • Reining, Christopher; Niemann, Friedrich; Moya Rueda, Fernando; Fink, Gernot A.; ten Hompel, Michael
    Human Activity Recognition for Production and Logistics — A Systematic Literature Review
    In: Information 2019, 10(8), 245.
    https://doi.org/10.3390/info10080245
  • Terharen, André; Feldmann, Felix; Reining, Christopher; ten Hompel, Michael
    Integration von Virtual Reality und optischem Motion Capturing in die Planung und Optimierung von Materialflusssystemen
    In: Logistics Journal : Proceedings, vol. 2018, no. 01, Nov. 2018.
    DOI 10.2195/lj_Proc_terharen_de_201811_01
  • Borst, Dominik; Reining, Christopher; ten Hompel, Michael
    Einfluss der Mensch-Maschine-Interaktion auf das Maschinendesign in der Social Networked Industry
    In: Logistics Journal : Proceedings, vol. 2018, no. 01, Nov. 2018.
    DOI 10.2195/lj_Proc_borst_de_201811_01
  • Reining, Christopher; Schlangen, Michelle; Hissmann, Leon; ten Hompel, Michael; Moya Rueda, Fernando; Fink, Gernot A.
    Attribute Representation for Human Activity Recognition of Manual Order Picking Activities
    In: Proceedings of the 5th International Workshop on Sensor-based Activity
    Recognition and Interaction – iWOAR ’18
    DOI 10.1145/3266157.3266214
  • Reining, Christopher; ten Hompel, Michael; Moya Rueda, Fernando; Fink, Gernot A.
    Towards a Framework for Semi-Automated Annotation of Human Order Picking Activities Using Motion Capturing
    In: Proceedings of the 2018 Federated Conference on Computer Science and Information Systems
    DOI 10.15439/2018F188

Contact: Christopher Reining, M.Sc. & Fernando Moya, M.Sc.