DFG-Project “Transfer Learning for Human Activity Recognition in Logistics”

DFG-Projekt zur Aktivitätserkennung in der Logistik

Manual activities dominate the logistics sector even in the age of Industry 4.0, for example, the case of order picking. Consequently, detailed information on the occurrence and duration of human activities is crucial to enhance warehousing efficiency, access ergonomics, and improve the supply chain. As manual assessment is economically inexpedient, methods of human activity recognition (HAR) gain relevance. HAR refers to the classification of human movements from time-series data. HAR has proven useful in smart-home, rehabilitation, and health support applications. Due to the highly variable and diverse nature of human movements, a large amount of data is required to train a classifier. Previous HAR datasets have been focused on activities of daily living. However, to research the possibility of HAR in the logistics sector, we require a HAR dataset containing activities pertaining to the logistics environment.

The objective of the project was to develop a method for avoiding the tremendous effort pertaining to the creation and annotation of high-quality on-body-devices data for HAR in logistics. Different logistics scenarios were replicated in a reference field that was equipped with a highly accurate, optical-motion capturing (oMoCap) system (see Video 1). In addition to oMoCap, on-body device data were captured synchronously. The oMocap recordings of all scenarios can be accounted for as the oMoCap-reference dataset (see Video 2). Thus enabling transfer- and zero-shot learning methods. As part of the method, machine learning, in specific, deep learning methods was implemented for processing time series and for training a classifier based on the reference oMoCap-dataset. The classifier facilitated the automation of the annotation of synchronized on-body device data. In addition, methods for creating additional synthetic data from raw-oMoCap data were considered. The performance of the classifier that was trained on the automatically annotated and synthetic on-body device data was assessed by comparing it to manually annotated data from a real warehouse.

Data collection was focused on the natural motion of the individual in a logistic environment. Three different scenarios were considered based on similarity to reality and order picking processes, as shown in Figure 1-3.

To ensure natural body movement, the subjects were requested to perform the activities without constraints. Consequently, the activity labels are designed to be inclusive of various body movements rather than focusing purely on the activity being performed, as shown in Figure 4. In addition, finer aspects of the body motion are accounted for within the attribute representation, as presented in Figure 4. The classes and attribute labels are expected to be expandable to various logistics scenarios.

Annotation is a cumbersome process. Here, we developed an annotator-friendly annotation tool was designed (see Figure 5). The tool has manual annotation mode, label correction mode and automatic annotation mode, as explained here. Manual annotation, as the name suggests, refers to the first-time annotation of the recording by the annotator. However, the automatic annotation mode simplifies the annotation process for the annotators by providing an annotation from a pre-trained classifier that can be corrected by the annotator. Thus, speeding up the annotation process and saving the annotator from mental exhaustion. Furthermore, the annotation tool provides flexibility in modifying the classes and attributes.

The classifier of interest is the time-series-based Convolutional Neural Network (tCNN), as shown in Figure 6. An extension of the network, referred to as IMU-CNN, has a convolution block associated to each limb of the human. The limb specific convolution block is expected to extract local features of the input data. Further, the global features are extracted post concatenation of the convolutional blocks of each limb (Niemann et. al 2020). For classification based on activity classes, the softmax activation function is used. To achieve attribute representation, the sigmoid activation function is utilized.

A joint project of the Chair of Materials Handling and Warehousing and the LS XII Pattern Recognition in Embedded Systems Group, TU Dortmund. This is the successor project of “Adaptive, ortsabhängige Aktivitätserkennung und Bewegungsklassifikation zur Analyse des manuellen Kommissionierprozesses”

Funded by: Deutsche Forschungsgemeinschaft DFG
Run time: November 2019 bis October 2022
Publications:

  • Ickstadt, M. Pauly, M. Motta, S. Herbrandt, N. Moroff, F. Niemann, M. Henke, M. ten Hompel
    Lernverfahren der Künstlichen Intelligenz zur Inwertsetzung von Daten: Automatisierte Erkennung und Prognose
    In: Springer 2022, Silicon Economy
    DOI: 10.1007/978-3-662-63956-6_11
  • Niemann, Friedrich; Reining, Christopher; Moya Rueda, Fernando; Avsar, Hülya; Altermann, Erik; Nair, Nilah Ravi; Steffens, Janine Anika; Fink, Gernot A.; ten Hompel, Michael
    Logistic Activity Recognition Challenge (LARa Version 2) – A Motion Capture and Inertial Measurement Dataset
    In: Zenodo 2022. (Version 2) [Data set]
    DOI: 5281/zenodo.5761276
  • Niemann, Friedrich; Lüdtke, Stefan; Bartelt, Christian; ten Hompel, Michael
    Context-Aware Human Activity Recognition in Industrial Processes
    In: Sensors 2021
    DOI: 3390/s22010134
  • Niemann, Friedrich; Avsar, Hülya; Steffens, Janine Anika; Nair, Nilah Ravi; ten Hompel, Michael
    Context-Aware Activity Recognition in Logistics (CAARL) – A optical marker-based Motion Capture Dataset
    In: Zenodo 2021. (Version 1) [Data set]
    DOI: 5281/zenodo.5680951
  • Avsar, Hülya; Niemann, Friedrich; Reining, Christopher; ten Hompel, Michael
    Cyber-physischer Zwilling Framework zur Generierung menschlicher Bewegungsdaten in der Intralogistik
    In: Logistics Journal 2021
    DOI: 2195/lj_Proc_avsar_de_202112_01
  • Avsar, Hülya; Altermann, Erki; Reining, Christopher; Moya Rueda, Fernando; Fink, Gernot A.; ten Hompel, Michael
    Benchmarking Annotation Procedures for Multi-channel Time Series HAR Dataset
    In: 2021 IEEE International Conference on Pervasive Computing and Communications Workshops
    DOI: 10.1109/PerComWorkshops51409.2021.9431062
  • Reining, Christopher; Moya Rueda, Fernando; Niemann, Friedrich; Fink, Gernot A.; ten Hompel, Michael
    Annotation Performance for multi-channel time series HAR Dataset in Logistics
    In: 2020 IEEE International Conference on Pervasive Computing and Communications Workshops
    DOI: 1109/PerComWorkshops48775.2020.9156170
  • Niemann, Friedrich; Reining, Christopher; Moya Rueda, Fernando; Nair, Nilah Ravi; Steffens, Janine Anika; Fink, Gernot A.; ten Hompel, Michael
    LARa: Creating a Dataset for Human Activity Recognition in Logistics Using Semantic Attributes
    In: Sensors 2020
    https://doi.org/10.3390/s20154083
  • Niemann, Friedrich; Reining, Christopher; Moya Rueda, Fernando; Altermann, Erik; Nair, Nilah Ravi; Steffens, Janine Anika; Fink, Gernot A.; ten Hompel, Michael
    Logistic Activity Recognition Challenge (LARa) – A Motion Capture and Inertial Measurement Dataset
    In: Zenodo 2020
    https://doi.org/10.5281/ZENODO.3862782
  • Reining, Christopher; Niemann, Friedrich; Moya Rueda, Fernando; Fink, Gernot A.; ten Hompel, Michael
    Human Activity Recognition for Production and Logistics — A Systematic Literature Review
    In: Information 2019, 10(8), 245.
    https://doi.org/10.3390/info10080245
  • Terharen, André; Feldmann, Felix; Reining, Christopher; ten Hompel, Michael
    Integration von Virtual Reality und optischem Motion Capturing in die Planung und Optimierung von Materialflusssystemen
    In: Logistics Journal: Proceedings, vol. 2018, no. 01, Nov. 2018.
    DOI 10.2195/lj_Proc_terharen_de_201811_01
  • Borst, Dominik; Reining, Christopher; ten Hompel, Michael
    Einfluss der Mensch-Maschine-Interaktion auf das Maschinendesign in der Social Networked Industry
    In: Logistics Journal: Proceedings, vol. 2018, no. 01, Nov. 2018.
    DOI 10.2195/lj_Proc_borst_de_201811_01
  • Reining, Christopher; Schlangen, Michelle; Hissmann, Leon; ten Hompel, Michael; Moya Rueda, Fernando; Fink, Gernot A.
    Attribute Representation for Human Activity Recognition of Manual Order Picking Activities
    In: Proceedings of the 5th International Workshop on Sensor-based Activity
    Recognition and Interaction – iWOAR ’18
    DOI 10.1145/3266157.3266214
  • Reining, Christopher; ten Hompel, Michael; Moya Rueda, Fernando; Fink, Gernot A.
    Towards a Framework for Semi-Automated Annotation of Human Order Picking Activities Using Motion Capturing
    In: Proceedings of the 2018 Federated Conference on Computer Science and Information Systems
    DOI 10.15439/2018F188

 

Contact:  Nilah Ravi Nair, M.Sc.; Friedrich Niemann, M.Sc.Fernando Moya, M.Sc.