Abstract

Automated activity recognition of heavy construction equipment as well as human crews can contribute to correct and accurate measurement of a variety of construction and infrastructure project performance indicators. Productivity assessment through work sampling, safety and health monitoring using worker ergonomic analysis, and sustainability measurement through equipment activity cycle monitoring to eliminate ineffective and idle times thus reducing greenhouse gas emission (GHG), are some potential areas that can benefit from the integration of automated activity recognition and analysis techniques. Despite their proven performance and applications in other domains, few construction engineering and management (CEM) studies have so far employed non-vision sensing technologies for construction equipment and workers’ activity recognition. The existence of a variety of sensors in ubiquitous smartphones with evolving computing, networking, and storage capabilities has created great opportunities for a variety of pervasive computing applications. In light of this, this paper describes the latest findings of an ongoing project that aims to design and validate a ubiquitous smartphone-based automated activity recognition framework using built-in accelerometer and gyroscope sensors. Collected data are segmented to overlapping windows to extract time- and frequency domain features. Since each sensor collected data in three axes (x, y, z), several features from all three axes are extracted to ensure device placement orientation independency. Finally, features are used as the input of supervised machine learning classifiers. The results of the experiments indicate that the trained models are able to classify construction workers and equipment activities with over 90% overall accuracy.

Date Created

June 2015

https://works.bepress.com/lianne-brito/2/download/

Included in

Engineering Commons

Share

COinS