Representation Learning for Accelerometer Data using Convolutional Deep Belief Network

Kyoung-Woon On and Byoung-Tak Zhang


  Accelerometers have been widely accepted as useful and practical sensors for wearable devices to measure and assess physical activity. Activity classification using accelerometry-based methodologies has been widely studied. It mostly utilize a supervised machine learning approach, which associates an observation (or features) of activity to possible activity states in terms of the probability of the observation. 
  However, this approach have learned relatively shallow, one-layer representations. Learning more complex, higher-level representation is a non-trivial, challenging problem.
The “deep learning” approach tries to learn simple features in the lower layers and more complex features in the higher layers.
  In deep learning approaches, The deep belief network is a generative probabilistic model composed of one visible (observed) layer and many hidden layers. Each hidden layer unit learns a statistical relationship between the units in the lower layer; the higher layer representations tend to become more complex. Recently, convolutional deep belief networks have been developed to scale up the algorithm to high-dimensional data.
  In this work, We show visualized personal daily accelerometer data with wrist-wearable device collected in 30 days. In addition, we apply convolutional deep belief networks for obtaining higher-level representation and demonstrate our feature representations learned from unlabeled accelerometer data show good performance for activity classification task.
  In the future, we will regenerate long-term physical activity pattern in daily accelerometer data using convolutional deep belief network.