Activity and Context Recognition with Opportunistic Sensor Configurations

Questions about data format

Discusss here aspects related to the challenge dataset format, or link to tools, scripts, to assist in the data access.
Q: I noticed that across the different datasets from all the subjects (both ADL and DRILL) there are plenty of portions where huge chunks of sensor data (mainly the accelerometers on the upper limbs) are missing, i.e. those columns contain NaNs. What happened there?
A: NaN values correspond to missing samples in the wireless sensors. These are Bluetooth accelerometers that suffered a few disconnections with loss of data chunks (we had too many wireless things around).

Q: Would you please explain the meaning of a zero value in either of the last two columns?  Would it mean that there is no annotation available for the sensor instance?
A: Exactly, a zero in the last columns means that no label is available for that particular sample (each row is a sample, with 30Hz rate). You can consider it like a "null class".
Concerning the posture/locomotion there are zeros mainly in transition phases, for example when the subject was first standing and then sitting down, you will see a label for standing, then a few zeros and finally a label for sitting. This is to avoid introducing samples which would not be representative of the activity. Regarding the gesture labels, there you have zeros whenever the corresponding gesture is not among the ones listed in the challenge description.

mj2.pdf82.92 KB
posshoe.pdf1.91 MB
MTi_and_MTx_User_Manual_and_Technical_Documentation.pdf2.94 MB
XM-B_Technical_Documentation.pdf948.29 KB

task A sensor set

Which set of sensors can be used in TASK A ?

For task A, as for all the

For task A, as for all the other tasks, all sensors can be used. Of course, you may decide to use different sensors subsets for each task.

Re-issued dataset

After downloading the reissued dataset I've noticed that the first row of S1-Drill.dat has only 115 columns instead of 116 as described in the label column names. Comparing to the other .dat files I'm guessing that the first timestamp: 0 is missing.

Re: Re-issued dataset

There was a small problem when uploading the new data. It has now been corrected, please download the new version (Aug 12, 2011)

devices orientation on body

how are oriented the devices placed on the body of the users? i mean, what is the reference coordinate system for the 3 sensors of every device (acc,gyro and magneto) with respect to the body of the user?

Re: devices orientation on body

Please find in the attachments two files that describes the placement of the IMUs (Xsense MTX) on the body (blue jacket) as well as the shoe sensor (files: mj2.pdf and posshoe, respectively).

The orientation of the bluetooth acceleration sensors is as follows:
- Z axis towards outwards the body (positive acceleration)
- Y towards up of the body
- X towards left of the body.

IMU datasheet

Q: Is it possible to have the datasheet of the gyros of the motion jacket? I need to calibrate the values they provide and your txt file just says that they output a raw value multiplied by 1000. I need to convert it to rad/s.

Re: IMU datasheet

The IMUs are the xsens MTx sensors (datasheets available on request). No special processing is done on the rotation data, except from the fact that we multiplied the data by the factor indicated in the text file to have only integers instead of floats in the challenge file. The datasheets can also be found in the attachments of this page.

NaN values

Q: could you suggest please how to treat NaNs values is sensors' readings?

A: NaN values

There is an increasing amount of research on different ways to deal with missing values (in both machine learning and activity recognition). Among them we can mention:
* Avdouevski et al., Missing values in user activity classification applications utilizing wireless sensors, ACM MobiWac, 2008.
* M. Saar-Tsechansky and F. Provost. Handling missing values when applying classification models. J. Mach. Learn. Res., 2007.
* Sagha et al., A Probabilistic Approach to Handle Missing Data for Multi-Sensory Activity Recognition, Workshop on Context Awareness and Information Processing in Opportunistic Ubiquitous Systems at Ubicomp'10, 2010


Do not hesitate to contact the consortium.


We develop opportunistic activity recognition systems: goal-oriented sensor assemblies spontaneously arise and self-organize to achieve a common activity and context recognition. We develop algorithms and architectures underlying context recognition in opportunistic systems.


Subscribe to our newsletter for regular project updates.

web design by
© 2017 OPPORTUNITY Consortium
The information on this website does not represent the opinion of the European Community, and is provided "as is". No guarantee or warranty is given that the information is fit for any particular purpose. The user uses the information at its sole risk and liability.