10th of November 2011: Challenge dataset with complete labels released. Follow this link.
12th of August 2011: an issue arose in S1-Drill.dat in the 10th of August update. Please download the new version.
10th of August 2011: challenge dataset for task A, B1, B2 has been re-issued!
The deadline for submitting your results has been extended to September 20
Check also our Q& A forum
Activity recognition is an exciting ground for the development of robust machine learning techniques, as applications in this field typically require to deal with high-dimensional, multimodal streams of data that are characterized by a large variability (e.g. due to changes in the user's behavior or as a result of noise). However, unlike other applications, there is lack of established benchmarking problems for activity recognition. Typically, each research group tests and reports the performance of their algorithms on their own datasets using experimental setups specially conceived for that specific purpose. For this reason, it is difficult to compare the performance of different methods or to assess how a particular technique will perform if the experimental conditions change (e.g. in case of sensor failure or changes in sensor location). We intend to address this issue by setting up a challenge on activity recognition aimed at provide a common platform that allows the comparison of different machine learning algorithms on the very same conditions. We call for methods for tackling key questions in activity recognition such as classification based on multimodal recordings, activity spotting and robustness to noise.
To this end we provide a publicly available benchmark database of daily activities recorded in a sensor rich environment (click here for a detailed description) and call for submissions addressing different aspects of the activity recognition problem. The use of a common database will allow fair and sensible comparison of the different contributions. Moreover, we expect that the possibility of replicate the presented results will contribute to further advances in state-of-the-art methods.
Contributions should address at least one of the tasks described here. Final submissions should include the obtained labels in the test dataset (following the same format as in the training data), as well as a detailed description of the proposed method -including full list of parameters used-, in order to allow replicability. Authors are encouraged to also provide accompanying code implementing the proposed method.
Participation in the challenge has no cost. To register please send an email to: email@example.com
Early feedback: Before the final deadline date participants can submit their results, along wih a brief description of their methods. Organizers will provide then feedback about the corresponding performance. This feature will be enabled on July 1st.
Participants may also be interested in submitting their work to the associated workshop at the IEEE Conference on Systems, Man and Cybernetics 2011.
Two prizes will be awarded to the contributions that achieve the highest performance for each task.
An additional prize of the jury (USD 500.-) will be awarded by the recomendation of the organizers. It is intended to reward the most novel, innovative contribution (even if it doesn't achieve the highest performance).
July 1 2011: Early feedback enabled
August 31, 2011: SMC workshop 'Last-minute-results' submission deadline
September 20 2011: Final submission date
October 9 2011: Final results and conclusions presented at the SMC workshop
Do not hesitate to contact the consortium.
We develop opportunistic activity recognition systems: goal-oriented sensor assemblies spontaneously arise and self-organize to achieve a common activity and context recognition. We develop algorithms and architectures underlying context recognition in opportunistic systems.
Subscribe to our newsletter for regular project updates.