JoggAR: A Mixed-Modality AR Approach for Technology-Augmented Jogging
JoggAR is a mixed-modality AR approach for technology-augmented jogging. It combines wearable visual, audio, and sensing technology to create a persistent AR environment that enhances jogging and other exertion experiences, especially with varying attention intensities during the activity.
A key feature of JoggAR is its audio-first exploration method for 3D virtual spaces, designed to support exertion-focused activities without overloading the visual channel.
🔬 Project Members
- Chek Tien Tan (UTS Games Studio)
- Floyd Mueller (RMIT Exertion Games Lab)
- Rich Byrne (RMIT Exertion Games Lab)
- Simon Lui (SUTD Audio Research Lab)
🏆 Outputs
-
SIGGRAPH Asia 2015 – Mobile Graphics & Interactive Applications
Tan, C. T., Byrne, R., Lui, S., Liu, W., & Mueller, F. (2015). JoggAR: A mixed-modality AR approach for technology-augmented jogging. In SIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications (Article 33, 1 page). Association for Computing Machinery. https://doi.org/10.1145/2818427.2818434 -
CHI PLAY 2017
Mueller, F. F., Tan, C. T., Byrne, R., & Jones, M. (2017). 13 Game Lenses for Designing Diverse Interactive Jogging Systems. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play (pp. 43–56). Association for Computing Machinery. https://doi.org/10.1145/3116595.3116607
💡 Keywords
AR
, Jogging
, Exertion
, Wearables
, Audio-First Navigation
, Game Design