brookes_logo_charcoal_small Save Save Save Save

Research Programme

PAL prototypes the next generation of smart knowledge media, in an emerging research field where data science meets augmented reality and wearable, pervasive computing.

DATA SCIENCE
Performance Analytics
Efficient real-time performance tracking, analysis, and prediction with the help of natural language processing, statistical analysis, and computer vision.
cRunch infrastructure for
computationally-intense learning analytics.
R packages and task view
for natural language processing,
including the lsa and mpia packages.
Tracking of human performance and learning with the experience API.
Wearable and mobile Computer Vision using Vuforia.
 
 
AUGMENTED REALITY
Job Performance Aids
Software development frameworks and modelling tools for end-user deployment and rapid coding of object- and flow-driven mixed reality applications.
ARgh! framework
for augmented reality application development.
Standards work on interactive content such as
IEEE ARLEM on augmented reality
learning experience models.
Error proofing to spot, predict, and help recover from malperformance, increasing safety and ergonomics.
Lean training for rapid, immersive competence development.
Embodied learning to reduce dissociation between sensory-motoric cognition and higher-order thinking.
Impact analysis and validation studies.
WEARABLE, PERVASIVE COMPUTING
Engaging User Experience
Utility is king, but user experience is emperor! We investigate radically new forms of engagement and interaction technology. Passionate technology is the objective.
GhostHands teletutoring for remote guidance with hand sensing and 3D hand-animation models for instruction.
Visual language of iconic glyphs
for signifying handling and motion
for in situ guidance.
Spacification methodology to upgrade and transform existing places for rich, AR-based user experience.
BLE Beacon-based indoor positioning.
StareGaze hands-free navigation and InspectorLaunch range-based activation
FlipIt marker transition effects

Leave a Reply

Your email address will not be published. Required fields are marked *