We push the envelope in Human Computer Interaction with wearables and other forms of pervasive computing. New modalities require new aesthetics and ergonomics. For example, hands-free navigation through the tangible reality using smart glasses needs new forms of interaction.
Online Educa Berlin saw an expert panel from the H2020-funded WEKIT project present and discuss whether the future of training and learning at the workplace will radically change its form, abandoning books, class rooms, and learning management systems to turn to smart glasses, live guidance, and in situ training.
Fridolin Wild of Oxford Brooke’s Performance Augmentation Lab discussed with Ralf Klamma (RWTH Aachen), Paul Lefrere (CCA Research), and Carlo Vizzi (Altec) how Performance Augmentation could change the way we learn.
Both panel and audience agreed in their discussion that wearables and in particular smart glasses with augmented reality mode enabled form an integral part of the future of learning at the workplace.
“It depends, however, on content, instructional design, and activity mix, as all of them together make out experience”, so Wild. “They determine, whether a training will be useful, enjoyable, and memorable, and not just render people less smart, operating persons like robots.”
Online Educa is a unique, cross-sector event with 2,100+ participants from 93 countries, fostering exchange between the corporate, education, and public service sectors. OEB took place from November 30 to December 2, 2016, in Berlin.
Dr Fridolin Wild will be giving a keynote at the Orphee Rendezvous in Font Romeu, France, to help shape the future research agenda of the French technology-enhanced learning R&D community. Orphee is a network of networks with over 30 partner organisations. The retreat takes place January 31 and February 1, 2017, in the Pyrenees and will bring together experts in the field. Dr Wild will speak about Performance Augmentation.
Here’s the abstract: Augmented Reality (AR) gained momentum in recent years, branching out beyond mere object-superimposition in marketing to more complex use cases. Other than Virtual Reality, AR refers to enhancing regular human perception with additional, artificially generated sensory inputs, merging natural and digital offers into a combined experience. Obviously, such novel technology is relevant to education and training. AR offers potential especially for human performance augmentation: to improve efficiency and effectiveness of learners through extended live guidance. In this talk, Dr. Wild will introduce to the concept of Performance Augmentation and report on latest findings from the R&D projects ARPASS, WEKIT, and TCBL.
The user interface for artificial intelligences needs a major overhaul. That’s why innovators like Volume Global, the IBM Watson development partner, turn to robots and augmented reality to explore, what the next generation of user interfaces could look like. When visiting Volume Global in Wokingham last week, Nigel Crook, Fridolin Wild, and John Corlett discussed possibilities for future collaboration.
In the context of Pearson (the AR/VR development team) visiting recently, we had a chance to try out the new Microsoft HoloLens. An impressive device, clear and crisp images, fantastic environmental mapping, with already some very interesting apps available. The pictures show Dr Wild and Dr Kamal operate mixed reality applications on the device – seeing the real world with superimposed holograms.
We’re running a workshop for makers of wearable solutions on Friday, September 16th, 2016, at the 11th European Conference for Technology Enhanced Learning (EC-TEL) in Lyon, France. It is dedicated to demonstrating prototypes and sharing experiences with wearable enhanced learning both with respect to hardware as well as software.
Wearable technologies – such as smart watches, smart fitness trackers, smart glasses, smart glasses, smart objects, smart earbuds, or smart garments – are beginning to change personal communications, offering new opportunities for learning and interacting. They are likely to dramatically change human computer interaction in the future beyond what we imagine today. Wearable Enhanced Learning (WELL) turns out to be a revolutionary step in the transition from the desktop age through the mobile age to the age of pervasive computing.
Participants of this session are encouraged to show projects, to demo or discuss how the prototypes were developed, and inspire other participants to add ideas for further development and applications in real-life settings. We invite all makers and researchers including individuals, project groups, non-for-profit and commercial.
Submit a 800 word summary (four A4 pages) via Easychair by June 1st!
Participants are requested to submit a 800 word summary (approx. 4 A4 pages) including images and/or links to samples of own hardware/software design work (e.g. mock-ups, prototypes). Design samples help the reviewers to understand the solution and the design approach. Submissions may include work from past, current, or future projects and may as well just present creative ideas independent of any project scheme or research program.
Click here for more information on the SIG WELL website.
Dr. Fridolin Wild (Senior Research Fellow and Scientific Director of WEKIT) spoke about Wearables Enhanced Knowledge Intensive Training (WEKIT). Augmented reality and wearables are potential game changers, re-inventing the way we perceive and work with information. In the publicly-funded WEKIT project (Horizon 2020), we explore experience capturing with AR and wearables to support on the one hand observation of a master performing problem-solving tasks and, on the other, to deliver augmented real-time guidance to trainees. In this Research Centre talk, Fridolin introduced the department to the project and outlined the future R&D timeline, and paved the way for further collaboration on AR and wearables within the department.
Sometimes we want immediate feedback about whether a marker was actually detected – without, however, activating the action connected to it. In this experiment, we’ve tried a set of transition effects. As the example video shows (‘my heart beats for TEL’): they surely make user interaction more engaging 🙂
With: Paul Hogan (Open University)
We’ve developed workplace models including an inventory of standard actions required for furniture production, textile production, and helicopter maintenance. Part of the vocabulary screening included developing standard verbs for handling and motion – and their visual representation. With our partners from VTT, the Technical Research Centre of Finland, we’ve created this visual language for guiding users on-the-job, when using smart glasses and portable devices.
Visual feature and fiducial markers can be used for distance-based launch activation, as demonstrated for this Halloween gimmick: Cuthberth 1.0, a ghost now spooking the hallways of the Knowledge Media Institute at the Open University.
With: Paul Hogan (OU), Fridolin Wild, Peter Scott (OU), Chris Valentine (OU)