We push the envelope in Human Computer Interaction with wearables and other forms of pervasive computing. New modalities require new aesthetics and ergonomics. For example, hands-free navigation through the tangible reality using smart glasses needs new forms of interaction.
26 participants from five universities across Europe have participated in the first School of AR that was organised by AR-for-EU consortium. All the classes were life-streamed with students following from Norway, Russia, UK and Germany.
The School of AR provided an introduction to Augmented Reality, with emphasis on designing and developing Augmented Reality applications. Lectures gave the theoretical background about the newest hardware and software, and students could try out their new knowledge in practical workshops with try-out possibilities of Smart Glasses. Guest speakers provided a first-hand update on the industrial application of AR.
The course module comprised of a mix of lectures and practical tutorials delivered by 12 teachers from the partner universities:
- Lecture 1 Introduction to AR, Dr Fridolin Wild
- Lecture 2 HCl methodologies, Alla Vovk
- Lecture 3 Perception, Alla Vovk
- Tutorial 1 Modelling AR UI/UX, Alla Vovk
- Lecture 4 Software Engineering, PD Dr Ralf Klamma
- Tutorial 2 New Business Development, PD Dr Ralf Klamma
- Lecture 5: Technology Overview, Dr Fridolin Wild
- Tutorial 3 Markers, Jazz Rasool
- Lecture 6 Geometric Algebra, Dr Carlos Fresnada Portillo
- Lecture 7 Story Telling with AR, Carl Smith, Jim Hensman
- Tutorial 4 3D modelling, Mark Ransley
- Tutorial 5 Gesture interaction, Will Guest
- Tutorial 6 Gaze interaction, Joshua Secretan
- Tutorial 7 Voice interaction, Joshua Secretan
- Tutorial 8: Spatial Understanding, Will Guest
- Tutorial 9: 3D scan and animation, Yu Huang
- Lecture 8 Careers in AR, Joanna Jesionkowska
- Lecture 9 Design Inspiration, Dave Hamblin
- Lecture 10 Research Directions, Carl Smith, Dr Fridolin Wild
After a week of intensive classes, students went home with the ideas for their projects. Now they have nine weeks to design, develop, and evaluate their own Augmented Reality applications.
Wearable technologies – such as smart watches, smart glasses, smart objects, smart earbuds, or smart garments – are just starting to transform immersive user experience into formal education and learning at the workplace. These devices are body-worn, equipped with sensors and conveniently integrate into leisure and work-related activities including physical movements of their users.
Wearable Enhanced Learning (WELL) is beginning to emerge as a new discipline in technology enhanced learning in combination with other relevant trends like the transformation of classrooms, new mobility concepts, multi-modal learning analytics and cyber-physical systems. Wearable devices play an integral role in the digital transformation of industrial and logistics processes in the Industry 4.0 and thus demand new learning and training concepts like experience capturing, re-enactment and smart human-computer interaction.
This proposal of a special track is the offspring of the SIG WELL (http://ea-tel.eu/special-interest- groups/well/) in the context of the European Association for Technology Enhanced Learning (EATEL). It is a follow up proposal for the inaugural session we had at the iLRN 2015 in Prague and in iLRN 2017 in Coimbra.
In the meantime, the SIG was successful in organizing a number of similar events at major research conferences and business oriented fairs like the EC-TEL, the I-KNOW and the Online Educa Berlin OEB. Moreover, the SIG has involved in securing substantial research funds through the H2020 project WEKIT (www.wekit.eu). The SIG would like to use the opportunity to present itself as a platform for scientific and industrial knowledge exchange. EATEL and major EU research projects and networks in the field support it. Moreover, we’ll seek to attach an IEEE standard association community meeting of the working group on Augmented Reality Learning Experience Models (IEEE ARLEM).
List of Topics
- Industry 4.0 and wearable enhanced learning
- Immersive Learning Analytics for wearable technologies
- Wearable technologies for health and fitness
- Wearable technologies and affective computing
- Technology-Enhanced Learning applications of smart glasses, watches, armbands
- Learning context and activity recognition for wearable enhanced learning
- Body-area learning networks with wearable technologies
- Data collection from wearables
- Feedback from wearables, biofeedback
- Learning designs with wearable technologies
- Learning designs with Augmented Reality
- Ad hoc learning with wearables
- Micro learning with wearables
- Security and privacy for wearable enhanced learning
- Collaborative wearable enhanced learning
- Development methods for wearable enhanced learning
Submitted papers must follow the same guidelines as the main conference submissions. Please visit https://immersivelrn.org/ilrn2019/authors-info/ for guidelines and templates. For submitting a paper to this special track, please use the submission system https://www.easychair.org/conferences/?conf=ilrn2019 , log in with an account or register, and select the track “ST6: Wearable Technology Enhanced Learning” to add your submission.
Special Track Chairs
- Ilona Buchem, Beuth University of Applied Sciences Berlin, Germany
- Ralf Klamma, RWTH Aachen University, Germany
- Fridolin Wild, Oxford Brookes University, UK
- Mikhail Fominykh, Norwegian University of Science and Technology, Norway
Tentative Program Committee (t.b.c.)
- Mario Aehnelt, Fraunhofer IGD Rostock, Germany
- Davinia Hernández-Leo, Universitat Pompeu Fabra, Spain
- Carlos Delgado Kloos, UC3M, Spain
- Elisabetta Parodi, Lattanzio Learning Spa, Italy
- Carlo Vizzi, Altec, Italy
- Mar Perez Sangustin, Pontificia Universidad Católica de Chile, Chile
- Isa Jahnke, University of Missouri-Columbia, USA
- Jos Flores, MIT, USA
- Puneet Sharma, Norwegian University of Science and Technology, Norway
- Yishay Mor, Levinsky College of Education, Israel
- Tobias Ley, Tallinn University, Estonia
- Peter Scott, Sydney University of Technology, Australia
- Victor Alvarez, University of Oviedo, Spain
- Agnes Kukulska-Hulme, The Open University, UK
- Carl Smith, Ravensbourne University, UK
- Victoria Pammer-Schindler, Graz University of Technology &Know-Center Graz, Austria
- Christoph Igel, CeLTech, Germany
- Peter Mörtel, Virtual Vehicle, Austria
- Brenda Bannan, George Mason University, USA
- Christine Perey, Perey Consulting, Switzerland
- Kaj Helin, VTT, Finland
- Jana Pejoska, Aalto, Finland
- Jaakko Karjalainen, VTT, Finland
- Joris Klerxx, KU Leuven, Belgium
- Marcus Specht, Open University, Netherlands
- Roland Klemke, Open University, Netherlands
- Will Guest, Oxford Brookes University, UK
For more information, please contact Ralf Klamma ( email@example.com )
On entering Audiomotion Studios, visitors will encounter a ginormous space for motion capture (MoCap) with over 160 Vicon cameras mounted on the rigs (see picture). The Audiomotion Studio is the largest performance capture stage in Europe. Quite a number of actor and animal movements can be recorded at the same time for production of accurate animations. Behind the MoCap space there are several rooms with green screens and Motion Control Crane for filming. PAL’s PhD candidate Yu Huang, Dr Fridolin Wild, and John Twycross went to see Brian Mitchell, the Managing Director, to explore possibilities for further collaboration on volumetric video capture.
Just in time for Halloween, we have finalised work on a major release of the ‘WEKIT.one’, our next-generation app for wearable experiences in knowledge-intensive training.
The development of the experience capturing software has been led by members of the Performance Augmentation Lab. This is one of the first of such tools that allows the generation of content to be done completely within AR. Using a HoloLens and other wearable sensors, the software guides experts to record immersive training procedures using all available AR content. Blending 2D and 3D instruction into the workplace creates a far richer and more interactive training experience.
The expert works through the procedure, capturing their actions, thoughts, and guiding instruction step by step. We are able to capture their movement in and around the workplace, their hand positions, and even some additional biophysical signals, such as heart rate variability or galvanic skin resistance. With just the technology at hand, trainees can now visualise the expert, listen to live guidance, and have access to on-demand knowledge about the task at hand.
To now we have seen experts in the field of aircraft maintenance, radiology and astronaut training use this software and, in 2019, we aim to establish new collaboration within the university, within Oxford and abroad – most imminently with the European Space Agency.
At the Augmented World Expo in Munich, we have been exhibiting the WEKIT project last week, showcasing the breakthrough achievements for our augmented reality and wearables solution in the space industry, aviation, and medicine. On stand #217, we exhibited the different versions of the e-textile garment (and its underlying sensor harness) as well as the WEKIT.one software solution. The director of our lab, Dr Fridolin Wild, gave a keynote presentation in the enterprise track about AR experience capturing and sharing for Training 4.0, explaining the technologies of the project and the findings of the pilot trials reported so far in a series of articles and papers.
The Warwick Business School has a Knowledge Innovation Network and PAL’s Dr Fridolin Wild gave the opening keynote at this year’s autumn workshop, speaking about Holographic Training and other little wonders of an Industry 4.0. In the talk, Dr Wild explored how smart glasses and wearable technologies can be used for knowledge intensive training and as job performance aids, sharing the floor with Jeremy Dalton of PwC, head of VR/AR. The workshop included several case studies and demos, including of Severn Trent Water and Kazendi, as well as a guided your to the Warwick Manufacturing Group Innovation Labs.
Dr Fridolin Wild, director of PAL, was invited to attend a Showcase and Networking meeting at the Microsoft Hololens Lounge in London – as one of twelve universities invited, on May 10, 2018. Microsoft shared some details about the Mixed Reality strategy and observations on the importance of academia as enable to industry, including an announcement for two new mixed reality apps (remote assist and layout, both now in the store). The universities shared their research. The building and demos took place in the stylish Hololens Lounge.
While presenting at the Future Tech Now Show in London on April 5, Dr Fridolin Wild was able to sneak into the new Tesla suit, experiencing the effects of electro-muscular stimulation (EMS) on his own body. The suit embeds muscle-stimulation pads and motion sensors – to be applied in anything from rehabilitation to gaming. “When you activate the six pack pads, you literally can make people feel a subtle kick in the guts”, so Dr Wild. “You still feel a strange electric tingling on your skin, but once immersed in a simulation or game, this quickly fades to the background”, he continues. With the technology, it is also possible to make people move, see here for an earlier reflection on how people feel about their bodies being remote controlled. The latest version of the suit also holds sensors for galvanic skin resistence and heat pads, promising new approaches to personalisation and adaptation.
Also exhibited at the show: electronic cocktails, using the same principle of electric stimulation – of the taste-buds on the tongue, to turn soda water – combined with fragrances – into a virtual cocktail.
Dr. Fridolin Wild gave a TEDx talk on ‘reality as a medium’, speaking about truth, reality, and perception, and how we can hack into perception to actually ‘make’ reality. The talk will be available online soon.
Last week in Tallinn Alla Vovk presented a paper at the 12th European Conference on Technology Enhanced Learning (EC-TEL 2017) “Affordances for Capturing and Re-enacting Expert Performance with Wearables” written by Will Guest, Fridolin Wild, Alla Vovk, Mikhail Fominykh, Bibeg Limbu , Roland Klemke, Puneet Sharma, Carl H Smith, Jazz Rasool, Soyeb Aswat, Kaj Helin, Daniele Di Mitri, and Jan Schneider. You can watch the presentation in 360 with a Q&A session.
The WEKIT.one prototype is a platform for immersive procedural training with wearable sensors and Augmented Reality. Focusing on capture and re-enactment of human expertise, this work looks at the unique affordances of suitable hard- and software technologies. The practical challenges of interpreting expertise, using suitable sensors for its capture and specifying the means to describe and display to the novice are of central significance here. We link affordances with hardware devices, discussing their alternatives, including Microsoft Hololens, Thalmic Labs MYO, Alex Posture sensor, MyndPlay EEG headband, and a heart rate sensor. Following the selection of sensors, we describe integration and communication requirements for the prototype. We close with thoughts on the wider possibilities for implementation and next steps.