brookes_logo_charcoal_small Save Save Save Save

Engaging User Experience with Wearable and Pervasive Computing

We push the envelope in Human Computer Interaction with wearables and other forms of pervasive computing. New modalities require new aesthetics and ergonomics. For example, hands-free navigation through the tangible reality using smart glasses needs new forms of interaction.

Special Track on Wearable Technology Enhanced Learning (@iLRN’19)

Fridolin Wild : 27th November 2018 6:57 pm : Augmented Reality, Wearable Computing

Wearable technologies – such as smart watches, smart glasses, smart objects, smart earbuds, or smart garments – are just starting to transform immersive user experience into formal education and learning at the workplace. These devices are body-worn, equipped with sensors and conveniently integrate into leisure and work-related activities including physical movements of their users.

Wearable Enhanced Learning (WELL) is beginning to emerge as a new discipline in technology enhanced learning in combination with other relevant trends like the transformation of classrooms, new mobility concepts, multi-modal learning analytics and cyber-physical systems. Wearable devices play an integral role in the digital transformation of industrial and logistics processes in the Industry 4.0 and thus demand new learning and training concepts like experience capturing, re-enactment and smart human-computer interaction.

This proposal of a special track is the offspring of the SIG WELL (http://ea-tel.eu/special-interest- groups/well/) in the context of the European Association for Technology Enhanced Learning (EATEL). It is a follow up proposal for the inaugural session we had at the iLRN 2015 in Prague and in iLRN 2017 in Coimbra.

In the meantime, the SIG was successful in organizing a number of similar events at major research conferences and business oriented fairs like the EC-TEL, the I-KNOW and the Online Educa Berlin OEB. Moreover, the SIG has involved in securing substantial research funds through the H2020 project WEKIT (www.wekit.eu). The SIG would like to use the opportunity to present itself as a platform for scientific and industrial knowledge exchange. EATEL and major EU research projects and networks in the field support it. Moreover, we’ll seek to attach an IEEE standard association community meeting of the working group on Augmented Reality Learning Experience Models (IEEE ARLEM).

List of Topics

  • Industry 4.0 and wearable enhanced learning
  • Immersive Learning Analytics for wearable technologies
  • Wearable technologies for health and fitness
  • Wearable technologies and affective computing
  • Technology-Enhanced Learning applications of smart glasses, watches, armbands
  • Learning context and activity recognition for wearable enhanced learning
  • Body-area learning networks with wearable technologies
  • Data collection from wearables
  • Feedback from wearables, biofeedback
  • Learning designs with wearable technologies
  • Learning designs with Augmented Reality
  • Ad hoc learning with wearables
  • Micro learning with wearables
  • Security and privacy for wearable enhanced learning
  • Collaborative wearable enhanced learning
  • Development methods for wearable enhanced learning

Author Info

Submitted papers must follow the same guidelines as the main conference submissions. Please visit https://immersivelrn.org/ilrn2019/authors-info/  for guidelines and templates. For submitting a paper to this special track, please use the submission system https://www.easychair.org/conferences/?conf=ilrn2019 , log in with an account or register, and select the track “ST6: Wearable Technology Enhanced Learning” to add your submission.

Special  Track  Chairs

  • Ilona Buchem, Beuth University of Applied Sciences Berlin, Germany
  • Ralf Klamma, RWTH Aachen University, Germany
  • Fridolin Wild, Oxford Brookes University, UK
  • Mikhail Fominykh, Norwegian University of Science and Technology, Norway

Tentative Program Committee (t.b.c.)

  • Mario Aehnelt, Fraunhofer IGD Rostock, Germany
  • Davinia Hernández-Leo, Universitat Pompeu Fabra, Spain
  • Carlos Delgado Kloos, UC3M, Spain
  • Elisabetta Parodi, Lattanzio Learning Spa, Italy
  • Carlo Vizzi, Altec, Italy
  • Mar Perez Sangustin, Pontificia Universidad Católica de Chile, Chile
  • Isa Jahnke, University of Missouri-Columbia, USA
  • Jos Flores, MIT, USA
  • Puneet Sharma, Norwegian University of Science and Technology, Norway
  • Yishay Mor, Levinsky College of Education, Israel
  • Tobias Ley, Tallinn University, Estonia
  • Peter Scott, Sydney University of Technology, Australia
  • Victor Alvarez, University of Oviedo, Spain
  • Agnes Kukulska-Hulme, The Open University, UK
  • Carl Smith, Ravensbourne University, UK
  • Victoria Pammer-Schindler, Graz University of Technology &Know-Center Graz, Austria
  • Christoph Igel, CeLTech, Germany
  • Peter Mörtel, Virtual Vehicle, Austria
  • Brenda Bannan, George Mason University, USA
  • Christine Perey, Perey Consulting, Switzerland
  • Kaj Helin, VTT, Finland
  • Jana Pejoska, Aalto, Finland
  • Jaakko Karjalainen, VTT, Finland
  • Joris Klerxx, KU Leuven, Belgium
  • Marcus Specht, Open University, Netherlands
  • Roland Klemke, Open University, Netherlands
  • Will Guest, Oxford Brookes University, UK

Contact

For more information, please contact Ralf Klamma ( klamma@dbis.rwth-aachen.de )

Leave a response »

Visit to Audiomotion Studios

Yu Huang : 7th November 2018 5:07 pm : Augmented Reality, Wearable Computing

On entering Audiomotion Studios, visitors will encounter a ginormous space for motion capture (MoCap) with over 160 Vicon cameras mounted on the rigs (see picture). The Audiomotion Studio is the largest performance capture stage in Europe. Quite a number of actor and animal movements can be recorded at the same time for production of accurate animations. Behind the MoCap space there are several rooms  with green screens and Motion Control Crane for filming. PAL’s PhD candidate Yu Huang, Dr Fridolin Wild, and John Twycross went to see Brian Mitchell, the Managing Director, to explore possibilities for further collaboration on volumetric video capture.

Leave a response »

WEKIT.one Halloween Release Candidate

Will Guest : 31st October 2018 8:00 pm : Augmented Reality, Performance Analytics, Wearable Computing

Just in time for Halloween, we have finalised work on a major release of the ‘WEKIT.one’, our next-generation app for wearable experiences in knowledge-intensive training.

The development of the experience capturing software has been led by members of the Performance Augmentation Lab. This is one of the first of such tools that allows the generation of content to be done completely within AR. Using a HoloLens and other wearable sensors, the software guides experts to record immersive training procedures using all available AR content. Blending 2D and 3D instruction into the workplace creates a far richer and more interactive training experience.

The expert works through the procedure, capturing their actions, thoughts, and guiding instruction step by step. We are able to capture their movement in and around the workplace, their hand positions, and even some additional biophysical signals, such as heart rate variability or galvanic skin resistance. With just the technology at hand, trainees can now visualise the expert, listen to live guidance, and have access to on-demand knowledge about the task at hand.

To now we have seen experts in the field of aircraft maintenance, radiology and astronaut training use this software and, in 2019, we aim to establish new collaboration within the university, within Oxford and abroad – most imminently with the European Space Agency.

Leave a response »

Augmented World Expo

Fridolin Wild : 21st October 2018 4:41 pm : Augmented Reality, Wearable Computing

At the Augmented World Expo in Munich, we have been exhibiting the WEKIT project last week, showcasing the breakthrough achievements for our augmented reality and wearables solution in the space industry, aviation, and medicine. On stand #217, we exhibited the different versions of the e-textile garment (and its underlying sensor harness) as well as the WEKIT.one software solution. The director of our lab, Dr Fridolin Wild, gave a keynote presentation in the enterprise track about AR experience capturing and sharing for Training 4.0, explaining the technologies of the project and the findings of the pilot trials reported so far in a series of articles and papers.

Leave a response »

Experiencing the Future with Immersive Technologies

Fridolin Wild : 11th September 2018 5:13 pm : Augmented Reality, Lecture, User Interface, Wearable Computing

The Warwick Business School has a Knowledge Innovation Network and PAL’s Dr Fridolin Wild gave the opening keynote at this year’s autumn workshop, speaking about Holographic Training and other little wonders of an Industry 4.0. In the talk, Dr Wild explored how smart glasses and wearable technologies can be used for knowledge intensive training and as job performance aids, sharing the floor with Jeremy Dalton of PwC, head of VR/AR. The workshop included several case studies and demos, including of Severn Trent Water and Kazendi, as well as a guided your to the Warwick Manufacturing Group Innovation Labs.

Leave a response »

Microsoft Mixed Reality Academic Showcase & Networking

Fridolin Wild : 10th May 2018 4:48 pm : Augmented Reality, Wearable Computing

Dr Fridolin Wild, director of PAL, was invited to attend a Showcase and Networking meeting at the Microsoft Hololens Lounge in London – as one of twelve universities invited, on May 10, 2018. Microsoft shared some details about the Mixed Reality strategy and observations on the importance of academia as enable to industry, including an announcement for two new mixed reality apps (remote assist and layout, both now in the store). The universities shared their research. The building and demos took place in the stylish Hololens Lounge.

Leave a response »

Musings from the Future Tech Now Show

Fridolin Wild : 10th April 2018 4:25 pm : Wearable Computing

While presenting at the Future Tech Now Show in London on April 5, Dr Fridolin Wild was able to sneak into the new Tesla suit, experiencing the effects of electro-muscular stimulation (EMS) on his own body. The suit embeds muscle-stimulation pads and motion sensors – to be applied in anything from rehabilitation to gaming. “When you activate the six pack pads, you literally can make people feel a subtle kick in the guts”, so Dr Wild. “You still feel a strange electric tingling on your skin, but once immersed in a simulation or game, this quickly fades to the background”, he continues. With the technology, it is also possible to make people move, see here for an earlier reflection on how people feel about their bodies being remote controlled. The latest version of the suit also holds sensors for galvanic skin resistence and heat pads, promising new approaches to personalisation and adaptation.

Also exhibited at the show: electronic cocktails, using the same principle of electric stimulation – of the taste-buds on the tongue, to turn soda water – combined with fragrances – into a virtual cocktail.

Leave a response »

TEDx talk

Fridolin Wild : 31st October 2017 3:19 pm : Augmented Reality, Wearable Computing

Dr. Fridolin Wild gave a TEDx talk on ‘reality as a medium’, speaking about truth, reality, and perception, and how we can hack into perception to actually ‘make’ reality. The talk will be available online soon.

Leave a response »

EC-TEL 2017 paper presentation in 360

Alla Vovk : 21st September 2017 10:49 am : Augmented Reality, Performance Analytics, Wearable Computing

 

Last week in Tallinn Alla Vovk presented a paper at the 12th European Conference on Technology Enhanced Learning (EC-TEL 2017) “Affordances for Capturing and Re-enacting Expert Performance with Wearables” written by Will Guest, Fridolin Wild, Alla Vovk, Mikhail Fominykh, Bibeg Limbu , Roland Klemke, Puneet Sharma, Carl H Smith, Jazz Rasool, Soyeb Aswat, Kaj Helin, Daniele Di Mitri, and Jan Schneider. You can watch the presentation in 360 with a Q&A session.

Abstract

The WEKIT.one prototype is a platform for immersive procedural training with wearable sensors and Augmented Reality. Focusing on capture and re-enactment of human expertise, this work looks at the unique affordances of suitable hard- and software technologies. The practical challenges of interpreting expertise, using suitable sensors for its capture and specifying the means to describe and display to the novice are of central significance here. We link affordances with hardware devices, discussing their alternatives, including Microsoft Hololens, Thalmic Labs MYO, Alex Posture sensor, MyndPlay EEG headband, and a heart rate sensor. Following the selection of sensors, we describe integration and communication requirements for the prototype. We close with thoughts on the wider possibilities for implementation and next steps.

 

Leave a response »

Artists in residency: Deval and Losseau

Fridolin Wild : 20th September 2017 2:25 pm : Augmented Reality, Wearable Computing
Via the Horizon 2020 funded Vertigo project, we will receive an artist in residence to work with our WEKIT project. Vertigo aims to catalyze new synergies between artists, cultural institutions, R&D projects in ICT, companies, incubators, and funds. We will host from December 2017 to October 2018 two artists, Yann Deval and Marie-Ghislaine Losseau, to work with us on exploring and investigating the new aesthetics and design plus interaction principles for ‘reality 2.0’, made possible through the advent of smart AR glasses. Yann and Marie-Ghislaine will use AR glasses as a medium of expression, creating a holographic exhibit ‘ATLAS’.
ATLAS is a work between digital arts and visual arts, in form of a interactive and sceno-graphic exhibition, mixing real and virtual worlds. Situated in an archipelago of poetical islands, the spectator will be invited to build a city. Using a ‘seed launcher’, the user will grow houses following urbanistic rules with smart homes adapting to the environment created: cities in the cloud, uprooted cities, cities on stilts, flying cities.
Yann Deval
Interactive designer, motion-designer, musical composer. After studying the history of cinema in La Sorbonne (Paris) and studying editing and audio-visual post-production in Cannes, he settled in Brussels in 2006 where he developed his activities as motion-designer and VFX artist. He works for the film industry (Mood Indigo by Michel Gondry, The Brand New Testament by Jaco Van Dormael), music-videos (Puggy, Sacha Toorop), documentaries for Arte, tv-shows for France Television). He occasionally trains professionals and students at digital creation workshops (School Arts2 Mons, EMMD Motion Design Brussels). Between 2012 and 2017, he co-directed the virtual reality performance IMMERSIO. This performance was a mix between live music and digital arts,  played in a rich set of venues (SAT Montreal, ADAF Athens, SignalOFF Prague, Wisp Festival Leipzig, Bozar and Halles de Schaerbeek Brussels).
Marie-Ghislaine Losseau
Scenographer, visual art designer. She studied scenography at La Cambre / Brussels and visual arts at ISPG Brussels. She develops an activity around the topics of scenography, visual installations and the organisation of workshops with kids and adults.
Leave a response »
« Page 1, 2, 3 »

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.