Workshops Program

Workshops Program

Access to workshops: Please follow these instructions to log into the virtual event platform. Once logged in, navigate to the desired workshop session under the Program section of the home page. On the right-hand side of the workshop page, there will be a Join button. This button will become available (clickable) 15 minutes before the session starts. A recording of each workshop will be available within 48 hours after the live session through the workshop’s page on the virtual event platform.

Time
(EDT)
Tuesday
June 29
Wednesday
June 30
Friday
July 2
Monday
July 5
Monday
July 12
Tuesday
July 13
Thursday
July 15
8:00-9:30 Affective Haptics for Enhanced XR Bespoke Digital Tailoring in Haptics Tactile Representation of Motion and Space
9:30-11:00 NeuroHaptics
11:00-12:30 Localised Surface Haptics Multimodal Augmentation of Haptic Touch Input Touch Tools
12:30-14:00

You can also find the detailed program on X-CD, our virtual conference platform.


Localised Surface Haptics: Issues and Solutions

Time: June 29, 11:00-14:00 EDT

Website: https://davidgueorguiev.wixsite.com/whc21workshop

Organizers:

David Gueorguiev, Sorbonne université, CNRS, Institut des Systèmes Intelligents et de Robotique, ISIR, F-75005, Paris, France david.gueorguiev@isir.upmc.fr

Charles Hudin, Université Paris-Saclay, CEA, List, F-91120, Palaiseau, France charles.hudin@cea.fr

Thomas Daunizeau, Sorbonne université, CNRS, Institut des Systèmes Intelligents et de Robotique, ISIR, F-75005, Paris, France thomas.daunizeau@sorbonne-universite.fr

Vincent Hayward, Sorbonne université, CNRS, Institut des Systèmes Intelligents et de Robotique, ISIR, F-75005, Paris, France vincent.hayward@sorbonne-universite.fr

Abstract:

Many common gestures on smartphones and tablets are done with multiple fingers. The need is even more acute when it comes to assistive haptic surfaces for sensory-impaired people who typically use complex bimanual interactions when they interact with tactile screens. A major challenge in the field of surface haptics is therefore the ability to produce localised stimuli, i.e. to stimulate independently multiple skin areas in simultaneous contact with the same surface. This feature is necessary to leverage the extended contact possibilities of multi-finger or whole hand gestures to communicatemore expressive and informative feedback.

Surface haptics are increasingly able to render a wealth of sensations like textures, compliances, bumps or vibrotactile messages. Although different actuation mechanisms such as vibrotactile stimulation, friction modulation or net force production are targeted, most approaches ultimately rely on vibrations whose propagative nature complexifies the localisation of haptic features on a surface. Presentations will explore state-of-the art and potential solutions to this problem such as wavefield shaping, wave focusing or confinement.

In addition to technical challenges, localised surface haptics also face perception issues. Because of the active nature of touch, there is a complex and astonishing interplay between the stimulus and its exploration by the hand that influences our perceptual representation. For example, masking reduces our ability to distinguish which areas of the surface are activated, summation (spatial and temporal) affects the perceived intensity, gating modulates the tactile transmission to the brain, and funnelling yields sensations of phantom sources. All those perceptual effects and illusions are either detrimental or beneficial to the perceived localisation of stimuli and must be accounted for.

Overall, the proposed workshop will thus review technical and cognitive issues related to localising haptic perception on tactile displays and discuss solutions for designing localised surface devices.


NeuroHaptics: Touch with the Brain

Time: June 30, 9:30 – 14:00 EDT

Website: https://sites.google.com/nyu.edu/neurohaptics21/home

Organizers:

Prof. S. Farokh Atashzar, New York University, USA

Prof. Mohamad Eid, New York University Abu Dhabi, United Arab Emirates

Abstract:

Neurohaptics, bringing new perspectives in the synergy between haptics and neuroscience, has emerged as a field of study that strives to understand the complex neural representation provoked in response to touch stimuli. Advancements in haptic technologies (such as wearable haptics, rehabilitation robotics, and neurorobotics combined with virtual and augmented reality), as well as brain scanning technologies, have powered an accelerated surge in neuroscience focusing on various modalities of haptics. A fundamental challenge in haptic research is the reliance on subjective self-report or behavioral assessments, limiting the generalizability of the conclusions and increasing the uncertainty, variability, and susceptibility of the assessments to cognitive, memory, and communication barriers and increasing the sensitivity to the timing of answer collection.

A fundamentally different approach to evaluate human haptic experience is to directly measure and encode brain activities while users interact with a haptic device. More recent efforts are focused on studying central neural responses to shed light on the neurophysiology of haptics and to better understand the functionality of the human nervous system related to haptics. In order to measure brain activities, electroencephalography (EEG) and functional MRI (fMRI) have attracted a great deal of interest in recent studies to probe human neural functions during haptics exploration and experiments. This will eventually allow for a better understanding of haptics and subtle underlying mechanisms that could not be detected using subjective methods.

This workshop seeks to present and discuss the most recent novel efforts related to neuroscientific models for the human sense of touch and the combination of haptics and advanced neurorehabilitation robotics technologies. The goal is to bring together diverse leading researchers and young investigators from both haptics and neuroscience in order to explore how to maximize the progress in this multidisciplinary area and inform and further accelerate research in both the haptics and neuroscience communities.

In this workshop, we aim at encouraging an interactive and interdisciplinary dialog between leading researchers, young researchers, and the clinical and industrial sectors. Topics to be covered in this workshop include, but are not limited to:

  • Haptics in AR/VR and Neuro-representation
  • Tactile Communications and Neuroscience
  • Tactile perception and Neuroscience
  • Brain-Computer Interface and Haptics
  • Neurorobotics and haptics
  • Neurorehabilitation and Haptics
  • Haptics-enabled neurorehabilitation robotics
  • Haptics communication and mutual adaptation

Multimodal Augmentation of Haptic Touch Input

Time: July 2, 11:00-14:00 EDT

Website: https://multitouch-itn.eu/multimodal

Organizers:

Dr Frédéric Giraud, University of Lille, frederic.giraud@univ-lille.fr

Pr Monica Gori, Istituto Italiano di Tecnologia, Genova, monica.gori@iit.it

Pr Olivier Collignon Université Catholique de Louvain, olivier.collignon@uclouvain.be 

Pr Radu-Daniel Vatavu, University of Suceava radu.vatavu@usm.ro

Abstract:

The question of introducing more multimodal haptic feedback into consumer products is becoming crucial today, with the advent of a society increasingly focused on digital solutions. Paradoxically, devices are now accessible to more people, but a segment of the population is excluded from these digital developments: elderly individuals who struggle to use touch screens, and visually-or auditory- impaired individuals. Indeed, computers and other devices provide information to users almost exclusively through visual and auditory feedback. How information conveyed by the different senses can integrate into a unified multisensory experience has been studied extensively. The brain integrates these inputs to reinforce information, especially when sensory signals become less reliable. Most of the studies on tactile-visual and tactile-auditory integration have been achieved in passive conditions, i.e. in conditions where the sensory inputs are not generated by active finger interactions with the environment. But recent advances in haptic surfaces offer now the possibility to study the cross-modal interaction with regard to body engagement, thus offering new perspectives for integrating multimodal haptic feedback.

The goal of this workshop is to present the state of the art in the field of multimodal touch devices, through a series of talk by experts. We will cover topics tied to Neuroscience, Psychophysics, Human Computer Interaction and Engineering.


Touch Tools: Bringing Social-Sensorial Tools into Digital Touch Design

Time: July 5, 11:00-14:00 EDT

Website: https://intouchworldhaptics21.wordpress.com/

Organizers:

Dr Kerstin Leder Mackley, UCL Knowledge Lab, University College London, k.ledermackley@ucl.ac.uk

Prof. Sara Price, UCL Knowledge Lab, University College London, sara.price@ucl.ac.uk 

Prof. Carey Jewitt, UCL Knowledge Lab, University College London, c.jewitt@ucl.ac.uk

Lili Golmohammadi, UCL Knowledge Lab, University College London, lili.golmohammadi.18@ucl.ac.uk 

Abstract:

This workshop aims to engage participants with new ways of thinking about the design of digital touch interaction and communication applications, through a focus on the use and development of social-sensorial ‘touch tools’. In so doing, it aims to bring Social Science perspectives into haptics design, traversing interdisciplinary boundaries and enabling new approaches across Computer Sciences, Engineering, HCI, and Design.

The term ‘digital touch’ has come to comprise a broad group of emerging technologies or interfaces that digitally mediate or deliver touch sensations. Technological advances in wearables, robotics and haptics bring with them novel opportunities for tactile interaction, new tactile communication systems and vocabularies. These developments often focus on the effectiveness of tactile communication and its affective implications. For a variety of technical and practical reasons, research and development in this area have largely occurred in labs and experimental settings. These contexts lend themselves to the use of established social categories and parameters (e.g. taking into account, or making assumptions about, users’ sex, age, relations). This workshop looks beyond psychological, psychophysical and neuroscientific interpretations of ‘social touch’. It seeks to broaden and differently illuminate concepts of the social and sensorial in digital touch development and design in the fields of HCI and Interaction Design. Through an engagement with novel design tools, it aims to open up this space to the diversity of bodily experiences, the complexity of interactional contexts, and the different possible meanings of touch for interaction and communication.

In the workshop, we will explore ways to achieve this through an engagement with existing ‘touch tools’, that is, design tools which seek to inspire, provoke, interrogate, and reflect on the social-sensorial dimensions of touch and digital touch as part of the design process. The tools promote speculative thinking to enable developers, engineers and designers to step in and out of concrete user contexts, interrogating human-to-human and human-to-machine (or other) relations in the process.

Three short provocations by interdisciplinary scholars and industry experts will set the scene for the workshop, outlining the need for touch tools with reference to the current digital touch landscape, existing concepts, prototypes and design processes. These will be followed by an introduction to specific examples of touch tools and an opportunity for participants’ hands-on engagement with these. Specifically, small-group breakout sessions will enable hands-on activities with two (virtual) card-based tools and a set of searchable haptic content tools. Collaborative discussion around ideas and concepts inspired through the workshop activities will aim to consolidate insights and identify future opportunities for touch tools.

The workshop will provide participants with the opportunity to engage in new ways of thinking about digital touch design. It will bring the social and sensory aspects into play; generate insights into ongoing design challenges for digital touch applications; create opportunities for developing cross-disciplinary collaborations around touch tools; and work towards a future-facing agenda for tool development and wider dissemination.


Affective Haptics for Enhanced XR

Time: July 12, 8:00-11:00 EDT

Website: http://www.mouniaziat.com/WHC-Workshop/index.html

Organizers:

Dr. Mounia Ziat, Bentley University, mziat@bentley.edu

Abstract:

Our emotions play an important role in our interaction with the world. In real-life situations, they affect positively or negatively our lives. The replication of our world into virtual environments has been in human minds as early as 1935; the year Stanley G. Weinbaum published his short novel “Pygmalion Spectacles” that described the first head-mounted display (HMD). On one hand, we have never been closer to Weinbaum’s vision eighty-five years later. The technology progress related to XR headsets, from hardware and software perspectives, has been impressive and cutting edge. On the other hand, despite this technological advance, we are still far away from Weinbaum’s ultimate vision of an immersive world that not only includes all our five senses, but also provides us with qualia begotten of the virtual experience. Pick up a rose, smell its scent, feel the pricks of its thorns or the soft velvet of its petals on your finger. These sensations are what ground us in reality. Now, think about a rose in a virtual reality (VR) world. Despite a high-fidelity virtual rendering of the rose, your experience of a virtual rose would be completely different from that of a real one. Touching and smelling the rose would trigger emotions that would be hard to replicate in the virtual world. The experience would be different in a Mixed Reality (MR) or Augmented Reality (AR) environment depending whether the user is interacting with a real rose that is virtually augmented. Different haptic technologies are being used to add tangibility to virtual entities in these Extended Reality (XR) worlds enhancing, therefore, users’ immersion. Creating strong emotions remains one of the goals of XR. However, one obstacle that users face to reach some sort of emotional qualities is the lack of tangible interaction. This workshop focuses on the affective and emotional aspects of haptic technologies. It is not enough to only focus on the reproduction of the mechanical stimulation, it is also crucial to understand how the haptic technology triggers emotions within the XR worlds. Emotion classification remains a contested issue, as multiple models and several measures are available and they are rarely without any flaws. Speakers in this workshop would try to tackle the difficult task of understanding the complex map of emotions in touch, from basic affective feelings to more complex mental emotional construction, by detailing their approach and research to help provide haptic guidelines and considerations for designers about the affective dimensions for an effective haptic-XR interaction.


Bespoke Digital Tailoring in Haptics – Theory and Implementation Using 3D Printers

Time: July 13, 8:00-11:00 EDT

Website: https://coi.sfc.keio.ac.jp/whc2021.html

Organizers:

Masashi Nakatani (Keio University) mn2598@sfc.keio.ac.jp 

Masataka Imura (Professor, School of Science and Technology, Kwansei Gakuin University) m.imura@kwansei.ac.jp          

Yoichi Yamazaki (Assistant Professor, School of Science and Technology / Research Center for Kansei Value Creation, Kwansei Gakuin University) y-yamazaki@kwansei.ac.jp   

Abstract:

3D printing technology enables us to develop a composite shape. Recent advances in 3D printing of composite structures also contribute to fabricating a material with an inner structure, which is called architected materials. Architected materials exhibit not only mechanical functions such as elasticity, durability, and damping capacity but also aesthetic functions that make people haptically comfortable. In past research, user-study based evaluation of tangible products is a major methodology to study product comfort. However, it would be more beneficial and time-effective if one can predict product comfort before the product is fabricated. In this panel workshop, we would like to overview recent advances in 3D printing technology that can be applied in haptic science. In digital fabrication, one can easily manipulate parameters that determine the composite shapes. Once the quantitative data on the force-displacement relationship of fabricated materials of different shape parameters is collected, we may predict the force-displacement relationship of a fabricated material to be produced by 3D printers.

We also provide an idea of how the force-displacement relationship can be connected with the sensible aspect of the product by referencing past research on sensory evaluations. In computer graphics, the Bidirectional Reflectance Distribution Function (BRDF) is widely used to render photorealistic scenes. Similarly, we can define the Force-induced Displacement Distribution Function (in short, FDDF), as a function of real variables that defines how displacement is produced by applied force to a tangible object. The FDDF is an input-output relationship in analogy with the BRDF, and can be measured from tangible objects using a calibrated force sensor and a linear actuator. Once the FDDF is measured from a real object, we can estimate the haptic perception of materials (e.g., softness) based on a previous study on softness perception. Once the FDDF is measured from a target object, in theory, we can reconstruct a shape with the target FDDF value using 3D printing technology. The simplest case is to reconstitute the force-displacement relationship at the center of a tangible object. We are going to show several examples of the reconstruction of the FDDF with various 3D printers.

In the later part of the workshop, we also introduce an industrial application of the tailor-made, digital- fabricated consumer product (e.g., insoles in a shoe) that can provide both mechanical functions and aesthetic functions for a long-term use.


Tactile Representation of Motion and Space

Time: July 15, 8:00-14:00 EDT

Website: https://sites.google.com/view/tactile-representation-of-moti

Organizers:

Msc Eng,  PhD  Student,  Gemma  Carolina  Bettelani,  Research Center E. Piaggio, Dept. Information   Engineering, University of Pisa, Pisa, Italy,  gemma.bettelani1@gmail.com

MD, Phd Student, Colleen Patricia Ryan, Dept. of Systems Medicine and Centre of Space Bio-Medicine, University of Rome “Tor Vergata”, Department of Neuromotor Physiology, IRCSS Santa Lucia Foundation, Rome, Italy, colleenpatriciar@gmail.com

Assistant Professor, Matteo Bianchi, Research Center E. Piaggio, Dept. Information Engineering, University of Pisa, Pisa, Italy, matteo.bianchi@unipi.it

Assistant Professor, Alessandro Moscatelli, Dept. of Systems Medicine and Centre of Space Bio-Medicine, University of Rome “Tor Vergata”, Department of Neuromotor Physiology, IRCSS Santa Lucia Foundation, Rome, Italy, a.moscatelli@hsantalucia.it

Abstract:

Touch is an intrinsically active sense, where the sensory signals are acquired through purposive movements, made to explore the world. Because of the active nature of touch, there is a complex and astonishing interplay between static (e.g., texture, softness) and dynamic (e.g., speed, vibrations) features of stimuli which affect our perceptual representation of objects and motor control of the hand. There is a bidirectional flow of information between the motor and the tactile system, which is evident at the hand level. More specifically, it is well known that hand movement is optimized to collect the maximum amount of information about object properties. At the same time, tactile cues play a crucial role for proprioception, motion perception and motor control, as well as for the execution of manipulation tasks. This workshop aims to highlight the intertwined relationship between the motor and the somatosensory system, discussing future research directions for what concerns the investigation of tactile representation of motion and space, and possible trans-disciplinary applications in advanced human machine interaction.