Assistant Professor at University of Chicago (starting 2019)
My PhD: Interactive Systems Based on Electrical Muscle Stimulation.
In my research, I create interactive devices that actuate the user's body using computer-controlled electrical muscle stimulation. My devices form two main categories: (1) Devices that increase immersion in virtual reality by simulating large forces, such as wind, physical impact, or walls and heavy objects. (2) Devices that allow users to access information eyes-free by means of their proprioceptive sense, such as a variable, a tool, or a plot. These interactive systems move past mobile and wearable computing in that they borrow parts of the user's body to interact with the user, resulting in devices that are not only exceptionally small, but that also implement a novel interaction model, in which devices overlap with the user's body.
I've published two overarching articles at IEEE magazines that provide a deeper discussion of my research & vision for interactive systems based on EMS, i.e., idiosyncrasies of devices that overlap with the human body, comparison to state-of-the-art in mechanically actuated devices, implications for the field of computing/HCI, and my vision:
Pedro Lopes and Patrick Baudisch
Magazine article at IEEE Computer, vol 50, no.10, 2017.
Pedro Lopes and Patrick Baudisch
Magazine article at IEEE Pervasive, vol. 16 Issue No. 03, 2017.
CHI & UIST full papers on Interactive Systems based on Electrical Muscle Stimulation
Pedro Lopes, Sijing You, Alexandra Ion, and Patrick Baudisch
Full paper at CHI'18, to appear
We present a mobile system that enhances mixed reality experiences, displayed on a Microsoft HoloLens, with force feedback by means of electrical muscle stimulation (EMS). The benefit of our approach is that it adds physical forces while keeping the users’ hands free to interact unencumbered—not only with virtual objects, but also with physical objects, such as props and appliances that are an integral part of both virtual and real worlds. [more soon]
Pedro Lopes, Sijing You, Lung-Pan Cheng, Sebastian Marwecki and Patrick Baudisch
Full paper at CHI'17, p1471-1482 & demo at SIGGRAPH'17
We explored how to add haptics to walls and other heavy objects in virtual reality. Our contribution is that we prevent the user’s hands from penetrating virtual objects by means of electrical muscle stimulation (EMS). As the shown user lifts a virtual cube, our system lets the user feel the weight and resistance of the cube. The heavier the cube and the harder the user presses the cube, the stronger a counterforce the system generates. [more]
Pedro Lopes, Doga Yueksel, François Guimbretière, and Patrick Baudisch
Full paper at UIST'16, p207-217
With muscle-plotter we explore how to create more expressive EMS-based systems. Muscle-plotter achieves this by persisting EMS output, allowing the system to build up a larger whole. More specifically, muscle-plotter spreads out the 1D signal produced by EMS over a 2D surface by steering the user’s wrist, while the user drags their hand across the surface. Rather than repeatedly updating a single value, this renders many values into curves. [more]
Pedro Lopes, Alexandra Ion, and Patrick Baudisch
Full paper at UIST'15, p11-19 and UIST'15 best talk nomination
We present impacto, a device designed to render the haptic sensation of hitting and being hit in virtual reality. The key idea that allows the small and light impacto device to simulate a strong hit is that it decomposes the stimulus: it renders the tactile aspect of being hit by tapping the skin using a solenoid; it adds impulse to the hit by thrusting the user’s arm backwards using electrical muscle stimulation. The device is self-contained, wireless, and small enough for wearable use. [more]
Pedro Lopes, Patrik Jonell, and Patrick Baudisch
CHI'15 best paper award, top 1%, full paper, p2515-2524
We propose extending the affordance of objects by allowing them to communicate dynamic use, such as (1) motion (e.g., spray can shakes when touched), (2) multi-step processes (e.g., spray can sprays only after shaking), and (3) behaviors that change over time (e.g., empty spray can does not allow spraying anymore). Rather than enhancing objects directly, however, we implement this concept by enhancing the user. We call this affordance++. By stimulating the user’s arms using electrical muscle stimulation, our prototype allows objects not only to make the user actuate them, but also perform required movements while merely approaching the object, such as not to touch objects that do not “want” to be touched. [more]
Pedro Lopes, Alexandra Ion, Willi Mueller, Daniel Hoffmann, Patrik Jonell, and Patrick Baudisch
Full paper at CHI'15, p939-948 and CHI'15 best talk award
We propose a new way of eyes-free interaction for wearables. It is based on the user’s proprioceptive sense, i.e., rather than seeing, hearing, or feeling an outside stimulus, users feel the pose of their own body. We have implemented a wearable device called Pose-IO that offers input and output based on proprioception. Users communicate with Pose-IO through the pose of their wrists. Users enter information by performing an input gesture by flexing their wrist, which the device senses using a 3-axis accelerometer. Users receive output from Pose-IO by finding their wrist posed in an output gesture, which Pose-IO actuates using electrical muscle stimulation. [more]
This core concept paved my second line of research in which the interactive system based on EMS allows users to access information (continued in Muscle Plotter and Affordance++).
The following projects relate to my topic in that they originated directly from my line of research or inspired it:
Lung-Pan Cheng, Patrick Lühne, Pedro Lopes, Christoph Sterz, and Patrick Baudisch
Full paper at CHI'14, p3463-3472
We present haptic turk, a different approach to motion platforms that is light and mobile. The key idea is to replace motors and mechanical components with humans. All haptic turk setups consist of a player who is supported by one or more human-actuators. The player enjoys an interactive experience, such as a flight simulation. The motion in the player’s experience is generated by the actuators who manually lift, tilt, and push the player's limbs or torso. [more]
The idea for this line of research (primary investigator is Lung-Pan Cheng) was born out of discussions on interactive devices based on EMS. In fact, one of the possible form factors for an EMS device is to apply the stimuli to a surrogate user, who in turn actuates the user. Interestingly, we realized that, instead of using EMS, a set of real-time actuation instructions achieves the same effect.
Anne Roudaut, Andreas Rau, Christoph Sterz, Max Plauth, Pedro Lopes, and Patrick Baudisch
Full paper at CHI'13, p2547-2556
Behind gesture output stands the idea to use spatial gestures not only for input but also for output. Analogous to gesture input, gesture output moves the user’s finger in a gesture, which the user then recognizes. A motion path forming a “5”, for example, may inform the user about five unread messages; a heart-shaped path may serve as a message from a close friend. [more]
This idea that one could the exact same gestural vocabulary for input and ouput (which we called symmetric interaction) inspired my work in Proprioceptive Interaction (CHI'15), in which we applied this idea to a device that overlaps with the human body.
Additional CHI & UIST full papers
Alexandra Ion, Robert Kovacs, Oliver Schneider, Pedro Lopes, Patrick Baudisch
Full paper at CHI'18 (to appear)
We enable 3D printed objects to display different textures by transforming their surface. We demonstrate several textured 3D objects, including a shoe sole that transforms from flat to treaded, a textured door handle providing tactile feedback to visually impaired users, and a configurable bicycle grip. [more soon]
This is work I did in collaboration with Alexandra Ion (who is the principal investigator on thiis topic).
Alexandra Ion, Johannes Frohnhofen, Ludwig Wall, Robert Kovacs, Mirela Alistar, Jack Lindsay, Pedro Lopes, Hsiang-Ting Chen, and Patrick Baudisch
UIST'16 Honourable Mention, top 5%, Full paper, p529-539.
So far, metamaterials were understood as materials—we want to think of them as machines. We demonstrate metamaterial objects that perform a mechanical function. Such metamaterial mechanisms consist of a single block of material the cells of which play together in a well-defined way in order to achieve macroscopic movement. Our metamaterial door latch, for example, transforms the rotary movement of its handle into a linear motion of the latch. [more]
This is work I did in collaboration with Alexandra Ion (who is the principal investigator on thiis topic).
Stefanie Mueller, Pedro Lopes, and Patrick Baudisch
Full paper at UIST'12 p599-606.
Constructable is an interactive drafting table that produces precise physical output in every step. Users interact by drafting directly on the workpiece using a hand-held laser pointer. The system tracks the pointer, beautifies its path, and implements its effect by cutting the workpiece using a fast high-powered laser cutter. [more]
This is work I did in collaboration with Stefanie Mueller (who is the principal investigator on thiis topic).
CHI & UIST short papers
Pedro Lopes, and Patrick Baudisch
IEEE World Haptics, People’s Choice Nomination for Best Demo, Note at CHI'13, p2577-2580
Force feedback devices resist miniaturization, because they require physical motors and mechanics. We propose mobile force feedback by eliminating motors and instead actuating the user’s muscles using electrical stimulation. Without the motors, we obtain substantially smaller and more energy-efficient devices. Our prototype fits on the back of a mobile phone. It actuates users’ forearm muscles via four electrodes, which causes users’ muscles to contract involuntarily, so that they tilt the device sideways. As users resist this motion using their other arm, they perceive force feedback. [more]
This was the core project that initiated my line of work in increasing realism (continued in Impacto and VR walls).
Ricardo Jota, Pedro Lopes, Daniel Wigdor, and Joaquim Jorge.
Note at CHI'14, p1411-1414.
We explore the design space of foot input on vertical surfaces, and propose three distinct interaction modalities: hand, foot tapping, and foot gesturing. We dinstinguish between feet and hands by feeding the sound of surface contacts (captured by a microphone) to a trainned machine learning classifier. [more]
This the second of a line of work (together with Ricardo Jota) using machine learning and acoustic features to distuinguish which part of the user's body (e.g., feet vs. hands) contacts an interactive surface (see also Augmenting touch through acoustic sensing).
Art projects from my research on EMS
A side effect of devices interacting directly through the user’s muscles is that they question the traditional model of users controlling devices. This topic stimulated a lot of discussion in the lab.
One of the results of this debate was the creation of a prototypical device that reverses this paradigm and is controlling the user. Our creation, shown below, is a technological parasite that lives off human energy. The device harvests kinetic energy by electrically stimulating participants’ wrists. This causes their wrists to involuntarily turn a crank, which powers the device.
We felt the debate and thus the device were of interest to the general public, but we also felt that its nature of asking a question (rather than giving an answer) made it more suitable for an artistic outlet. We have exhibited this device as an installation at prestigious venues such as the Ars Electronica (Linz, 2017) and Science Gallery (Dublin, 2017).
Pedro Lopes, Alexandra Ion, Robert Kovacs, David Lindlbauer and Patrick Baudisch
Exhibited at Ars Electronica'17, World Economic Forum (San Francisco) and Science Gallery Dublin
Ad infinitum is a parasitical entity which lives off human energy. It lives untethered and off the grid. This parasite reverses the dominant role that mankind has with respect to technologies: the parasite shifts humans from “users” to “used”. Ad infinitum parasitically attaches electrodes onto the human visitors and harvesting their kinetic energy by electrically persuading them to move their muscles using EMS. [more]
This piece is a critical take on the canonical HCI configuration, in which a human is always in control. Instead, here participants experience how it feels when a machine is in control.
Pedro Lopes (and supporting musicians)
Exhibited at Print Screen Festival (Tel Aviv) and Disruption Network Lab (curated by Tatiana Bazzichelli)
Conductive-ensemble is an art performance in which the musicians are controlled by the audience. On stage, a quartet of musicians is ready to “play”. To interact: the audience members open up their browser and navigate to the performance’s website. Control: when tapping on a musician’s name on their screen, this website sends a stream of electrical muscle stimulation to that musician’s muscles; causing the musician to play “against their own will”. [more]
- Ars Electronica’17, Linz (Ad Infinitum)
- World Economic Forum, San Francisco (Ad Infinitum)
- Science Gallery Dublin (Ad Infinitum)
- Conductive Ensemble, Print Screen Festival, Tel Aviv (Conductive Ensemble)
- Conductive Ensemble, Disruption Network Lab, SPEKTRUM (Conductive Ensemble)
- Natural History Museum, Bern (Ad Infinitum)
- Laznia Centre for Contemporary Art, Gdansk (Affordance++)
Additional publications in Human-Computer Interaction
- Best Talk award. Interacting with Wearable Computers by means of Functional Electrical Muscle Stimulation. Pedro Lopes and Patrick Baudisch. In Proc. NAT '17 (Neuroadaptive Technology, paper)
- I, the device: observing human aversion from an HCI perspective, Ricardo Jota, Pedro Lopes, and Joaquim Jorge. 2012. In Proc. CHI EA’12, pg. 261-270. (paper)
- Augmenting touch interaction through acoustic sensing, Pedro Lopes, Ricardo Jota, Joaquim Jorge, In Proc. ITS’11 (Interactive Tabletops and Surfaces). (paper, video)
- Combining bimanual manipulation and pen-based input for 3D modelling. Pedro Lopes, Daniel Mendes, Bruno de Araújo, Joaquim Jorge, SBIM’11: Sketch-based Interface and Modelling, in cooperation with ACM SIGGRAPH and EUROGRAPHICS (paper, video)
- Hands-on interactive tabletop LEGO application. Daniel Mendes, Pedro Lopes, and Alfredo Ferreira, in Proc. ACE ’11: Proceedings of the 2010 ACM SIGCHI international conference on Advances in Computer Entertainment Technology (paper, video)
- Battle of the DJs: an HCI Perspective of Traditional, Virtual, Hybrid and Multitouch DJing. Pedro Lopes, Alfredo Ferreira, João Pereira, In Proc. NIME’11 (New Interfaces for Musical Expression ) (paper, video)
- Multitouch interactive DJing surface. Pedro Lopes, Alfredo Ferreira, João Pereira. In Proc. ACE’10 (Advances in Computer Entertainment ). (paper, video)
Workshops lectured at Scientific Venues
- Hands-on course on: “Electrical Muscle Stimulation”, at ACM CHI’16 (workshop website)
- Hands-on course on: “Electrical Muscle Stimulation as Haptics”, IEEE World Haptics’15 (workshop website)
Chairing & Organizing
- Program Committee Member for ACM CHI’19
- Program Committee Member for ACM CHI’18
- Program Committee Member for ACM UIST’18
- Demo Chair for ACM MobileHCI’19
- Program Committee Member for IEEE VR’18
- Best Paper Committee for ACM UIST’17
- Program Committee Member for ACM UIST’17
- Student Innovation Contest Chair for ACM UIST’17
- Student Design Challenge Chair for ACM TEI’17
- Program Committee Member for Desform’17
- Student Innovation Contest Chair for ACM UIST’16 (In this contest 18 teams used my EMS hardware)
- Proceedings Chair for ACM UIST’13
If you are looking for more information (e.g., demos at scientific conferences, list of invited talks, teaching experience, and so forth) please check my CV