Person: Iriarte Cárdenas, Naroa
Loading...
Email Address
person.page.identifierURI
Birth Date
Research Projects
Organizational Units
Job Title
Last Name
Iriarte Cárdenas
First Name
Naroa
person.page.departamento
Estadística, Informática y Matemáticas
person.page.instituteName
ORCID
0000-0002-7672-5353
person.page.upna
811550
Name
5 results
Search Results
Now showing 1 - 5 of 5
Publication Open Access PiloNape: electrostatic artificial piloerection for affecting emotional experiences(2022) Iriarte Cárdenas, Naroa; Marzo Pérez, Asier; Escuela Técnica Superior de Ingeniería Industrial, Informática y de Telecomunicación; Industria, Informatika eta Telekomunikazio Ingeniaritzako Goi Mailako Eskola TeknikoaPiloerection is a strong affective reaction that occurs in human beings. In this project, we induce artificial piloerection using contactless electrostatics to enhance the affective response of users when they are interacting with computer systems. Firstly, we design and compare various high-voltage generators in terms of the static charge, safety and frequency response with different electrodes as well as grounding strategies. Secondly, a psychophysics user study revealed which body parts are more sensitive to electrostatic piloerection and what adjectives are associated with them; the wrist and the nape were the most sensitive parts. Finally, we combine a generator with a head mounted display to add artificial piloerection to virtual experiences of fear. We hope that these findings allows designers to use contactless piloerection in affective experiences such as music, short movies, videogames or exhibitions.Publication Open Access LevPet: a magnetic levitating spherical pet with affective reactions(ACM, 2022) Sorbet Molina, Josune; Elizondo Martínez, Sonia; Iriarte Cárdenas, Naroa; Ortiz Nicolás, Amalia; Marzo Pérez, Asier; Estadística, Informática y Matemáticas; Estatistika, Informatika eta MatematikaLevPet combines affective computing and magnetic levitation to create an artificial levitating pet with affective responses and novel ways of moving to express emotions. Our interactive pet can recognise the user's emotional status using computer vision, and respond to it with a low-level empathy system based on mirroring behaviour. For example, if you approach it with a happy face, the pet will greet you and move in a nimble way. A repulsive magnetic levitator is attached to a mechanical stage controlled by a computer system. On top of it, there is the pet playground, where a house, a ping-pong ball,a xylophone and other accessories are placed. Two cameras allow to capture the user's face and the objects placed on the playground, so that the pet can interact with them. LevPet is an exploration of how to communicate internal state with only a levitating sphere; it is a platform for experimentation and an interactive demo that brings together an outer-worldly levitating metallic sphere with familiar things like emotions and a playground made of traditional items.Publication Open Access Contactless electrostatic piloerection for haptic sensations(IEEE, 2023) Iriarte Cárdenas, Naroa; Ezcurdia Aguirre, Íñigo Fermín; Elizondo Martínez, Sonia; Irisarri Erviti, Josu; Hemmerling, Daria; Ortiz Nicolás, Amalia; Marzo Pérez, Asier; Estadística, Informática y Matemáticas; Estatistika, Informatika eta MatematikaIn this project, we create artificial piloerection using contactless electrostatics to induce tactile sensations in a contactless way. Firstly, we design various high-voltage generators and evaluate them in terms of their static charge, safety and frequency response with different electrodes as well as grounding strategies. Secondly, a psychophysics user study revealed which parts of the upper body are more sensitive to electrostatic piloerection and what adjectives are associated with them. Finally, we combine an electrostatic generator to produce artificial piloerection on the nape with a head-mounted display, this device provides an augmented virtual experience related to fear. We hope that work encourages designers to explore contactless piloerection for enhancing experiences such as music, short movies, video games, or exhibitions.Publication Open Access A multi-object grasp technique for placement of objects in virtual reality(MDPI, 2022) Fernández Ortega, Unai Javier; Elizondo Martínez, Sonia; Iriarte Cárdenas, Naroa; Morales, Rafael; Ortiz Nicolás, Amalia; Marichal Baráibar, Sebastián Roberto; Ardaiz Villanueva, Óscar; Marzo Pérez, Asier; Institute of Smart Cities - ISC; Universidad Pública de Navarra / Nafarroako Unibertsitate PublikoaSome daily tasks involve grasping multiple objects in one hand and releasing them in a determined order, for example laying out a surgical table or distributing items on shelves. For training these tasks in Virtual Reality (VR), there is no technique for allowing users to grasp multiple objects in one hand in a realistic way, and it is not known if such a technique would benefit user experience. Here, we design a multi-object grasp technique that enables users to grasp multiple objects in one hand and release them in a controlled way. We tested an object placement task under three conditions: real life, VR with single-object grasp and VR with multi-object grasp. Task completion time, distance travelled by the hands and subjective experience were measured in three scenarios: sitting in front of a desktop table, standing up in front of shelves and a room-size scenario where walking was required. Results show that the performance in a real environment is better than in Virtual Reality, both for single-object and multi-object grasping. The single-object technique performs better than the multi-object, except for the room scenario, where multi-object leads to less distance travelled and reported physical demand. For use cases where the distances are small (i.e., desktop scenario), single-object grasp is simpler and easier to understand. For larger scenarios, the multi-object grasp technique represents a good option that can be considered by other application designers.Publication Open Access TOUCHLESS: demonstrations of contactless haptics for affective touch(ACM, 2023) Chew, Sean; Dalsgaard, Tor-Salve; Maunsbach, Martin; Bergström, Joanna; Seifi, Hasti; Hornbæk, Kasper; Irisarri Erviti, Josu; Ezcurdia Aguirre, Íñigo Fermín; Iriarte Cárdenas, Naroa; Marzo Pérez, Asier; Frier, William; Georgiou, Orestis; Sheremetieva, Anna; Kwarciak, Kamil; Stroiński, Maciej; Hemmerling, Daria; Maksymenko, Mykola; Cataldo, Antonio; Obrist, Marianna; Haggard, Patrick; Subramanian, Sriram; Estadística, Informática y Matemáticas; Estatistika, Informatika eta MatematikaA set of demonstrators of contactless haptic principles is described in this work. The technologies are based on electrostatic piloerection, chemical compounds and ultrasound. Additionally, applications related to affective touch are presented, ranging from storytelling to biosignal transfer, accompanied with a simple application to edit dynamic tactile patterns in an easy way. The demonstrators are the result of the Touchless project, which is a H2020 european collaborative project that integrates 3 universities and 3 companies. These demostrators are contactless haptic experiences and thus facilitate the come-and-interact paradigm, where users can approach the demo booth and directly experience the applications without having to wear devices, making the experience fast and hygienic.