Mostrar el registro sencillo del ítem

dc.creatorVillanueva Larre, Arantxaes_ES
dc.creatorCabeza Laguna, Rafaeles_ES
dc.date.accessioned2015-09-30T07:35:27Z
dc.date.available2015-09-30T07:35:27Z
dc.date.issued2007
dc.identifier.issn1687-5176
dc.identifier.urihttps://hdl.handle.net/2454/18331
dc.description.abstractOne of the most confusing aspects that one meets when introducing oneself into gaze tracking technology is the wide variety, in terms of hardware equipment, of available systems that provide solutions to the same matter, that is, determining the point the subject is looking at. The calibration process permits generally adjusting nonintrusive trackers based on quite different hardware and image features to the subject. The negative aspect of this simple procedure is that it permits the system to work properly but at the expense of a lack of control over the intrinsic behavior of the tracker. The objective of the presented article is to overcome this obstacle to explore more deeply the elements of a video-oculographic system, that is, eye, camera, lighting, and so forth, from a purely mathematical and geometrical point of view. The main contribution is to find out the minimum number of hardware elements and image features that are needed to determine the point the subject is looking at. A model has been constructed based on pupil contour and multiple lighting, and successfully tested with real subjects. On the other hand, theoretical aspects of video-oculographic systems have been thoroughly reviewed in order to build a theoretical basis for further studies.en
dc.format.mimetypeapplication/pdfen
dc.language.isoengen
dc.publisherHindawi Publishing Corporationen
dc.relation.ispartofEURASIP Journal on Image and Video Processing 2007, 2007:023570en
dc.relation.ispartofseriesImage and video processing for disabilityen
dc.rights© 2007 Villanueva and Cabeza. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.en
dc.rights.urihttp://creativecommons.org/licenses/by/2.0/
dc.subjectGaze trackingen
dc.titleModels for gaze tracking systemsen
dc.typeinfo:eu-repo/semantics/articleen
dc.typeArtículo / Artikuluaes
dc.contributor.departmentIngeniería Eléctrica y Electrónicaes_ES
dc.contributor.departmentIngeniaritza Elektrikoa eta Elektronikoaeu
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessen
dc.rights.accessRightsAcceso abierto / Sarbide irekiaes
dc.identifier.doi10.1155/2007/23570
dc.relation.publisherversionhttps://dx.doi.org/10.1155/2007/23570
dc.type.versioninfo:eu-repo/semantics/acceptedVersionen
dc.type.versionVersión aceptada / Onetsi den bertsioaes


Ficheros en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem

© 2007 Villanueva and Cabeza. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
La licencia del ítem se describe como © 2007 Villanueva and Cabeza. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

El Repositorio ha recibido la ayuda de la Fundación Española para la Ciencia y la Tecnología para la realización de actividades en el ámbito del fomento de la investigación científica de excelencia, en la Línea 2. Repositorios institucionales (convocatoria 2020-2021).
Logo MinisterioLogo Fecyt