Models for gaze tracking systems

dc.contributor.authorVillanueva Larre, Arantxa
dc.contributor.authorCabeza Laguna, Rafael
dc.contributor.departmentIngeniería Eléctrica y Electrónicaes_ES
dc.contributor.departmentIngeniaritza Elektrikoa eta Elektronikoaeu
dc.date.accessioned2015-09-30T07:35:27Z
dc.date.available2015-09-30T07:35:27Z
dc.date.issued2007
dc.description.abstractOne of the most confusing aspects that one meets when introducing oneself into gaze tracking technology is the wide variety, in terms of hardware equipment, of available systems that provide solutions to the same matter, that is, determining the point the subject is looking at. The calibration process permits generally adjusting nonintrusive trackers based on quite different hardware and image features to the subject. The negative aspect of this simple procedure is that it permits the system to work properly but at the expense of a lack of control over the intrinsic behavior of the tracker. The objective of the presented article is to overcome this obstacle to explore more deeply the elements of a video-oculographic system, that is, eye, camera, lighting, and so forth, from a purely mathematical and geometrical point of view. The main contribution is to find out the minimum number of hardware elements and image features that are needed to determine the point the subject is looking at. A model has been constructed based on pupil contour and multiple lighting, and successfully tested with real subjects. On the other hand, theoretical aspects of video-oculographic systems have been thoroughly reviewed in order to build a theoretical basis for further studies.en
dc.format.mimetypeapplication/pdfen
dc.identifier.doi10.1155/2007/23570
dc.identifier.issn1687-5176
dc.identifier.urihttps://academica-e.unavarra.es/handle/2454/18331
dc.language.isoengen
dc.publisherHindawi Publishing Corporationen
dc.relation.ispartofEURASIP Journal on Image and Video Processing 2007, 2007:023570en
dc.relation.ispartofseriesImage and video processing for disabilityen
dc.relation.publisherversionhttps://dx.doi.org/10.1155/2007/23570
dc.rights© 2007 Villanueva and Cabeza. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.en
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttps://creativecommons.org/licenses/by/2.0/
dc.subjectGaze trackingen
dc.titleModels for gaze tracking systemsen
dc.typeinfo:eu-repo/semantics/article
dc.type.versioninfo:eu-repo/semantics/acceptedVersion
dspace.entity.typePublication
relation.isAuthorOfPublicationd3bfd5bf-8426-455b-bcc4-897ddb0d4c2e
relation.isAuthorOfPublication42fe20f8-5341-4c0e-8686-333ce816adfd
relation.isAuthorOfPublication.latestForDiscoveryd3bfd5bf-8426-455b-bcc4-897ddb0d4c2e

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
ModelsGaze.pdf
Size:
1.67 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed to upon submission
Description: