Friday, July 13, 2012

eye tracking technology

Abstract:


Eye tracking data is collected using either a remote or head-mounted ‘eye tracker’ connected to a computer. While there are many different types of non-intrusive eye trackers, they generally include two common components: a light source and a camera. The light source (usually infrared) is directed toward the eye. The camera tracks the reflection of the light source along with visible ocular features such as the pupil. This data is used to extrapolate the rotation of the eye and ultimately the direction of gaze. Additional information such as blink frequency and changes in pupil diameter are also detected by the eye tracker.


Refer more...
eye tracking technology Pdf1 
eye tracking technology Pdf2

eye tracking technology ppt 1
eye tracking technology ppt 2
eye tracking technology ppt 3






EyeTech received a lot of attention with the unveiling of two new eye tracking devices for both laptops and TVs. EyeTech’s new Quick Access software was also unveiled and allows consumers to quickly and accurately select inside any Windows program.
The future of computer human interaction will be a combination of:
  • Multi-touch
  • Gesture
  • Voice
  • Eye Tracking
  • Brain Wave / EEG
Eye tracking is a field that never seizes to fascinate me. It is truly amazing that one technology can have such a wide range of capabilities and applications. Here are 8 applications of eye tracking that are being used today:
#1 Website Usability Testing- Computers have become a primary source of information; therefore, it is critical that users be able to easily locate and comprehend information on a user interface. Eye tracking is often used by Web designers and Usability Specialists to identify which elements of websites function as intended and which need to be revised.
#2 Advertising and Marketing Research- Another growing application for eye tracking is in the marketing industry. Advertisers are evaluating the effectiveness of their campaigns, using eye tracking to determine if customers are noticing the key elements in a product placement, commercial, or print ad.
#3 Assistive Technology- both wearable and monitor mounted eye trackers are being used by disabled individuals for communication and computer control. With most of these products, eye tracking permits eye movement to replace the traditional keyboard and handheld mouse.
#4 Digital and Operational Training Scenarios- Eye tracking is used in different types of simulators, including driving, flight, and even operating room, to track the eye movements of trainees as they perform tasks. Military and law enforcement agencies have also used eye tracking in the field.
#5 Human Behavior- One of the most common applications of eye tracking in research is studying patterns of eye movements and their correlation with different behaviors. There is much to be discovered about how visual behavior relates to cognition and decision-making.
#6 Developmental Psychology- Infants communicate and take in information about their world through their eyes before they can speak. Eye tracking can get an up close look at how babies perceive their surroundings and how visual behavior impacts their development.
#7 Human Factors Research- Eye tracking is often used to monitor and research how people interact with their environment, particularly with respect to equipment and machinery. Human factors research seeks to improve efficiency, operational performance, and safety, as humans engage with their technical and environmental surroundings.
#8 Neuroscience and Diagnostics- It has been discovered that certain oculometrics, only traceable with eye tracking, could be potential indicators of neurological conditions. Research is being conducted to determine if eye tracking may be an accurate tool for identifying signs of Traumatic Brain Injury, autism, and ADD.

Eyetracking can be defined as a technique that is used to record and measure eye movements. Definition is simple enough, but I always get a follow-up of “how does it record” and “will it hurt”? First let me say no it will not hurt, second you will not go blind, and third you will not become a mutant–sorry. At User First we use the Tobii T60 model, so I will discuss how this equipment works specifically.

Fantastic Machinery, The Eye

Imagine if you will that you are looking out a window from your home or office onto a city street.  As you look outside, your eyes are constantly moving.  Some of these movements are conscious.  For example, you notice the movement of a dog and glance above it for a glimpse of its owner.  But more so your eyes are moving involuntarily, focusing only on certain areas of the visual field in order to form a picture of the scene for your brain. The human eye is a fantastic piece of machinery; yet it is not capable of absorbing 100% of the visual field in an instant with clarity. We call the area of the eye capable of this focus the foveal area and the brief pauses of our gaze the fixations.PCCR_ User First Blog The foveal area accounts for only 8% of the visual field at any one time but supplies 50% of the visual data received by our brain.  And the movement of the eye controls which regions of the visual field we fixate on and which regions are ignored and left to the poor acuity of our peripheral vision which is only useful for picking up movement and strong contrast.

So how are Eye Movements Tracked?

Just as the human eye relies on the focus and detection of light to see, so does the most common technique used to track eye movements called Pupil Centre Corneal Reflection (PCCR).  This technique is non-intrusive and the technology making it possible comes most commonly in two forms: either a specially equipped computer monitor or a head-mounted device.  Both options use a light source to illuminate the eye causing highly visible reflections.  The illumination is near infrared and therefore unnoticeable to the user but creates reflection patterns on the cornea and pupil of the eye and two image sensors on either the computer monitor or the head-mounted device are used to capture images of the eyes and the reflection patterns.  A computer then uses advanced image processing algorithms and a physiological 3D model of the eye to estimate the position of the eye in space and the point of gaze with high accuracy.
The location of these gaze points during each fixation, the time spent on each fixation, and the pattern in movement from one gaze point to another are the key pieces of data collected during an eye tracking study.  These data can then be visualized using a gaze plot or a heatmap.
gaze-plot-blog
CAPTION: The Gaze Plot visualization shows the movement sequence and position of fixations (dots) and saccades (lines) on the observed image or visual scene.
Heatmap-User First blog
CAPTION: The Heatmap visualization highlights the areas of the image where the participants fixated. Warm colors indicate areas where the participants either fixated for a long time or at many occasions.
Not only can we determine what and how visual information is consumed but patterns in eye movement tell us more.  Emotional responses are evident in eye movement patterns and thus allow us to connect physical behavior of the eye to cognitive behavior in the brain.  This is whyeyetracking is a strong supplement to traditional qualitative studies.  They allow a scientific measure beyond the subjective responses provided by a participant in an interview.
Eyetracking is especially important in the age of mass media.  The amount of content, the speed at which it is delivered, and the speed at which a user consumes it, means users spend less and less time fixated on each image

Here’s how it works

Lenovo-Tobii eye control PCBefore we began the demo, Barbara explained the technology. Tobii’s eye control works a bit like the Xbox Kinect (or a reverse Wii), but on a much closer scale. As you sit in front of the laptop, a row of two synced infrared sensors located under the screen scan your eyes. They do this about 30 to 40 times per second, examining the size and angle of your pupil, the glint in each of your eyes, and the distance between you and the laptop. Together, the two sensors create a stereoscopic 3D image of your eye for the computer to examine. Based on the angle and glint of your eye, Tobii’s technology calculates precisely which part of the screen you are looking at. It can even tell when you look away or close your eyes. To save power, the demo unit on hand darkened its screen when we looked away.
After explaining how it works, Barbara calibrated the Lenovo-Tobii eye control PC for her eyes. The calibration process takes a few seconds. Basically, you look at a series of three to nine dots on the screen, which lets the computer know where your eyes are looking. Nintendo has used similar calibration tests on its Wii Motion Plus controllers and Wii Fit balance board software. The calibration is painless and shouldn’t have to be done very often. After calibration, the laptop will be able to save your “eye profile” and know how to calibrate when a familiar user logs on.


Read more: http://www.digitaltrends.com/computing/death-of-the-mouse-how-eye-tracking-technology-could-save-the-pc/#ixzz20WpgwZzn



Uses for eye control

The first portion of the demo (which you can watch below) simply shows where your eyes are looking on the screen. Looking at your own eyes isn’t particularly fun, but it shows you how fast the system reacts when you move or blink. However, after the intro screen and calibration, we got into some different use scenarios.



Reading: Out of all of the uses for eye control, reading demonstrates its value more than anything else. Everyone has their own technique and style for reading on a laptop or touch device. Personally, I tend to keep my text toward the top of the screen. Sometimes I use the mouse to highlight things I’ve already read and use the direction buttons to scroll down. Tobii’s eye control instantly makes all of these customized reading styles irrelevant. More natural than book reading, text automatically scrolls up, down, left, or right for you as your eyes pan around the screen. It’s amazing. Confused about a word? Well, if you stare at one word long enough its definition will pop up.

Playing media: Another demo Barbara showed me was a simple media player. A row of pictures and album covers  fill the bottom of the screen. Glancing at one of them highlights the choice and looking upward plays the music or maximizes the picture so you could get a better look. Done listening or viewing? Simply look at another item in the list. And when you looked at the arrows on the left or right for a second or so, the next page of results appear.

Zooming and panning: Eye control doesn’t mean there is no use for the keyboard. In a Google Maps-like demo, you can pan and zoom by looking and pressing/holding a button, which works well. A single button is assigned to zoom and another button is assigned to pan. To zoom, you simply look at what you wish to focus on and push the zoom button. Once zoomed, holding another button and looking left, right, up, or down lets you pan around your zoomed image. There are likely even more intuitive ways to perform complex tasks like this.

Multitasking: At CES this year, I complimented the BlackBerry PlayBook for its easy swiping method to switch between applications. With a WebOS-like interface (or perhaps WebOS itself) and Tobii’s eye control, multitasking between apps is as natural as looking to the left or right of the screen. Using Windows 7, which isn’t at all optimized for anything other than a mouse, Barbara swapped between windows by looking at them and pressing a button. She also moved a mouse pointer icon around the screen with ease.

Gaming: The last demo we played was a simple Asteroids-like game. Your mission is to protect the Earth from a barrage of doomsday-sized comets and asteroids headed your way. Looking at an asteroid triggers a laser that blows it up. There are a ton of touch-type games like this, which pit your reflexes against the computer, but eye control takes the speed and intensity of these games up a notch. I’m incredibly excited to see what kind of games can be made using the speed of the eye.
These are only a few of many new ways eye control would let you interact with a PC desktop or laptop. When you start thinking about how this technology could interact with voice recognition, the possibilities seem endless. In a few years, Minority Report may look dated.




It’s like Kinect for PCs


Lenovo-Tobii eye control PC
Microsoft has said that it will eventually release Kinect-like motion technology for PCs. Well, I hate to break it to them, but Tobii has done a lot of the hard work. Moving Kinect to a PC would mean shifting the focus from the body to the eyes and face, something Tobii has achieved with remarkable precision.
I’ve written a lot about Microsoft and the many challenges it faces with Windows 8. Currently, the company has a failing Phone platform and no tablet strategy. To forge ahead, Microsoft needs to generate excitement around its bread and butter, which is still traditional, keyboarded PCs. Tobii’s eye control is exactly the kind of innovative interface Microsoft could build a comprehensive new experience around. It could simplify the complexities of the Windows desktop OS while bridging some of the gaps between the PC and mobile touch platforms. If implemented properly, technology like this could reignite some buzz around the laptop market, especially combined with many of the ideas Microsoft has demonstrated on Windows Phone and Xbox Kinect.
Barbara informed me that Microsoft is already a client of Tobii’s for some of its larger research units, which the Redmond giant uses to study the effectiveness of its own application layouts and interfaces by tracking the eye movements of potential customers as they experience a new design. Microsoft, I’m looking at you. If you don’t try something like this, someone else will.
Regardless of who it is, one thing is clear: Tobii needs a strong partner that realizes the potential of its technology.

Eye control is coming, hopefully

Like any good technology, Tobii’s eye control is only as useful as the software developers who write for it. The company is in talks with a number of hardware, software, and platform makers to work toward implementing its technology in PCs as soon as two years from now, but it will be a tough road ahead. The technology still needs to be smaller, use less battery, and cost less. Without the vision of Apple, touch tablet devices stagnated in limbo for a decade. Lets hope that one of these platform makers recognizes the potential in eye control. The only loser here is the mouse, and I’m sorry my friend, I love ya, but your days are numbered.


Read more: http://www.digitaltrends.com/computing/death-of-the-mouse-how-eye-tracking-technology-could-save-the-pc/#ixzz20WqOF5TM

1 comment:

  1. Nice post. This article explains so much about eye tracking technology in detail. I do find it very helpful for me as I was curious to learn about it. This is such an amazing technology and is mainly used in web development.
    eye tracking mouse

    ReplyDelete