Who isn’t excited about new toys? So I just thought I would kick off the new year with a little blog on one of the new toys that arrived in the Objective Digital labs in 2011 – the Tobii mobile device eyetracker. This shiny new gadget can do what I’ve wanted to try for ages – track the eyes of a person using a mobile device (mobile phones and tablets).
For a lot of our past mobile projects, we have had to use an overhead camera to record the screen of the device as a participant used it but we couldn’t tell where they were looking. So a lot of the findings depended on the behaviour and responses of the participants. With the use of the eyetracker, we are able to get much deeper insights on the use of mobile devices and applications.
Setting up the eyetracker is pretty straightforward. There are a few small pieces to put together but following the instructions in the user manual makes it a simple process. You just have to make sure you use the correct attachments depending on whether you want to test a mobile phone or a tablet. The device holder can be rotated so that the device can be used in portrait or landscape mode. After the setup is done, you hook up the device to one of our eyetracking PCs with Tobii Studio and configure a few settings to optimise the image coming from the device camera.
Calibrating the eye-tracker is a bit different to the normal automatic calibration we do with the standard monitor eyetracker setup in which the participant looks at the monitor during calibration and follows a moving red dot with their eyes. However, with the mobile version, the participant needs to look at a numbered calibration pad.
First, you let Tobii Studio know where the screen of the phone is on the incoming video image. You do this by moving around an augmented rectangle area on the image to fit exactly around the calibration pad.
And because the eyetracker is upside down, you flip the rectangle (drag the top-left to the bottom-right, bottom-left to the top-right and so on). To calibrate the eyes of the participant with the system, the user has to to look at the numbered points on the calibration pad starting from 1 and all the way to 5.
For example, the moderator would look at the following calibration point on their monitor and ask the participant to look at number 1 on the calibration pad:
And then number 2:
and so on.
Now we get onto the fun bits – the actual eyetracking stuff. The only little annoyance I find is that the phone cannot be moved after calibration. It is stuck to the phone holder. But the participant can grab the phone in any way they choose. I tried it out by playing with a few apps on my phone. Although it was not the usual way to use the phone, I got used to it pretty quickly.
I asked my colleague, Liz, volunteer to be my first participant and look for her dream house on an iPhone app. We got some pretty cool eyetracking data:
We have already completed a few iPhone and iPad eyetracking projects with fascinating results and we will sharing some of those stories in our future blogs. So stay tuned!