Distance from sensor incurs vertical shift (and how to fix)

Forum to report issues and bugs on Windows 7, 8, and 10.

Distance from sensor incurs vertical shift (and how to fix)

Postby kevin.cole » 16 Jan 2014, 04:30

I discovered the eye-tribe once calibrated will encounter issues if the user moves either forward or backward from the device.

Moving further away causes a shift downwards, moving closer a shift upwards.

I found I may be able to correct for this issue if I manipulate the output coordinates using the iris size. Perhaps an additional calibration point could be to ask the user to lean back while staring at the same point and then lean forward staring at the same point, this way you can capture this change and adjust for it.


Another thing as a developer I may find better is getting the raw X/Y of the pupil location in relation to it's calibrated zone as a normalized float (1.0 to -1.0) That way we may be able to internally do our own calibration or maths in-case our use-case for the device doesn't fit the current calibration method. You can add these values for each eye on the GazeData class.

For example, I would like to sit the device on a podium and work with dual monitor setup of rather large screens which may act as one large wide screen.

Edit:
After reviewing the API it seems like the smoothed coordinates are returning such a setting (1/-1) Ignore that portion =) My initial point was from observation of the eyetribe ui and brief look at the samples.

The Calibration for distance still stands though. Do you accept contributions to the GIT for the CSharpClient? Perhaps once I'm done I can expand on the code a bit?
kevin.cole
 
Posts: 7
Joined: 14 Jan 2014, 03:57

Re: Distance from sensor incurs vertical shift (and how to f

Postby Martin » 16 Jan 2014, 14:45

Hi Kevin,

You made the correct observation, this is a limitation with the initial release.

I'd be glad to try out your compensation routine and potentially merge/branch the GIT repo.

Thanks.
Martin
 
Posts: 567
Joined: 29 Oct 2013, 15:20

Re: Distance from sensor incurs vertical shift (and how to f

Postby kevin.cole » 21 Jan 2014, 12:11

I managed to add some compensation for the z-shift a bit. I didn't use the calibration over a distance as it was proving to be too tedious and often the sensor would report bad data and I wasn't able to filter it out easily enough. I switched my approach to using a formula.

ZInitial = headPosition.Z at sweet spot;
ZCurrent = headPosition.Z;
YGaze = gazePoint.Y;
YMouse = cursorPoint.Y;
YDelta = YMouse - YGaze;

YConstant = Math.Sqrt(Math.Abs(YDelta))/(ZCurrent - ZInitial);

Then to compute the delta needed to put it back in the right spot:
YDelta = ((ZCurrent-ZInitial)*YConstant)^2


If I could get a good value for the head's Z-Distance from the sensor I'm sure it would work smoothly. The code currently has a problem if the user heads more near to the sensor. I think that's because I'm incorrectly judging head distance based on the pupils. Would be nice to know the actual distance some how, maybe more calibrations and tweaking?

How it works:

1. The user runs a normal calibration run to set the sensor up properly (looking at the circles).
2. The user uses the GazeDot and positions their body so that the gaze dot lines up with where they are looking on the screen.
3. User clicks a button which calls SetSweetSpot()
4. User moves farther away, looking at the tip of their cursor and clicks a button that calls SetDeltas()
5. User moves closer to sensor and clicks a button that calls SetDeltas() again (which is supposed to calculate for YConstantOnNearZ but has issues right now)
6. You now have a method of correcting the gaze offset by a bit.

I was able to keep the gaze location within 100 pixels of the actual gaze location, it's not perfect but it is a heck of a lot better. Seems to work nicely within a 2 ft range from the sweet spot. Would no doubt be more better if I could get a realistic head position or if the sensor had depth sensing in it.

The X correction when you move side to side will most likely need to be done by figuring out the projection from the sensor to the user which I don't understand the math behind very well (I was never good at 3D Maths :oops: ).

Here's a video showing the result (Video is unlisted) http://www.youtube.com/watch?v=0PvtbHz_arE


Code: Select all
using System;
using System.Drawing;
using System.Runtime.InteropServices;
using TETCSharpClient;
using TETCSharpClient.Data;

namespace NetworkController.Plugin.EyeTribe
{
    public class Correction : IGazeUpdateListener,IDisposable
    {
        public static Correction Instance;

        public double ZInitial;
        public double ZCurrent;
        public double YGaze;
        public double YMouse;
        public double YDelta;
        public double YConstantOnFarZ;
        public double YConstantOnNearZ;


        public Correction()
        {
            Instance = this;
            GazeManager.Instance.AddGazeListener(this);
        }
        public void Dispose()
        {
            GazeManager.Instance.RemoveGazeListener(this);
            Instance = null;
        }

        /// <summary>
        /// When looking at a point on the screen and the dot using the normal eyetribe result is exactly on the point you are looking at.
        /// </summary>
        public void SetSweetSpot()
        {
            ZInitial = ZCurrent;
        }


        /// <summary>
        /// Assuming user is looking directly at the tip of mouse cursor the offsets can be calculated.
        /// The user should be leaned back or forward enough to make a noticable error.
        /// </summary>
        public void SetDeltas()
        {
            YDelta = YMouse - YGaze;

            //Perform calcs for constants
            //Yp= Sqr(Yd) / (Zc - Zi)
            if(ZCurrent < ZInitial) YConstantOnFarZ = Math.Sqrt(Math.Abs(YDelta))/(ZCurrent - ZInitial);
            else YConstantOnNearZ = Math.Sqrt(Math.Abs(YDelta))/(ZCurrent - ZInitial); //Different constant for near due to bad head position z value
        }


        public void OnGazeUpdate(GazeData gazeData)
        {

            var isValid = ((gazeData.State & GazeData.STATE_TRACKING_GAZE) != 0)
                          && ((gazeData.State & GazeData.STATE_TRACKING_PRESENCE) != 0)
                          && ((gazeData.State & GazeData.STATE_TRACKING_EYES) != 0)
                          && ((gazeData.State & GazeData.STATE_TRACKING_FAIL) == 0)
                          && ((gazeData.State & GazeData.STATE_TRACKING_LOST) == 0)
                          && gazeData.SmoothedCoordinates != null
                          && gazeData.SmoothedCoordinates.X != 0
                          && gazeData.SmoothedCoordinates.Y != 0
                ;
            if (!isValid) return;

            var headPosition = gazeData.HeadPosition(); //See other post for code: http://theeyetribe.com/forum/viewtopic.php?f=11&t=35&sid=301d70c38eb44f37495cf997dc8d9b11
            if (headPosition == null) return;

            //Now that we can suppose the data is probably valid we can make use of it.
            var cursorPoint = GetCursorPosition();
            var gazePoint = gazeData.SmoothedCoordinates;

            ZCurrent = headPosition.Z;
            YGaze = gazePoint.Y;
            YMouse = cursorPoint.Y;

        }

        public Point2D CorrectPoint(Point2D point)
        {
            var result = new Point2D(point);
            var y = (ZCurrent - ZInitial)* (ZCurrent<ZInitial ? YConstantOnFarZ : YConstantOnNearZ );
            result.Y = result.Y - (y*y); //Yd = ((Zc-Zi)*Yc)^2
            return result;
        }


        public void OnCalibrationStateChanged(bool isCalibrated) { }

        public void OnScreenIndexChanged(int screenIndex) { }

        #region Windows Mouse API
       
        /// <summary>
        /// Struct representing a point.
        /// </summary>
        [StructLayout(LayoutKind.Sequential)]
        public struct POINT
        {
            public int X;
            public int Y;

            public static implicit operator Point(POINT point)
            {
                return new Point(point.X, point.Y);
            }
        }

        /// <summary>
        /// Retrieves the cursor's position, in screen coordinates.
        /// </summary>
        /// <see>See MSDN documentation for further information.</see>
        [DllImport("user32.dll")]
        public static extern bool GetCursorPos(out POINT lpPoint);

        public static Point GetCursorPosition()
        {
            POINT lpPoint;
            GetCursorPos(out lpPoint);
            //bool success = User32.GetCursorPos(out lpPoint);
            // if (!success)

            return lpPoint;
        }
        #endregion
    }

}
kevin.cole
 
Posts: 7
Joined: 14 Jan 2014, 03:57

Re: Distance from sensor incurs vertical shift (and how to f

Postby bostwickenator » 06 Mar 2014, 05:58

Eye to eye distance will be far more accurate for determining your Z
bostwickenator
 
Posts: 11
Joined: 02 Mar 2014, 22:04

Re: Distance from sensor incurs vertical shift (and how to f

Postby bostwickenator » 27 May 2014, 05:08

Is this implemented in the server yet?
bostwickenator
 
Posts: 11
Joined: 02 Mar 2014, 22:04

Re: Distance from sensor incurs vertical shift (and how to f

Postby Martin » 27 May 2014, 13:39

bostwickenator wrote:Is this implemented in the server yet?


On its way. This feature will mark version 1.0. It's not just a simple compensation coefficient, it's a whole new calibration model.
There's a couple of things left to fix and we won't ship this feature until it works really well. Preliminary results are pretty spectacular.
Martin
 
Posts: 567
Joined: 29 Oct 2013, 15:20

Re: Distance from sensor incurs vertical shift (and how to f

Postby aamanieu » 21 Aug 2014, 00:42

is there a downloadable beta or even an alpha version of this miraculous version 1.0 that fix this major problem ?
aamanieu
 
Posts: 10
Joined: 02 Apr 2014, 00:51

Re: Distance from sensor incurs vertical shift (and how to f

Postby Martin » 02 Sep 2014, 19:39

Not yet. It's at internal testing and performance optimization.
Martin
 
Posts: 567
Joined: 29 Oct 2013, 15:20

Re: Distance from sensor incurs vertical shift (and how to f

Postby joaquin » 05 Feb 2015, 20:06

I think using the pupils as reference is not such a good idea because the size of them might change over time due to habituation to room light or variation in room lighting.
joaquin
 
Posts: 19
Joined: 05 Feb 2015, 18:18

Re: Distance from sensor incurs vertical shift (and how to f

Postby JeffKang » 31 Jul 2015, 09:07

Asymmetric Aperture raytrace patent expiry (for Z movement?)

viewtopic.php?f=8&t=520

Asymmetric Aperture eye tracking - Explicit raytracing for gimbal-based gazepoint trackers expired patent - solution for z-axis forwards and backwards head movement?

Could the Asymmetric Aperture method be an adequate interim solution to the vertical shift from z-axis movment as the head moves forwards and backwards?

Patent “Explicit raytracing for gimbal-based gazepoint trackers” (WO 2006108017 A2) seems to have expired, and is now in the public domain.
http://www.google.com/patents/WO2006108017A2?cl=en

Interactive minds eye tracking uses the Asymmetric Aperture method:

To achieve high gaze point tracking accuracy, the image processing algorithms in our eye tracking systems explicitly accommodate several common sources of gaze point tracking error.

The accuracy of video eye trackers is typically sensitive to head motion along the camera axis.
As the head moves toward the camera the predicted gaze point (if uncorrected for range) moves radially away from the camera, and as the head moves backward, the predicted gaze point moves radially in toward the camera.

Typically, when a person is about 60 cm from the camera, and looking at a point toward the top of the computer screen head motions of 2.5 cm along the camera Z axis result in predicted gaze point variations of about 1,9 cm.

All our eye trackers use the patented Asymmetric Aperture Method to measure variations in the range between the camera and the cornea of the eye, and they use the range information to minimize gaze point tracking errors resulting from longitudinal head motions.


http://www.interactive-minds.com/eye-tracker

________________________________________

Explicit raytracing for gimbal-based gazepoint trackers WO 2006108017 A2

Abstract

One embodiment of the present invention is a method for computing a first gaze axis of an eye in a first coordinate system.
A camera is focused on the eye and moved to maintain the focus on the eye as the eye moves in the first coordinate system
A first location of the camera in the first coordinate system is measured
A second location of the eye and a gaze direction of the eye within a second coordinate system are measured
A second gaze axis within the second coordinate system is computed from the second location and the gaze direction
The first gaze axis is computed from the second gaze axis and the first location using a first coordinate transformation.


The patent is expired?:
May 14, 2008 32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC - FORM 1205A (05.03.2008)


http://www.google.com/patents/WO2006108017A2?cl=en

TLDR: This method seems an awful lot more complex than it is.
But basically it’s projecting an IR pattern onto the eye that changes as the range changes.
So you can measure range by looking at how the pattern looks.
Pretty simple, doesn't require a ton of math to compute, and gives you a highly precise range measurement.
JeffKang
 
Posts: 129
Joined: 15 Feb 2014, 23:59


Return to Issues and troubleshooting - Windows



cron