The EyeTribe C# SDK simplifies connectivity and parsing data. Although this is a sample implementation we recommend that the reference clients are used when possible as they have been verified to work with the EyeTribe Server. If you do not want to use this a simple plain C# example is provided further down.
To get started go to the C# SDK on GitHub. From here, you can import the library to your project using NuGet, clone the library and reference it from your own project or you can go to 'releases' section and download the latest DLL.
Note that you can find examples of how to use EyeTribe C# SDK in the official C# Samples on GitHub. These are a great inspiration to anyone new to the EyeTribe Dev Kit.
In the constructor of our application we need to Activate with the GazeManager. The first parameter indicated the API version. We have chosen to provide an enum here that will list the versions this specific client build is compatible with. The second parameter specifies that we want data continuously pushed to our application (the alternative is pull, to manually fetch data by request).
Classes that wish to receive gaze data should implement the IGazeListener interface. This interface contains three methods where the OnGazeUpdate is of most interest since it contains the coordinates of the estimated on-screen gaze position, size of the pupils, position relative to the sensor etc.
Note that the Eye Tracker Server must be calibrated before it outputs gaze data.
public class GazePoint : IGazeListener { public GazePoint() {
// Connect client GazeManager.Instance.Activate(GazeManager.ApiVersion.VERSION_1_0, GazeManager.ClientMode.Push);
// Register this class for events GazeManager.Instance.AddGazeListener(this);
Thread.Sleep(5000); // simulate app lifespan (e.g. OnClose/Exit event)
// Disconnect client
GazeManager.Instance.Deactivate(); } public void OnGazeUpdate(GazeData gazeData) { double gX = gazeData.SmoothedCoordinates.X; double gY = gazeData.SmoothedCoordinates.Y; // Move point, do hit-testing, log coordinates etc. } }
In order to fetch the data stream coming out of the Tracker Server we need to do three things,
Lets look at how we can accomplish this in C#. This is a simplified version that demonstrates the basic concepts of obtaining data from the server.
First we define three objects.
private TcpClient socket; private Thread incomingThread; private System.Timers.Timer timerHeartbeat;
The socket is to be connected to the “localhost” on port “6555” (default values). Using the same socket we then send a simple connect request that specifies what version of the API we will be using. In order to continuously receive data we spawn a thread that reads data from the socket. The json data received is then parsed into an object called Packet where Key-Value pairs can be parsed further.
Since we want to keep the connection with the server alive we are required to send heartbeat requests at certain intervals. To achieve this we create a timer that ticks every N milliseconds and sends a heartbeat request over the socket.
First, lets create a Connect() method that connects the socket, starts the listener thread and the heartbeat timer.
public bool Connect(string host, int port) { try { socket = new TcpClient(“localhost”, 6555); } catch (Exception ex) { Console.Out.WriteLine("Error connecting: " + ex.Message); return false; } // Send the obligatory connect request message string REQ_CONNECT = "{\"values\":{\"push\":true,\"version\":1},\"category\":\"tracker\",\"request\":\"set\"}"; Send(REQ_CONNECT); // Lauch a seperate thread to parse incoming data incomingThread = new Thread(ListenerLoop); incomingThread.Start(); // Start a timer that sends a heartbeat every 250ms. // The minimum interval required by the server can be read out // in the response to the initial connect request. string REQ_HEATBEAT = "{\"category\":\"heartbeat\",\"request\":null}"; timerHeartbeat = new System.Timers.Timer(250); timerHeartbeat.Elapsed += delegate { Send(REQ_HEATBEAT); }; timerHeartbeat.Start(); return true; }
The Send(string message) method is used to send data back to the server. In this example these messages consists of the initial connect request and subsequent heartbeats.
private void Send(string message) { if (socket != null && socket.Connected) { StreamWriter writer = new StreamWriter(socket.GetStream()); writer.WriteLine(message); writer.Flush(); } }
From the Connect() method we spawn a new thread that reads data from the socket and parses the json messages into a “Packet” object. We use the Newtonsoft Json library to handle the parsing. Once a packet has been parsed we raise an event with the data.
public event EventHandler<ReceivedDataEventArgs> OnData; private void ListenerLoop() { StreamReader reader = new StreamReader(socket.GetStream()); isRunning = true; while (isRunning) { string response = string.Empty; try { response = reader.ReadLine(); JObject jObject = JObject.Parse(response); Packet p = new Packet(); p.RawData = json; p.Category = (string)jObject["category"]; p.Request = (string)jObject["request"]; p.StatusCode = (string)jObject["statuscode"]; JToken values = jObject.GetValue("values"); if (values != null) {
/* We can further parse the Key-Value pairs from the values here. For example using a switch on the Category and/or Request to create Gaze Data or CalibrationResult objects and pass these via separate events.
To get the estimated gaze coordinate (on-screen pixels):
JObject gaze = JObject.Parse(jFrame.SelectToken("avg").ToString()); double gazeX = (double) gaze.Property("x").Value; double gazeY = (double) gaze.Property("y").Value; */ } // Raise event with the data if(OnData != null) OnData(this, new ReceivedDataEventArgs(p)); } catch (Exception ex) { Console.Out.WriteLine("Error while reading response: " + ex.Message; } } }
We use a simple container class called Packet to hold the data.
public class Packet { public string time = DateTime.UtcNow.Ticks.ToString(); public string category = string.Empty; public string request = string.Empty; public string statuscode = string.Empty; public string values = string.Empty; public string rawData = string.Empty; public Packet() { } }
The Packet is then broadcasted to event-subscribers by raising a custom event that simply contains the Packet.
public class ReceivedDataEventArgs : EventArgs { private Packet packet; public ReceivedDataEventArgs(Packet _packet) { this.packet = _packet; } public Packet Packet { get { return packet; } } }
A trackbox is a small graphical component that illustrates a persons position relative to the sensor. This is useful to see if you are within range of the sensor and that tracking is fully functional. In most implementations the position of the eyes are drawn on a surface that maps to the physical sensor of the eye tracker device.
The easiest way to display a trackbox in your application is to add the TETControls.dll to your .Net project in Visual Studio. Then you can easily drag and drop the TrackBoxStatus control onto your application surface. A simple sample using using WPF XAML code:
<Window x:Class="MyApp.MainWindow" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:TrackBox="clr-namespace:TETControls.TrackBox;assembly=TETControls" Height="340" Width="310"> <Grid> <Grid x:Name="TrackingStatusGrid"> <TrackBox:TrackBoxStatus x:
Name="trackingStatus"
Width="300"
Height="250"
Margin="4"
HorizontalAlignment="Center"
VerticalAlignment="Top"
/> </Grid> </Grid> </Window>
No additional changes to the C# code is required, assuming that the GazeManager has been activated (required once per application).
In order to get on-screen coordinates of where someone is looking a calibration must be performed. This procedure creates a model that maps the users unique eyes to the display. A typical calibration routine displays a point that is moved to different locations on the screen and left visible for a second or so while the eye tracker to collect samples. Once all points have been sampled, typically nine, the server computes and sets the active calibration whereby the output of on-screen x and y coordinates begins.
In this C# example we first need to include TETWinControls.dll to our Visual Studio project. In the simplest form we use the CalibrationRunner to handle the entire process. This will launch a new window that closes on completion.
using TETUserInterface.Calibration; public class MyApplication {
private void ButtonCalibrateClicked() {
CalibrationRunner calRunner = new CalibrationRunner(); calRunner.OnResult += calRunner_OnResult; calRunner.Start(); }
private void calRunner_OnResult(object sender, CalibrationRunnerEventArgs e) {
switch (e.Result) { case CalibrationRunnerResult.Success: MessageBox.Show(this, "Calibration success " + e.CalibrationResult.AverageErrorDegree); break; case CalibrationRunnerResult.Abort: MessageBox.Show(this, "The calibration was aborted. Reason: " + e.Message); break; case CalibrationRunnerResult.Error: MessageBox.Show(this, "An error occured during calibration. Reason: " + e.Message); break;
case CalibrationRunnerResult.Failure: MessageBox.Show(this, "Calibration failed. Reason: " + e.Message); break; case CalibrationRunnerResult.Unknown: MessageBox.Show(this, "Calibration exited with unknown state. Reason: " + e.Message); break; }
} }
It’s also possible for any class to implement the ICalibrationResultListener
which calls back to listeners using the method OnCalibrationChanged(bool isCalibrated, CalibrationResult results)
. This comes directly over the API (which the CalibrationRunner uses and raises its event, see above.) The benefit of using the CalibrationRunner is that it will raise additional events, for example, if the device was disconnected and the calibration aborted, or if the user aborted the calibration etc.
public MyClass : ICalibrationResultListener { public MyClass() { // If needed, activate client (once per app) GazeManager.Instance.Activate(GazeManager.ApiVersion.VERSION_1_0, GazeManager.ClientMode.Push); // Register this class for calibration-ready callbacks GazeManager.Instance.AddCalibrationStateListener(this); } public void OnCalibrationChanged(bool isCalibrated, CalibrationResult result) { Console.Out.WriteLine(“AverageAccuracy: “ + result.AverageErrorDegree); } }
We can obtain the quality of the calibration in the CalibrationResults
object which contains the following values.
Name | Key | Type | Description |
Result | result |
bool |
Overall success |
AverageErrorDegree | deg |
double |
Mean accuracy for all points in degrees of visual angle |
AverageErrorDegreeLeft | degl |
double |
Left eye average accuracy for all points in degrees of visual angle |
AverageErrorDegreeRight | degr |
double |
Right eye average accuracy for all points in degrees of visual angle |
Calibpoints | CalibrationPoint [] |
Array of values for each point |
To find points in the sequence that were not sampled correctly we can iterate over the CalibrationPoint
objects in the results array. The values of the CalibrationPoint
are listed in the table below.
Name | Key | Type | Description |
State | state |
int |
STATE_NO_DATA = 0 STATE_RESAMPLE = 1 STATE_OK = 2 |
Coordinates | cp |
Point2D |
Center X and Y position of the point in full screen coordinates (top-left=0,0) |
MeanEstimatedCoords | mepix |
Point2D |
Average estimated gaze position from all samples |
Accuracy | acd |
Accuracy |
Plain object that contains three doubles (degrees) Average , Left , Right |
MeanError | mecp |
MeanError |
Plain object that contains three doubles (pixels) Average , Left , Right |
StandardDeviation | asdp |
StandardDeviation |
Plain object that contains three doubles (pixels) Average , Left , Right |
Performing a calibration requires following a predefined sequence of steps. First we need to tell the server that a calibration is starting, then we signal the start and stop for each point. Once all points have been sampled the server will return with the calibration results.
You can grab our working code of a complete Calibration sample on GitHub
private void DoCalibration() { GazeManager.Instance.CalibrationStart(9, this); foreach(Point point in calibrationPoints) { ShowPointAtPosition(point.X, point.Y); // Let eyes settle on point Thread.Sleep(200); // Notify server point start GazeManager.Instance.CalibrationPointStart(point.X, point.Y); // Sampling for 800ms... Thread.Sleep(800); // Notify server point end, then move to next GazeManager.Instance.CalibrationPointEnd(); } CalibrationResult result = GazeManager.Instance.LatestCalibrationResult; }
Let’s take a look at the JSON messages that are sent during the calibration. First we send a packet telling the server that we want to start a new calibration with nine points.
{ "category": "calibration", "request":"start", "values": { "pointcount":9 } }
For each point we send the following PointStart
and PointEnd
messages.
{ "category":"calibration", "request":"pointstart", "values": { "x":100, "y":100 } } { "category":"calibration", "request":"pointend" }
Once all points have been sampled we get the CalibrationResults
delivered from the server in reply to the last Point End. This response contains average values for the sequence as well as a list of points with corresponding values for each position. By parsing and iterating over the points we can find positions where we obtained poor samples, this can be useful for re-sampling position where tracking was poor (perhaps not looking at screen).
You can read more about the JSON message representing the Calibration result in our API section