Hi, I'm also interested in this feature.
Ideally, I would like to be able to calibrate one eye while other is covered. After that the same process would be done for other eye. Once
calibrated, the system would calculate the gaze positions and angles for both eyes
independently. This would enable research in eye movements disorders, like strabismus. It would also enable users with such conditions to use the gaze interaction more effectively, because they would be able to use the system with the dominant eye only (deviating eye would be ignored).
Is there any chance this feature will be out? If it's not high on your priority list (it's an edge case after all), I was thinking about some alternatives. For instance, I could develop the calibration routine myself. But then I would like to have access to head movement variables, so I can compensate for head movements. I posted relevant question here:
viewtopic.php?f=8&t=172Any info would be appreciated. Thanks!