Dwell Clicker 2 – detecting interface elements, and snapping to themAn assistive technology website did a review of the Eye Tribe:
http://www.spectronics.co.nz/blog/confe ... s-have-it/. They successfully tested the eye tracker with software called Dwell Clicker 2 (
https://www.sensorysoftware.com/dwellclicker.html).
It apparently works with a headpointer, joystick, and now, the Eye Tribe tracker. It “allows you to use a mouse or other pointing device without clicking buttons”. It allows you to snap your clicks to targets. “Target snapping is a feature that makes it easier to click on specific elements on the screen. These elements include buttons, menu items and links. Target snapping works by detecting elements near the pointer that you might want to click on, and locking onto the nearest element.”
(There is a free and paid version, and I think the paid version has the snapping.)
I haven’t tested Dwell Clicker 2 yet, but there are two additional features that would probably help it:
GazeTalk – Resume/accumulation of dwell timeThe first feature is found in GazeTalk: Resume/accumulation of dwell time.
GazeTalk is a free, predictive text entry system by the nonprofit Gaze Group organization (Eye Tribe was derived from Gaze Group):
http://wiki.cogain.org/index.php/Gazetalk_About.
If any jumpiness from eye tracking causes the fixation on an intended button to be briefly interrupted, you could have the option to recognize a resumption of the dwelling, and continue building the time for that particular button. GazeTalk has an "Accumulate dwell time" function to “avoid the re-setting of a button”.
To prevent too many partials being logged and built up, perhaps activation of one element could reset all the states, or partial loads may slowly decay if the element is not being adequately focused on and maintained. Edit: successful activation of one button does in fact reset any partial accumulations.
(I’m still not sure how to transfer the text out of GazeTalk to the window in focus. With Dragon NaturallySpeaking, you can dictate into a Dictation Box, and then transfer. Also, pressing the buttons in Windows’s on-screen keyboard transfers output to the current window in focus).
GazeMouse – Zoom/magnificationThe second feature can be found in GazeMouse: Zoom/magnification.
Gaze Group also has a free gaze-based interface for simulating mouse clicks by gaze input only called GazeMouse (
http://www.gazegroup.org/downloads).
It lets you zoom in a few times before selecting a possibly smaller interface element.
ConclusionEach individual application has some very useful features. If an open source project could be started that could combine some of these features, it could be beneficial to a lot of people.
Also, some of these features could be taken by developers to add to their own program, even if it has nothing to do with accessibility. E.g. detecting and snapping to interface elements. Whether you’re dealing with a non-touch/non-eye-tracking interface with smaller elements and buttons, or a touch/eye-tracking interface with larger elements, the ability to detect and snap to the nearest interface element could be beneficial for both types of interfaces.
bkb – open source program to control the computer:Edit: MastaLomaster, a member of this forum, has an open source application (called bkb?) to control a computer that can be found here:
https://github.com/MastaLomaster/bkbThe page says that the program works with the Eye Tribe tracker.
I just found a video demonstration of it here:
https://www.youtube.com/watch?v=O68C4d2SNC8 (the video is labeled in Russian, which probably helped to make it take so long to find).
bkb, GazeMouse, and PCEye softwareIt looks like bkb functions similarly to GazeMouse, and the PCEye software (
https://www.youtube.com/watch?v=6n38nQQOt8U).
With bkb, you dwell/fixate on widgets in a vertical menu bar that is docked on the right. There are widgets for single clicking, double-clicking, etc. There is also a virtual, on-screen keyboard that can be brought out.
One thing that I’ve noticed with GazeMouse is that choosing a command, like left click, means that it will be continually be repeated when the cursor stops (thinks that the user is fixating) after the cursor has been moving. Every movement and subsequent stoppage of the cursor will produce an action until you dwell on/mouse-over a “pause” widget in a vertical menu bar (similar to PCEye’s bar). On the other hand, with PCEye’s interface, you keep going back to a widget in the docked menu bar for each action. It looks like you keep going back with bkb also.
(The interaction on the programs might speed up if they had a movable, floating window like Paint.net’s floating “Tools” window to optionally house shortcuts of the widgets for quicker access).
Anyways, I’m glad that open source accessibility software has been started. bkb looks great!