Page 1 of 3

bkb is a program to control keyboard/mouse with eyes

PostPosted: 12 Apr 2014, 09:32
by MastaLomaster
bkb is a program to control keyboard/mouse with eyes
It supports The Eye Trybe eye tracker, TobiiREX eye tracker, and any device that can move a mouse cursor (e.g. an airmouse).

Program and the source code are available for download here:
https://github.com/MastaLomaster/bkb


Installation
Just unzip the bkb32c-English.zip to any folder. Make sure that this folder remains the working directory of the program. Otherwise the program won't load messages.bkb and keyboard.bkb files, and you'll get Russian interface instead of English one.

(updated)
To run the program you also need Microsoft Visual C++ Redistributable for Visual Studio 2012 Update 4. It can be downloaded here:

http://www.microsoft.com/en-us/download ... x?id=30679

If it is not installed, you'll get the error message complaining that the file "msvcrt110.dll" cannot be found.

Using the program with The Eye Tribe tracker

The "Eye Tribe Server" program must be running. Also you need to calibrate the device with the "Eye Tribe UI" before running the bkb32c.exe

Using the program with the Tobii REX eye tracker

(updated)
You need the "TobiiGazeCore32.dll" file from the Tobii Gaze SDK 4.0 to be copied to the working directory of the program. By the way, I found it in my "C:\Program Files (x86)\Tobii\Tobii EyeX" directory after installing the "Tobii Eye Experience". Alternatively, Gaze SDK 4.0 can be downloaded from: http://developer.tobii.com/downloads/ (registration required). Look for the "TobiiGazeSdk-CApi-4.0.X.XXX-Win32" file, where X-current release numbers.

Before starting the program, visit Windows Control Panel, run the "Tobii EyeX Settings (32 bit)" program, and calibrate the device.

Keyboard click sounds
There is a click sound when you press the keyboard buttons. If you don't like the sound, place a WAV-file with the desired sound into the working directory of the program and name it "click.wav".

Basic work principles
After program started and a supported device is selected, you may see the toolbar on the right side. If you ise an eye tracker, the transparent window with the cursor will be shown, it will follow your eyes movements. When using an [air]mouse, the regular cursor is used. To select a tool fixate your eyes on the tool button.

Take a look at these videos to understand the modes of operation:

http://youtu.be/O68C4d2SNC8

**IMPORTANT**: choose Swahili language to watch English subtitles. Sorry, I don't know other ways to switch off the subtitles by default.

http://youtu.be/rqcN9IZ39_4

Known issues
- no easy way to exit the program. One have to close windows in the task bar
- doesn't work with fullscreen applacations so far
- doesn't work with the Metro-style interface of Windows 8/8.1, you have to use good old desktop
- drag-and-drop doesn't work in some cases, for example you cannot move desktop icons on some PCs
- windows moved and doesn't work properly after logout/switch user
- you cannot define timings (keyboard press, fixations, etc.) [fixed]
- impossible to click with a mouse holding the keyboard button pressed (e.g. Ctrl + click) [fixed]
- and many more small things....

Translate to other languages:
One can easily translate the User Interface and modify a keyboard. Just edit the "messages.bkb" and "keyboard.bkb" files. These are text unicode files. But (!) the file format and contents can be changed in future!!!

Compiling the source codes
(updated)
As for now, you have to use Microsoft Visual Studio 2012 (latest update preferred). This is due to the fact that the libraries used (from the Tobii Gaze SDK 4.0) are compiled the same way.

In the project properties enable unicode support

Include directories must contain the "include" one from the Tobii Gaze SDK 4.0. Gaze SDK 4.0 can be downloaded from: http://developer.tobii.com/downloads/ (registration required). Look for the "TobiiGazeSdk-CApi-4.0.X.XXX-Win32" file, where X-current release numbers.

No Tobii Gaze SDK libraries needed during the compilation/build.

You need just standard Windows libraries: Ws2_32.lib, winmm.lib,Msimg32.lib

The project must be linked dynamically to the MSVCRT110.dll, if you link statically, this will be in conflict with the Tobii Gaze SDK libraries used!

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 14 Apr 2014, 00:37
by MastaLomaster
Just fixed a terrible bug with the drag-and-drop function. You might have your mouse stop clicking or clicking wrong button. Please download the latest build...
It should now be possible to drag-and drog not windows only, but desktop icons as well. Also you may now select text or items in lists.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 24 May 2014, 15:44
by giandou
Wonderful job MastaLomaster, I'm trying to join the project for further functionalities, but at the moment I'm reading the code (and translating comments in english:). Thank You for sharing.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 06 Jun 2014, 11:28
by JeffKang
Adjustable button sizes

I can’t thank you enough for the program.

So I think that one of the first aims for the program is to try to be more adaptable for when eye tracking is very imprecise for an individual.

When I’m using the program, there are times where I’m inadvertently hitting the neighbor of my target.

There might be some people that are getting less accurate tracking with the Eye Tribe tracker.

Your program is also able to work with other inputs. People might be using open-source projects like the ITU GazeTracker software for use with cameras, and they could have rougher tracking.

I’m not sure if you checked it out yet, but there’s Gazespeaker (http://www.gazespeaker.org/viewtopic.php?f=22&t=196), another open source accessibility program in the forums.

If you take their virtual keyboard, and set the positioning mode to automatic, the cells will order and position themselves to fill the available space, and cells will grow in size if other cells are deleted. You can make the cells as large as you need them to be, and that helps to work with any eye tracking inaccuracies and instabilities.

Thanks.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 06 Jun 2014, 20:06
by MastaLomaster
Re: Adjustable button sizes

I'm very pleased you are using the program. So far I heard about 3 people only all over the world who are using or trying to use it (and one of them is a handicapped colleague of mine).
You've increased my self-confidence by 25% :)

To be honest, initially I wanted to make buttons blow like bubbles when you stare on them. This would make the part of the keyboard you look at bigger. A kind of zoom. And the "bkb" name is [probably] an acronym for a "Bubble KeyBoard". But I didn't manage to do it and left it as it is. And the name means nothing now.

As a workaround you can easily make your own keyboard layout with big buttons. Just edit the "keyboard.bkb" file with the notepad (keep the copy of the original file). Here is an example layout with 4 buttons only:

Code: Select all
1 2 2
# The first line must be like the one above: panes lines columns

scancode 0x51 0 0 0 Q 0
scancode 0x57 0 0 0 W 0

scancode 0x45 0 0 0 E 0
scancode 0x52 0 0 0 R 0


The keyboard will look like this:
Image

Just take the provided "keyboard.bkb" file and change it to contain less columns or even less rows, then the remaining buttons will grow.
By design, the keyboard cannot fill more than... don't remember... about 45% of screen height. This is done to be able to type in any part of the screen. first you fill the upper part, then, after pushing the "top-down" key move the keyboard up, and see the lower half of the screen.

Now, regarding your phrase: "There might be some people that are getting less accurate tracking with the Eye Tribe tracker. "

All of us are waiting for the new drivers/software to change the reliability and quality of the tracking...
As far as the device is just an IR-camera with lights, the processing is done in the computer and improving the software may dramatically change the device behaviour.
I will not write anything else... I'm still waiting... 9.0.28, 9.0.29, ... 9.0.35, 9.0.36.... calm down... wait...

By the way, today I've pushed the next build of the program to github. It now supports EyeX from Tobii as well. But you need MSVC2012 C runtime libraries instead of MSVC2010 ones that were used before.

Jeff, you've mentioned the "chronic tendinitis". Does it affect your hands movements only or head movements as well? I'm asking because my program is very effectively used by the person who can move his head while not able to move hands. Head, not just eyes. I have a publication in Russian about the possibility to use a remote control from a multimedia STB mounted on a head to use as an iput for my program. This guy NEVER miss the right button, the precision is fantastic, but... you need to be able to move your head...

The publication translated by google looks like this http://translate.google.com/translate?hl=ru&sl=ru&tl=en&u=http%3A%2F%2Fhabrahabr.ru%2Fpost%2F213715%2F.
The original publication is here http://habrahabr.ru/post/213715/.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 07 Jun 2014, 04:10
by JeffKang
----

Full-screen virtual keyboard – adjusting transparency of buttons, and size of labels – trusting auto correct when full-screen keyboard mildly obscures view

>keyboard cannot fill more than... don't remember... about 45% of screen height


There is quite a bit of stuff that I’d like to squeeze into the virtual keyboards. It’s give-and-take, as you have to sacrifice button sizes, or you might need to have more alternate virtual keyboards (requires extra steps).

I don’t think the virtual keyboard of Gazespeaker outputs text to external windows like your software can, but you can make the virtual keyboard the size of the entire screen. That allows the buttons to be really large, but you can’t see through.

Maybe there could be an option to adjust the transparency of the buttons, and size of labels?

E.g. “Transparent Keyboard” on Android: https://play.google.com/store/apps/deta ... t_keyboard

Memorizing the layout of a virtual keyboard, and not needing solid buttons, and large labels – fewer, larger regions easier to memorize – user becomes accustomed to personal keyboard

A virtual keyboard probably won’t be changed as often, and a user would probably and eventually know full well the locations and details of their personal virtual keyboard buttons. That is, a fading of the buttons wouldn’t affect them as much.

I sometimes use a Nexus 10 as my keyboard by way of VNC. I don’t use the stock Android keyboard. I use Hacker’s Keyboard, which gives you much more keyboard keys. It also lets me stretch the keyboard height to 75% of the screen. The buttons are much larger, and depending on the remapping script, groups of buttons activate the same thing, so I sometimes have even larger regions.

If Hacker’s Keyboard were to give me the option of making the keyboard near-transparent so that I could see behind it (see what the VNC sees), I’d still be able to comfortably hit the Hacker’s Keyboard buttons, and regions of buttons. Even though it’s a virtual keyboard, and there’s no tactile feedback, I can still sometimes strike the buttons that I want without looking down at the keyboard because the regions are large, and I’m now very familiar with the layout. This is going even further than a near-transparent and hard-to-see virtual keyboard, as I’m not even looking at the keyboard. That is, the Hacker’s Keyboard keyboard could be invisible, and I’d still probably be able to sometimes know where to press. Therefore, given a large enough button size, and time to learn an on-screen keyboard like bkb’s keyboard, a high transparency shouldn’t bother the user.

Auto correct and auto complete = not needing to pay attention to what you’re outputting

With transparency, you’d be able to be aware of what’s being outputted, even with a full-screen keyboard. However, with all the auto correct and auto complete technology these days, just being aware that something is being outputted could be enough.

E.g. “is my cursor in the Google search text box? It is? Okay.” *Going back to focusing on the virtual keyboard, and typing “aitocottecy”* = Showing results for “autocorrect”. “That’s right. Thanks!”.

I don’t have to focus on what’s being typed, and any possible mistakes, so having my view be slightly obscured (depends on the transparency/translucency setting) by a full-screen keyboard doesn’t matter.

(Although, a customized on-screen keyboard, or parts of it would probably have to be somewhat similar to a QWERTY keyboard layout, as I’m guessing that some auto correction systems are based primarily on that layout.

Then again, auto correction systems and language models are getting more advanced, and more contextual factors are getting taken into account.

http://googleresearch.blogspot.ca/2014/ ... guage.html – “we are releasing scripts that convert a set of public data into a language model consisting of over a billion words, with standardized training and test splits”).

Full-screen = more unintended dwelling – take “pause” button from master list

If it is full-screen, then a button is more likely to be in your view. To get the dwelling to stop, there could be a “pause” button that you could drag from a master list of buttons to put on the virtual keyboard (e.g. Gazespeaker has a master list of cells that you can drag to your grid. When you customize the toolbar of Firefox, you drag from a master list of icons. The six-square “All Apps” icon in Android also takes you to a large list of icons that can be sent to the Home screen).

Head-tracking (face-tracking?)

>I'm asking because my program is very effectively used by the person who can move his head while not able to move hands. Head, not just eyes.


Yeah, my head’s all good. I’ve heard about programs like Camera Mouse, eViacam, and MouseTrap. There’s supposed to be much more stable, but you might have to tilt your head around a lot. I already tend to lean, tilt, and move around a lot; I don’t like sitting still.

Eye-scrolling for everyone!

>So far I heard about 3 people only all over the world who are using or trying to use it (and one of them is a handicapped colleague of mine).


Eye-scrolling! If the eye-scrolling feature can be refined, every single person on the planet could benefit from using this program for its eye-scrolling. That’s one function that’s not specific to accessibility. When you read, your eyes go to the bottom of the window anyway.

(I’m not going to give up on the thought that an eye tracking two-step process for an interface element selection in certain situations could benefit everyone, so I think bkb’s magnification is important. I’m still wondering what could be added to the magnification step e.g. some sort of projection after the magnification?: http://i.imgur.com/3erfG6K.png).

bkb Bubble KeyBoard – Apple OS X dock?

>"bkb" name is [probably] an acronym for a "Bubble KeyBoard"


Ah, that’s what it means ha ha ha.

That makes me think of the magnification that happens on the Apple OS X dock:

http://i.imgur.com/Tqupszq.png?1

When you move your arrow over a docked icon, the icon grows bigger. I don’t have any experience with it, but I don’t think the registration area changes; it’s a visual effect. Vertically, the registration area might grow bigger, but horizontally, it doesn’t, as the docked icons are squeezed together horizontally. An icon would have to shrink after magnification as the cursor moves laterally so that you could get to its neighbor that’s right beside it.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 09 Jun 2014, 11:07
by MastaLomaster
Thank you Jeff for a detailed vision on how the program can be improved.

Unfortunately, the priority #1 for me is to make the program comfortable for the person it was created for.

So, as for now, I will be fixing things like its inability to move the volume slider in YouTube, adding repeating right click and tens of other bugs/features important for this particular person. And transparent keyboard and word prediction are not in this list...

Fortunately, I've published all the source code, and anyone can tailor the program on its own, adding the features he/she needs the most.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 09 Jun 2014, 23:09
by JeffKang
> Fortunately, I've published all the source code, and anyone can tailor the program on its own, adding the features he/she needs the most.


Ideally, I’d get in there myself, but the tricky position is that I actually need programs like these to do heavier inputting, and practice writing code.

Thankfully, there are some other programs (e.g. Alt Controller (map regions to inputs)) to read hands-free (not as input heavy). I’ll stick with reading about programming for now.

I really want to put off learning C++ until I absolutely have to learn it.

Somebody put up a video of a virtual keyboard: https://www.youtube.com/watch?v=JoIMzfIKVDI. The user is supposedly using an eye-tracking platform called Pupil, and the keyboard is based on an open source JavaScript framework.

I think Eye Tribe is going to have a Python API soon (I think there are already some unofficial Python wrappers already). I think Tobii is going to release a JavaScript API in the future.

I’d like to shortcut my learning as much as possible :).

Then again, if this program is as important as I say it is, then I should be learning C++ just for it.

No obligation, but I’ll continue to float any ideas in the future.

Thanks.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 30 Jul 2014, 10:20
by JeffKang
I will be fixing things like its inability to move the volume slider in YouTube


“Repeat-the-next-command” button on a virtual keyboard

When you click on a YouTube video, you give it focus.

When you press tab, a yellow border/box/rectangle surrounds an interface control, or link of the video player:
Image

The border indicates which interface element has focus (:focus state, or onFocus event?), so you can manipulate it with the keyboard.

You can keep pressing tab to go through each control.

Speech recognition commands have the advantage of easily being able to repeat actions, and say something like “four words right”.

Perhaps bkb could add a “repeat” button on the virtual keyboard.
You could activate “repeat”, then a number (for the number of times to repeat the following action), and then activate another keyboard button, or action, and have that action be repeated for the specified number of times.

For text editing, you could repeat the arrow keys, or control + arrow, to move the caret around.
You could repeat a tab for navigating through, or jumping to interface controls.

Once you tab to the volume control, you could use a “repeat”, followed by a “number-of-times-to-repeat”, and then an “up” arrow key to increase the volume, or a “down” arrow key to decrease the volume.

Re: bkb is a program to control keyboard/mouse with eyes

PostPosted: 08 Aug 2014, 15:56
by MastaLomaster
Thank you for the suggestion!

So far I've found and implemented another way to do it. I've added submenus for the left and right click tools, where you can choose not to use zoom window, and this helps to change the volume in the YouTube. Unfortunately this costs less precision, and often you need to click several times before you click on the right pixel in the volume control...
The submenus, however, are used for other purposes with better efficiency. Now you may click holding the Ctrl, Alt and/or Shift key, as shown below.

Image