Click2Speak: on-screen keyboard powered by Swiftkey

Applications of eye tracking for accessibility, augmented communication and health care

Click2Speak: on-screen keyboard powered by Swiftkey

Postby JeffKang » 05 Jul 2014, 08:05

BBC video interview: Gal Sont, a programmer with ALS, creates Click2Speak, an on-screen keyboard that is powered by Swiftkey

Gal Sont is a programmer that was diagnosed with ALS in 2009.
He created Click2Speak, an on-screen keyboard that is powered by Swiftkey.

http://www.bbc.co.uk/programmes/p021r01n

https://www.youtube.com/watch?v=WWMsPpBRV3A

Features:

Works with all standard Microsoft Windows applications.
Includes Swiftkey’s powerful features like the award-winning prediction engine, and 'Flow'.
Supports more than 60 languages.
Floats over other applications.
Includes advanced visual and audio features.
Auto-spacing and auto-capitalization.
Choose between different layouts and sizing options.
Contains Dwell feature that allows you to imitate a mouse click by hovering.


"After being diagnosed with the disease, I contacted other individuals who suffer from ALS at different stages, and began to learn about the different challenges that I would face as my disease progressed.
I also learned about the tech solutions they used to cope with these challenges.
The most basic challenge was typing, which is done using a virtual on screen keyboard, a common solution shared by not only individuals affected by ALS, but a variety of illnesses such as brain trauma, MS and spinal cord injuries victims.
The fully featured advanced on screen keyboards, again proved relatively very expensive (starting at $250), so I decided to develop the ultimate on screen keyboard on my own.
Through the development process, my own physical condition continued to deteriorate and I reached the point of needing to use these cameras and on screen keyboards myself.
I started with Microsoft’s 'ease of access’ keyboard that comes with windows.
This is an acceptable keyboard and it has a reasonable prediction engine.

For my own development needs I purchased the developer version of TOBII’s eye gaze camera.
This allowed me to code (with my eyes!) additional important features that were lacking in the Microsoft keyboard for eye control such as highlighted keys, virtual keys, auto scroll, right click, drag and much more.

It quickly became apparent that using our 'powered by Swiftkey’ keyboard enabled me to work faster and more accurately.
Friends who used other solutions prior to ours (not necessarily Microsoft’s) were delighted with the results, albeit a small sample size.

This started a new journey that introduced me to Swiftkey’s revolutionary technologies and how we customize them to our specific needs.
I reached a first version of our keyboard and distributed it to friends who also suffer from ALS.
They gave us invaluable feedback through the development process, and they all raved about its time saving capabilities and accuracy and how it makes their lives a little easier.
Even Swiftkey’s 'Flow’ feature is translated successfully to this environment; basically, it replaces the finger when using Swiftkey on an Android device with an eye/head/leg when using a PC/Tablet/laptop + camera/other input device + our Swiftkey powered keyboard installed.

At this point I had my good friend Dan join me in this endeavor as I needed help with detail design, quality assurance, market research, project management, and many other tasks.
We formed 'Click2Speak’, and we plan to make the world a better place! ...”.


http://www.click2speak.net/
JeffKang
 
Posts: 129
Joined: 15 Feb 2014, 23:59

Return to Accessibility & Health Care



cron