Eye Tracking keyboard switch project team-up

Who besides me has multiple monitors AND multiple computers, with multiple keyboards and mice?

How many times have I typed on the wrong keyboard? Answer: a lot. People in my telegram/whatsapp chats often get command line entries blotted into their threads (they are used to it now).

I think the answer is a switch of some kind that appropriately directs the keyboard and mouse to the correct computer, based on gaze direction of the user. That is, what monitor the user is looking at.

I think this can be done with a Raspberry Pi, a decent webcam, some CV software, and some USB voodoo on the Pi. 9 years ago I built a commercial eye tracking product. The speed of a modern RasPi, together with 9 years of additional maturity for open source tools in this domain, should be adequate to get this done. Iā€™m happy to contribute that element. If thereā€™s anyone here who knows a lot about USB hacking in Linux, Iā€™d love to team up to build this. If you want to do a kickstarter or something, thatā€™s fine, but Iā€™m not interested in doing the marketing. If you are, great. If you want to make in OSHW, thatā€™s fine with me, too.

Not to discourage you from coming up with a hardware solution by suggestion a software one but have you tried using just one keyboard and mouse and something like the Synergy software that letā€™s you use them on multiple computers at the same time?

1 Like

Interesting idea. Definitely a good on-ramp application for eye-tracking that isnā€™t VR/AR related.

What are you picturing that connects on the ā€˜clientā€™ computer side(s) and receives kbd/mouse? (Wireless) Hardware USB dongles? Tangle of wires from the coordinating RPi/computer switch? Network software?

Iā€™m thinking about a USB cable from each computer going into a raspi, and one usb coming out to a keyboard & mouse.

This isnā€™t really a hw project. Itā€™s a SW project on commodity hw. I donā€™t like synergy ā€” looked at it. It is a very imperfect solution because it tracks the mouse. It is also slow and craps all over my network. A lot of times I just have terminals open and the only thing that moves is my gaze.

1 Like

What youā€™re describing sounds mostly like a conventional KVM (Keyboard Video Mouse) switch.

Before the shift to virtualised machines, a server room might have dozens of machines connected to a single display and input devices. There were hardwired and IP based solutions.

It sound like you just need to add automatic focus detection.

Yes, Iā€™m old enough that Iā€™ve used KVMs before. The difference here is that it doesnā€™t switch monitors, just keyboard and mouse, and, of course, that it would be automated based on gaze detection.

sounds interesting,. iā€™ve got a couple of usb hid projects going on that do require some switching.

so the device would need mutiple usb device support and multiple usb host support, then youā€™d do a compound HID which would funnel the devices to the selected pc

does the rpi do multiple usb device/otg? if so thatā€™d be a good start.

I love the idea but am a bit dubious about how well the gaze tracking part would work, based on some trials of online demos with my webcam. My laptop sits in front of and below one of my PC monitors, so it would need more than left/right detection.

Re ā€œUSB voodooā€, it would probably be more straightforward to implement a simple USB switch in hardware and control it with Rpi GPIO.

Re Synergy, how exactly does it work? They have one of those annoying websites that tells you only the touted end result and gives zero information about operation - no user documentation, no trial download.

The gaze tracking is what Iā€™m least worried about, since Iā€™ve done it before. Itā€™s just SW and a camera. At worst case, a driver mod to speed up frame acquisition.

The USB side is more unknown to me. Not so much the USB protocol part, but how to make the UX good. Hard switching may cause the OS to wig out when it loses input devices. So I expect the raspi will need to keep sending dummy HID messages to the client computers.

youā€™d just setup two device connections and then service them both, then depending on which is active send the data from the other two hids as a passthru. itā€™d never loose the device, you just wouldnā€™t send data to one.

Synergy 1.8x is the last version free I think. It can be installes on linux, windows and macOS. I used it a while ago.

I paid for 2.0 thinking it would be better, but I was wrong. It connects to server (most probably to check if you paid or smth) and if it looses the conection you are out of luck, despite that all your devices are in the local network.

I requested a refund.

Just set up Synergy (1.10.2) and am in love. Unless there are issues lurking that will manifest later, I donā€™t think Iā€™d trade this for a gaze-tracking solution.

1 Like

I have a lot of computers too. I gave up on KVMā€™s because of the bugginess. I try to SSH or RDP if I can. If I need a keyboard, I use 1 wireless keyboard with an integrated track pad, and just move the dongle. Wouldnā€™t you need to hook multiple webcams to the pi too?

Day Two and Iā€™m still loving Synergy, itā€™s near perfect, transparent, and seamless. Life-changing! Iā€™m also thinking that gaze detection might be one of those ā€œtoo much automationā€ scenarios - I can think of a number of situations in which I would not want my keyboard & mouse switching to another computer just because I looked at its monitor, or out the window next to the monitor, etc. Sometimes manual is better.

One camera should be fine.

Thereā€™s a calibration stage on first setup, or if you move your monitors. After that, there are ways to do continuous calibration to account for things like changes in lighting conditions.

As for looking out the window, gaze tracking can be remarkably accurate. It will know if youā€™re not looking at a monitor. The precision should be down to about 1cm radius of your gaze point.

So Logitech also has a Synergy clone in itā€™s arsenal, but itā€™s hardware dependent. Logitech Flow They have mice and keyboards that can pair with multiple computers. Youā€™ll just need to use the Flow software to make the transitions.

Iā€™m not sure if it dials home, but I know that it uses multiple Unifying dongles to connect to the computers. I assume this means that it is not like the Synergy software that transmits the Key/Mouse info via the network or via their serversā€¦

I try to stay away from Logitech software, itā€™s generally horrible. Synergy has been rock solid and completely transparent for me so far - it just works.

Hi, I got this page from google, and this project seems to exactly match my need: 2 computers with net isolated, messy desk, tired of KVM. So I came up with similar thought of switching by gaze with rpi + cam, but happy to know someone else has a more structured idea. So I know it has been 2 years, but how does this project go?

Hi!

Two years ago I had a lot of time on my hands, but not now. That said, the Pygaze project has popped-up, and (assuming it works), it should be able to do all the heavy lifting.

http://www.pygaze.org