Eye Tracking keyboard switch project team-up

Who besides me has multiple monitors AND multiple computers, with multiple keyboards and mice?

How many times have I typed on the wrong keyboard? Answer: a lot. People in my telegram/whatsapp chats often get command line entries blotted into their threads (they are used to it now).

I think the answer is a switch of some kind that appropriately directs the keyboard and mouse to the correct computer, based on gaze direction of the user. That is, what monitor the user is looking at.

I think this can be done with a Raspberry Pi, a decent webcam, some CV software, and some USB voodoo on the Pi. 9 years ago I built a commercial eye tracking product. The speed of a modern RasPi, together with 9 years of additional maturity for open source tools in this domain, should be adequate to get this done. I’m happy to contribute that element. If there’s anyone here who knows a lot about USB hacking in Linux, I’d love to team up to build this. If you want to do a kickstarter or something, that’s fine, but I’m not interested in doing the marketing. If you are, great. If you want to make in OSHW, that’s fine with me, too.

Not to discourage you from coming up with a hardware solution by suggestion a software one but have you tried using just one keyboard and mouse and something like the Synergy software that let’s you use them on multiple computers at the same time?

1 Like

Interesting idea. Definitely a good on-ramp application for eye-tracking that isn’t VR/AR related.

What are you picturing that connects on the ‘client’ computer side(s) and receives kbd/mouse? (Wireless) Hardware USB dongles? Tangle of wires from the coordinating RPi/computer switch? Network software?

I’m thinking about a USB cable from each computer going into a raspi, and one usb coming out to a keyboard & mouse.

This isn’t really a hw project. It’s a SW project on commodity hw. I don’t like synergy — looked at it. It is a very imperfect solution because it tracks the mouse. It is also slow and craps all over my network. A lot of times I just have terminals open and the only thing that moves is my gaze.

1 Like

What you’re describing sounds mostly like a conventional KVM (Keyboard Video Mouse) switch.

Before the shift to virtualised machines, a server room might have dozens of machines connected to a single display and input devices. There were hardwired and IP based solutions.

It sound like you just need to add automatic focus detection.

Yes, I’m old enough that I’ve used KVMs before. The difference here is that it doesn’t switch monitors, just keyboard and mouse, and, of course, that it would be automated based on gaze detection.

sounds interesting,. i’ve got a couple of usb hid projects going on that do require some switching.

so the device would need mutiple usb device support and multiple usb host support, then you’d do a compound HID which would funnel the devices to the selected pc

does the rpi do multiple usb device/otg? if so that’d be a good start.

I love the idea but am a bit dubious about how well the gaze tracking part would work, based on some trials of online demos with my webcam. My laptop sits in front of and below one of my PC monitors, so it would need more than left/right detection.

Re “USB voodoo”, it would probably be more straightforward to implement a simple USB switch in hardware and control it with Rpi GPIO.

Re Synergy, how exactly does it work? They have one of those annoying websites that tells you only the touted end result and gives zero information about operation - no user documentation, no trial download.

The gaze tracking is what I’m least worried about, since I’ve done it before. It’s just SW and a camera. At worst case, a driver mod to speed up frame acquisition.

The USB side is more unknown to me. Not so much the USB protocol part, but how to make the UX good. Hard switching may cause the OS to wig out when it loses input devices. So I expect the raspi will need to keep sending dummy HID messages to the client computers.

you’d just setup two device connections and then service them both, then depending on which is active send the data from the other two hids as a passthru. it’d never loose the device, you just wouldn’t send data to one.

Synergy 1.8x is the last version free I think. It can be installes on linux, windows and macOS. I used it a while ago.

I paid for 2.0 thinking it would be better, but I was wrong. It connects to server (most probably to check if you paid or smth) and if it looses the conection you are out of luck, despite that all your devices are in the local network.

I requested a refund.

Just set up Synergy (1.10.2) and am in love. Unless there are issues lurking that will manifest later, I don’t think I’d trade this for a gaze-tracking solution.

1 Like

I have a lot of computers too. I gave up on KVM’s because of the bugginess. I try to SSH or RDP if I can. If I need a keyboard, I use 1 wireless keyboard with an integrated track pad, and just move the dongle. Wouldn’t you need to hook multiple webcams to the pi too?

Day Two and I’m still loving Synergy, it’s near perfect, transparent, and seamless. Life-changing! I’m also thinking that gaze detection might be one of those “too much automation” scenarios - I can think of a number of situations in which I would not want my keyboard & mouse switching to another computer just because I looked at its monitor, or out the window next to the monitor, etc. Sometimes manual is better.

One camera should be fine.

There’s a calibration stage on first setup, or if you move your monitors. After that, there are ways to do continuous calibration to account for things like changes in lighting conditions.

As for looking out the window, gaze tracking can be remarkably accurate. It will know if you’re not looking at a monitor. The precision should be down to about 1cm radius of your gaze point.

So Logitech also has a Synergy clone in it’s arsenal, but it’s hardware dependent. Logitech Flow They have mice and keyboards that can pair with multiple computers. You’ll just need to use the Flow software to make the transitions.

I’m not sure if it dials home, but I know that it uses multiple Unifying dongles to connect to the computers. I assume this means that it is not like the Synergy software that transmits the Key/Mouse info via the network or via their servers…

I try to stay away from Logitech software, it’s generally horrible. Synergy has been rock solid and completely transparent for me so far - it just works.

Hi, I got this page from google, and this project seems to exactly match my need: 2 computers with net isolated, messy desk, tired of KVM. So I came up with similar thought of switching by gaze with rpi + cam, but happy to know someone else has a more structured idea. So I know it has been 2 years, but how does this project go?


Two years ago I had a lot of time on my hands, but not now. That said, the Pygaze project has popped-up, and (assuming it works), it should be able to do all the heavy lifting.