This action requires saving a cookie to remember the page's state. You can read more about that on the cookies page.

 

Flattening the handWavey learning curve

Released on: 2023-12-28

I’ve just released the biggest, most exciting, update since the first public release of handWavey. There’s a bit to it, but the important ideas are:

  • A few bug fixes.
  • Damage control for unknown/unsolved bugs.
  • Several quality-of-life features for reducing the learning curve, and making it better at recognising what you intend.
  • A new way of clicking that’s fundamentally more intuitive.

You can get the latest release on handWavey’s github releases page. And the specific release that this blog post is about is 2023.12.22.0

Table of contents

What lead to this release?

A few weeks ago I came back to handWavey with fresh eyes after taking a break for a while. I wanted to experience the niggles that I had become used to so that I could identify them and figure out a solution to them.

The solutions

Cleaning input for roll and grab

While all other input has been cleaned for quite some time, roll and grab components of the gestures have lacked data cleaning until now. This meant that you needed to do sharp movements to give handWavey a strong signal. If you got too close to the boundary between two states, it was common to trigger several events. This could result in several clicks, which was likely not what you want. Trying to prevent this behaviour lead to tensing up the hand, which was a recipe for injury.

There are a few things that I’ve done to achieve this change:

  • Gentle moving means provided for all axis. These can be used for decision input, and similar tasks.
  • The open/closed state is now calculated after data cleaning. Importantly, the components are calculated together. (Before I got that right, it actually made the grab gesture component worse.)

This can all be tweaked in handCleaner.yml in your configuration directory.

Speed lock

This is one of those changes, that new-comers will probably never notice, but is silently making a night and day difference to their experience.

Assumption: You almost never need to mouse down/up while moving the mouse. Ie you mouse down while stopped, drag, and then mouse up. But you don’t change the button state while the mouse is moving.

Yet, when the hand is moving, the LeapMotion controller produces unreliable data about everything except the X, Y, and Z coordinates. This lead to unreliable events while moving the hand quickly. This manifested as spurious clicks, or loosing grip while dragging.

The solution is really simple:

  1. Detect that the hand is moving faster than a threshold.
  2. Disable gesture changes.
  3. Detect that the hand is no longer moving faster than the threshold.
  4. Re-enable gesture changes.

I needed to do a little playing with the threshold value. But other than that it was incredibly effective.

SpeedLock documentation.

Recalibrating roll - hand entry

Until this change, assumptions about what is comfortable for how you move your hand, had to be hard-coded into the gestureLayout. I would do what I thought was best, and then I’d test it on other people, and many of them would find it completely unnatural. Adding to that, it was hard to be consistent from one entry to the next, so even though I calibrated it for what worked well for me, it was not rare that I’d fumble and create unintended events. This added to the work-load while trying to get the hang of handWavey and even in using it from day to day.

That configuration is now moved from the gestureLayout to being automatically detected once the data has stabilised after you’ve inserted your hand. The effect is that you simply insert your hand, and it is automatically at the optimum position.

Auto-calibration documentation.

Recalibrating roll - autoTrim

Something I noticed while watching other people use handWavey was that they all drifted. ie They might be nailing the segments at one moment, and then when their attention moved to something else, they’d start to move too close to the next segment and trigger unwanted events. There is more to this, but let’s explain it with the solution:

autoTrim is constantly trying to adjust the trim towards having the current segment centered on your current hand position. It’s limited in speed, so if you move slowly, it will adjust transparently, and your selection will feel stable. But if you move quickly, it won’t be able to keep up, and the segment will change. IE:

  • A definite movement will trigger an event.
  • A slow drift will be catered for, and will feel stable.

The trick was to set the autoTrimMaxChangePerSecond within handCleaner.yml to be both intuitive, and not stressful/straining. I’ll continue to tune this as I test it on more people, but for now I think I have a really nice balance.

Auto-calibration documentation.

Damage control

While I’m reducing unwanted events, they do still happen. This can send too many notifications to the sound output. I’ve therefore limited the number of notifications that can be active concurrently. When this limit is reached, the bug noise (A coocoo clock, by default) will be triggered.

Taps

Saving the best, and most exciting for last…

This is actually for an even newer release, but it totally belongs in this blog post.

A few nights ago I realised that I now have the foundations to reliably detect a tapping motion that would behave the same way as tapping a touch pad. There’s quite a bit happening under the hood to make it intuitive, but it works, and it’s fricken cool:

  • It’s really easy to learn. So a new user can delay learning the gestures until they’ve got the hang of the basics.
  • It reduces how often you need to use the gestures, so you can reduce repetitive movements.
  • It works along side the gestures. You can use one, or the other, or both.
  • It works with both, your primary, and secondary hand. And can use the gesture’s segments and open vs closed states to give you much more power over what you can achieve with it.

One thing it does not yet do is click and drag. My personal preference is to use traditional gestures to solve that, but it could also be done via the gestureLayout by having one event to begin the drag, and another to end it.

And if it doesn’t interest you, it’s easy to disable independently of your chosen gestureLayout.

Where it can from

I actually originally implemented this functionality as the “action zone”. The idea was to push through the “active zone” to reach the “action zone”, which would trigger an action. In theory this should have worked. But in reality it wasn’t comfortable to use. Instead I used the “action zone” for other functionality (and have more planned for it), and used gestures to trigger clicks.

By contrast, taps use the speed of your hand moving away from you, anywhere within the “active zone”.

How taps work

There’s a fair bit of logic happening under the hood to make it work in an intuitive way, and I still have some room for improvement. But the basic idea is that we want to detect an intentional jousting movement of the hand. There are many similar looking movements that should not trigger an event. Eg:

  • The hand enters into the “active zone”.
  • The hand is part way through another gesture.
  • The hand is pulling out.
  • The hand is moving the cursor.
  • The LeapMotion controller is sending noise that looks like an intentional movement.
  • etc

I’m still rapidly improving the logic. But you can get the gist of it in the current, publicly visible code, in the public Boolean isDoingATap(String zone) function:

    public Boolean isDoingATap(String zone) {
        // Taps are disabled. Don't spend any more time on it.
        if (tapSpeed < 0) return false;

        // Hand is absent.
        if (absent) {
            resetTap();
            return false;
        }

        // We're not in the active zone.
        if (!zone.equals("active")) {
            resetTap();
            return false;
        }

        /*
            There are a couple of related things going on here.

            * We only want a tap when the hand is pushing away from you.
            * We don't want the tap to trigger when entering the active zone. Therefore we need to make sure that we don't arm the taps until the hand has started to retreat after entering the active zone.
        */
        if (isRetracting()) {
            tapArmed = true;
            tapNegativeCount ++;
            tapPositiveCount = 0;
            return false;
        } else {
            tapPositiveCount ++;
        }

        // If the hand is moving, we are busy doing something else.
        if (!isStationary()) {
            resetTap();
            return false;
        }

        // We haven't yet met the contitions to perform a tap. Don't do anything further.
        if (!tapArmed) {
            resetTap();
            return false;
        }

        // If the state is fluctuating, we don't want to trigger multiple events.
        if (tapNegativeCount > -1 && tapNegativeCount < samplesToWaitNegative) {
            return false;
        }

        if (tapPositiveCount < samplesToWaitPositive) {
            return false;
        }

        // Have we met the speed threshold for a tap?
        if (Math.abs(zSpeed) < tapSpeed) {
            return false;
        }

        // Phew! We're ready to perform the tap.
        tapNegativeCount = 0;
        tapPositiveCount = 0;
        return true;
    }

    private void resetTap() {
        tapNegativeCount = 0;
        tapPositiveCount = 0;
        tapArmed = false;
    }

A nod to the original name

I originally wanted to call handWavey “Display Jouster”, after the sort of stabbing motion you do while interacting with it.

I moved away from this for two reasons:

  • Very few of my friends understood the joke. So that was likely to be a wide-spread problem.
  • The jousting bit actually didn’t work well until now.

So although I didn’t stick with the name; it’s cool to finally live up to it.

Summary

All of this comes together to make handWavey much better at recognising what you intend rather than expecting you to move like a robot.

It was really cool watching my wife try this update for the first time, especially the taps. I had told her about the tap recognition a few days earlier when it was just an idea. Then she just started using it without needing any instruction.

There’s still some learning curve. And there’s still plenty of room for improvement (especially on tap recognition). I’ll continue to work on those niggles. But this is such an exciting place to be right now.

If you’ve been holding off giving handWavey a go, now is an excellent time to get in to it.

This post references

Control your computer using a Leap Motion controller, but with an every-day-quality implementation.
1970-01-01
I've done a few software projects over the years. Here's a collection of some of them.

Posts using the same tags

2024-01-30
Getting the information and access you need to your infrastructure quickly, so that you can get back to sleep.
It's time to blow the dust off machine learning, and apply it to a dataset that I know.
I've just released the biggest, most exciting, update since the first public release of handWavey. The learning curve is dramatically reduced!
The story of where Achel came from, and where it's going.
My CV had gained so much weight that it was hard to do anything with it any more, and it was hard to read. So I did something about it...
Control your computer using a Leap Motion controller, but with an every-day-quality implementation.
How much of your phone's screen resolution can you actually see? This app helps you quantify it.
What is handWavey? And how to get up to speed with it more quickly.
An easy way to get a dark theme for kmail content, that reliably works on pretty much everything.
calibrate multiple touch and non touch screens on a single linux system." This is to address the issue on multi-display Linux desktops where the touch panel is automatically calibrated to operate over all of the displays, and even if you get the calibration right, it's then wrong again when you ...
DoneIt is a time tracker for working out where your time is going. For me there were a few things I wanted to achieve with it - Be able to say what I've been doing all day. - See how much time is lost to time ...
Well over a year ago I introduced mass which is a tool for managing large numbers of servers. It's come along way in that time, and it's well and truly time for another look. From the users' point of view, the most interesting things are probably that you can now ...
Achel is a programming language that I have been working on for the past 13 years (I said 12 in the video, it has been a long time!) There has been growing interest in the programs I have been writing using it, so I have now released it open source. ...
1970-01-01
I've done a few software projects over the years. Here's a collection of some of them.
Home | About | Contact | Cookies | Site map