The announcement of AssistiveTouch on Apple Watch caused a great deal of excitement among those with motor disabilities, with many others impressed by the sheer technological achievement of it.

An analyst believes that we’re only seeing the tip of the iceberg in terms of the uses for this particular feature…

Writing on AboveAvalon, analyst Neil Cybart believes that Apple is likely to use the tech as a means of controlling Apple Glasses.

It’s part of a post in which Cybart argues that Apple has a four- to five-year technological lead in wearables, and is more like a decade ahead when you look at the broader picture.

Two months ago, Facebook gave the press a peek at how it is researching using a smartwatch-like device as an input method for a pair of AR glasses. The research, centered on electromyography, looked to be in the pretty early stages with many years needed before seeing the technology in a consumer-facing product. The video was intriguing as it showed research that was thought to be at the forefront of what is going on in technology R&D today.

Apple then shocked everyone by unveiling AssistiveTouch for Apple Watch. Instead of showing a behind-the-scenes look at an R&D project, Apple unveiled a technology ready for users today. The technology, relying on a combination of sensors and technologies to turn the Apple Watch into a hand/finger gesture reader, was designed for those in need of additional accessibility.

Of course, the technology can go on to have other use cases over time, such as controlling a pair of smart glasses like the ones Facebook is working on. AssistiveTouch does a good job of showing just how far ahead Apple is on the wearables R&D front.

Both arguments make sense to me. Glasses are a reasonably convenient display device for information and AR experiences, but the Google Glass approach of tapping and swiping on the touchpad was a pretty clunky form of control. To me, it had exactly the same problem Steve Jobs described about touchscreen computers: It seems to make sense at first, but is awkward and tiring to use after a time.

When Apple unveiled the iPhone in January 2007, Steve Jobs famously said that the iPhone was “literally five years ahead of any other mobile phone.” He ended up being mostly correct. It took the competition a number of years, and a whole lot of copying, to catch up with what Apple had just unveiled.

With wearables, my suspicion is Apple’s lead is longer than five years. There are three components to Apple’s wearables lead:

Custom silicon/technology/sensors (a four to five-year lead over the competition, and that is being generous to the competition).

Design-led product development processes that emphasizes the user experience (adds three years to Apple’s lead).

A broader ecosystem build-out in terms of a suite of wearables and services (adds two years to Apple’s lead).

The idea of simply leaving our arm relaxed at our side and using pinch and clench gestures with our hand seems a great way to control Apple Glasses or similar.

Of course, this would require someone to own an Apple Watch as well as Apple Glasses, so it perhaps won’t be the only control option, but it makes sense as a recommended one – and would boost the Apple ecosystem by making two devices much better than one.

The whole piece is well worth reading.

Concept image: iDropNews