Apple Watch 2 Speculation

What is the best way to control the Apple Watch 2 display?

In order to have a full colour screen, the current display and battery technology is not capable of keeping the screen lit up for 16–18 hours a day. Could this change in the near future?

Battery densities increase very incrementally and since the size of the device needs to stay the same or more than likely, thinner 1, there is slim to no chance the battery is going to get 3–4 times bigger.  To go from the current Apple Watch to one that could have the display on all day, well, that’s just not going to happen because of battery improvements. There is a rumour of a 35% larger batterya 1.28 WH rating over the Apple Watch’s current 0.98 WH battery, but adding GPS to Watch 2 will probably use most of this larger capacity. It doesn’t change the fact that the display would still need to turn itself off for most of the day.

So, is there a display technology that will give us the 10x or more improvement needed in power efficiency? Can AMOLEDS be produced much more efficiently than the current versions? Or are we waiting for micro-LED technology? They acquired a company called LuxVue in 2014 that was reportedly working on this tech. I’m not sure anybody outside of Apple would have any idea, but I suspect that the power budget to have a screen on all the time is not feasible in the near future.

So what would solve this problem? Better software or new hardware?

The fundamental design flaw with the current Apple Watch is it doesn’t know for certain whether you are looking at it or not. The current trick of detecting gyro motion is neat and clever, but nowhere near 100% reliable.

Yes, you can train users to rotate their forearms to “activate on raise” but it’s even more annoying when you do this Dick Tracy wrist swivel move and the screen still doesn’t come on! I notice this issue a lot when my hands are already above my shoulders, or lying down on a couch, on my side or back.

And yes, the team could tweak that algorithm to make it more sensitive, but then you’re going to have more false positives, where the watch is pointing away from you and yet the screen is on, both wasting battery and showing others what you may not want to show them.

But for Apple Watch to be a better watch in every way from a traditional mechanical or quartz powered watch, it fucking NEEDS to be visible whenever I look at it with 100% reliability. Anything less is a terrible user experience.

I believe detecting WHEN a user is looking at the screen is the most fundamental challenge they need to solve for the watchOS to advance as a platform.

So, do we need better software? Or new hardware? Or in typical Apple style, both?

One Possible Solution

The Watch needs some sort of sensor to detect a user looking at it. I wonder if any of the engineers from PrimeSense, the company who created the original Kinnect, acquired by Apple in 2013, ended up on the Watch team?

I would guess the iPhone’s face detection feature might have already used this technology, and there are rumours of them working on 3D depth mapping technology (dual lenses on iPhone 7 Plus seems way more interesting with this in mind, especially for augmented reality applications), but perhaps they’ve also been working on future Watch technologies?

I wonder whether the rumours of a FaceTime camera on the Watch have more to do with a sensor that could detect eyeballs looking at it, rather than using it for video chat? I’m no programmer, but it would seem to me, the contrast between the whites of someone’s eye and their retina colour would be something that could be reliably detected by a CMOS sensor and intelligent software? And could it be made cheaply enough to include on the $299 price point version of the product?

So, the watch needs a sensor and a lens. Where would that go? Doesn’t seem like there’s room for one on the current model, in between edge of screen and case. Would an ugly forehead or chin on the Watch get approval from our dear CDO? Would the holy alter of symmetry, that Jony has the entire design team pray to before every day in the lab, allow for space above or below the screen for a camera hump? I don’t think so.

For Apple standards of industrial design, we would need to have this sensor embedded behind the screen as a layer underneath the AMOLEDs. I wonder if this patent from 2013 is ready to be used on a shipping product?

http://www.patentlyapple.com/patently-apple/2013/05/apple-invents-a-wild-new-display-that-could-conceal-a-camera-strobe-flash-andor-fingerprint-scanner-until-needed.html

There are also rumours of retina scanning tech coming to iPhone in 2017 already, but maybe the initial version of the rumoured edge-to-edge AMOLED 2018 iPhone with no forehead or chin would be tested out on the much smaller screen and lower volumes of the Apple Watch?

Another nice side benefit of the S2 chip constantly scanning for eyeballs looking at the watch, is that the opposite is also true. If you could reliably detect not just eyeballs, but whose eyeballs it’s seeing, you could then ensure that the screen is off when turned away from a person. The “most personal device Apple has ever made” could expand to also mean that the UI and content on this device are truly “for your eyes only”.

As you put more and of your personal information about your health and daily life into Apple OS’s, it would be wonderful if privacy could be expanded to making sure others don’t accidentally see a private notification or details about your health you’d rather not be public knowledge.

Maybe the new MicroLED screen technology is a requirement for this to happen? Most “analysts” seem to think this isn’t feasible until 2017. So, maybe I’m really predicting what the Apple Watch 3 will be like, but I really hope that I’m wrong and we see a dramatically better Apple Watch UX on Sept 7th.

 

1. see Jony Ive’s obsession with thinness in every other Apple product.