Sensors - Feeling the Force (Force Touch) in iOS9
Sensors are changing the planet and impacting us personally and professionally. They extend our physical senses into the digital world, and enable our physical touch to digitally control remote systems and machines on land, air, sea and space. The new Apple operating system iOS9 adds new capabilities that my friend and Cognizant mobile and digital technical guru, Peter Rogers, has been testing and researching. In this "must read" article he shares how iOS9 handles touch and sensing. [Geek Alert] This gets technical.
Every time there is a new games console release (especially when Nintendo is involved) rumors are always floating around of a technological support for textures that you can actually feel on your touch screen. Basically the ability to sense different materials through the screen. It is a lovely idea and the closest we have come yet is probably haptics (https://en.wikipedia.org/wiki/Haptic_technology) and electric shock feedback (https://www.youtube.com/watch?v=MRQAijNKSEs).
Well, we are not there quite yet but Apple certainly came close with the iPhone 6S announcement of 3D Touch (http://www.apple.com/iphone-6s/3d-touch/). After revolutionising the touch screen world with multi-touch, it then made perfect sense to add a force element to the touches in order to offer different types of touch depending on the applied pressure. In fact, there was something called Force Touch which was already available on the Apple Watch however it had less capability to measure your touches and doesn’t react as quickly to your input. This is because the new 3D touch can instantly measure microscopic changes and feed them back from the hardware to the software in real-time. 3D Touch is highly sensitive and reacts immediately, it also allows different types (or level) of press depending on the pressure applied. Apple has included this feature in iOS 9, but the hardware is only available in 6S devices.
“When you press the display, capacitive sensors instantly measure microscopic changes in the distance between the cover glass and the backlight. iOS uses these measurements to provide fast, accurate, and continuous response to ﬁnger pressure, which could only happen with deep integration between software and hardware. iPhone 6s also provides you with responsive feedback in the form of subtle taps, letting you know that it’s sensing the pressure you’re applying.” [Apple]
I have already fallen in love with 3D Touch but we have to remember that it is only available on 3D Touch devices and the feature may also be turned off by the user. Currently the only devices supporting this are the 6S and 6S Plus, which is surprising given that the new iPad Pro would be perfect for pressure sensitive art packages. The Apple Human Interface Guidelines state that “When 3D Touch is available, take advantage of its capabilities. When it is not available, provide alternatives such as by employing touch and hold. To ensure that all your users can access your app’s features, branch your code depending on whether 3D Touch is available.” This gives a glimpse of a future whereby most Apps are using 3D Touch even if it is faked on non-3D Touch devices.
As well as being built into some preinstalled applications. You can also use it within third party applications. The 3D Touch enables three new types of capability:
- Pressure sensitive applications, such as art packages
- Peek and pop, to preview content without opening it
- Quick actions, to offer a short cut to different services offered by the same App
The first is realised by two new properties in the UITouch class: ‘force’ and ‘maximumPossibleForce’. These properties allow ‘UIEvent’ events to convey touch pressure information to the App. A typical example is an art package whereby you press harder to get a thicker line.
The second is true genius in my opinion. The UIViewController class can respond to three phases of applied pressure to offer ‘Peek and Pop’ functionality. When you first apply a little bit of pressure then a visual indication appears (the rest of the content blurs) to show if a content preview is available. If it is then a little bit more pressure then you will be shown a preview of the content called a ‘Peek’. If you release your finger at this stage then the content preview is hidden and you return back to the original user interface without having wasted your time loading content that was needlessly time consuming. The email client is a perfect use case as you can imagine. If however you swipe upwards on the Peek then you are shown the ‘Peek Quick Actions’ which allow you to perform quick actions associated with it – this will be explained in the Quick Actions section later on. If you apply the final level of pressure then the you can optionally navigate to the preview content and this is referred to as a ‘Pop’. The analogy here is of a stack of visual elements that allows you to peek at an element before popping it off the stack.
This is where Apple have been really clever in iOS 9 and their rollout of information, as we had previously seen the capability to switch between Apps transparently, but it becomes very clear why this is so useful when we see ‘Peek and Pop’. For example the new Safari View Controller actually uses Safari to do the new rendering without launching it. Likewise the new hot-linking between Web Browser and Apps is seamless without any App loading or closing. This enables the Peak Preview to show you the a preview of a Web URL or Apple Map contained in an email, without having to clumsily swap between applications. This is built into a few of the native applications: email; web links in email; locations in email; and the camera.
The third is probably the most contentious. By clicking on an App icon within a 3D Touch device then you will be presented with a menu of options called Quick Actions. These actions allow you to use the App to quickly perform a given service – for example “Take a Selfie” is supported in the pre-installed Camera App. If you can anticipate between three and five common tasks that your App performs (typically the items within a menu shown in the first screen are good candidates) then you can offer these as Quick Actions either statically (in your app’s Info.plist file) or dynamically (using UIApplicationShortcutItem). A Quick Action can return a small amount of text and an optional icon.
The only downside to all of this wonderfulness is how Xcode 7 supports 3D Touch development. Sadly the Simulator in Xcode 7 does not support 3D Touch and neither does Interface Builder. That pretty much means you need to develop on the device for testing 3D Touch. It also adds a whole layer of entropy for automated testing using systems like Calabash.
As wonderful as iOS 9 is, and I truly believe it is wonderful now, the bottom line is that developers are going to face three issues:
- They will need to be doing a lot more on-device testing for 3D Touch and Multi-Tasking
- They will be increasingly going in different directions for iOS and Android development
- They will be increasingly waiting for cutting edge features to be supported in cross-platform solutions
iOS 9 may go down in history as the operating system that finally broke cross platform development and actually differentiated between native Apps and HTML 5.