The Taptic Engine was introduced in Apples iPhone 7. It not only made possible to replace the mechanical home button but it also introduced API for developers. Through this API it became possible to give the user a subtle mechanical feedback about what is happening on screen.
For example an app can combine user interactions (like tapping a button, fading a slider oder flipping a view) with a light shake of the phone.
Even more sophisticated feedback types are possible. A notification can be combined with the according haptic feedback and then the user can literally feel if a task was successful or not.
My new open source project makes the usage of the mentioned API easier. You can find the project over here at GitHub. As always, I would love to hear feedback from you 🤗.
This blogpost describes the usage of the haptic feedback API and explains the approach of my new project. If it makes live easier for you, feel free to use it. Otherwise it could provide a reference implementation for you.
To integrate the haptic feedback, one must use
UIFeedbackGenerator. This superclass is derived by three different subclasses:
These three subclasses can be used for three different types of feedback:
- A single tap-like feedback provided by
- A more complex feedback to indicate that a task has completed (successfully / erroneous) or that a warning occurred by using
- A single tap that implies a change in the users selection, for example when a switch is used or an image snaps into place, provided by
In any case, the Taptic Engine must be transitioned into its active state, by calling
prepare() on any of the mentioned subclasses. To align the UI interaction or even sounds with the haptic feedback,
prepare should be called about one or two seconds in advance.
What FeedbackController does
As it turns out, the API is not as straight forward as it could be. For example, the type of the notification-feedback is set when the feedback should fire.
Then again, the type of an impact feedback must be set during the
In addition to that, the API is only available on iPhone 7 or newer, running iOS 11 or higher. Prior to using the described APIs, one must perform availability checks. As a reference to the
UIFeedbackGenerator must be held strongly and calls to
perform can occur in many places of your app, those checks need to be implemented quite frequently.
That is where FeedbackController comes into play. It simplifies the calls, needed to perform haptic feedback. It eliminates the need for the developer to keep in mind where the types of feedback can be configured.
In addition to that, it comes with easy to use extensions of
UIViewController. They make it possible to use feedback everywhere in the app by two simple method calls:
However, a call to
prepareFeedback(for:) should be made as early as possible so that the Taptic Engine can be powered on. Please see the example project for further details.
How to use it
If you integrate FeedbackController using CocoaPods, you need to
import FeedbackController first.
When you expect that a user interaction will take place in the next couple of seconds, call
prepareFeedback(for:). In doing so, the Taptic Engine will be powered on and is ready for your feedback. The timing is not as critical as might have guessed and it turned out that preparing the Taptic Engine in
viewDidAppear(_:) is sufficient.
However, the type of feedback is determined by the call to
Secondly, a feedback needs to be performed. To do so, just call
hapticFeedbackImpactOccured()for a impact feedback.
hapticFeedbackNotificationOccured(with:)for a notification feedback, specifying the type of said notification.
hapticFeedbackSelectionChanged()for a selection feedback.
When you are done with the feedback, call
doneWithHapticFeedback() to allow the Taptic Engine going back to its idle state.
You can install it by using CocoaPods or just by copying the
Again, here is the link to the project.