iOS, Objective-C, Swift, Design and Whatever Comes in Mind
‹ Back to the Blog

Open-Source-Project: Audiograph

Shortly after iOS 13 was released I discovered a new accessibility feature of the native stocks app:
The app offers to use to rotor to select "data comprehension". After doing so the user can select between various options to get a verbal description of the chart, for example the trend or minimum and maximum values. One option that stands out is "play audiograph".

If you haven't tried it yourself, here is a short video demonstrating it:

Using sound to describe a chart for accessibility reasons seems to be a great idea and because Apple has put so much effort into it, it seems like people really benefit from that!
Charts most often describe a time-value relationship that is otherwise really hard to put into words. Why should disabled people be excluded from that?

Unfortunately there is no public API from Apple that enables developers to implement it in their apps (yet). What's more, there seems to be no open source library that fills the gap (yet).

This is where my small new side project comes in:

Logo of Audiograph project

The usage is rather simple:

private func playAudiograph() { points)

Audiograph can play the content that is visually displayed by the chart. The developer just needs to invoke the play method, passing an array of CGPoint that is already used to draw the UI.
All logic that's left is to decide when to play the Audiograph.

In order to help for that, Audiograph provides a pre-configured UIAccessibilityCustomAction for the developer to add to the chart view.

accessibilityCustomActions = [audiograph.createCustomAccessibilityAction(for: self)]

This is what it looks like in action:

You can read more on that project at GitHub.

When your app draws a chart in any way, please give it a try. Charts should not be a limited experience to those without impairments.

If you have any ideas for improvement please contact me 🙂

"Open-Source-Project: Audiograph".