Can data sonification help us see data?

What is data sonification? Data sonfication is to auditory perception as data visualisation is to visual perception. In other words, sonification allows you to hear data in much the same way as visualisation allows you to see data.

So, representing data as sound/music is not the same as representing musical data visually (which I’ve done here), but it’s something that has interested me since I learned about it, which is why I was delighted to see this presentation from the Financial Times’ Alan Smith where he explores data sonification.

https://www.ft.com/video/93a6c808-e8e6-4e6d-b79c-30974ab589cc

This was fascinating and opened up a whole new way of understanding data. But does it help you *see* data? First impressions might be to think that no, sonification allows you to hear data only, that’s the whole point. Sure, you can interpret large and small values and comparisons; trends and outliers in your data via the medium of sound. And perhaps in much the same way that experience in data literacy will help a reader understand a visualisation, perhaps experience and training in interpreting significations (data musicality?) will assist the process further.

But “see” doesn’t always mean to use one’s eyes or visual perception. First of all, my passion for many years was chess. I remember one of my finest victories (almost thirty years ago) was against one of the country’s leading blind players. My queen raked across the board sweeping the long diagonal and threatening my opponent’s king in an unescapable checkmating manoeuvre. But I digress – after the game we chatted and analysed the preceding game to see what we could have played differently. “Well done”, said my opponent, referring to my queen manoeuvre. “I didn’t see that.”

My teenage self tried not to snigger – of course he didn’t see it, he’s blind! But blind chess is played on two boards. A regular board for those of us with regulation eyesight, and a smaller, tactile board for the blind player to analyse and think through his/her moves. Note that white and black squares are raised to different heights, pieces are held in place with slots fitting into holes in each square, and, though you can’t see it here because the player is contemplating the positions and interplay of all his black pieces, there are differences in touch sensations between black and white pieces.

markk
Picture from Braille Chess Association braillechess.org.uk – player unknown

So you don’t need to use visual senses to actually see.

If you look for definitions of the word “see”, this is what you get (courtesy of google before even investigating any dictionary sites). First meaning is to perceive with the eyes, but second meaning is to deduce from information or understand.

Screenshot 2019-04-30 at 20.03.59

When we say “I see”, we’re not referring to the vision in front of us, we’re acknowledging that we have deduced the information presented to us and we understand. Data visualisation package Tableau has it easy – their mission is to help customers “see and understand” their data. Both are the same thing! If data sonification can do this in the same way that data visualisation can do so, then we can say that sonification helps us see our data.

But lets consider “seeing” data in the same way that a specially adapted chess set can help a blind chess player. Can data sonification help us “see” data if we are visually impaired or do not have full visual capabilities?

Now, I’d be first to admit that I’m never one to miss out on a data visualisation fad! When joy plots became popular, I created a joy plot. When bar chart races were in vogue last month, I created a bar chart race (this might only be of interest to you if you’re in my data viz fantasy football league … but hey, it’s a bar chart race!) So when I watched Alan Smith’s yield curve data sonification, I wanted to create one myself. So, I did – here’s what I came up with.

I used free app twotone.io to create a data sonification, using the same dataset used to create my Data Visualisation Society visualisation. The feature image of this post is from the interface of twotone.io – crucially in order to create a sonification the user generates a simple visualisation (in this case a series of column charts) first. With knowledge of the underlying structure and shape of the data (and its visualisation), it’s possible to interpret the rise and fall of each instrument representing the self-declared “data” , “visualisation” and “society” elements of each society member. It’s true that the twotone.io application allows for additional arpeggios which makes for a more pleasant listening experience, though these additional notes don’t carry analytical value. I think it’s possible to get a good feel though.

Crucially, in order to understand the sonification, I chose to overlay the sound on to an animated video. Hearing is great, seeing *and* hearing adds an extra level. I did wonder if that was an acknowledgement that data sonifcation by itself is not quite sufficient to help us see our data? It’s hard to sonify data and explain how the data is being represented without visualising it in some way to illustrate the methodology (literally).

I had this post written up as far as the paragraph above a week or so ago, and wasn’t sure whether to complete the post, or how to come to a conclusion on the benefits of data sonification. But then all became clear in the unlikely circumstances of a family Sunday lunch at the weekend. My wife and I drove our family and my parents out to a country pub, choosing to take two cars because there were too many for one car. Conversation at one point moved on to how useful parking sensors are these days to help those of us park more easily, particularly when reversing, as we compared the offerings of the two family cars.

My own car is ten years old now – the parking sensor is great. It beeps. As I approach a car or a wall going backwards, it beeps faster. As the other side of my car approaches an obstruction, it beeps at a different pitch. It’s data sonification in action – if the beeps get too fast I stop because I’m about to reverse into the wall. It really helps me see my parking space. We already have data sonification in our lives – the rear parking sensors on our cars.

My wife’s car is about two years old – there is no parking sensor but there is a rear facing camera to assist parking. It’s easier to interpret and more accurate than my sensor – the visual representation and clean lines on the camera output make it clear whether the driver has enough space to reverse to the object he/she is approaching. So this is data visualisation in action. Interestingly, the feature of the car is seen as an improvement and upgrade on the “beeping” parking sensor.

So, yes, data sonifcation *does* help us see our data. I’m not sure it is as effective as true data visualisation, but in the absence of visual cues (whether by design, or whether by reduced or lack of visual input), it has a place, and if done well, it works. And it’s a fun alternative! I may not have found it myself, I look forward to learning of examples where sonification is every bit as important a representation of data as visualisation, to help us see and understand our data.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s