A couple of days ago I wrote about sonifications, the sound equivalent to visualizations. Even though they haven’t permeated business culture in the same way visualizations have, they are around us used in a folk way—meaning, designers use sound in their interfaces, but it isn’t systematic or directly supported in tools.
Even without that discipline, sonifications generally do a much better job of making themselves actionable. This is because that’s the main reason you use sound. You need to interrupt the user and get them to do something now. In my last essay on this topic, I gave a list of examples you encounter while driving: car honks, sirens, lane-drift warnings, etc—they all need immediate attention.
If we look at visualizations we encounter while driving, actionability is a core-driver. The gas light, tire-pressure warning, traffic lights all translate to simple actions. Even the check engine light wants you take an action (which is true even if you ignore it).
A lot of visualizations I see elsewhere aren’t that clear. It’s easy to make charts in most software that has data, and the easiest thing to do is to pick a chart template and populate it. It practically makes itself.
The problem is that many visualizations are communicating the list of data and not the action. Instead of random lines going up and down periodically, I want a green or red light*. And just like in my hometown of Queens, if I don’t pay attention to the light, a honk.
* Augmented for color-blindness and accessibility, of course