![]() Similar to metadata, embedded audio markers can travel with the digital audio file and can be read by many audio softwares. Audio markers include both digital audio workstation (DAW) track markers that help identify song sections and other key elements in a timeline and embedded audio markers that often serve as loop points, sampling keys, and anchors as part of audio regions and their respective files on disk. In sound, my mind immediately raced toward sonification ‘callouts’ or embedded audio markers that users could move between and playback with keyboard shortcuts. Second, what struck me was the call for screen readers to be able to “traverse accessible structure to explore data or locate specific points” (ibid: 4). One aim in the pilot is to “inclusively design and pilot auditory display techniques…to convey meaningful aspects of ocean science data” and I hear how our Data Nuggets data sonifications being developed from co-designing auditory display elements (data sonifications, data narratives, etc) sounds like a related focus. Even though our pilot’s aim doesn’t really provide space for the type of tool-building or evaluation of user exploration of data sonifications/visualizations that their article addresses, our co-design process is somewhat focused on a data “overview” aspect. ![]() First, the literature review and study’s co-design experience amplified the message that screen reader users desire “an overview,” followed by user exploration as part of “information-seeking goals” (Zong et al. ![]() While reading “Rich Screen Reader Experiences for Accessible Data Visualization” by Zong, et al., two items struck me related to data sonification.
0 Comments
Leave a Reply. |