Sonification and journalism: how to play data through sounds
DOI:
https://doi.org/10.26441/RC22.1-2023-3022Keywords:
sonification, journalism, data, sound, science, visually impaired people, activism, audification, software, emotivenessAbstract
Sonification is a technique for representing data using sound, which has been used in various disciplines, including journalism, more intensively in the last 40 years. Its use with journalistic content is closely linked to practices in other fields, such as the representation of large scientific volumes of data, as well as the interpretation of data sets through sounds for visually impaired people. This article aims to provide a theoretical approach to sonification, an overview of its evolution and the challenges of its use in the context of news reporting. It includes a general introduction to sonification, its main elements and techniques, as well as a bibliographical analysis of both academic and professional literature. As a result, an overview of work based on sonifications is presented: firstly, applied to scientific productions and to people with visual impairment. Based on the above, specific examples of journalistic sonifications are presented, as well as descriptions of tools used to develop this technique. Sonification has proved useful as an alternative representation for discriminating differences in large volumes of data. However, much of its output and tools are still experimental. The complexity of human perception of sound, the difficulty of balancing its emotive and informative values, and the need to train the public in the use of this technique mean that it has yet to become a mass form of data representation.
Metrics
References
Alexander, R. L., O’Modhrain, S., Roberts, D. A., Gilbert, J. A., y Zurbuchen, T. H. (2014). The bird’s ear view of space physics: Audification as a tool for the spectral analysis of time series data. Journal of Geophysical Research: Space Physics 119(7), 5259-71. https://doi.org/10.1002/2014JA020025 DOI: https://doi.org/10.1002/2014JA020025
Astheimer, P. (1993). Sonification tools to supplement dataflow visualization. En P. Palamidese (Ed.) Scientific Visualization: Advanced Software Techniques (pp. 15-36). Ellis Horwood.
Avanzo, S., Barbera, R., De Mattia, F., La Rocca, G., Sorrentino, M., y Vicinanza, D. (2010). Data sonification of volcano seismograms and Sound/Timbre reconstruction of ancient musical instruments with Grid infrastructures. Procedia Computer Science 1(1), 397-406. https://doi.org/10.1016/j.procs.2010.04.043. DOI: https://doi.org/10.1016/j.procs.2010.04.043
Backer, S. (1 de diciembre, 2020). Hear the Blind Spot: Visualizing Data for Those Who Can’t See. Nightingale Medium. https://medium.com/nightingale/hear-the-blind-spot-visualizing-data-for-those-who-cant-see-b370e6ec0b9e.
BBC World Service. (2022). Audiographs: The news through sound. https://www.bbc.co.uk/programmes/p04188zt.
Beans, C. (2017). Musicians Join Scientists to Explore Data through Sound. Proceedings of the National Academy of Sciences 114 (18), 4563-65. https://doi.org/10.1073/pnas.1705325114. DOI: https://doi.org/10.1073/pnas.1705325114
Bearman, N. y Brown, E. (18 de junio, 2012). Who’s Sonifying Data and How Are They Doing It? A Comparison of ICAD and Other Venues since 2009 [Comunicación en congreso]. http://hdl.handle.net/1853/44402
Dierssen Sotos, M. (2019). El cerebro del artista. Shackleton Books.
Brazil, E. y Fernström, M. (2011). Auditory icons. En T. Hermann, A. Hunt y J. G. Neuhoff (Eds.) The Sonification Handbook (pp. 325-38). Logos Verlag.
Briggs, J. y Corey, M. (18 de junio, 2016). A Sonic Memorial to the Victims at Orlando’s Pulse Nightclub. Reveal. http://revealnews.org/blog/a-sonic-memorial-to-the-victims-at-orlandos-pulse-nightclub/.
Buchanan, L., Huang, J. y Pearce, A. (5 de octubre, 2017). Nine Rounds a Second: How the Las Vegas Gunman Outfitted a Rifle to Fire Faster, New York Times.
Cairo, A. (2019). How Charts Lie: Getting Smarter about Visual Information. W.W. Norton.
Campo, A. (3 de febrero, 2007). Toward A Data Sonification Design Space Map [Comunicación en congreso]. Proceedings of the 13th International Conference on Auditory Display, Montreal, Canadá. https://iem.kug.ac.at/fileadmin/media/iem/altdaten/projekte/publications/paper/design/design.pdf
Candey, R. M., Schertenleib, A. M. y Díaz-Merced, W. L. (20 de junio, 2006). Sonify: Sonification tool for space physics [Comunicación en congreso]. Georgia Institute of Technology (ICAD). http://hdl.handle.net/1853/50697
Cantrell, S., Bruce, J., Walker, N. y Moseng, O. (25 de junio, 2021). Highcharts Sonification Studio: An Online, Open-Source, Extensible, And Accessible Data Sonification Tool [Comunicación en congreso]. International Conference on Auditory Display (ICAD 2021), Virtual Event. https://icad2021.icad.org/wp-content/uploads/2021/06/ICAD_2021_5.pdf. DOI: https://doi.org/10.21785/icad2021.005
Chen, S., Bowers, J., y Durrant, A. (13 de julio, 2015). 'Ambient walk' a mobile application for mindful walking with sonification of biophysical data [Comunicación en congreso]. Proceedings of the 2015 British HCI Conference. New York, USA. https://dl.acm.org/doi/10.1145/2783446.2783630 DOI: https://doi.org/10.1145/2783446.2783630
Chou, S. (2017). What does the death toll of gun violence sound like? https://sophiechou.com/post/171320495526/what-does-the-death-toll-of-gun-violence-sound.
Clausen, J. (2018). Choral. http://joshuaclausen.com/choral.
Corey, M., Humbert, A., Marcoux, J., Chou, S., Kocí, P. y McNally, P. (29 de mayo, 2019). Data in the air: a guide to producing data journalism for radio, DataJournalism.com. https://datajournalism.com/read/longreads/data-in-the-air
Corey, M. (9 de julio, 2015). Turn your data into sound using our new MIDITime library. Reveal. https://revealnews.org/blog/turn-your-data-into-sound-using-our-new-miditime-library/.
Delogu, F., Palmiero, M., Federici, S., Plaisant, C., Zhao, H. y Belardinelli, O. (2010). Non-Visual Exploration of Geographic Maps: Does Sonification Help? Disability and Rehabilitation. Assistive Technology, 5 (3), 164-74. https://doi.org/10.3109/17483100903100277. DOI: https://doi.org/10.3109/17483100903100277
Dombois, F., y Eckel, G. (2011). Audification. En T. Hermann, A. Hunt y J. Neuhoff (Eds.), The Sonification Handbook (pp. 301-24). Logos Verlag.
Dunn, J., y Clark, M. A. (1999). Life Music: The Sonification of Proteins. Leonardo, 32 (1), 25-32. https://doi.org/10.1162/002409499552966. DOI: https://doi.org/10.1162/002409499552966
Ferguson, J., y Brewster, S. (2017). Evaluation of psychoacoustic sound parameters for sonification [Comunicación en congreso]. ICMI '17: Proceedings of the 19th ACM International Conference on Multimodal Interaction. https://dl.acm.org/doi/10.1145/3136755.3136783. DOI: https://doi.org/10.1145/3136755.3136783
Foo, B. (2015a). Data-Driven DJ. https://datadrivendj.com/.
Foo, B. (2015b). Data-Driven DJ / Two Trains. https://datadrivendj.com/tracks/subway/.
García-Riber, A. (23 de junio, 2019). Sonifigrapher: Sonified light curve synthesizer [Comunicación en congreso]. International Conference on Auditory Display, 52. http://hdl.handle.net/1853/61497. DOI: https://doi.org/10.21785/icad2019.016
Geere, D. y Quick, M. (8 de diciembre, 2020). Telling Stories With Data & Music. Nightingale Medium. https://medium.com/nightingale/telling-stories-with-data-music-f60ac0b5f1be.
Geere, D. y Quick, M. (2021). Loud Numbers - a Data Sonification Podcast. https://www.loudnumbers.net.
Gena, P. y Strom, Ch. (2001). A Physiological Approach to DNA Music [Comunicación en congreso]. 4th Computers in Art and Design Education Conference, Glasgow. Escocia. https://www.petergena.com/docs/gena-strom-DNA.pdf
Georgia Tech. (2022). Actas de la International Conference on Auditory Display (1994-2022). https://smartech.gatech.edu/handle/1853/49750.
Grond, F. y Berger, J. (2011). Parameter Mapping Sonification. En T. Hermann, A. Hunt, J. G. Neuhoff (Eds.), The Sonification Handbook (pp. 363-97). Logos Verlag.
Goldstein, E.B. y Cacciamani, L. (2022). Sensation and Perception. Cengage.
Hegg, J., Middleton, J., Robertson, B. L. y Kennedy, B. P. (2018). The Sound of Migration: Exploring Data Sonification as a Means of Interpreting Multivariate Salmon Movement Datasets. Heliyon 4 (2). https://doi.org/10.1016/j.heliyon.2018.e00532. DOI: https://doi.org/10.1016/j.heliyon.2018.e00532
Hermann, T., Hunt, A. y Neuhoff, J. G. (2011). The Sonification Handbook. Logos Verlag.
INTA-NASA. (2020). Sonificación de datos: una nueva tríada cósmica de sonido. Deep Space Network. https://www.mdscc.nasa.gov/
Kramer, G., Walker, B., Bonebright, T., Cook, P., Flowers, J.H., Miner, N. y Neuhoff, J. (1999). Sonification Report: Status of the Field and Research Agenda. University of Nebraska-Lincoln. https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1443&context=psychfacpub
Lenzi, S. (15 de noviembre, 2018). Data Sonification. A chronicle of a death foretold? [Comunicación en congreso]. IXXI - École Normale Supérieur, Lyon. https://doi.org/10.13140/RG.2.2.15666.04800.
Lenzi, S., Ciuccarelli, P., Liu, H. y Hua, Y. (2021). Data Sonification Archive. https://sonification.design/.
Li, G. y Walker, B. N. (2019). Mixed Speech and Non-Speech Auditory Displays: Impacts of Design, Learning, and Individual Differences in Musical Engagement [Comunicación en congreso]. Georgia Institute of Technology. https://doi.org/10.21785/icad2019.019. DOI: https://doi.org/10.21785/icad2019.019
Loud Numbers (2021). The Loud Numbers Sonification Festival [Comunicación en congreso]. https://www.youtube.com/watch?v=X17zV8-4CdI
Madhyastha, T. M. y Reed, D. A. (1995). Data sonification: Do you see what I hear? IEEE Software 12 (2), 45-56. https://doi.org/10.1109/52.368264. DOI: https://doi.org/10.1109/52.368264
Madrid Deep Space Communications Complex. (1 de diciembre, 2020). Sonificación de datos: una nueva tríada cósmica de sonido. https://www.mdscc.nasa.gov/index.php/2020/12/01/sonificacion-de-datos-una-nueva-triada-cosmica-de-sonido/.
Mauney, L. y Walker, B. (2004). Creating Functional and Livable Soundscapes for Peripheral Monitoring of Dynamic Data. The Tenth International Conference on Auditory Display. http://hdl.handle.net/1853/50847.
McGookin, D. y Brewster, S. (2011). Earcons. En T. Hermann, A. Hunt, J. G. Neuhoff (Eds.), The Sonification Handbook, (pp. 339-61). Logos Verlag.
Mihalas, G., Andor, M., Tudor, A. y Paralescu, S. (2018). Potential use of sonification for scientific representation, Romanian Journal of Biophysics, 28, 45-57.
Minghim, R. y Forrest, A.R. (1995). An illustrated analysis of sonification for scientific visualization. Proceedings of IEEE Visualization '95, 110 -117. DOI: https://doi.org/10.1109/VISUAL.1995.480802
Monteiro, V. (27 de noviembre, 2020). Data Sonification with JFugue and SpringMVC, Towards Data Science. https://towardsdatascience.com/listen-to-your-data-in-real-time-99b59d93031b.
Nath, S.S. (2020). Hear Her Fear: Data Sonification for Sensitizing Society on Crime Against Women in India [Comunicación en congreso]. IndiaHCI. https://arxiv.org/pdf/2009.14182.pdf DOI: https://doi.org/10.1145/3429290.3429307
Neuhoff, J. G. (2011). Perception, Cognition and Action in Auditory Displays. En T. Hermann, A. Neuhoff, J. G. Neuhoff (Eds.), The Sonification Handbook, (pp. 63-85). Logos Verlag.
Neuhoff, J. G. (2019). Is sonification doomed to fail? [Comunicación en congreso]. International Conference on Auditory Display. https://smartech.gatech.edu/handle/1853/61531. DOI: https://doi.org/10.21785/icad2019.069
Oberst, T. (2005). Blind graduate student ‘reads’ maps using CU software that converts color into sound. Cornell Chronicle. https://www.cis.rit.edu/people/faculty/ferwerda/press/cornell_chronicle_012705_pg5.pdf.
Phillips, S. y Cabrera, A. (2019). Sonification Workstation, 184-190. https://doi.org/10.21785/icad2019.056 DOI: https://doi.org/10.21785/icad2019.056
Piñeiro-Otero, T. (2020). Sonidos que cuentan: la ambientación sonora en el audiovisual. Editorial UOC.
Plack, C. J. (2018). The Sense of Hearing. Routledge. DOI: https://doi.org/10.4324/9781315208145
Pure Data (2022). Pure Data Community Site. https://puredata.info/.
Quinton, M., McGregor, I., y Benyon, D. (2018). Investigating effective methods of designing sonifications [Comunicación en congreso]. Georgia Institute of Technology. DOI: https://doi.org/10.21785/icad2018.011
Ramssier, M.A. y Rauschecker, J.P (2017). Primate Audition: Reception, Perception and Ecology. En Quam, R.M., Ramssier, M.A., Fay, R.F., Popper, A.N. Primate Hearing and Communication (pp. 47-78). DOI: https://doi.org/10.1007/978-3-319-59478-1_3
Rogers, S. (5 de marzo, 2019). Sonification: Make Beautiful Music with Your Data. Medium. https://medium.com/@smfrogers/sonification-make-beautiful-music-with-your-data-d8fd59b84f3f.
Roddy, S. y Bridges, B. (2020). Mapping for meaning: the embodied sonification listening model and its implications for the mapping problem in sonic information design. Journal on Multimodal User Interfaces, 14(2), 143-151. DOI: https://doi.org/10.1007/s12193-020-00318-y
Romo, T. (2018). MIDI: A Standard for Music in the Ever Changing Digital Age. https://digitalcommons.csumb.edu/caps_thes_all/368/.
Ryssdal, K., y Fam, A. (2020). What Sound Does a Volatile Stock Market Make? Marketplace. https://www.marketplace.org/2020/03/31/the-sounds-of-a-volatile-stock-market/.
Sawe, N., Chafe, Ch. y Treviño, J. (2020). Using Data Sonification to Overcome Science Literacy, Numeracy, and Visualization Barriers in Science Communication. Frontiers in Communication 5. https://doi.org/10.3389/fcomm.2020.00046. DOI: https://doi.org/10.3389/fcomm.2020.00046
Scaletti, C. (1994). Sound synthesis algorithms for auditory data representations. En G. Kramer (Ed.). Auditory Display: Sonification, Audification, and Auditory Interfaces (pp. 223-51). Addison Wesley.
Scaletti, C. (2018). Sonification ≠ Music. En R. T. Dean y A. McLean (Ed.). The Oxford Handbook of Algorithmic Music (pp. 363-386). Oxford Academic. DOI: https://doi.org/10.1093/oxfordhb/9780190226992.013.9
Seiça, M., Martins, P., Gomes Roque, L. y Cardoso, A. (2019). A Sonification Experience to Portray the Sounds of Portuguese Consumption Habits [Comunicación en congreso]. The 25th International Conference on Auditory Display (ICAD 2019). http://dx.doi.org/10.21785/icad2019.050 DOI: https://doi.org/10.21785/icad2019.050
Smith, A. (2019). Sonification: turning the yield curve into music. Financial Times. https://www.ft.com/content/80269930-40c3-11e9-b896-fe36ec32aece
Sonify. (2022a). Data-Driven Storytelling: Making Civic Data Accessible with Audio https://www.wichita.sonify.io/
Sonify. (2022b). Twotone. https://twotone.io/
Stone, E. y Garrison, J. (2013). Package “audiolyzR”. http://www2.uaem.mx/r-mirror/web/packages/audiolyzR/audiolyzR.pdf.
Tactical Technology Collective. (2014). Why are houses collapsing in Egypt? https://egyptbuildingcollapses.org/.
Takahashi, R. y Miller, J. H. (2007). Conversion of Amino-Acid Sequence in Proteins to Classical Music: Search for Auditory Patterns. Genome Biology 8 (5), 405. https://doi.org/10.1186/gb-2007-8-5-405. DOI: https://doi.org/10.1186/gb-2007-8-5-405
Temple, M. D. (2017). An auditory display tool for DNA sequence analysis. BMC Bioinformatics 18 (1), 221. https://doi.org/10.1186/s12859-017-1632-x. DOI: https://doi.org/10.1186/s12859-017-1632-x
Väljamäe, A., Steffert, T., Holland, S., Marimon, X., Benitez, R., Mealla, S., Oliveira, A. y Jorda, S. (2013). A Review of Real-Time Eeg Sonification Research [Comunicación en congreso]. Georgia Institute of Technology. https://smartech.gatech.edu/handle/1853/51645.
Vogt, K., Bovermann, T., Huber, P. y Campo, A. (24 de junio, 2008). Exploration Of 4d-Data Spaces. Sonification In Lattice Qcd [Comunicación en congreso]. Proceedings of the 14th International Conference on Auditory Display. http://hdl.handle.net/1853/49933
Wang, R., Jung, C. y Kim, Y. (2022). Seeing Through Sounds: Mapping Auditory Dimensions to Data and Charts for People with Visual Impairments. Computer Graphs Forum, 41 (3), 71-83. https://doi.org/10.1111/cgf.14523 DOI: https://doi.org/10.1111/cgf.14523
Walker, B. y Mauney, L. (1 de marzo, 2010). Universal Design of Auditory Graphs: A Comparison of Sonification Mappings for Visually Impaired and Sighted Listeners. ACM Transactions on Accessible Computing, 2. https://dl.acm.org/doi/10.1145/1714458.1714459. DOI: https://doi.org/10.1145/1714458.1714459
Walker, B.N. y Kramer, G. (2021). Ecological Psychoacoustics and Auditory Displays. En Neuhoff, J.G. (Ed.). Ecological Psychoacustics (pp. 149-174). Brill. DOI: https://doi.org/10.1016/B978-012515851-0/50007-2
Walker, B. N, y Moseng, Ø. (2020). Chart Sonification for All: Accessible Multimodal Graphs [Comunicación en congreso]. Proceedings of the 35th Annual International Technology & Persons with Disabilities Conference (CSUN2020), 2. http://sonify.psych.gatech.edu/publications/pdfs/2020CSUN-WalkerMoseng.pdf.
Walker, B. N., y Nees, M. A. (2011). Theory of Sonification. En T. Hermann, A. Hunt, J. G. Neuhoff (Eds.) The Sonification Handbook (pp. 24-39). Logos Verlag. https://sonification.de/handbook/download/TheSonificationHandbook-HermannHuntNeuhoff-2011.pdf.
Wilson, C. M. y Lodha, S. K. (1996). Listen: A Data Sonification Toolkit [Comunicación en congreso]. International Conference on Auditory Display, Georgia Institute of Technology. https://smartech.gatech.edu/handle/1853/50809.
Wilson, S. y Harkings, J. (2022). Supercollider. https://supercollider.github.io/
Wirfs-Brock, J. (2022). Data Sonification: ‘Charts’ for Radio? National Press Foundation. https://nationalpress.org/topic/data-sonification-charts-for-radio/.
Wirfs-Brock, J., Fam, A., Devendorf, L. y Keegan, B. (2021). Examining Narrative Sonification: Using First-Person Retrospection Methods to Translate Radio Production to Interaction Design. ACM Transactions on Computer-Human Interaction 28 (6), 41:1-41:34. https://doi.org/10.1145/3461762. DOI: https://doi.org/10.1145/3461762
Worrall, D. (2019). Sonification: An Overview. En D. Worrall, Sonification Design (pp. 23-54). Springer International Publishing. https://doi.org/10.1007/978-3-030-01497-1_2 DOI: https://doi.org/10.1007/978-3-030-01497-1_2
Zhao, H., Plaisant, C., Shneiderman, B. y Lazar, J. (2008). Data Sonification for Users with Visual Impairment: A Case Study with Georeferenced Data. ACM Transactions on Computer-Human Interaction (TOCHI) 15(1), 1-28. https://doi.org/10.1145/1352782.1352786 DOI: https://doi.org/10.1145/1352782.1352786
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Revista de Comunicación
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.