Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf ·...

56
Department of Science and Technology Institutionen för teknik och naturvetenskap Linköping University Linköpings universitet g n i p ö k r r o N 4 7 1 0 6 n e d e w S , g n i p ö k r r o N 4 7 1 0 6 - E S LiU-ITN-TEK-A--20/050-SE Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28

Transcript of Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf ·...

Page 1: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Department of Science and Technology Institutionen för teknik och naturvetenskap Linköping University Linköpings universitet

gnipökrroN 47 106 nedewS ,gnipökrroN 47 106-ES

LiU-ITN-TEK-A--20/050-SE

Interactive Sonification inOpenSpace

Malin Ejdbo

Elias Elmquist

2020-08-28

Page 2: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Department of Science and Technology Institutionen för teknik och naturvetenskap Linköping University Linköpings universitet

gnipökrroN 47 106 nedewS ,gnipökrroN 47 106-ES

LiU-ITN-TEK-A--20/050-SE

Interactive Sonification inOpenSpace

The thesis work carried out in Medieteknikat Tekniska högskolan atLinköpings universitet

Malin EjdboElias Elmquist

Norrköping 2020-08-28

Page 3: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Linköpings universitetSE–581 83 Linköping+46 13 28 10 00 , www.liu.se

Linköping University | Department of Science and TechnologyMaster’s thesis, 30 ECTS | Media technology

2020 | LiU-ITN-TEK-A--20/050--SE

Interactive Sonificationin OpenSpaceElias ElmquistMalin Ejdbo

Supervisor : Niklas RönnbergExaminer : Camilla Forsell

Page 4: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Upphovsrätt

Detta dokument hålls tillgängligt på Internet - eller dess framtida ersättare - under 25 år från publicer-ingsdatum under förutsättning att inga extraordinära omständigheter uppstår.Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka ko-pior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervis-ning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annananvändning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säker-heten och tillgängligheten finns lösningar av teknisk och administrativ art.Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning somgod sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentetändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsman-nens litterära eller konstnärliga anseende eller egenart.För ytterligare information om Linköping University Electronic Press se förlagets hemsidahttp://www.ep.liu.se/.

Copyright

The publishers will keep this document online on the Internet - or its possible replacement - for aperiod of 25 years starting from the date of publication barring exceptional circumstances.The online availability of the document implies permanent permission for anyone to read, to down-load, or to print out single copies for his/hers own use and to use it unchanged for non-commercialresearch and educational purpose. Subsequent transfers of copyright cannot revoke this permission.All other uses of the document are conditional upon the consent of the copyright owner. The publisherhas taken technical and administrative measures to assure authenticity, security and accessibility.According to intellectual property law the author has the right to bementionedwhen his/her workis accessed as described above and to be protected against infringement.For additional information about the Linköping University Electronic Press and its proceduresfor publication and for assurance of document integrity, please refer to its www home page:http://www.ep.liu.se/.

© Elias ElmquistMalin Ejdbo

Page 5: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Abstract

This report presents the work of a master thesis which aim was to investigate howsonification can be used in the space visualization software OpenSpace to further conveyinformation about the Solar System. A sonification was implemented by using the softwareSuperCollider and was integrated into OpenSpace using Open Sound Control to send po-sitional data to control the panning and sound level of the sonification. The graphical userinterface of OpenSpace was also extended to make the sonification interactive. Evaluationswere conducted both online and in the Dome theater to evaluate how well the sonificationconveyed information. The outcome of the evaluations shows promising results, whichmight suggest that sonification has a future in conveying information of the Solar System.

Page 6: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Acknowledgments

First of all, we would like to thank Niklas Rönnberg for creating this exciting opportunitythat would become the subject of this master thesis. You have been an invaluable asset as asupervisor for sonification, sound theory, evaluation methods and everything in between.

Secondly, we would like to thank Alexander Bock for being our unofficial second supervisorby introducing us to OpenSpace and the Dome theater. Thank you for letting us into theDome theater during inconvenient hours.

Additional thanks to Lovisa Hassler for further introducing us to OpenSpace and pointingus in the right direction, to Camilla Forsell for her input on evaluation methods, and toeveryone who participated in our evaluations.

iv

Page 7: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Contents

Abstract iii

Acknowledgments iv

Contents v

List of Figures vii

List of Tables ix

1 Introduction 11.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.4 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.5 Delimitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2 Theory 42.1 OpenSpace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42.2 Sonification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.2.1 Strengths of Sonification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.2.2 Types of Sonification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.2.3 Types of Sounds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.2.4 Sound Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.3 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.3.1 Sonifications with Visual Component . . . . . . . . . . . . . . . . . . . . 82.3.2 Sonifications in Planetariums . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.4 SuperCollider . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.5 Open Sound Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

3 Method and Implementation 113.1 OpenSpace Views . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.1.1 Solar System View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113.1.2 Planetary View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123.1.3 Compare View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.2 Sonification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143.2.1 Identifying Important Concepts . . . . . . . . . . . . . . . . . . . . . . . 143.2.2 Implementation of Sonifications . . . . . . . . . . . . . . . . . . . . . . . 143.2.3 Sound Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.2.4 Surround in the Dome . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

3.3 Integration with OpenSpace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.3.1 Extraction Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.3.2 The Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

v

Page 8: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.3.3 Distances and Angles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.3.4 Precision Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233.3.5 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

3.4 Open Sound Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233.5 Graphical User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.6 User Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

3.6.1 Initial Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253.6.2 Online Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263.6.3 Dome Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263.6.4 Analyzing the Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

4 Evaluation Results and Further Development 284.1 First Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284.2 Changes for Second Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304.3 Second Survey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304.4 Dome Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

5 Discussion 375.1 Evaluation Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395.4 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

6 Conclusion 41

Bibliography 43

vi

Page 9: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

List of Figures

1.1 Screenshot of OpenSpace. The planets in order of distance to the Sun are namedMercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus and Neptune. All planetshave been scaled up for visibility in this picture. . . . . . . . . . . . . . . . . . . . . 2

2.1 Screenshot of OpenSpace, showing the interface and settings of focus and time. . . 5

3.1 Screenshot of OpenSpace in the Solar System view, showing the inner planets. . . . 123.2 Screenshot of OpenSpace in the planetary view, displaying Earth. . . . . . . . . . . 133.3 Screenshot of OpenSpace in the compare view, where Earth and Mars are compared. 133.4 The pitches of each planet, shown on a piano with 61 keys. . . . . . . . . . . . . . . 163.5 An overview of the structure of the sonification. . . . . . . . . . . . . . . . . . . . . 183.6 Ideal surround sound placement. Note that the LFE-component does not emit

directional sound and can be placed more freely. . . . . . . . . . . . . . . . . . . . . 193.7 The calculation of the angles for planets relative to the camera. . . . . . . . . . . . . 213.8 The black curve depicts how the sound level for the sonification of Earth changes

depending on the distance to the planet. The value on the Y-axis is the sound levelfor the planet and the value on the X-axis is the distance to the planet in kilometers(km). The points where the lines of the same color intersect are points where thecurve was desired to be close to. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

3.12 The GUI in OpenSpace for all three views. . . . . . . . . . . . . . . . . . . . . . . . . 25

4.1 BUZZ results from the first online survey presented as a box plot. The statementshave been simplified for clarity. Note that for the statements "Difficult", "Boring"and "Confusing" it is positive if the answer has a low value. The thick horizontalline in the box represents the median value and the cross the average value. . . . . 29

4.2 Result for each sonification for the first survey. The thick horizontal line in the boxrepresents the median value and the cross the average value. . . . . . . . . . . . . . 29

4.3 BUZZ result from both online survey presented as a box plot. The statementshave been simplified for clarity. Note that for the statements "Difficult", "Boring"and "Confusing" it is positive if the answer has a low value. The thick horizontalline in the box represents the median value and the cross the average value. . . . . 31

4.4 BUZZ score result from both surveys. The thick horizontal line in the box repre-sents the median value and the cross the average value. . . . . . . . . . . . . . . . . 31

4.5 Result of the added BUZZ statements after each sonification for the second survey.The thick horizontal line in the box represents the median value and the cross theaverage value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

4.6 Result for the sonifications in both surveys. The thick horizontal line in the boxrepresents the median value and the cross the average value. . . . . . . . . . . . . . 33

4.7 BUZZ result for the second survey and the Dome evaluation. Note that for thestatements "Difficult", "Boring" and "Confusing" it is positive if the answer has alow value. The thick horizontal line in the box represents the median value andthe cross the average value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

vii

Page 10: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.8 Total BUZZ score result for all the evaluations. The thick horizontal line in the boxrepresents the median value and the cross the average value. . . . . . . . . . . . . . 34

4.9 Result of the three BUZZ statements after each sonification for the Dome evalu-ation. The thick horizontal line in the box represents the median value and thecross the average value. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

4.10 Result for all the sonifications in the Dome evaluation. The thick horizontal line inthe box represents the median value and the cross the average value. . . . . . . . . 36

viii

Page 11: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

List of Tables

3.1 List of planet properties for the eight planets. Temperature is given in unit Kelvin(K) and the other values are given in a ratio compared to Earth. . . . . . . . . . . . 14

3.2 List of planet properties and their respective sonification mappings. . . . . . . . . . 15

ix

Page 12: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

1 Introduction

There is no sound in space [16]. That is a fact that has discouraged the use of sound whenit comes to representing scientific data in outer space. Instead, scientific visualization is theprimary method of conceptualizing the information of space, which is the method used inthe visualization software OpenSpace [5].

1.1 Background

Sound relies on vibrations propagating as pressure waves through a transmission medium[16]. On Earth this medium is air which surrounds us within the atmosphere [7]. In spacehowever, there is no transmission medium for sound to travel in. The average density ofatoms in space is about one atom per cubic centimeter [16], compared to millions of billionsof air molecules by cubic centimeter in Earth’s atmosphere [7]. Even though sound does notexist in space, sound can still conceptualize data collected from space using sonification.

Instead of visually showing data, sonification is a method of conveying informationthrough non-speech audio [11]. It is a method that has been around for several decadesbut has only in recent years started to gain more attention. There are certain advantagesof using sonification over visualization, since our hearing can be a powerful tool to analyzedata. By using these advantages as well as immersing the audience with sound, a furtherunderstanding of astronomy can be acquired.

Sonification has proved itself as an important scientific tool of certain scientific discover-ies. One example is when the space probe Voyager 2 approached Saturn’s rings [15]. Whenscientist looked at the data it appeared to just be noise. However, by sonifying the data, thescientists could discern a sound resembling a "machine gun". With this discovery the scien-tists could conclude that this was because of micrometeoroids hitting the hull of Voyager 2.This example show that sound can perceptualize data in a way that visualization techniquescannot.

The Solar System contains one star and eight planets that orbits around it in ellipticalshapes at different distances and speeds [7]. The differences between the planets can some-times be very large, for example the diameter of the largest planet Jupiter is almost 30 timeslarger than the diameter of the smallest planet Mercury. One way to explore the Solar Systemis to use a visualization software that visualize space such as OpenSpace, see Figure 1.1.

1

Page 13: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

1.2. Motivation

Figure 1.1: Screenshot of OpenSpace. The planets in order of distance to the Sun are namedMercury, Venus, Earth, Mars, Jupiter, Saturn, Uranus and Neptune. All planets have beenscaled up for visibility in this picture.

OpenSpace is an open source interactive data visualization software which uses scientificvisualization to visualize the known universe in several ways [5]. It is a collaboration projectbetween Linköping University, American Museum of Natural History and NASA Goddard’sCommunity Coordinated Modeling Center to visualize the known universe. The softwaresupports a wide variety of setups, ranging from personal computers to advanced planetari-ums with clusters of computers and projectors. The software can show the Solar System inan accurate relative scale, while also being able to visualize dynamic simulations such as themagnetic field of Earth and other planets.

OpenSpace does not output any sound, which is a missed opportunity as sound can beused as a tool to further engage audiences, as it can give new perspectives of the data [11].This is what will be explored in this thesis by implementing sonification in OpenSpace.

1.2 Motivation

During a planetarium show, the visuals of OpenSpace are usually accompanied with back-ground music while a presenter gives information through speech. The music does evoke afeeling of space and a sense of wonder, but it does not relate to anything showed visually oradd any additional information or insight. By offering sonification in a planetarium show,the show could become more immersive and informative as the sounds can be connected towhat is shown on the screen [10]. The immersive feeling can create a sense of wonder forastronomy that is tied to astronomical data, inspiring more people to take interest to it.

There are a couple of examples where sonification has previously been tested in planetar-iums as a demonstration for the general public [18, 25]. However, these sonifications werenot a part of any visualization software to complement the audio. By adding sonification ina software like OpenSpace, an interactive and immersive audio-visual experience would becreated, which in turn could lead to increased understanding of astronomy for the audience.

1.3 Aim

The aim of this thesis is to investigate in what ways sonification can be used to increaseunderstanding of astronomy together with the software OpenSpace, as well as to increasethe immersiveness for the audience. Specifically, the sonification will be implemented to beused in the Dome theater at Visualization Center C1 in Norrköping, which includes soundand visuals in 360 degrees around the audience.

An implementation will be made to extract data from OpenSpace to be sent to a real-timeaudio synthesizer where the data will be sonified. Evaluations will be conducted during the

1Visualization Center C: http://visualiseringscenter.se/en/about-c

2

Page 14: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

1.4. Research Questions

development to test how well the sonification conveys the data to the audience. Feedbackfrom the evaluations will be used to improve the sonification further.

1.4 Research Questions

This thesis aims to answer the following research questions:

1. How can sonification give a comprehensible understanding of the Solar System?

2. What Solar System data should be used in the sonification, and what sound parametersshould be manipulated to convey the data?

3. How can a sonification be integrated into an interactive data visualization software suchas OpenSpace, and how can it make the experience more immersive?

1.5 Delimitations

The addition of sound in OpenSpace will increase the accessibility of the software to visu-ally impaired people. However, the work of this thesis will not be focused on replacing thealready existing visualization with a sonification. Instead the goal is to use sonification tocomplement the visualization of OpenSpace.

The size of space will be limited to the Solar System, which includes the Sun, the eightplanets and their respective moons. Data from other parts of space will not be part of thesonification.

The sonification is going to be tailored to the setting of the Dome theater with a 7.1 sur-round sound system. During development stereo speakers and headphones will however beused to monitor the audio.

3

Page 15: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

2 Theory

Creating a sonification for OpenSpace required knowledge about how OpenSpace works andhow interaction is done with the software. It was also necessary to identify in what wayssonification could be used to complement the experience of OpenSpace. This required knowl-edge about different types of sonifications to find the most suitable way to represent the data.Related work was also studied to find sources of inspiration for creating a sonification.

2.1 OpenSpace

The system behind OpenSpace is designed in such a way that it can easily be expanded. Thesystem architecture is structured into four layers, the OpenSpace-core, modules, OpenSpace-applications and external applications [5]. This structure makes it possible to expand thefunctionality to send data to the sonification software.

The coordinate system in OpenSpace has its origin located in the Sun and uses floatingpoint numbers with single precision and meters as unit. However, this causes single pre-cision floating point errors since floating point numbers are not precise enough to describeevery point in the large environment of the Solar System with high enough precision [2]. Theprecision decreases with increased distance to the origin and is caused by the fact that notevery number can be represented with single precision. The numbers that cannot be repre-sented with single precision are rounded to the closest possible number and this causes thenumbers to jump quickly from one value to another. This could cause problems with theparts of the sonification that are dependent on positions.

4

Page 16: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

2.1. OpenSpace

Figure 2.1: Screenshot of OpenSpace, showing the interface and settings of focus and time.

Navigation in OpenSpace is focus-oriented, which means that all navigation is relativeto an object that is in focus. In order to navigate to a planet, the user must switch focus tothe planet of interest by using the graphical user interface (GUI) in OpenSpace. Once thenew planet is in focus the user can navigate closer to that planet and explore it instead. Thefocus-oriented navigation makes it easy to navigate in the environment while also preventingthe user from getting lost in an otherwise large and empty environment. The object that isin focus heavily influences what is visually shown on screen and should consequently alsoinfluence the sonification.

In OpenSpace it is possible to change how fast time is simulated, this makes the planets gofaster either into the future or the past. This enables the user to appreciate changes that wouldotherwise be hard to discern by observing it in real-time, which includes almost everythingtemporal happening in space. The sonification should react to this change to create a moredynamic connection with the software.

The GUI of OpenSpace can be used to change the object in focus, change the speed of timeand change settings for the individual objects in the scene, see Figure 2.1. It can also be usedto change the settings of the individual modules in OpenSpace, which could also be used tocontrol the sonification.

OpenSpace gives a visual representation of the Solar System and communicates facts ofthe Solar System that can be visualized. The appearances of planets are shown by using realphotographs taken from previous space missions [4]. Their position in the Solar System andtheir length of day and year are visualized in the software by simulating the Solar System.However, OpenSpace does not show all of the information about the planets, this would in-stead be communicated via a presenter during a planetarium show. In similar software1 suchinformation could be visible in an information window, but this would not be an immersiveway of presenting the information in a planetarium show. Instead audio in form of a soni-

1NASA’s Eyes on the Solar System: https://eyes.jpl.nasa.gov/eyes-on-the-solar-system.html

5

Page 17: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

2.2. Sonification

fication could be used to complement the visuals of OpenSpace to mediate information thatwould otherwise not be seen.

2.2 Sonification

Sonification is the use of non-speech sound to convey and communicate information. It doesthis by translating relationships in data into sound where the human hearing is stimulated toa sense that the data relationships become comprehensible [11]. Sonification shares the samepurpose as visualization, to communicate information, with the biggest difference being thatsonification transmits information sonically to the ears.

2.2.1 Strengths of Sonification

The strengths of sonification are based on how the ears are different from the eyes. It istherefore important to identify specific concepts where sound can perceptualize a conceptbetter than a visual representation. Compared to our vision, our auditory perception has abetter sense of detecting temporal and other continuous changes [20, 11]. This can be usefulwhen representing dynamic data by altering the sound according to it.

Sound can also be positioned in a wider space than visual components. This can be re-ferred to as the spherical sound, which describes the multidimensional and multi-directionalnature of hearing [23]. While our eyes are only capable of observing things that are in frontof us, our ears can detect and focus on changes all around us. This opens the possibility toposition objects all around the audience, which enables a sense of presence and immersion.

Planetariums and other science centers often rely on engaging audiences by offering animmersive and stimulating experience to spark their curiosity. This is described by Mark Bal-lora [3] as the "wow-factor", where he also states that sonification is an under-utilized elementfor this purpose, and that it should be used because humans are audiovisual by nature. Bal-lora also touches on the fact that sound has a power to evoke emotions and memories, moreso than its visual counterparts, which would enable the audience to be more emotionallyconnected to the experience. Presenting both visual and auditory components increases thechance that the audience remembers the information presented as they can create associationsto either of the senses.

With multidimensional data, several sounds can be played at once, as the hearing is ca-pable of perceiving and distinguishing between several sounds simultaneously. This is whymusic is often consisted of several instruments playing together. When listening to music, itis also possible to focus on one instrument throughout the song. This relates to the CocktailParty phenomenon [6], where a person can isolate another person talking among a crowd ofother people. There is however a difference between perceiving several sounds and actuallyabsorbing the information. Schuett and Walker [21] stated that audience can focus on at leastthree auditory streams at the same time. This comprehension can also be improved by sepa-rating the sounds spatially, which was shown by Song and Beilharz [22] where a separationof 120 degrees between two audio streams gave an increased comprehension while listeningto them simultaneously. Creating a balance between the number of audio streams is impor-tant to stimulate the audience enough while still not playing too many sounds which wouldinstead lead to confusion.

2.2.2 Types of Sonification

Different types of sonifications exists to cater for different situations. For data exploration,which is the focus of this project, both direct and more abstract techniques are available.

Audification is the direct conversion of data into a waveform in the audible domain. Awaveform which would normally not be audible could become that by speeding up or slow-ing down the waveform, i.e. transposing the waveform. Common applications of audifi-

6

Page 18: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

2.2. Sonification

cation is putting ultrasound or infrasound, such as the vibrations of earthquakes, into thespectrum of human hearing. One of the more recent and famous examples is the audificationof the gravitational waves discovered by the laser experiment facility LIGO [1] in 2016. Thedetection of the gravitational waves resulted in a waveform increasing in frequency from 35to 250 Hz, which was converted into an audio clip.

One of the simpler kinds of sonification is an auditory graph, which is a sonic represen-tation of the data from a graph. Most commonly the pitch of a sound will be manipulateddepending on the value of the Y-axis, which is then played continuously along the X-axis.This type of sonification is the most straight forward way of representing a graph to visuallyimpaired users. Xsonify [8] is an example of a software that can convert data into auditorygraphs.

Parameter Mapping is the most common type of sonification [15], which connects thevalue of data to a parameter of a synthesized sound, such as pitch, loudness or rhythm.The type of conversion of the data to the synthesized sound determines the quality of themapping, which can also be affected by using an appropriate scaling. If data of a lineardomain would be mapped to pitch for example, the relation between the data and sonificationmight not be successful because our hearing perceives pitch in an exponential manner.

Interactive sonification [12] is another type of sonification which can be achieved by en-abling an interactive element in any of the types of sonifications mentioned above. For param-eter mapping this could mean that the user can manipulate the data, which in turn changesthe parameters of the acoustic variables.

In the present study, parameter mapping with an interactive element was used to best suitthe data and software that the sonification was going to be integrated with. By using param-eter mapping the planets can be compared by letting their data affect the acoustic variablesof the sonification. OpenSpace is an interactive software, which enables the sonification tofollow the focus of the visualization and be played at the same time speed. The GUI can alsobe used to let the audience navigate through the different sounds.

2.2.3 Types of Sounds

There are different techniques that can be used to create a variety of sounds for a sonification.This can range from using simple audio clips to creating fully synthesized sounds. Using thecorrect type of sound can help the audience understand the data in a better way.

Auditory icons are symbolic audio clips that can be a straightforward way of representingcertain types of data [11]. Just like visual icons, auditory icons are often used in the interfaceof computer software, and often refers to real-life sounds to be as intuitive as possible. Oneprime example is the sound of crumpling a piece of paper which indicates that a documentor any other kind of file has been deleted. Auditory icons can also have a parametric ap-proach, where the sound changes depending on the data. A size parameter can for exampledetermine the pitch of an audio clip of a bouncing ball to convey the perceived size of anobject.

For continuous values, synthesized sounds can be used to easily follow the data as synthe-sized sounds can easily be manipulated. An initial approach can be to use pure tones such assine waves which are then used to create a fundamental sound. Several pure tones can thenbe used to create a richer sound, using a technique called additive synthesis. This combinesthe pure tones to create more harmonics and overtones which is perceived as a richer sound.On the other hand, subtractive synthesis can be used to remove features from an already richsound. White noise can for example be filtered to only output a single tone, which createsother types of sounds.

7

Page 19: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

2.3. Related Work

2.2.4 Sound Parameters

Sonification using parameter mapping involves using the different parameters of a sound.Using the knowledge of how humans perceive sound is known as psychoacoustics [13],where parameters of sound can be manipulated to convey information to the audience.

Pitch is the perceived frequency of a sound, which makes a higher frequency be judgedas a higher pitched sound. The human hearing recognizes pitch in an exponential mannerin respect to frequency, with a range of about 20 - 20 000Hz. Pitch can be divided into toneswhich can be played together to create chords and is one of the most common parameters tomanipulate with sonification [9]. Pitch can often be mapped to the size of an object [9], wherea bigger object would correspond to a lower frequency and a smaller object to a brighterfrequency. This follows the results of an experiment made by Walker [26], were a negativepolarity mapping was considered most suitable when conveying the size of an object withpitch.

Loudness is the perceived volume of a sound, which is changed by altering the amplitudeof the sound waves. This can affect how close an object is perceived to be from the audienceand can be perceived farther away by decreasing the loudness [9]. This effect can also beamplified by adding room characteristics such as reverb to place the object in a room andcreate a sense of distance.

Tempo is the pace of several consecutive sounds or tones and is often manipulated toconvey temporal values [9]. It is measured in beats per minute, and often influences theenergy of a sound. A fast tempo can be perceived as more energetic, while a slow tempocan instead invoke calmness [9]. If a tempo is too slow however, the audience will no longerperceive it as a beat, and if the tempo is too fast it will instead create new harmonics.

The timbre of a sound explains its character [13], which is based on how many harmonicsand overtones are present and what relation they have to the fundamental sound. This willdetermine how soft or sharp a sound is and makes it possible to identify what the soundoriginated from. A simple example of timbre is the different harmonics and overtones of asawtooth-wave compared to a square-wave. A square wave consists of even spaced over-tones, creating a relatively mellow timbre. A sawtooth-wave contains both even and oddovertones, which results in a more sharp timbre.

2.3 Related Work

Sonification is used in many areas and applications where it is often used as a scientific toolto make more sense of data. It can also be used to increase the scientific outreach of a visual-ization to make it more accessible and understandable.

2.3.1 Sonifications with Visual Component

One of the more common use cases for sonification is to complement an already implementedvisualization. One example is in the area of chemistry where Rau et al. [19] used sonificationto indicate certain events in a molecular simulation. The sonification was useful to observeevents in the simulation, even when they were occluded or outside the view. Spatial soundwas used to position the events in the simulation using a Head Related Transfer Functions(HRTF) with headphones. The study shows that spatial sound can be particularly useful inthe case where the event occurs outside of the visual view.

For sonifications in astronomy, temporal activities are often highlighted to make use ofthis advantage of human hearing. SYSTEM sounds2 showcases the orbital resonance of thethree inner moons of Jupiter, which are locked in a 4:2:1 ratio. This is done by signaling anorbital period with a specific drum sound of each planet, where the pitch is determined by

2SYSTEM sounds sonification of Jupiter’s moons: http://www.system-sounds.com/jupiters-moons/

8

Page 20: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

2.4. SuperCollider

the orbital period. A similar method is used in Whitevinyls Solarbeat application3, where allthe planets of the Solar System also emits a sound after each orbital period.

2.3.2 Sonifications in Planetariums

As mentioned in section 1.2, there are a couple of examples where sonification has been usedin planetariums. Much inspiration was drawn from these projects as they had similar goalsand target platform as the present study.

Quinton et al. [18] sonified the planets of the Solar System with focus on testing anddemonstrating it for the end user. A planetarium representative was initially interviewed tograde the importance of each planetary parameter, where the parameters of most importancewere density, diameter, gravity, length of day, length of year, mean temperature and orbitalvelocity. These parameters were mapped to pitch, loudness, tempo and timbre among others.An evaluation was conducted with 12 participants where they were tasked with discerningproperties for each planet by listening to the sonification. The evaluation showed promisingresults, stating that the participants could discern several characteristics of the planets. Quin-ton et al. stated that the use of surround sound could be especially effective at sonifying theorbits of the planets, as the planets could orbit around the audience. Quinton et al. also sug-gested how the sonification of the Solar System can act as a scientific tool that could be usedto create a comparative model to other exo-solar systems. For this work, there was no visualcomponent, but it was stated that the sonification would be enhanced if a visual componentwas present.

The second planetarium example is the work done by Tomlinson et al. [25]. It sharesmany aspects of the report by Quinton et al. [18]. For example, similar important parameterswere concluded by interviewing astronomy teachers for different levels of classes. Similarmappings to Quinton et al. were also used to sonify the data. The sonification was dividedinto two views, the Solar System view and the planetary view. An evaluation was donein a planetarium using the available quadraphonic speaker system. Images of the planetswere used during the demonstration of the sonification for visual context. The evaluationincluded a survey of five questions about how interesting, pleasant, helpful and relatable thesonification was. The variety of questions shows that it is not just important that a sonificationcan increase understanding of something, but that it also needs to be pleasant to listen to. Thesurvey was later developed into an audio user experience scale called BUZZ [24]. The resultsfrom the survey showed that their sonification managed to relay information to the audiencewhile still being pleasant and interesting.

An important difference between these mentioned works were in their sound design.Tomlinson et al. [25] could be perceived as having a more concrete sound design, focus-ing on straight-forward mappings. Quinton et al. [18] could instead be perceived as havinga more abstract and musical approach. Both approaches have their perceived pros and cons,where straight-forward mappings give a more intuitive and informative experience, while amusical approach gives a more pleasant and immersive experience. The aim of the presentstudy was to lay somewhere in between these approaches by both being concrete to make itintuitive, while still presenting a pleasant soundscape to create immersion.

2.4 SuperCollider

To create and manipulate a sonification depending on external data in real-time, an audiosynthesizer was needed. SuperCollider4 is a code environment which enables real-time audiosynthesis and algorithmic compositions, suiting well for sonification [17]. The SuperColliderenvironment consists of a server (scsynth) and a client (sclang) [17], where the server contains

3Whitevinyls Solarbeat sonification of the Solar System:http://www.whitevinyldesign.com/solarbeat/

4SuperCollider: https://supercollider.github.io/

9

Page 21: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

2.5. Open Sound Control

the tools to create real-time audio synthesis, which is done by using unit generators (UGens)that generates audio. A UGen can be as simple as generating a sine wave, to as complexas modeling the impact of a bouncing ball. A combination of UGens with effects such asfilters can be composited to create a richer sound. This is all combined in an object calleda SynthDef, which works similarly to a programming function where variables can be usedwithin UGens to create a dynamic sound by manipulating its sound parameters in real-time.Arguments can be used to access data outside of the SynthDef. This makes it possible tocreate different versions of a SynthDef, which are called synths. In this project each planethad its own synth to reflect the different properties of the planets. The server and client inSuperCollider communicate via OSC, which can also be used to transfer data to SuperColliderfrom external applications such as OpenSpace.

2.5 Open Sound Control

Open Sound Control5 (OSC) is a communication protocol used between computers, synthe-sizers and controllers to communicate with each other. It is transport-independent, meaningthat it can be used to send information between devices through any transport layer, suchas UDP or TCP [27]. OSC can be used to send many arbitrary messages at the same time inthe form of bundles. To organize the messages OSC uses an addressing system where everymessage is given an address that can be used to label the messages with what data they con-tain [28]. In this way the data from the planets in the Solar System can be sent with separatemessages with different addresses. This reduces the risk of the data being mixed up.

5Open Sound Control: http://opensoundcontrol.org/

10

Page 22: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3 Method and Implementation

The sonification was developed by identifying important concepts in the Solar System whichwere scaled and mapped to different sound parameters. To integrate the sonification toOpenSpace, data such as distances and angles between the camera and the planets wereextracted. This data was processed and converted to a suitable format before it was sentto SuperCollider using the protocol Open Sound Control. SuperCollider then used this datato control the sonification depending on the state of OpenSpace. User tests were conductedto evaluate the sonification, where the feedback was used to make improvements to createthe final sonification. This was done iteratively two times before the final sonification wascreated.

3.1 OpenSpace Views

The aim of the sonification was to enhance the experience of using OpenSpace and conveymore information to the audience by using sound. However, it was not possible to presentall the information at once, since there are eight different planets to convey. The environmentof OpenSpace was therefore divided into three different views in the software to convey in-formation in different detail. The sonification was then developed in consideration of theseviews.

3.1.1 Solar System View

The Solar System View was intended to give an overview over the Solar System, conveyingsimple information through sound about all the planets at the same time. To do this, the cam-era was positioned in such a way that it gave a top-down overlook over the Solar System, seeFigure 3.1. This meant that all planets were visible and moved around the Sun in a counter-clockwise orbit. The sounds would then be spatially positioned according to planets’ positionon the screen. This would be especially fitting in the Dome theater, as the orbits could be fullyvisible in 360 degrees above the audience, as well as giving accurate spatial positioning usinga 7.1 surround system.

11

Page 23: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.1. OpenSpace Views

Figure 3.1: Screenshot of OpenSpace in the Solar System view, showing the inner planets.

3.1.2 Planetary View

The planetary view was intended to be a close up view of one planet at a time, where moresonifications could be used to convey more information about the specific planet, see Fig-ure 3.2. Similarly to the Solar System view, moons orbiting around a focused planet wouldalso create a miniature Solar System view at a certain distance, conveying information aboutthe moons of the planet using sonification. The sound level of the planet would also changedepending on the distance to the planet, which would reinforce the sense that the soundswere coming from the planet itself.

12

Page 24: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.1. OpenSpace Views

Figure 3.2: Screenshot of OpenSpace in the planetary view, displaying Earth.

3.1.3 Compare View

The compare view was intended to be a hybrid of the views mentioned above, where theview of the Solar System was combined with the sonifications of the planetary view. Themotivation behind this view was that in the planetary view it would not be possible to directlycompare planets, and because the sonification would rely much on the relations between theplanets this view was considered useful. The compare view would create a way to listen tothe sonifications of two planets at the same time for comparison. In order to make it morevisible which planets were being compared, the planets would be highlighted by increasingthe size of the planets that were selected, see Figure 3.3.

Figure 3.3: Screenshot of OpenSpace in the compare view, where Earth and Mars are com-pared.

13

Page 25: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.2. Sonification

3.2 Sonification

The development of the sonification began by first identifying important parameters anddata of the Solar System to sonify. These parameters then needed to be scaled accordinglyso that a fair comparison between the planets could be made. The sound design was thendeveloped to create suitable sounds to represent each parameter. An architecture was builtup to easily manage the sonification. Finally, the sonification was prepared for the Dometheater by implementing surround sound and rerouting the channels to the correct speakers.

3.2.1 Identifying Important Concepts

The sonification design process began by studying which planet parameters were suitable tobe sonified. Informal interviews with OpenSpace-developers were made to identify impor-tant concepts that could be sonified from OpenSpace. Most concepts were however obtainedfrom related work [18, 25], where a planetarium representative and astronomy teachers wereinterviewed about which aspects of astronomy were the hardest to teach. In general, it wasconcluded that the differences between the planets of the Solar System were of most inter-est. Specifically, the properties that were of most importance were mass, density, diameter,gravity, length of day, length of year and temperature. Some of these properties are listedwith their respective values for each planet in Table 3.1 with data from NASA’s planetary factsheets1. Parameters that were already visualized in an informative way in OpenSpace werealso less prioritized in the sonification. This included parameters such as the planets distancefrom the Sun and their orbital eccentricity. When the concepts were identified the processcontinued with determining what the different properties would map to in the sonification.

Table 3.1: List of planet properties for the eight planets. Temperature is given in unit Kelvin(K) and the other values are given in a ratio compared to Earth.

Diameter Gravity Length of Day Length of Year Temperature (K)Mercury 0.383 0.378 175.9 0.241 100 to 700Venus 0.949 0.907 116.8 0.615 735Earth 1 1 1 1 185 to 331Mars 0.532 0.377 1.03 1.88 120 to 293Jupiter 11.21 2.36 0.414 11.9 163Saturn 9.45 0.916 0.444 29.4 133Uranus 4.01 0.889 0.718 83.7 78Neptune 3.88 1.12 0.671 163.7 73

3.2.2 Implementation of Sonifications

Implementing a sonification using parameter mapping needed consideration of the mapping,scaling and sound design. The mapping of the sonification parameters to the data propertieswere experimented with in the early stages of the project. Common mappings that had pre-viously been used in other sonifications [9] were used as a starting point while at the sametime experimenting with new mappings and drawing inspiration from related work [25, 18].All of this lead to the final mappings shown in Table 3.2.

A problem with the parameters of the planets was the big difference between values, sincethe planets in the Solar System vary largely across many parameters. It was however impor-tant that all planets could be compared within a reasonable scale to appreciate the differences.Scaling was therefore used to more easily compare the data. Following that, the sound designwas developed to make the sonification as intuitive as possible, while also being immersive

1NASA’s planetary fact sheets: https://solarsystem.nasa.gov/planet-compare/, and https://nssdc.gsfc.nasa.gov/planetary/factsheet/index.html

14

Page 26: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.2. Sonification

and pleasant to listen to. All the planets shared the same kind of sound design, but withdifferent parameters to represent their differences.

For the following paragraphs, each planet property is explained with respect to its map-ping, scaling and sound design. The sonifications can be listened to here: https://www.youtube.com/watch?v=JfPtZn2fgYs.

Table 3.2: List of planet properties and their respective sonification mappings.

Planet Property Sonification MappingMass and diameter Pitch

Type of planet (density) Type of waveform (timbre)Length of day Rate of oscillationLength of year Spatial positioning

Gravity Bouncing ballAtmosphere Wind depthWind Speed Wind intensityTemperature Density of grating sound

Mass and Diameter

Mass and diameter were conveyed as the pitch of the fundamental sound of the planet. Thiswas done in a manner that a bigger planet had a lower pitch than a smaller one, accordingto the polarity mappings of Walker [26]. To determine the specific pitch of each planet, aninterval of frequencies was first decided. The pitch intervals of related works were first con-sidered, where Quinton et al. [18] used a two-octave range (C2-C4), while Tomlinson et al.[25] instead used a higher octave range of approximately six octaves. An intermediate optionwas used, creating a three-octave range (C2-C5). The lowest octave was dedicated to the gasand ice giants, and the higher octave was used by the terrestrial planets. This created a gapof almost one octave between the type of planets which represented the differences in sizebetween the inner and outer planets.

The mass and the diameter of each planet were considered when placing each planetwithin the respective octave. A balance between accuracy and musicality had to be made.The eight planets could be seen as four pairs of planets which shared their sizes the most.These were therefore placed within two semitones of each other. To place planets closer thanthat would scale-wise make sense for some pairs of planets considering their similarities insize. However, because an aspect of musicality had to be made, the two semitones differencewas preferred. The aim of the sonification was not to create an exact representation of thedata, but instead to increase the understanding of it. A sonification which would not havemusical elements could be perceived as less appealing and would therefore not be as useful.The resulting tones of the planets can be seen in Figure 3.4.

15

Page 27: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.2. Sonification

Figure 3.4: The pitches of each planet, shown on a piano with 61 keys.

The type of waveforms used to sonify the size of the planet depended on the type of theplanet. For a terrestrial planet a sawtooth-wave was used, and for a gas/ice giant a square-wave was used. The main difference between these waveforms is that the sawtooth-wavecreates more overtones than the square-wave. This made the terrestrial planets to have asharper timbre to represent the higher density of these planets, while the gas and ice giantsobtained a more mellow timbre representing their lower density. Additionally, to indicate ifa planet had a global magnetic field, a phaser-effect was also layered on the sound.

Length of Day

To convey the length of day of a planet, the analogy stated by Tomlinson et al. [25] wasused, which said that the brightness of daytime was mapped to the rate of an oscillator. Thiscreated an increase in sound level following the sunrise of a certain position on a planet, anda decrease in sound level following the sunset.

The oscillation was applied to the fundamental sound of the planet by modulating thecutoff frequency of a resonant low pass filter where the speed of the modulation dependedon the length of day of the planet. This created an oscillation of the sound level, but also ofthe amount of high frequency content in the sound. A small modulation of the pitch of theplanet was also added to create a slight Doppler-effect which made the sound more dynamic.

One of the more important values to scale that was shared for all the planets was time.It was important that the sonification was interesting and informative to listen to in real-time, while also not being too fast to avoid loss of information. A default timescale of 24hours/second (1 Earth-day/second) was chosen to account for this, which was also whatwas used in the sonification by Tomlinson et al. [25]. The time speed could however laterbe changed by changing the time speed in OpenSpace. For example, with a time speed of1 day/second, the length of day of Earth would be represented with an oscillator with thespeed of 1 hertz.

Length of Year

Length of year was conveyed through the spatial position of the planet. Using the positionaldata extracted from OpenSpace (see subsection 3.3.3), a directional sound was created tofollow the actual position of the planet in the software. A planet with a shorter length ofyear would revolve faster around the audience and vice versa. Using the same time speed aslength of day with 1 day/second, this would take Earth 365 seconds to revolve around theaudience. If a planet had moons it would also be positioned in a similar way in the planetaryview. The surround sound implementation is covered in subsection 3.2.4.

16

Page 28: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.2. Sonification

Gravity

The sonification of gravity was inspired by Tomlinson et al. [25], where a bouncing ball wasused to convey gravity. This worked as a parametric auditory icon, where a planet withlow gravity caused the ball to bounce less frequently and for a longer time. Because gravityis related to the mass of the planet, the pitch of the ball would also be higher for a planetwith lower gravity. The sound of the ball was created using the built-in UGen TBall, whichmodels the impact of a bouncing ball. The UGen has a gravity parameter which was set inratio to Earth for all the planets. Panning was also used on the ball sound so it would beperceived that the ball was bouncing sideways to create a wider stereo image. The ball wasdropped every seventh second for every planet as it became a fitting time interval for all theball bounces to fit within for each planet. It also worked as a time indicator when listening toevery planet as seven seconds in the default time speed of 1 day/second represented a weekon Earth.

To scale the gravity of the planets, the unit used was the gravity of Earth, denoted as g.Because this gravitational unit is in relation to Earth, it gives Earth a value of 1, and the restof the planets became a ratio of Earth’s gravity.

Atmosphere and Wind

Atmosphere and wind speed were conveyed with a sound that resembled the sound of wind.The depth of the wind corresponded to the density of the atmosphere, while the intensityof the wind corresponded to the average wind speed of the planet. The sound of the windwas created by letting noise be modulated by a low-pass filter where the cutoff frequencywas swept randomly. The type of noise depended on the type of planet, where pink noisewas used for the inner planets and brown noise was used for the outer planets. Brown noisehas a steeper fall-off in sound level with increasing frequency compared to pink noise, whichcreated a deeper wind, reflecting on the thick atmosphere of the outer planets. Additionally,if a planet did not have a defined atmosphere (like Mercury) the sound would not play.

The density of the atmosphere of a planet was determined mainly by using the surfacepressure of the planet. However, this value varied largely between the planets and did notapply for the outer planets since they lack a defined surface. Instead of mapping to the actualvalues of their atmosphere, the planets were instead ranked using these values and depend-ing on the type of planet. A planet with a higher surface pressure would get a deeper windsound, but not directly mapped to its values. A similar strategy was also used for the windspeed, since no definitive data source could be found for all of the wind speeds of the planets.It was also decided that the data used would be the wind speed of the overall atmosphere,and not necessarily the surface of planet. Venus for example has winds of up to 100 m/s inits atmosphere but decreases to just 3 m/s on its surface.

Temperature

Temperature was conveyed as the density of a grating sound. The analogy of the sound couldeither be linked to a frying sound, fire crackling, or the sound of a Geiger counter. Highertemperature resulted in a higher number of impulses, creating more noise. The lowest andhighest temperature for a planet created an interval which was swept through with the speedof the length of day for the planet. A low-pass filter was also used together with the speedof length of day to highlight the day (highest) and night (lowest) temperature of the planet.The sound was created by using the UGen Dust in SuperCollider. Dust provided randomimpulses in which density is increased with higher temperature.

Kelvin was used as the unit for temperature. This was used mainly to avoid negativevalues, as it would not suitable to map to the frequency of the noise. The inner planetshad a temperature range since they had the most accurate data, while the mean temperaturewas instead used for the outer planets. The temperatures of Uranus and Neptune were also

17

Page 29: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.2. Sonification

clamped to the lowest temperature of Mercury to create a smaller interval of values betweenthe planets. All the temperature values of the planet were then scaled down so that theplanets with the lowest temperature received a value of 1. This meant that the coldest planetswould only output a sound one time per second in average.

Rings

Rings were sonified by letting pure-tones be played around the audience which fluctuated infrequency, where the amount of pure-tones represented how many ring groups were present.Because ring systems has no recorded mass or unified position, the sonification works moreto indicate the presence of the rings. Because only Saturn’s rings are visible in OpenSpace,only Saturn had this sound even though all the outer planets have rings to some degree.

Solar System View

The Solar System view included the sonifications of mass, length of day and length of yearof each planet. The sound design was however simplified to enable playing more planetsat the same time. This was done by using brown noise through a band-pass filter whichcenter frequency was set according to the size of the planet. The pitch was doubled for all theplanets, both to signify that the planets appear smaller on the screen, and to better fit for thebrown noise. Instead of using a sweeping low-pass filter to convey the length of day, a pulseoscillator was used to simplify the sound. This type of sound was also used for the moons ofthe planets.

3.2.3 Sound Architecture

To create the sonification an architecture was built up in SuperCollider to make sure that thedata for all the planets were processed in the same way but with different parameters. Anoverview of this structure can be seen in Figure 3.5. The data for each planet went throughthe same SynthDef, only with different values, creating an environment where the planetscould be compared to each other as they originated from the same kind of sound generators.This created several instances of synths that were grouped together according to which planetthey belonged to. Grouping the sounds made it easier to control all of the sounds of a specificplanet. One way of controlling the sounds was to turn them on and off, which would becontrolled by the GUI of OpenSpace (see section 3.5). This was done using a gate function,where a binary signal determined if the sound should be played or not, similar to a key beingpressed down or released on a keyboard. An envelope, i.e. the contour of the amplitude, wasalso used together with the gate function to create a fade effect when starting and stoppingthe sound, i.e. the attack and release times.

Figure 3.5: An overview of the structure of the sonification.

At the end of the SuperCollider pipeline, effects that were meant to affect all the sounds inthe sonification were implemented. This included a reverb effect, which created a reverbera-

18

Page 30: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.2. Sonification

tion after every sound. This simulated how a sound would resonate in a certain size of room,depending on the length of the reverberation time. Having the same reverberation time onevery sound created a sense that all the sounds exists in the same space, which makes it eas-ier to relate to them. The reverb was created by duplicating the original sound of all soundchannels, and then delaying them randomly by a small margin to lastly apply a reverb filter.

3.2.4 Surround in the Dome

SuperCollider offers native support of surround sound. By default, the first eight outputchannels are reserved to the eight speakers of a 7.1 surround sound. The order of thesespeakers and their ideal placement is illustrated in Figure 3.6.

Figure 3.6: Ideal surround sound placement. Note that the LFE-component does not emitdirectional sound and can be placed more freely.

To position a sound is called panning. For a stereo setup a value of 0 in SuperCollidermeans that the sound is only played in the left speaker, a value of 1 plays the sound in theright speaker, and a value between 0 and 1 positions the sound somewhere in between the leftand right speaker. To extend this concept to a surround setup, the placement of the speakersis expressed as an angle within a circle, also known as the azimuth. However, the panningfunctions of SuperCollider uses the channels in their numerical order shown in Figure 3.6,therefore the order of the channels does not increase proportionally to the azimuth angle.To correct this, rerouting had to be done which reordered the channels to their circular order.Positional data from OpenSpace (see subsection 3.3.3) could then be used to create the desiredpanning.

The third channel in SuperCollider represents the low frequency emitter (LFE) or sub-woofer. Because this channel only handles low frequencies which are not perceived as di-rectional, it did not need to be part of the panning function. Instead this channel receiveda weighted sum of all channels, which was natively low pass filtered to suit the frequencyrange of the subwoofer.

To monitor the surround sound during development, a conventional stereo setup wouldnot be enough. An external sound card was initially used to see that surround sound was

19

Page 31: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.3. Integration with OpenSpace

outputted outside of SuperCollider. There were however difficulties with getting the equip-ment for a surround speaker setup. Instead, a virtual surround software was used to beable to output all eight channels, albeit only virtually through two headphone speakers. Thesoftware that was used was an older version of the headphone virtualization software RazerSurround2, which enabled any stereo headphones to output virtual surround.

3.3 Integration with OpenSpace

To integrate the sonification with OpenSpace, the positional and temporal data of OpenSpaceneeded to be used in the sonification to match what was shown on screen. How far awaya planet was from the camera should affect how loud the sonification would be, and wherethe planet was on screen should affect in what direction the sound would come from. Toaccomplish this, data needed to be extracted from OpenSpace and converted to be compatiblewith SuperCollider.

3.3.1 Extraction Method

The structure of OpenSpace allows for different approaches to extract data from the softwarewhich were explored to find the most suitable one. In the beginning of the project an attemptwas made to create an external C++ application that would extract data from OpenSpace.This application would be connected to OpenSpace with a network socket and communicatevia json messages. However, it was discovered that OpenSpace received the messages butthen ignored them. Therefore, it was decided to work more closely with the OpenSpacesoftware and create a new module. This new sonification module started a new thread whenit was initialized by the OpenSpace-core. The thread was in charge of monitoring the dataand to send it directly to SuperCollider when needed. When the module was de-initialized amessage was sent to SuperCollider to stop the sonification and the thread was destroyed.

3.3.2 The Data

It was initially intended that all the data needed for the sonification were going to be extractedfrom OpenSpace. However, since some data for the planets does not change over time (forexample mass, density and gravity) it would be unnecessary to extract it from a dynamicsoftware like OpenSpace. Instead the static data was acquired from the NASA planetary factsheets3 which were used in SuperCollider as constants.

Dynamic data that were of interest for the sonification was the position of the camera, theplanets and their moons. These positions could then be used to calculate the distances andangles from the camera to the planets. OpenSpace uses a scene graph to keep track of theobjects in the scene and every object is represented in the scene graph by one or more nodes.The positions were acquired by searching the scene graph for the corresponding node andextract its world position.

3.3.3 Distances and Angles

Surround panning was used in the sonification to spatially position the planets around theaudience, as previously mentioned in subsection 3.2.4. This required the angle between thecamera and the planets, which was calculated in steps, see Figure 3.7 and Equation 3.1. Firstthe vector from the camera to the planet was calculated giving ~CP, this vector was then pro-jected on to the horizontal camera plane giving ~CPCH.plane . The angle θ between the projected

2Razer virtual 7.1 surround sound: https://www.razer.com/7.1-surround-sound3NASA’s planetary fact sheets: https://solarsystem.nasa.gov/planet-compare/, and https://

nssdc.gsfc.nasa.gov/planetary/factsheet/index.html

20

Page 32: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.3. Integration with OpenSpace

vector and the forward vector of the camera ~C f orward was calculated using the function ori-entedAngle4 from the library glm using the cameras up-vector ~Cup as the reference axis. Theoriented angle was used instead of the absolute angle since the surround system had to beable to distinguish between left and right. This angle was then sent to SuperCollider in radi-ans ranging from ´π to π.

Figure 3.7: The calculation of the angles for planets relative to the camera.

~CP = ~P´ ~C~CPCH.plane =

~CP´ ~CPCup

θ = glm::orientedAngle( ~C f orward, ~CPCH.plane , ~Cup)

(3.1)

For other situations the sonification required the angles to be calculated differently. Thisdepended on if the sonification was supposed to be perceived spatially from the camera orfrom an object inside OpenSpace. One case of this is when the angles to the moons are to beperceived from the planet they orbit instead of the camera. If the angles were to be perceivedfrom the camera the audience would perceive that the moons are emitting sounds wherethey are on the screen. However, if the angles are instead perceived form the planet, theaudience would hear the moons orbit around them. Since this was the wanted outcome,the angles were in this case calculated by first calculating the vector from the planet to themoon ~PM. This vector was then projected onto the horizontal camera plane ~PMCH.plane . The

angle φ between the forward vector ~C f orward of the camera and the projected vector was thencalculated in the same way as previously, see Equation 3.2. This method was also used tocalculate the angle from the Sun to the planets with respect to the camera. This case couldhowever be simplified since the origin in the world coordinate system is placed in the Sun.

~PM = ~M´ ~P~PMCH.plane =

~PM´ ~PMCup

φ = glm::orientedAngle( ~C f orward, ~PMCH.plane , ~Cup)

(3.2)

The sound level SP of the sonification for each planet was mapped to the distance betweenthe camera and the planet to create an immersive element. The distance was calculated as the

4glm::orientedAngle documentation: https://glm.g-truc.net/0.9.4/api/a00210.html,source: https://github.com/g-truc/glm/blob/master/glm/gtx/vector_angle.inl

21

Page 33: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.3. Integration with OpenSpace

length of the vector between these objects. The rate that the sound level would decreaseby the distance was determined by Equation 3.3, or if the planet did not have any moonsEquation 3.4 would instead be used.

k =0.5´ 1

dmoon ´ dclose

SP = ea¨kd ¨ b(3.3)

SP =3 ¨ I

d(3.4)

In Equation 3.3 and Equation 3.4 the diameter of the planet was denoted as I, d was thedistance to the planet from the camera, and the values of the coefficients a and b were usedto adjust the equation for the different planets. The equation depended on three distancesfor each planet. dclose was the distance when the camera was considered close to the planetwhich would correspond to the highest sound level for the planet sonification. dmoon was thedistance when the camera was approaching the orbits of the moons of the planet which wouldcreate a transition in sound level between the planet and its moon system. Lastly d0 was thedistance where the sound level of the sonification would be almost inaudible because theplanet would be too far away. Equation 3.3 created a sound level curve (see Figure 3.8) whichwas individual for each planet since the planets size, number of moons, and the distancefrom the planet to the moons would affect the sound level. Before any sound level was sentto SuperCollider the values were clamped between the values 0 and 1 to make sure it wouldnot send unreasonably high sound levels.

Figure 3.8: The black curve depicts how the sound level for the sonification of Earth changesdepending on the distance to the planet. The value on the Y-axis is the sound level for theplanet and the value on the X-axis is the distance to the planet in kilometers (km). The pointswhere the lines of the same color intersect are points where the curve was desired to be closeto.

It was also desired that the sound level of the moon sonification would be the loudestwhen the planet sound level was half of its loudest sound level, as well as fade away whenthe planet sonification became either louder or quieter. This was done to put more focus onthe moons when they were more present on screen. Therefore, the sound level SM of the

22

Page 34: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.4. Open Sound Control

moon sonification was calculated separately with Equation 3.5 where SP is the sound level ofthe planet that the moons orbit.

SM = ´4(SP´ 0.5)2 + 1 (3.5)

3.3.4 Precision Error

As mentioned in section 2.1 the single precision floating point errors in OpenSpace [2] causedthe extracted positions to become unstable, which in turn caused the calculated angles tofluctuate as well. This problem became more prominent for the planets further away fromthe Sun and when the time was simulated at a faster pace. This would cause unwantedsound artifacts when SuperCollider applied a new angle that differed substantially from theprevious angle. In order to solve this a lag was used in SuperCollider which interpolatedtowards the new value from the old to remove the errors that causes the unwanted sound.However, as mentioned earlier the angles were in a range from ´π to π where both ´π andπ are positioned behind the audience. When the lag-function receives the command to gofrom ´π to π the mathematical solution is to interpolate through the 0 which is in front ofthe audience. This causes the sound to pan around the front to get to the other side instead ofcrossing the gap directly. This problem was not solved and is further discussed in section 5.3.

3.3.5 Optimization

In games and computer graphics there is a method called level of detail (LOD) that saves ren-dering time by using the appropriate resolution of the 3D models in a scene depending onthe distance to the camera [14]. Similarly to this, all the data from OpenSpace was not sentcontinuously to SuperCollider but instead was only sent if it had been changed from what itpreviously was. For example, if the time speed in OpenSpace was real-time and the camerawas not moving, the values would only change by a small margin which made it unneces-sary to send the same data again. However, since the environment of OpenSpace is large andsome of the planets move faster than others, the distance could change more rapidly for someplanets and trigger to be sent more frequently even when those planets were not in focus orin the frame. To avoid this, two different sensitivities for what was classified as new datawere used. Firstly, if the planet was in focus the sensitivity would be higher than if the planetwas not in focus. This gave more accurate data for the planet that was of interest while theless interesting planets would send less data. Secondly, when the planet was in focus, datawas sent if the new angle varied more than 3 degrees or if the distance varied with more than1000km from the previous data, and if the planet was not in focus the sensitivity was loweredto 6 degrees and 10000km.

3.4 Open Sound Control

Open Sound Control (OSC) was the protocol used to send the extracted data from OpenSpaceto SuperCollider. It was integrated into the sonification module in OpenSpace with the libraryoscpack5. The data was written to a buffer that was connected to an outbound packet stream.Before the data was written to the stream an address was defined for each planet where alldata relevant to that planet was sent with that address. The data was written to the streamin a defined order to make it easier for SuperCollider to know which part of the messagecontained a particular type of data. The data included was the distance and angle to theplanet, an array with the GUI-settings for that planet and the angles to the major moons ifthe planet had any. The array of the GUI-settings was sent as a Binary Large Object (Blob) [27],which behaves as an array with binary content in SuperCollider. For the Solar and Compare

5oscpack: http://www.rossbencina.com/code/oscpack?q=~rossb/code/oscpack/

23

Page 35: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.5. Graphical User Interface

view, different addresses were used to separate the information from the planets. For theSolar System view a Blob with the GUI settings for the sonification was sent to SuperCollider.For the compare view, two integer numbers were sent to represent which planets were tobe compared together with another Blob of GUI settings of which sonifications were to becompared for the selected planets. How fast the speed of time in OpenSpace was simulatedwas sent with a separate address and was only sent if this value changed.

3.5 Graphical User Interface

In order to make the sonification more interactive and useful the graphical user interface(GUI) in OpenSpace was extended to control the sonification module. The type of GUI ele-ments used were checkboxes where the user could turn a sonification parameter on or off,and a drop down menu where the user could choose two planets to compare with each other,see Figure 3.12. If any of the GUI elements were changed, the new information would be sentto SuperCollider as a Blob that explained what parts of the sonification was on or off. Thisenabled changes that the user made to the GUI to affect the sonification in real-time.

In order to make the GUI able to control the different views in OpenSpace it was organizedwith three different layers, the planetary, solar and compare layers. Only one of these layerscould be active at any given time as only one could be used at a time in OpenSpace. The solaror compare layers could be activated by setting the Sun as focus in OpenSpace and interactingwith one of the GUI settings for those layers. The planetary layer was activated by selectingone planet as the focus in OpenSpace, and every planet had its own set of properties andsettings that the user could turn on or off, see Figure 3.9. The Solar System view containedless information in the sonification of each planet and therefore its GUI only had a list ofcheckboxes to turn on or off the sonification of each planet, see Figure 3.10. The compare GUIconsisted of two drop down menus where the user could select two planets to compare theirmore detailed sonifications. Which sonifications to compare for the selected planets could beselected using the list of checkboxes below the drop down menus, see Figure 3.11.

24

Page 36: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.6. User Evaluation

Figure 3.9: The planetaryview GUI.

Figure 3.10: The Solar Systemview GUI.

Figure 3.11: The compareview GUI.

Figure 3.12: The GUI in OpenSpace for all three views.

3.6 User Evaluation

Three different types of user evaluations were conducted to evaluate the sonification. Early inthe project a small informal evaluation was conducted to see whether the development washeading in the right direction. Halfway through the project two online surveys were used toget feedback and improve the sonification further. At the end of the project a more thoroughevaluation was conducted in the Dome theater at Visualiseringscenter C. Since this evaluationwas conducted at the end of the project, the feedback received could not be implemented,but was instead suggested as future work, see section 5.4. In all evaluations, an audio userexperience scale called BUZZ [24] was used as inspiration, which is a questionnaire usedto evaluate sonifications. The questionnaire contains 11 questions about the functionality,comprehension, and aesthetics of the sonification. Each question is scored using a Likertscale of 1 - 7 which is then summed up to a highest possible score of 77. In the evaluations allthe BUZZ statements were included with some minor changes to make the statements moreconcrete.

3.6.1 Initial Evaluation

To ensure early in the project that the development of the sonification was heading in the rightdirection, a small and informal evaluation was conducted with two participants. The partic-

25

Page 37: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.6. User Evaluation

ipants were asked to listen to the sonification with headphones without any visuals, and theonly information the participants were given prior to the test was that the sonification wasrelated to the planets of the Solar System. This was done to test how intuitive the sonificationwas perceived. Questions were asked to the participants after they had listened to differentparts of the sonification, including a few selected from the BUZZ questionnaire. The evalu-ation was conducted in an informal manner where the participants gave their spontaneousopinions about the sonification. The feedback from the evaluation was then taken into con-sideration in the future development.

3.6.2 Online Surveys

Halfway through the project two online surveys were conducted. The plan was originallyto do most of the evaluations inside the Dome theater but due to the covid-19 pandemicthat plan was changed. Therefore, in order to get feedback about the sonification two onlinesurveys were published instead. The feedback from the first survey was used to create animproved sonification, which in turn was evaluated in the second survey.

To gain respondents on the surveys a website called SurveySwap6 was used, which is awebsite where in essence survey-creators swaps surveys with each other. Both surveys weredeveloped as an online form and contained videos of the inner planets with their respectivesonification along with visuals from OpenSpace. The information that was presented to therespondents about the sonification was what the sounds were supposed to represent, but theactual sound design was not revealed. For example, when the mass of Mercury and Venuswere compared it was revealed that Venus had a larger mass than Mercury. The respondentswere asked to watch and listen to the videos and afterwards answer questions about thesonification. Most of the questions touched on whether the sonifications represented the datain a good manner or not. After each sonification there was an optional free text questionwhere the respondents could write their thoughts about that sonification. The last part ofthe survey was a BUZZ questionnaire where the respondents were asked to evaluate thesonification as a whole. Demographic questions that were asked in the survey were howthe respondents would rate their knowledge about the Solar System and which country theywere from.

The respondents were encouraged to use headphones for the test since the listening de-vice could affect their experience of the sonification, and in order to monitor this there was aquestion where the respondents would state how they listened to the sonifications. The soni-fications that were evaluated were those of mass, length of day, gravity, atmosphere, windspeed, temperature, and finally a composition of all sonifications for one planet. Length ofyear and the Solar System view were not included since that would require a surround soundsetup, which would be too strict to have as a requirement for the survey.

3.6.3 Dome Evaluation

The Dome theater in Visualization Center C was used to conduct the final evaluations. Thiswas beneficial as the Dome was the target platform of the project, and features regarding sur-round could be tested as this was not possible in the online surveys. The test was conductedtwice with a small focus group at each time that were invited to the test.

The evaluation started with the participants signing an informed consent that informedthem that audio was going to be recorded during one part of the evaluation. The test admin-istrator then proceeded with informing the participants of how the test would be conducted.The participants were then introduced to OpenSpace without the sonification or any audio.This was done to separate the experience of OpenSpace and the sonification so that the eval-uation could focus on the sonification. The evaluation then proceeded similarly to the online

6SurveySwap: https://surveyswap.io/

26

Page 38: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

3.6. User Evaluation

surveys where each sonification was explained verbally by the test administrator and demon-strated. The participants individually filled in an online form for each sonification where theywould rate how well they thought the sonifications conveyed the data. Features of the sonifi-cation that couldn’t be evaluated in the online survey such as the Solar System view, moons,surround sound and immersion were also covered in this evaluation. After all the sonifica-tions had been demonstrated the participants were asked to answer a BUZZ questionnaireabout the overall experience. A discussion was then held to discuss their thoughts about thesonifications as well as the experience as a whole.

3.6.4 Analyzing the Data

The results of the evaluations were analyzed to find correlations and conclusions to deter-mine how to improve the sonification. Box plots were used to visualize the data of the onlinesurveys, including both the survey questions and the BUZZ questionnaire. This created anoverview of the results of the evaluations where correlations and conclusions could be made.The free text answers were analyzed by reading the comments and if there were any concreteideas or criticism those were considered in the future development of the sonification. Con-crete comments could be about what aspects of the sonification were good or bad or any newideas the respondents might have of how to improve the sonifications.

For every sonification there were a few comparisons of one parameter between two plan-ets, for example the temperature sonification was compared between Earth and Mercury, andMars and Venus. The different comparisons for the same questions were analyzed in orderto determine if the sonification was particularly problematic for a certain planet. The resultsof the comparisons where then combined to create an overall evaluation for the specific soni-fication. This resulted in an uneven number of data points per sonification since there werefor example more cases for the sonification of mass and length of day while there were fewercases for the sonification of gravity.

Correlation analysis was conducted to check whether the correlation of certain resultswas significant. This could give an indication of unwanted correlations that could affectthe overall result of the evaluation. The significant value (p-value) for the correlation wascalculated in Excel as seen in Equation 3.6,

r = correl(x1, x2)

t =r ¨?

n´ 2?

1´ r2

p = t.dist.2t(t, n´ 2)

(3.6)

where n is the number of samples, r is Pearson’s correlation coefficient, x1 and x2 are thetwo variables that are analyzed. The functions correl7 and t.dist.2t8 are built in functionsin Excel that were used. The correlation is considered significant9 if the resulting p-value issmaller than 0.05. In addition to this the Pearson’s correlation coefficient (r-value) can tellthe strength of the correlation10, where a correlation with a r-value between 0.3 and 0.5 isconsidered weak, a r-value between 0.5 and 0.7 is considered moderate and a r-value greaterthan 0.7 is considered strong.

7Excel corell function documentation:https://support.microsoft.com/en-us/office/correl-function-995dcef7-0c0a-4bed-a3fb-239d7b68ca92

8Excel t.dist.2t function documentation:https://support.microsoft.com/en-us/office/t-dist-2t-function-198e9340-e360-4230-bd21-f52f22ff5c28?ui=en-us&rs=en-us&ad=us

9How to determine significance:https://www.dummies.com/education/math/statistics/what-a-p-value-tells-you-about-statistical-data/

10How to determine strength of correlation:https://www.dummies.com/education/math/statistics/how-to-interpret-a-correlation-coefficient-r/

27

Page 39: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4 Evaluation Results and FurtherDevelopment

A total of 64 respondents were part of the two online surveys, 30 in the first and 34 in thesecond. The country of origin of the respondents were mixed and spanned all over the world.For the Dome evaluations there were five participants across two evaluations. The resultswere analyzed and presented as box plots where both the range and the general trend couldbe discerned. A box plot presents 50% of the data inside the box as well as the upper andthe lower quantiles. The thick horizontal line in the box represents the median value and thecross the average value.

4.1 First Survey

30 respondents answered the first online survey via the website SurveySwap. Among theserespondents 80% (24 respondents) used headphones to listen to the sonifications and 20% (6respondents) used laptop speakers to listen to the sonifications. The respondents rated theirknowledge of the Solar System as fairly low with an average score of 3.2 on a Likert scalefrom 1 (very low) to 7 (very high). A total of 26 free-text comments were received acrossthe different sonifications. The sonifications received fairly good score overall where thestatement regarding how interesting the sonifications were received the highest median andaverage value of 6 and 5.3 respectively. However, the statement regarding how pleasant thesonifications was received the lowest average score of 3.9, see Figure 4.1.

28

Page 40: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.1. First Survey

Figure 4.1: BUZZ results from the first online survey presented as a box plot. The statementshave been simplified for clarity. Note that for the statements "Difficult", "Boring" and "Confus-ing" it is positive if the answer has a low value. The thick horizontal line in the box representsthe median value and the cross the average value.

The correlation between the respondents self-perceived knowledge of the Solar Systemand how well they understood the sonification based on the BUZZ questionnaire was ana-lyzed since it was suspected that a correlation was present. The BUZZ statements that wereincluded in this analysis were only the questions regarding the understanding of the sonifica-tion and not the questions regarding the aesthetics of the sonification. This analysis resultedin a probability value (p-value) of 0.042 calculated over the 30 respondents in the first survey.This was a significant correlation since the p-value was smaller than 0.05. However, the cor-responding Pearson’s correlation coefficient (r-value) resulted in 0.37 which is considered asa weak correlation.

Regarding the results for the sonification of mass, comments mentioned that the respon-dents would associate a smaller mass with a higher pitch, which was consistent with thechosen pitch mapping. The highest median and average score of 6 and 5.5 respectively weregiven to the sonification of length of day (see Figure 4.2) with comments also stating that thelength of day was easy to distinguish.

Figure 4.2: Result for each sonification for the first survey. The thick horizontal line in the boxrepresents the median value and the cross the average value.

29

Page 41: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.2. Changes for Second Survey

4.2 Changes for Second Survey

Using the feedback from the first survey, changes were made to the sound design and thesurvey layout to then be tested in the second survey. Comments gave a good indicationof what the respondents felt needed to be changed, with some comments even suggestingconcrete advice for improvement.

For the sonification of gravity, comments suggested that the time between the bounces ofthe ball in the sonification was too narrow to be perceived as a bouncing ball. It was insteadperceived as "electronic noise" or a "short phone ringtone" as stated by some of the respondents.To attempt to fix this problem, the gravity for all of the planets were decreased in value toincrease the time between bounces. This was done by the same amount for all the planets tostill keep the same relation between them. The order of the planets in the video were alsochanged so that the planet with the lowest gravity was played first.

For the atmosphere and wind, comments suggested that the faster wind speed of Mars didnot come through with the sonification. The wind speed of the sound was therefore increasedto create a bigger contrast. Despite these comments many had however given this sonificationa decent score in the survey with a median of 5 and an average of 4.7, see Figure 4.2.

The sonification of temperature received the lowest average score of all the sonificationswith a score of 3.9, see Figure 4.2. Comments regarding the temperature sonification told thatit was "unpleasant" and "annoying". Therefore, more mellow background noise was added totry and make the sound more pleasant. This was done by using the UGen Crackle in Super-Collider.

The correlation analysis made for the first survey suggested that there was a significantcorrelation between the respondents self-perceived knowledge of the Solar System and howwell they understood the sonification. This could indicate that too little information was saidabout the sonification for all respondents to understand it equally. Therefore, it was decidedto present more information about the underlying sound design and parameter mapping tothe respondents in the second survey.

The results of the first survey showed that the comments gave more indications on whatto improve than the score given to the questions. To account for this, more specific questionswere asked about each type of sonification in the second survey. The question regardingatmosphere and wind speed was divided into two separate questions (one for atmosphereand one for wind speed), and two more questions were added for temperature of how thelowest and highest temperatures were perceived. Three chosen BUZZ statements (regardinghow interesting, pleasant and understandable the sonifications were) were also added aftereach sonification. This was added to be able to define if there was a particular sonificationthat was problematic and in what way.

4.3 Second Survey

34 respondents answered the second online survey, where the average self-perceived knowl-edge of the Solar System of the respondents was 3.6. A total of 12 free-text comments werereceived across the different sonifications. Among these respondents 65% (22 respondents)used headphones to listen to the sonifications, 20% (7 respondents) used separate speakers,9% (3 respondents) used laptop speakers, 3% (1 participant) used a TV, and 3% (1 participant)used a phone.

The results improved in favor of the sonifications on all the statements compared to thefirst survey, see Figure 4.3. The statements regarding how confusing and relatable the sonifi-cations were had the biggest improvement.

30

Page 42: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.3. Second Survey

Figure 4.3: BUZZ result from both online survey presented as a box plot. The statements havebeen simplified for clarity. Note that for the statements "Difficult", "Boring" and "Confusing" itis positive if the answer has a low value. The thick horizontal line in the box represents themedian value and the cross the average value.

Compared to the results of the first survey the results from the second survey were nar-rower, despite having more respondents. This can be seen in Figure 4.4 where the total BUZZscore for both surveys are presented. Despite the resulting improvement on all the individualquestions in the BUZZ questionnaire, the total BUZZ score remained more or less the samefor the second survey. The first survey had a median of 49.5 and an average score of 49.7,while the second survey had a median of 49 and an average score of 49.1.

Figure 4.4: BUZZ score result from both surveys. The thick horizontal line in the box repre-sents the median value and the cross the average value.

For the second survey the correlation found in the first survey was no longer significantwith a p-value of 0.083 calculated over the 34 respondents of the second survey. In additionto this the r-value decreased from 0.37 to 0.30, which is considered as a weaker correlation

31

Page 43: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.3. Second Survey

compared to the first survey. This could indicate that adding more information to the surveyremoved this correlation and helped the respondents understand the sonifications better.

In the second survey three chosen BUZZ statements were asked after each sonification, seeFigure 4.5. These questions could not be compared to the first survey as they were added asan attempt to gain more information in the second survey. It can be seen that atmosphere andwind speed as well as temperature received the lowest score out of the sonifications, wheretemperature was the least pleasant and interesting sonification, and atmosphere and windwere the most confusing sonifications. The most pleasant sonification was the sonification ofgravity and the most interesting was the sonification of mass and length of day while bothwere relatively easy to understand.

Figure 4.5: Result of the added BUZZ statements after each sonification for the second survey.The thick horizontal line in the box represents the median value and the cross the averagevalue.

The results for the mass sonification increased from an average value of 4.2 in the firstsurvey to 5.1 in the second survey, see Figure 4.6. The improved results of the mass sonifi-cation could indicate that revealing more information helped the respondents to understandthe sonification, since there were no sound design changes to this sonification. The sonifi-cation of length of day retained an average score of 5.5 for both surveys. For gravity, it canbe seen that the sound design changes created a positive outcome, where the average scorefor gravity increased from 4.5 in the first survey to 5.6 in the second survey. There was alsoa positive outcome for the results of the temperature sonification. The average score for thesonification of temperature difference in the first survey was 3.9 and in the second surveythis increased to 4.8. The added questions regarding highest and lowest temperature got anaverage score of 5.0 and 4.6 respectively in the second survey.

Despite the changes made to the atmosphere and wind speed sonification and the rear-rangement of the questions, the result of the second survey did not differ that much from thefirst. In the first survey the sonification of atmosphere and wind speed had a median scoreof 5 and an average score of 4.7. In the second survey the sonification of atmosphere had amedian of 5 and an average of 4.7, while the sonification of wind speed had a median of 5and an average of 4.4.

32

Page 44: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.4. Dome Evaluation

Figure 4.6: Result for the sonifications in both surveys. The thick horizontal line in the boxrepresents the median value and the cross the average value.

In the second survey less comments were given by the respondents, which could sug-gest that the added questions in the survey made it possible for the respondents to expresstheir opinions without adding comments. No negative comments were mentioned aboutthe sound design of gravity in the second survey. Two comments for atmosphere and windmentioned that the data was not as well represented as mass, length of day or gravity. Fortemperature, one comment mentioned that the sonification was not relatable. However, nocomments mentioned anything negative about the sound design, whereas in the first surveyseveral comments touched on this.

In the second survey a question regarding whether the respondent had learned somethingnew from listening to the sonifications was added. This gave a positive result where 68% (23respondents) of the respondents claimed that they had learned something new from listeningto the sonification. Only 9% (3 respondents) claimed that they had not learned anything newfrom listening to the sonification and 23% (8 respondents) said that they might have learnedsomething new.

4.4 Dome Evaluation

Two Dome evaluations were conducted with five participants in total, which were combinedinto one data set. The average self-perceived knowledge of the Solar System for the partici-pants was 4.8.

The BUZZ result for the Dome evaluation can be seen in Figure 4.7. The statement re-garding how interesting the sonifications were received the highest average score of 6.4 and amedian of 7. The statement regarding how fun the sonification was improved the most com-pared to the online surveys, which also created a bigger contrast to the statement regardinghow boring the sonification was. The statement regarding how difficult the sounds were tounderstand received the lowest average and median score of 4.

33

Page 45: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.4. Dome Evaluation

Figure 4.7: BUZZ result for the second survey and the Dome evaluation. Note that for thestatements "Difficult", "Boring" and "Confusing" it is positive if the answer has a low value. Thethick horizontal line in the box represents the median value and the cross the average value.

The overall BUZZ score increased for the Dome evaluations compared to the online sur-veys from an average score of 49.1 (for the second survey) to an average score of 57.4, seeFigure 4.8. Similar questions were asked between the online surveys and the Dome evalua-tion, but the Dome evaluation was held in a more controlled environment. This could indicatethat the listening environment could have a substantial impact on how well the participantsperceived the sonifications.

Figure 4.8: Total BUZZ score result for all the evaluations. The thick horizontal line in the boxrepresents the median value and the cross the average value.

The correlation between the participants self-perceived knowledge about the Solar Systemand how well they understood the sonification was not significant with a p-value of 0.51. Inaddition to this the r-value was 0.39 which is considered as a weak correlation. This result isconsistent with the second survey that the participants understood the sonifications more ifthe underlying sound design was revealed to them before listening to the sonifications.

34

Page 46: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.4. Dome Evaluation

The result for the three BUZZ statements after each sonification can be seen in Figure 4.9,where the sonification of mass and length of day have been separated and the sonificationsof the Solar System and the moons have been added. The sonification of mass received thelowest score overall with comments stating that it was hard to discern the mass by onlylistening to the pitch of the sound. During the discussion it was speculated that how wellan individual could discern the pitches could be dependent on their musical background.The most interesting sonification according to the participants was gravity, with commentsstating that it was the most unique of the sonifications.

The sonification for the moons received the highest score regarding how pleasant thesound was with a median of 6 and an average of 5.6, but also received the lowest score onhow easy it was to understand with both a median and an average score of 3. This couldbe because the moons were demonstrated by observing the orbital resonance that occurs onJupiter’s three inner moons. This created more musicality to the sonification, which led to itbeing more pleasant, but this could also have made it less understandable since the data wasoccluded by the musical sound. One of the participants pointed this out and said that theylistened to the sounds more passively because it sounded more like music than a sonificationconveying data.

Figure 4.9: Result of the three BUZZ statements after each sonification for the Dome eval-uation. The thick horizontal line in the box represents the median value and the cross theaverage value.

The result of how well the participants thought the sonifications conveyed the data canbe seen in Figure 4.10. The length of day sonification received the highest average score of6.6, while the moon sonification received the lowest average score of 3.9. The best improve-ments from the second survey was for the sonification of length of day and atmosphere whichincreased from an average score of 5.5 to 6.6 and 4.7 to 5.8 respectively.

35

Page 47: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

4.4. Dome Evaluation

Figure 4.10: Result for all the sonifications in the Dome evaluation. The thick horizontal linein the box represents the median value and the cross the average value.

For all of the evaluations, a composition of all the sonifications were played for one planet.Combining the results for all of the surveys, an average score of 4.7 was given to how wellthe separate sonifications could be distinguished while playing at the same time. Commentsfrom of all the evaluations said that it was hard to distinguish the sonifications of gravity,temperature, and atmosphere and wind speed in particular.

During the Dome evaluation, a question was asked regarding how immersed the partic-ipants felt before and after listening to the sonifications. This made it possible to see if theparticipants felt that their sense of immersion changed with the sonification or not. On av-erage the participants gave a score of 4.2 before the sonification and an average score of 6.4with the sonification, which is an increase of 2.2.

Comments during one of the evaluations mentioned that it was confusing when someof the BUZZ statements were inverted, where a high score would be negative towards thesonification instead of the other way around. For the second evaluation these statementswere grouped together in the end to create less confusion. Other comments stated that thelength of year was hard to discern for the participants as they expected it be mapped to aspecific sound, when it instead was conveyed as the spatial position of the planets. It issupposed to be discerned with the surround sound by either estimating the speed the planetorbits around the audience, or how long it takes to complete one orbit.

36

Page 48: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

5 Discussion

Throughout the project decisions had to be made which depended on circumstances happen-ing in the world as well as time restraints. Initial Dome evaluations were replaced with onlinesurveys, and in order to progress in the project a few problems could not be solved and wereinstead postponed. The motivation behind these decisions is discussed and explained in thischapter, as well as improvements that can be made for future work.

5.1 Evaluation Method

All the evaluations evaluated how well the sonification conveyed information, but not neces-sarily how well the sonification was incorporated with the OpenSpace software. This was notevaluated since it was not the primary aim of this thesis, which was to create a sonificationthat would convey more information about the Solar System to the audience of OpenSpacein the Dome theater. In this environment the audience would not have direct control of theOpenSpace software. This meant that the GUI would not need to be widely tested since onlya few experienced people would need to interact with it. Therefore, it was not a priority totest its usability.

It was first intended that all evaluations were to be conducted in the Dome theater witha small focus group. However, due to the covid-19 pandemic this could only be done closeto the end of the project. Instead, online surveys were conducted during the developmentto receive feedback. Interviews with a few participants one-on-one could still have beenan option. However, the online surveys were preferred since more people could be reachedwhich would lead to a more conclusive result. Not all sonifications could however be tested inan online environment. This included the Solar System view and surround implementationsas these required a surround setup such as the Dome theater to be experienced in, which wereinstead evaluated later in the final Dome evaluation.

No questions were asked regarding the demographics (age, gender, education) of the re-spondents other than their country of origin. The demographic of an individual could impacttheir expectations about the sonifications which could impact their experience. Demographicquestions were not asked because only a general opinion was desired from the surveys. Cor-relations between demographics could therefore not be identified, such as which target groupwas evaluated. This could also make it less transferable for other researchers to use the datain the future.

37

Page 49: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

5.2. Results

An attempt was made to collect respondents from other sources than SurveySwap for theonline surveys. The survey was sent to an astronomy youth group where people with moreexperience in astronomy could answer the survey. However, only two people answered thesurvey which was not enough to create a separate data set and were instead combined withthe respondents from SurveySwap.

It was recommended to use headphones for the online surveys, which the respondentswere informed about in the surveys and this was also monitored by asking what listeningdevice was used. Depending on what device the respondent used (apart from headphones) tolisten to the sonification could have affected their experience of the sonification. When somerespondents used the speakers on their laptops for example, the sounds might not have beenreproduced in a fair manner. This could have affected their experience of the sonification,which in turn would affect their ratings in the survey. The respondents that answered thatthey did not use headphones were however still included in the results of the surveys, as theamount of impact on the experience was not known.

Different amount of data points where present for the different sonifications. This wasbecause there was different amount of comparative cases for each sonification, there weremore cases for the mass sonification than the gravity sonification for example. This couldskew the data as there were more information available for some sonifications than others.

5.2 Results

The scores for all of the sonifications improved from the first to the second survey. This couldboth have depended on that the sound design for some of the sounds were slightly changed,and also that more information about the sound design was presented in the second survey.The order of what planet played first in the survey was also changed in some cases, whichalso could have affected the results.

For the first online survey a significant correlation was found between the respondentsself-perceived knowledge about the Solar System and their understanding of the sonifica-tion. This was a weak correlation with a low r-value but was still seen as a flaw of the survey,since the respondents that did not know much about the Solar System did not receive enoughinformation to be able to understand the sonification. In the second online survey more in-formation about the sonifications were added, which revealed the underlying sound designof the sonifications. This correlation was no longer significant in the second survey and theDome evaluation which could indicate that the respondents in the second survey and the par-ticipants in the Dome evaluation knew enough about the sonification to understand it, evenif they had little knowledge about the Solar System. However, this meant that the survey nolonger tested how intuitive the sonifications were since the answer was already given to therespondents. Instead it tested how understandable and relatable the sonifications were.

In the Dome evaluation the BUZZ score for the statements fun, boring, interesting andpleasant were the statements that improved the most from the online surveys (see Figure 4.7).This could indicate that the listening environment in the Dome theater gave the participantsa better experience of the sonifications compared to headphones in the online surveys.

The mass sonification received a rather low score across the evaluations. However, de-spite this the sonification was given comments in the online surveys that were positive to it.This made it difficult to discern what part of the sonification was good and what part wasproblematic. By adding the three selected BUZZ statements after each sonification it waspossible to see what part of it was problematic, see Figure 4.5. It could be seen there thatthe mass sonification was rather interesting and easy to understand but not very pleasantaccording to the respondents. This information could not be discerned from the first surveywhere these questions were not present and without them it was not possible to improve thesonification to the second survey. Even though no changes were made to the mass sonifica-tion between the surveys, the result for it increased in the second survey as can be seen in

38

Page 50: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

5.3. Implementation

Figure 4.6. This could be due to the additional information that was given to the respondentsabout the sound design. One reason why the mass sonification was rated fairly low could bethat the respondents could not appreciate the differences in pitch between the planets, whichwas mentioned by the participants in the Dome evaluation. Most of the planets have a dif-ference of two semitones from the closest planet, which could be hard to discern for a personwho is not very familiar with pitch. Additionally, in the surveys only the inner planets wereevaluated which did not showcase the bigger differences in pitch for all of the planets.

The sonification of atmosphere and wind speed had the weakest connection to its data, asthere was no direct mapping between the data and the sound. The sonification did howeverreceive a decent score across the evaluations, especially in the Dome evaluation. The reasonfor this could be that the lower frequencies of the winds were reproduced in a better mannerwith the 7.1 surround system in the Dome theater, which helped the participants to discernthe thickness of the planets’ atmosphere.

The sonification of temperature was considered one of the more unpleasant sonificationsacross all of the evaluations. This was partly fixed between the first and the second survey,but in general the idea with the sonification was that a planet with a high temperature wouldfeel more chaotic, which could also be interpreted as unpleasant.

5.3 Implementation

A more high-level synthesized sound generator such as a Digital Audio Workstation (DAW)could have been used to create more complex sounds. Many presets of complex soundsalready exist in these softwares that would improve the sound design process. However,the aim of this thesis was not to create the best sounding sonification, but instead to test theconcept of it.

A possible alternative to OSC for sending the data from OpenSpace would be MIDI1,which is a popular protocol used in music production. MIDI is a hardware dependent proto-col, meaning that physical equipment would be needed in order to use it. OSC however canwork over the internet and does not need any additional hardware. OSC is also more flexiblewith what types of data can be sent over the protocol compared to MIDI, which makes OSCthe preferred protocol for this project.

One disadvantage with the method used to calculate the angles (see subsection 3.3.3) isthat it could give unexpected results if the camera was pointing directly onto the planet fromabove or below. In this case ~PMCH.plane in Equation 3.1 would become zero and the functionto calculate the oriented angle would fail. This was not perceived as an important issue sincethis case would only happen in two specific points in a relatively large environment.

The problem with single precision floating point errors (see subsection 3.3.4) which causedthe distances and angles to fluctuate was not entirely fixed. The lag introduced in SuperCol-lider partly fixed the issue, but instead caused the sound to behave in an undesired mannerwhen it was placed behind the audience. In the Solar System view the sounds were short im-pulse sounds and did not get as affected by this problem. In the Planetary view the audienceis supposed to be focused on the planet in front of them and in that case, it does not makesense to place the planet behind the audience, and in that way the problem is avoided. How-ever, for future work this problem should be fixed in order to make the sonification work inmore scenarios for a wider range of sounds.

5.4 Future Work

One disadvantage with the current implementation of the sonification is how SuperCollider isintegrated with OpenSpace. Currently SuperCollider works as a separate application whichmust be turned on manually in parallel with OpenSpace. There is therefore no easy way

1MIDI: https://www.midi.org/

39

Page 51: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

5.4. Future Work

to distribute the sonification along with OpenSpace without having to install SuperColliderand connect the two softwares. This could either be fixed by integrating SuperCollider intoOpenSpace in some way, or for a different approach OpenAL2 could be used instead of Su-perCollider. OpenSpace uses OpenGL for the graphics and OpenAL serves as a good soundcounterpart to OpenGL.

The GUI could be further improved with a user evaluation to make it more understand-able and usable. This could make it possible to display the sonification in a more interactiveenvironment, such as an exhibition, where more of the sonification parameters could be in-teractive.

Overall, the general sound design of the sonifications could be improved to create moreintuitive and pleasant sounds. As stated in section 2.3, the aim of this thesis was to create asonification that is both informative and pleasant. According to the result of the evaluations,some sonifications were informative but were not as pleasant and vice versa. To furtherimprove the sound design of the sonification, it would be beneficial with a more unifiedsound experience where all the sonifications are both informative and pleasant.

2OpenAL: https://openal.org/

40

Page 52: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

6 Conclusion

The aim with this thesis was to investigate how sonification could be used to increase under-standing of the Solar System in OpenSpace. Sonification was therefore designed and imple-mented by using SuperCollider and evaluated by taking inspiration from the audio experi-ence user scale BUZZ. The sonification was integrated into OpenSpace by sending positionaldata via OSC, as well as making the sonification interactive by extending the built-in GUI ofOpenSpace. Surround sound was implemented and rerouted to work for the Dome theater,where the final evaluations were held. By evaluating the sonification, the research questionsmentioned in the beginning of this study could be answered.

How can sonification give a comprehensible understanding of the Solar System?

By conveying information using both visuals and audio, the advantages of both senses can beused. Compared to our eyes, our ears are better at discerning temporal and spatial changes.This enables a more intuitive way of conveying this type of data which is utilized in sonifi-cation. The human hearing is also capable of perceiving and distinguishing between severalsounds simultaneously, which enables the possibility of conveying more information at thesame time. This will also immerse the audience in a better way, which can increase the cu-riosity and interest of the subject.

The results of the evaluations show that information can be conveyed about the SolarSystem while still presenting it in a pleasant and informative way. Some sonifications weremore successful than others however, which could depend on the mapping and design of thesounds, or the complexity of the data.

What Solar System data should be used in the sonification, and what sound parametersshould be manipulated to convey the data?

Choosing what Solar System data to use and how to convey it depended on the strengthsof sonification, as well as what data was the most relevant. Related work showed that thedata that is used to differentiate between the planets was considered the most relevant. Morespecifically, this was the data of the mass, density, diameter, gravity, length of day, lengthof year and temperature of the planets. OpenSpace displays and conveys some of this data,such as diameter, length of day and length of year. The sonification should however stillconvey this data, as the audience will be able to use both their visual and auditory senses

41

Page 53: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

to understand the data, which will increase the chance of them remembering that data sincethey can create associations to either or both senses.

Parameter mapping was used to convey the Solar System data by letting the data controlthe parameters of the sounds. This created the possibility to map temporal data such as thelength of day of the planets to the tempo of a sound, and surround sound could be utilizedto convey the length of year of the planets. Common mappings that had previously beenused in other sonifications were also used, such as mapping pitch to the size of an object. Thetypes of sounds used were purely synthesized sounds using different waveforms and noise,as well as an auditory icon of a bouncing ball to convey gravity.

How can a sonification be integrated into an interactive data visualization software suchas OpenSpace, and how can it make the experience more immersive?

In order to convey a sense that OpenSpace and the sonification were closely connected, thesonification needed to react to what was happening in OpenSpace. This was done by usingthe data available in OpenSpace to send it via OSC to SuperCollider that would shape thesonification based on that data. This included playing the sonifications for the planet thatwas in focus and to adjust the tempo of the sounds of the sonification based on how fasttime was simulated in OpenSpace. With the positional data, the sound of the planets andmoons were placed around the audience where they were visible on the screen using the7.1 surround system in the Dome theater. The sound level of the sonification was adjustedbased on the distance to the planet to highlight the sense that the sound originated from theplanet. To further increase the connection between the sonification and OpenSpace, the GUIin OpenSpace was extended with controls for the sonification. This enabled full control of thesonification using only the OpenSpace software.

Using the surround system to place the sonification around the audience and adjustingthe sound level based on the distance to the planet created a sense that the sound originatedfrom that planet. Both of these things increased the immersiveness for the audience. Allthe participants in the Dome evaluation stated that the sonification helped them feel moreimmersed in their experience, which indicated that the sonification increased the immersion.

42

Page 54: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Bibliography

[1] BP Abbott, R Abbott, Thomas Abbott, Matthew Abernathy, Fausto Acernese, K Ackley,LIGO Scientific Collaboration, and Virgo Collaboration. “Observation of GravitationalWaves from a Binary Black Hole Merger”. In: Phys. Rev. Lett. 116.6 (Feb. 2016), 061102:1–061102:16. DOI: 10.1103/PhysRevLett.116.061102.

[2] Emil Axelsson, Jonathas Costa, Cláudio Silva, Carter Emmart, Alexander Bock, andAnders Ynnerman. “Dynamic Scene Graph: Enabling Scaling, Positioning, and Navi-gation in the Universe”. In: Computer Graphics Forum 36.3 (July 2017), pp. 459–468. DOI:10.1111/cgf.13202.

[3] Mark Ballora. “Sonification, Science and Popular Music: In search of the ‘wow’”. In:Organised Sound 19.1 (2014), pp. 30–40. DOI: 10.1017/S1355771813000381.

[4] A. Bock, E. Axelsson, C. Emmart, M. Kuznetsova, C. Hansen, and A. Ynnerman.“OpenSpace: Changing the Narrative of Public Dissemination in Astronomical Visu-alization from What to How”. In: IEEE Computer Graphics and Applications 38.3 (May2018), pp. 44–57. DOI: 10.1109/MCG.2018.032421653.

[5] Alexander Bock, Emil Axelsson, Jonathas Costa, Gene Payne, Micah Acinapura, Vi-vian Trakinski, Carter Emmart, Claudio Silva, Charles Hansen, and Anders Ynnerman.“OpenSpace: A System for Astrographics”. In: IEEE TVCG 26.1 (Jan. 2020), pp. 633–642.DOI: 10.1109/TVCG.2019.2934259.

[6] Adelbert Bronkhorst. “The Cocktail Party Phenomenon: A Review of Research onSpeech Intelligibility in Multiple-Talker Conditions”. In: Acta Acustica united with Acus-tica 86.1 (Jan. 2000), pp. 117–128.

[7] Eric Chaisson and Steve McMillan. Astronomy Today. 8th ed. Harlow, UK: Pearson, 2015.ISBN: 9780321901675.

[8] Wanda Diaz-Merced, Robert Candey, Nancy Brickhouse, Matthew Schneps, John CMannone, Stephen Brewster, and Katrien Kolenberg. “Sonification of AstronomicalData”. In: Proceedings of the International Astronomical Union 7.S285 (Apr. 2012), pp. 133–136. DOI: 10.1017/S1743921312000440.

[9] Gaël Dubus and Roberto Bresin. “A Systematic Review of Mapping Strategies for theSonification of Physical Quantities”. In: PloS one 8.12 (Dec. 2013), e82491. DOI: 10.1371/journal.pone.0082491.

43

Page 55: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Bibliography

[10] Inger Ekman. “On the Desire to Not Kill Your Players: Rethinking Sound in Pervasiveand Mixed Reality Games”. In: Proceedings of the 8th International Conference on the Foun-dations of Digital Games. FDG 2013. May 2013, pp. 142–149.

[11] Thomas Hermann, Andy Hunt, and John G. Neuhoff, eds. The Sonification Handbook.1st ed. Berlin, Germany: Logos Publishing House, 2011. ISBN: 978-3-8325-2819-5. URL:https://sonification.de/handbook/.

[12] Andy Hunt and Thomas Hermann. “The Importance of Interaction in Sonification.” In:Proceedings of the 10th Meeting of the International Conference on Auditory Display. ICAD2004. Sydney, Australia, Jan. 2004.

[13] Patrik N. Juslin and Petri Laukka. “Expression, Perception, and Induction of MusicalEmotions: A Review and a Questionnaire Study of Everyday Listening”. In: J New MusicRes 33.3 (2004), pp. 217–238. DOI: 10.1080/0929821042000317813.

[14] D. Luebke, M. Reddy, Jonathan Cohen, Amitabh Varshney, Benjamin Watson, and R.Huebner. Level of Detail for 3D Graphics. San Francisco: Morgan Kaufmann, July 2002.

[15] Paul Lunn and Andrew Hunt. “Listening to the invisible: Sonification as a tool for as-tronomical discovery”. In: Proceedings of the ADS-VIS 2011: Making visible the invisible:art, design and science in data visualisation. Huddersfield, UK, Mar. 2011.

[16] James F. Lynch. “Acoustics and Astronomy”. In: Acoustics Today 13.4 (2017), pp. 27–34.

[17] James McCartney. “SuperCollider, a New Real Time Synthesis Language”. In: Proceed-ings of the 1996 International Computer Music Conference. ICMC 1996. Hong Kong: Michi-gan Publishing, Aug. 1996.

[18] Michael Quinton, Iain McGregor, and David Benyon. “Sonifying the Solar System”.In: Proceedings of the 22nd International Conference on Auditory Display. ICAD-2016. Can-berra, Australia, July 2016, pp. 28–35. DOI: 10.21785/icad2016.003.

[19] B. Rau, F. Frieß, M. Krone, C. Muller, and T. Ertl. “Enhancing visualization of molec-ular simulations using sonification”. In: 2015 IEEE 1st International Workshop on Virtualand Augmented Reality for Molecular Science. VARMS@IEEEVR. Arles, France, Mar. 2015,pp. 25–30. DOI: 10.1109/VARMS.2015.7151725.

[20] Niklas Rönnberg. “Musical sonification supports visual discrimination of color inten-sity”. In: Behaviour & Information Technology 38.10 (2019), pp. 1028–1037. DOI: 10.1080/0144929X.2019.1657952.

[21] Jonathan H. Schuett and Bruce N. Walker. “Measuring Comprehension in SonificationTasks That Have Multiple Data Streams”. In: Proceedings of the 8th Audio Mostly Confer-ence. Piteå, Sweden, Sept. 2013, 11:1–11:6. DOI: 10.1145/2544114.2544121.

[22] Hong Song and Kirsty Beilharz. “Concurrent Auditory Stream Discrimination in Au-ditory Graphing”. In: International Journal of Computers and Applications 1.3 (Jan. 2007),pp. 79–87.

[23] Jonathan Sterne. The Sound Studies Reader. 1st ed. New York, NY: Routledge, Aug. 2012.ISBN: 9780415771306.

[24] Brianna J. Tomlinson, Brittany E. Noah, and Bruce N. Walker. “BUZZ: An AuditoryInterface User Experience Scale”. In: Extended Abstracts of the 2018 CHI Conference onHuman Factors in Computing Systems. CHI EA ’18. Montreal QC, Canada: Associationfor Computing Machinery, 2018, LBW096:1–LBW096:6. DOI: 10 . 1145 / 3170427 .3188659.

[25] Brianna Tomlinson, R. Winters, Christopher Latina, Smruthi Bhat, Milap Rane, andBruce Walker. “Solar System Sonification: Exploring Earth and Its Neighbors ThroughSound”. In: Proceedings of the 23rd International Conference on Auditory Display. ICAD-2017. Pennsylvania, USA, June 2017, pp. 128–134. DOI: 10.21785/icad2017.027.

44

Page 56: Interactive Sonification in OpenSpaceliu.diva-portal.org/smash/get/diva2:1473031/FULLTEXT01.pdf · Interactive Sonification in OpenSpace Malin Ejdbo Elias Elmquist 2020-08-28. ...

Bibliography

[26] Bruce N. Walker. “Magnitude Estimation of Conceptual Data Dimensions for Use inSonification”. In: Journal of Experimental Psychology: Applied 8.4 (May 2002), pp. 211–221.DOI: 10.1037/1076-898X.8.4.211.

[27] Matthew Wright and Adrian Freed. “Open SoundControl: A New Protocol for Com-municating with Sound Synthesizers”. In: Proceedings of the 1997 International ComputerMusic Conference. ICMC 1997. Thessaloniki, Greece: Michigan Publishing, Sept. 1997.

[28] Matthew Wright, Adrian Freed, and Ali Momeni. “OpenSound Control: State of theArt 2003”. In: Proceedings of the 2003 Conference on New Interfaces for Musical Expression.NIME-03. Montreal, Canada, June 2003, pp. 153–159.

45