#PAGE_PARAMS# #ADS_HEAD_SCRIPTS# #MICRODATA#

The Echobot: An automated system for stimulus presentation in studies of human echolocation


Autoři: Carlos Tirado aff001;  Peter Lundén aff001;  Mats E. Nilsson aff001
Působiště autorů: Gösta Ekman Laboratory, Department of Psychology, Stockholm University, Stockholm, Sweden aff001;  Research Institute of Sweden, Borås, Sweden aff002
Vyšlo v časopise: PLoS ONE 14(10)
Kategorie: Research Article
doi: https://doi.org/10.1371/journal.pone.0223327

Souhrn

Echolocation is the detection and localization of objects by listening to the sounds they reflect. Early studies of human echolocation used real objects that the experimental leader positioned manually before each experimental trial. The advantage of this procedure is the use of realistic stimuli; the disadvantage is that manually shifting stimuli between trials is very time consuming making it difficult to use psychophysical methods based on the presentation of hundreds of stimuli. The present study tested a new automated system for stimulus presentation, the Echobot, that overcomes this disadvantage. We tested 15 sighted participants with no prior experience of echolocation on their ability to detect the reflection of a loudspeaker-generated click from a 50 cm circular aluminum disk. The results showed that most participants were able to detect the sound reflections. Performance varied considerably, however, with mean individual thresholds of detection ranging from 1 to 3.2 m distance from the disk. Three participants in the loudspeaker experiment also tested using self-generated vocalization. One participant performed better using vocalization and one much worse than in the loudspeaker experiment, illustrating that performance in echolocation experiments using vocalizations not only measures the ability to detect sound reflections, but also the ability to produce efficient echolocation signals. Overall, the present experiments show that the Echobot may be a useful tool in research on human echolocation.

Klíčová slova:

Bioacoustics – Echoes – Hearing – Psychophysics – Reflection – Loudspeakers – Echolocation – Vocalization


Zdroje

1. Kolarik AJ, Cirstea S, Pardhan S, Moore BCJ. A summary of research investigating echolocation abilities of blind and sighted humans. Hear Res. 2014;310(Supplement C): 60–8. doi: 10.1016/j.heares.2014.01.010 24524865

2. Stroffregen TA, Pittenger JB. Human echolocation as a basic form of perception and action. Ecol Psychol. 1995;7:181–216. doi: 10.1207/s15326969eco0703_2

3. Thaler L, Goodale MA. Echolocation in humans: an overview. Wiley Interdiscip Rev Cogn Sci. 2016;7: 382–93. doi: 10.1002/wcs.1408 27538733

4. Supa M, Cotzin M, Dallenbach KM. "Facial vision": The perception of obstacles by the blind. Am J Psychol. 1944;57: 133–83. doi: 10.2307/1416946

5. Kellogg WN. Sonar system of the blind. Science. 1962;137(3528): 399–404. PubMed Central PMCID: PMC14454994. doi: 10.1126/science.137.3528.399 14454994

6. Rice CE, Feinstein SH, Schusterman RJ. Echo-detection ability of the blind: size and distance factors. J Exp Psychol. 1965;70: 246. doi: 10.1037/h0022215 14343251

7. Schenkman BN, Jansson G. The detection and localization of objects by the blind with the aid of long-cane tapping sounds. Hum Factors. 1986;28: 607–18. doi: 10.1177/001872088602800510

8. Tonelli A, Brayda L, Gori M. Depth Echolocation Learnt by Novice Sighted People. PLoS ONE. 2016; 11(6): e0156654. https://doi.org/10.1371/journal.pone.0156654

9. Rosenblum L. D., Gordon M. S., & Jarquin L. Echolocating distance by moving and stationary listeners. Eco Psy. 2000;12:3 181-206.doi: https://doi.org/10.1207/S15326969ECO1203_1

10. Schenkman BN, Nilsson ME. Human echolocation: blind and sighted persons' ability to detect sounds recorded in the presence of a reflecting object. Perception. 2010;39: 483–501. doi: 10.1068/p6473 20514997

11. Norman LJ, Thaler L. Human echolocation for target detection is more accurate with emissions containing higher spectral frequencies, and this is explained by echo intensity. Iperception. 2018;9(3): 204166951877698. doi: 10.1177/2041669518776984 29854377

12. Thaler L, Milne JL, Arnott SR, Kish D, Goodale MA. Neural correlates of motion processing through echolocation, source hearing, and vision in blind echolocation experts and sighted echolocation novices. J Neurophysiol. 2013;111: 112–27. doi: 10.1152/jn.00501.2013 24133224

13. Schörnich S, Nagy A, Wiegrebe L. Discovering Your inner bat: echo–acoustic target ranging in humans. J Assoc Res Otolaryngol. 2012;13: 673–82. doi: 10.1007/s10162-012-0338-z 22729842

14. Pelegrín-García D, De Sena E, van Waterschoot T, Rychtáriková M, Glorieux C. Localization of a virtual wall by means of active echolocation by untrained sighted persons. Appl Acoust. 2018;139: 82–92. doi: 10.1016/j.apacoust.2018.04.018

15. Rowan D, Papadopoulos T, Edwards D, Holmes H, Hollingdale A, Evans L, et al. Identification of the lateral position of a virtual object based on echoes by humans. Hear Res. 2013;300: 56–65. doi: 10.1016/j.heares.2013.03.005 23538130

16. Stockholm University. The Echobot: a new system to study human echolocation [video file]. 2019 May 7 [cited 2019 May 7]. Available from: https://www.youtube.com/watch?v=MTkF0k-CkbM&feature=youtu.be

17. Kingdom F, Prins N. Psychophysics: a practical introduction, 2nd ed. London: Academic Press; 2016.

18. Schenkman BN, Nilsson ME. Human echolocation: pitch versus loudness information. Perception. 2011;40: 840–52. doi: 10.1068/p6898 22128556

19. Thaler L, Castillo-Serrano J. People’s ability to detect objects using click-based echolocation: a direct comparison between mouth-clicks and clicks made by a loudspeaker. PLoS One. 2016;11(5): e0154868. doi: 10.1371/journal.pone.0154868 27135407

20. Thaler L, Reich GM, Zhang X, Wang D, Smith GE, Tao Z, et al. Mouth-clicks used by blind expert human echolocators–signal description and model based signal synthesis. PLoS Comput Biol. 2017;13(8): e1005670. doi: 10.1371/journal.pcbi.1005670 28859082

21. TEAM, R. Core. R: A language and environment for statistical computing; 2015. 2018. URL: https://www.R-project.org/.

22. Shepherd D, Hautus MJ, Stocks MA, Quek SY. The single interval adjustment matrix (SIAM) yes–no task: an empirical assessment using auditory and gustatory stimuli. Attention, Perception, & Psychophysics. 2011;73: 1934. doi: 10.3758/s13414-011-0137-3 21533962

23. Tirado C, Nilsson M, Lundén P. Open data: The Echobot—an automated system for stimulus presentation in studies of human echolocation. Stockholm University; 2019. doi: 10.17045/sthlmuni.8047259

24. Moore BC. An introduction to the psychology of hearing. Bingley, UK: Emerald Group Publishing; 2012.

25. Nilsson ME, Tirado C, Szychowska M. Psychoacoustic evidence for stronger discrimination suppression of spatial information conveyed by lag-click interaural time than interaural level differences. J Acoust Soc Am. 2019;145: 512–24. doi: 10.1121/1.5087707 30710980

26. Teng S, Whitney D. The acuity of echolocation: Spatial resolution in the sighted compared to expert performance. J Vis Impair Blind. 2011; 105(1):20–32. 21611133; PubMed Central PMCID: PMC3099177.

27. Schörnich S. Psychophysics of Human Echolocation. Advan in Exp Med and Bio. 2013. 787:311–319. https://doi.org/10.1007/978-1-4614-1590-9_35


Článek vyšel v časopise

PLOS One


2019 Číslo 10
Nejčtenější tento týden
Nejčtenější v tomto čísle
Kurzy

Zvyšte si kvalifikaci online z pohodlí domova

Svět praktické medicíny 1/2024 (znalostní test z časopisu)
nový kurz

Koncepce osteologické péče pro gynekology a praktické lékaře
Autoři: MUDr. František Šenk

Sekvenční léčba schizofrenie
Autoři: MUDr. Jana Hořínková

Hypertenze a hypercholesterolémie – synergický efekt léčby
Autoři: prof. MUDr. Hana Rosolová, DrSc.

Význam metforminu pro „udržitelnou“ terapii diabetu
Autoři: prof. MUDr. Milan Kvapil, CSc., MBA

Všechny kurzy
Kurzy Podcasty Doporučená témata Časopisy
Přihlášení
Zapomenuté heslo

Zadejte e-mailovou adresu, se kterou jste vytvářel(a) účet, budou Vám na ni zaslány informace k nastavení nového hesla.

Přihlášení

Nemáte účet?  Registrujte se

#ADS_BOTTOM_SCRIPTS#