Robust detection of event-related potentials in a user-voluntary short-term imagery task

Autoři: Min-Ho Lee aff001;  John Williamson aff001;  Young-Jin Kee aff001;  Siamac Fazli aff002;  Seong-Whan Lee aff001
Působiště autorů: Department of Brain and Cognitive Engineering, Korea University, Seoul, Korea aff001;  Department of Computer Science, Nazarbayev University, Nur-Sultan, Kazakhstan aff002
Vyšlo v časopise: PLoS ONE 14(12)
Kategorie: Research Article
doi: 10.1371/journal.pone.0226236


Event-related potentials (ERPs) represent neuronal activity in the brain elicited by external visual or auditory stimulation and are widely used in brain-computer interface (BCI) systems. The ERP responses are elicited a few milliseconds after attending to an oddball stimulus; target and non-target stimuli are repeatedly flashed, and the ERP trials are averaged over time in order to improve their decoding accuracy. To reduce this time-consuming process, previous studies have attempted to evoke stronger ERP responses by changing certain experimental parameters like color, size, or the use of a face image as a target symbol. Since these exogenous potentials can be naturally evoked by merely looking at a target symbol, the BCI system could generate unintended commands while subjects are gazing at one of the symbols in a non-intentional mental state. We approached this problem of unintended command generation by assuming that a greater effort by the user in a short-term imagery task would evoke a discriminative ERP response. Three tasks were defined: passive attention, counting, and pitch-imagery. Users were instructed to passively attend to a target symbol, or to perform a mental tally of the number of target presentations, or to perform the novel task of imagining a high-pitch tone when the target symbol was highlighted. The decoding accuracy were 71.4%, 83.5%, and 89.2% for passive attention, counting, and pitch-imagery, respectively, after the fourth averaging procedure. We found stronger deflections in the N500 component corresponding to the levels of mental effort (passive attention: -1.094 ±0.88 μV, counting: -2.226 ±0.97 μV, and pitch-imagery: -2.883 ±0.74 μV), which highly influenced the decoding accuracy. In addition, the rate of binary classification between passive attention and pitch-imagery tasks was 73.5%, which is an adequate classification rate that motivated us to propose a two-stage classification strategy wherein the target symbols are estimated in the first stage and the passive or active mental state is decoded in the second stage. In this study, we found that the ERP response and the decoding accuracy are highly influenced by the user’s voluntary mental tasks. This could lead to a useful approach in practical ERP systems in two respects. Firstly, the user-voluntary tasks can be easily utilized in many different types of BCI systems, and performance enhancement is less dependent on the manipulation of the system’s external, visual stimulus parameters. Secondly, we propose an ERP system that classifies the brain state as intended or unintended by considering the measurable differences between passively gazing and actively performing the pitch-imagery tasks in the EEG signal thus minimizing unintended commands to the BCI system.

Klíčová slova:

Attention – Data acquisition – Electroencephalography – Event-related potentials – Evoked potentials – Man-computer interface – Microwave radiation – Vision


1. Nicolas-Alonso LF, Gomez-Gil J. Brain-computer interfaces, a review. Sensors. 2012;12(2):1211–1279. doi: 10.3390/s120201211 22438708

2. Pfurtscheller G, Neuper C. Motor imagery and direct brain-computer communication. Proceedings of the IEEE. 2001;89(7):1123–1134. doi: 10.1109/5.939829

3. Farwell LA, Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalography and clinical Neurophysiology. 1988;70(6):510–523. doi: 10.1016/0013-4694(88)90149-6 2461285

4. Ravden D, Polich J. On P300 measurement stability: habituation, intra-trial block variation, and ultradian rhythms. Biological Psychology. 1999;51(1):59–76. doi: 10.1016/s0301-0511(99)00015-0 10579421

5. Kim IH, Kim JW, Haufe S, Lee SW. Detection of braking intention in diverse situations during simulated driving based on EEG feature combination. Journal of neural engineering. 2014;12(1):016001. doi: 10.1088/1741-2560/12/1/016001 25426805

6. Speier W, Deshpande A, Cui L, Chandravadia N, Roberts D, Pouratian N. A comparison of stimulus types in online classification of the P300 speller using language models. PloS one. 2017;12(4):e0175382. doi: 10.1371/journal.pone.0175382 28406932

7. Salvaris M, Sepulveda F. Visual modifications on the P300 speller BCI paradigm. Journal of Neural Engineering. 2009;6(4):046011. doi: 10.1088/1741-2560/6/4/046011 19602731

8. Yeom SK, Fazli S, Müller KR, Lee SW. An efficient ERP-based brain-computer interface using random set presentation and face familiarity. PloS one. 2014;9(11):e111157. doi: 10.1371/journal.pone.0111157 25384045

9. Li Q, Liu S, Li J, Bai O. Use of a green familiar faces paradigm improves P300-speller brain-computer interface performance. PloS one. 2015;10(6):e0130325. doi: 10.1371/journal.pone.0130325 26087308

10. Kubová Z, Kremlacek J, Szanyi J, Chlubnová J, Kuba M. Visual event-related potentials to moving stimuli: normative data. Physiological Research. 2002;51(2):199–204. 12108931

11. Sellers EW, Krusienski DJ, McFarland DJ, Vaughan TM, Wolpaw JR. A P300 event-related potential brain–computer interface (BCI): the effects of matrix size and inter stimulus interval on performance. Biological psychology. 2006;73(3):242–252. doi: 10.1016/j.biopsycho.2006.04.007 16860920

12. Polich J. P300 development from auditory stimuli. Psychophysiology. 1986;23(5):590–597. doi: 10.1111/j.1469-8986.1986.tb00677.x 3809365

13. Polich J, McIsaac HK. Comparison of auditory P300 habituation from active and passive conditions. International Journal of Psychophysiology. 1994;17(1):25–34. doi: 10.1016/0167-8760(94)90052-3 7961051

14. Bennington JY, Polich J. Comparison of P300 from passive and active tasks for auditory and visual stimuli. International Journal of Psychophysiology. 1999;34(2):171–177. doi: 10.1016/s0167-8760(99)00070-7 10576401

15. Wenzel MA, Almeida I, Blankertz B. Is neural activity detected by ERP-based brain-computer interfaces task specific? PloS one. 2016;11(10):e0165556. doi: 10.1371/journal.pone.0165556 27792781

16. Lee MH, Williamson J, Won DO, Fazli S, Lee SW. A high performance spelling system based on EEG-EOG signals with visual feedback. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2018;26(7):1443–1459. doi: 10.1109/TNSRE.2018.2839116 29985154

17. Polich J. Comparison of P300 from a passive tone sequence paradigm and an active discrimination task. Psychophysiology. 1987;24(1):41–46. doi: 10.1111/j.1469-8986.1987.tb01859.x 3575593

18. Giard MH, Perrin F, Pernier J, Bouchet P. Brain generators implicated in the processing of auditory stimulus deviance: a topographic event-related potential study. Psychophysiology. 1990;27(6):627–640. doi: 10.1111/j.1469-8986.1990.tb03184.x 2100348

19. Giard M, Lavikahen J, Reinikainen K, Perrin F, Bertrand O, Pernier J, et al. Separate representation of stimulus frequency, intensity, and duration in auditory sensory memory: an event-related potential and dipole-model analysis. Journal of cognitive neuroscience. 1995;7(2):133–143. doi: 10.1162/jocn.1995.7.2.133 23961819

20. Schreuder M, Rost T, Tangermann M. Listen, you are writing! Speeding up online spelling with a dynamic auditory BCI. Frontiers in neuroscience. 2011;5. doi: 10.3389/fnins.2011.00112

21. Kübler A, Furdea A, Halder S, Hammer EM, Nijboer F, Kotchoubey B. A brain–computer interface controlled auditory event-related potential (P300) spelling system for locked-in patients. Annals of the New York Academy of Sciences. 2009;1157(1):90–100. doi: 10.1111/j.1749-6632.2008.04122.x 19351359

22. Song Y, Sepulveda F. A novel onset detection technique for brain–computer interfaces using sound-production related cognitive tasks in simulated-online system. Journal of neural engineering. 2017;14(1):016019. doi: 10.1088/1741-2552/14/1/016019 28091395

23. Schreuder M, Blankertz B, Tangermann M. A new auditory multi-class brain-computer interface paradigm: spatial hearing as an informative cue. PloS One. 2010;5(4):e9813. doi: 10.1371/journal.pone.0009813 20368976

24. Halder S, Rea M, Andreoni R, Nijboer F, Hammer E, Kleih S, et al. An auditory oddball brain–computer interface for binary choices. Clinical Neurophysiology. 2010;121(4):516–523. doi: 10.1016/j.clinph.2009.11.087 20093075

25. De Vos M, Gandras K, Debener S. Towards a truly mobile auditory brain–computer interface: exploring the P300 to take away. International journal of psychophysiology. 2014;91(1):46–53. doi: 10.1016/j.ijpsycho.2013.08.010 23994208

26. Falkenstein M, Hoormann J, Hohnsbein J. ERP components in Go/Nogo tasks and their relation to inhibition. Acta psychologica. 1999;101(2-3):267–291. doi: 10.1016/s0001-6918(99)00008-6 10344188

27. Kopp B, Mattler U, Goertz R, Rist F. N2, P3 and the lateralized readiness potential in a nogo task involving selective response priming. Electroencephalography and clinical Neurophysiology. 1996;99(1):19–27. doi: 10.1016/0921-884x(96)95617-9 8758967

28. Holcomb PJ. Automatic and attentional processing: An event-related brain potential analysis of semantic priming. Brain and language. 1988;35(1):66–85. doi: 10.1016/0093-934x(88)90101-0 3179703

29. Lee MH, Kim KT, Kee YJ, Jeong JH, Kim SM, Fazli S, et al. OpenBMI: A real-time data analysis toolbox for brain-machine interfaces. In: Systems, Man, and Cybernetics (SMC), 2016 IEEE International Conference on. IEEE; 2016. p. 001884–001887.

30. Blankertz B, Lemm S, Treder M, Haufe S, Müller KR. Single-trial analysis and classification of ERP components:a tutorial. NeuroImage. 2011;56(2):814–825. doi: 10.1016/j.neuroimage.2010.06.048 20600976

31. Friedman JH. Regularized discriminant analysis. Journal of the American Statistical Association. 1989;84(405):165–175. doi: 10.1080/01621459.1989.10478752

32. Potts GF. An ERP index of task relevance evaluation of visual stimuli. Brain and cognition. 2004;56(1):5–13. doi: 10.1016/j.bandc.2004.03.006 15380870

33. Lemm S, Blankertz B, Dickhaus T, Müller KR. Introduction to machine learning for brain imaging. Neuroimage. 2011;56(2):387–399. doi: 10.1016/j.neuroimage.2010.11.004 21172442

34. Zenker F, Barajas J. Auditory P300 development from an active, passive and single-tone paradigms. International journal of psychophysiology. 1999;33(2):99–111. doi: 10.1016/s0167-8760(99)00033-1 10489075

35. Lee MH, Kwon OY, Kim YJ, Kim HK, Lee YE, Williamson J, et al. EEG dataset and OpenBMI toolbox for three BCI paradigms: an investigation into BCI illiteracy. GigaScience. 2019;8(5):giz002. doi: 10.1093/gigascience/giz002 30698704

36. Jeffreys D, Axford J. Source locations of pattern-specific components of human visual evoked potentials. I. Component of striate cortical origin. Experimental brain research. 1972;16(1):1–21. doi: 10.1007/bf00233371 4646539

37. Park M, Choi J, Park S, Lee J, Jung H, Sohn B, et al. Dysfunctional information processing during an auditory event-related potential task in individuals with Internet gaming disorder. Translational psychiatry. 2016;6(1):e721. doi: 10.1038/tp.2015.215 26812042

38. Falkenstein M, Hohnsbein J, Hoormann J, Blanke L. Effects of crossmodal divided attention on late ERP components. II. Error processing in choice reaction tasks. Electroencephalography and clinical neurophysiology. 1991;78(6):447–455. doi: 10.1016/0013-4694(91)90062-9 1712280

39. Tateyama T, Hummel T, Roscher S, Post H, Kobal G. Relation of olfactory event-related potentials to changes in stimulus concentration. Electroencephalography and Clinical Neurophysiology/Evoked Potentials Section. 1998;108(5):449–455. doi: 10.1016/S0168-5597(98)00022-7

40. Petten CV, Kutas M, Kluender R, Mitchiner M, McIsaac H. Fractionating the word repetition effect with event-related potentials. Journal of cognitive neuroscience. 1991;3(2):131–150. doi: 10.1162/jocn.1991.3.2.131 23972089

41. Yeung N, Holroyd CB, Cohen JD. ERP correlates of feedback and reward processing in the presence and absence of response choice. Cerebral cortex. 2004;15(5):535–544. doi: 10.1093/cercor/bhh153 15319308

42. Kwak NS, Müller KR, Lee SW. A convolutional neural network for steady state visual evoked potential classification under ambulatory environment. PloS one. 2017;12(2):e0172578. doi: 10.1371/journal.pone.0172578 28225827

43. Won DO, Hwang HJ, Dähne S, Müller KR, Lee SW. Effect of higher frequency on the classification of steady-state visual evoked potentials. Journal of neural engineering. 2015;13(1):016014. doi: 10.1088/1741-2560/13/1/016014 26695712

44. Obeidat Q, Campbell T, Kong J. Spelling with a small mobile brain-computer interface in a Moving Wheelchair. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 2017.

45. Cao T, Wan F, Wong CM, da Cruz JN, Hu Y. Objective evaluation of fatigue by EEG spectral analysis in steady-state visual evoked potential-based brain-computer interfaces. Biomedical Engineering Online. 2014;13(1):28. doi: 10.1186/1475-925X-13-28 24621009

46. Howells FM, Stein DJ, Russell VA. Perceived mental effort correlates with changes in tonic arousal during attentional tasks. Behavioral and Brain Functions. 2010;6(1):39. doi: 10.1186/1744-9081-6-39 20615239

47. van der Waal M, Severens M, Geuze J, Desain P. Introducing the tactile speller: an ERP-based brain–computer interface for communication. Journal of Neural Engineering. 2012;9(4):045002. doi: 10.1088/1741-2560/9/4/045002 22831906

48. Verbaten M, Roelofs J, Sjouw W, Slangen JL. Habituation of early and late visual ERP components and the orienting reaction: The effect of stimulus information. International Journal of Psychophysiology. 1986;3(4):287–298. doi: 10.1016/0167-8760(86)90037-1 3700189

49. Ravden D, Polich J. Habituation of P300 from visual stimuli. International Journal of Psychophysiology. 1998;30(3):359–365. doi: 10.1016/s0167-8760(98)00039-7 9834892

50. Kübler A, Neumann N, Kaiser J, Kotchoubey B, Hinterberger T, Birbaumer NP. Brain-computer communication: self-regulation of slow cortical potentials for verbal communication. Archives of physical medicine and rehabilitation. 2001;82(11):1533–1539. doi: 10.1053/apmr.2001.26621 11689972

51. Seo SP, Lee MH, Williamson J, Lee SW. Changes in fatigue and EEG amplitude during a longtime use of brain-computer interface. In: 2019 7th International Winter Conference on Brain-Computer Interface (BCI). IEEE; 2019. p. 1–3.

52. Lee MH, Williamson J, Lee YE, Lee SW. Mental fatigue in central-field and peripheral-field steady-state visually evoked potential and its effects on event-related potential responses. NeuroReport. 2018;29(15):1301. doi: 10.1097/WNR.0000000000001111 30102642

53. Ding X, Lee SW. Changes of functional and effective connectivity in smoking replenishment on deprived heavy smokers: a resting-state fMRI study. PLoS One. 2013;8(3):e59331. doi: 10.1371/journal.pone.0059331 23527165

54. Chiappa KH. Evoked potentials in clinical medicine. Lippincott Williams & Wilkins; 1997.

55. Yuan P, Wang Y, Gao X, Jung TP, Gao S. A collaborative brain-computer interface for accelerating human decision making. In: International Conference on Universal Access in Human-Computer Interaction. Springer; 2013. p. 672–681.

56. Zhang H, Guan C, Wang C. Asynchronous P300-based brain–computer interfaces: A computational approach with statistical models. IEEE Transactions on Biomedical Engineering. 2008;55(6):1754–1763. doi: 10.1109/tbme.2008.919128 18714840

Článek vyšel v časopise


2019 Číslo 12