Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Eyes-closed hybrid brain-computer interface employing frontal brain activation

  • Jaeyoung Shin,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Department of Biomedical Engineering, Hanyang University, Seoul, Korea

  • Klaus-Robert Müller ,

    Contributed equally to this work with: Klaus-Robert Müller, Han-Jeong Hwang

    Roles Conceptualization, Funding acquisition, Supervision, Validation, Writing – original draft, Writing – review & editing

    klaus-robert.mueller@tu-berlin.de (KRM); h2j@kumoh.ac.kr (HJH)

    Affiliations Machine Learning Group, Berlin Institute of Technology (TU Berlin), Berlin, Germany, Department of Brain and Cognitive Engineering, Korea University, Seoul, Korea, Max Planck Institute for Informatics, Stuhlsatzenhausweg, Saarbrücken, Germany

  • Han-Jeong Hwang

    Contributed equally to this work with: Klaus-Robert Müller, Han-Jeong Hwang

    Roles Conceptualization, Funding acquisition, Investigation, Supervision, Validation, Writing – original draft, Writing – review & editing

    klaus-robert.mueller@tu-berlin.de (KRM); h2j@kumoh.ac.kr (HJH)

    Affiliation Department of Medical IT Convergence Engineering, Kumoh National Institute of Technology, Kumi, Korea

Abstract

Brain-computer interfaces (BCIs) have been studied extensively in order to establish a non-muscular communication channel mainly for patients with impaired motor functions. However, many limitations remain for BCIs in clinical use. In this study, we propose a hybrid BCI that is based on only frontal brain areas and can be operated in an eyes-closed state for end users with impaired motor and declining visual functions. In our experiment, electroencephalography (EEG) and near-infrared spectroscopy (NIRS) were simultaneously measured while 12 participants performed mental arithmetic (MA) and remained relaxed (baseline state: BL). To evaluate the feasibility of the hybrid BCI, we classified MA- from BL-related brain activation. We then compared classification accuracies using two unimodal BCIs (EEG and NIRS) and the hybrid BCI in an offline mode. The classification accuracy of the hybrid BCI (83.9 ± 10.3%) was shown to be significantly higher than those of unimodal EEG-based (77.3 ± 15.9%) and NIRS-based BCI (75.9 ± 6.3%). The analytical results confirmed performance improvement with the hybrid BCI, particularly for only frontal brain areas. Our study shows that an eyes-closed hybrid BCI approach based on frontal areas could be applied to neurodegenerative patients who lost their motor functions, including oculomotor functions.

Introduction

Brain-computer interfaces (BCIs) have in the past enabled patients to control external devices directly without the help of muscular movements [15]. Thus, many research groups have explored BCI technology and considerably improved the performance of BCI systems [612]. Various BCI paradigms based on electroencephalography (EEG) have been introduced to implement BCIs for physically challenged patients. These paradigms include motor imagery [1315], P300 [1618], steady-state visual evoked potential (SSVEP) [19], and others. However, these paradigms have limitations with respect to severely motor-impaired patients such as late-stage amyotrophic lateral sclerosis (ALS). For example, some of them cannot generate reliable sensorimotor rhythms for motor-imagery-based BCI [2022]. Also, as BCIs based on exogenous paradigms such as conventional visual P300 and SSVEP generally require moderate oculomotor functions, these exogenous paradigms cannot be fully exploited for those with oculomotor dysfunctions that are often presented in late-stage ALS or completely locked-in state (CLIS) patients [23]. To overcome these constraints, previous studies introduced BCIs based on cognitive tasks instead of motor imagery tasks and validated its feasibility with both healthy subjects and ALS patients [2022]. Also, an eyes-closed (EC) SSVEP-based BCI was recently introduced [19], which validated the feasibility of an SSVEP-based BCI under EC conditions for healthy participants and for an (ALS) patient with partially impaired oculomotor functions. A more recent study introduced a novel EC BCI paradigm based on visual P300 and demonstrated its effectiveness [23].

In our previous study, we first proposed an EC BCI system using a representative endogenous BCI paradigm, namely, mental arithmetic (MA), to check whether an endogenous BCI paradigm can also be used in an EC condition [24]. In [24], we used near-infrared spectroscopy (NIRS) signals of prefrontal cortex (PFC) areas, which represents a promising alternative to EEG for BCI research, as its sensitivity to physiological artifacts (e.g., electrooculogram (EOG)) is limited. It has been well documented that EEG signals significantly change under an EC state (e.g., α-rhythm). However, because PFC hemodynamic changes are irrelevant to an EC condition [25], the feasibility of the EC NIRS-BCI could be successfully verified. Moreover, because PFC is essentially below the hair-free region of the skull, we could shorten preparation time and speed up the experiment. In fact, many NIRS-BCI studies have focused on PFC hemodynamic changes, as PFC areas are free from one of the critical drawbacks of a NIRS-based BCI: signal amplitude attenuation as a result of dense, long, and dark hairs blocking light penetration to the scalp [2632].

Although we successfully demonstrated the feasibility of an EC NIRS-BCI system, the classification accuracy was relatively low compared to those reported in standard EEG-BCI studies [19, 23]. One possible means of improving classification accuracy is to use a hybrid approach that combines two brain-imaging modalities (e.g., EEG and NIRS [3336]). That the hybrid BCI can increase the reliability of BCI systems in terms of performance has already been demonstrated [37, 38]. In particular, the performance of BCI systems could be enhanced by integrating two kinds of BCI systems or using complementary information of brain activations measured with different modalities. As an example of the former case, a hybrid EEG-NIRS BCI using the SSVEP paradigm was studied, in which NIRS and EEG signals were utilized to operate a brain switch and produce an actual BCI command, respectively [39]. As an example of the latter case, EEG and NIRS data were simultaneously used to decode motor imagery tasks. Here, the decoding performance improved considerably compared to that when using the unimodal data (EEG or NIRS) [33].

Although hybrid EEG-NIRS BCIs have proven useful, they may still be impractical in real clinical scenarios because most hybrid BCI systems employ sensor positions of the scalp with hairs. For NIRS, dense hairs interfere with light penetration to the scalp, which leads to a decrease in signal-to-noise ratio. Regarding EEG, recording EEG signals from the scalp covered in hair also creates practical concerns (attaching electrodes and washing hairs after the experiment). This is particularly the case with patients. Employing frontal areas that include hair-free regions as much as possible is a potential means of reducing this problem. In this case, as a result of the location of their respective brain sources, non-motor tasks (e.g., MA) are considered more appropriate than standard BCI paradigms (e.g., motor imagery). However, no study has yet been conducted to investigate the feasibility of using frontal areas for a hybrid EEG-NIRS BCI approach.

Our previous research aimed to verify the feasibility of the EC NIRS-BCI based on an endogenous BCI paradigm, namely, MA [24]. This study proposes a hybrid EEG-NIRS BCI that utilizes frontal areas, including a hair-free PFC, to improve the BCI performance of our previous EC NIRS-BCI in terms of classification accuracy in an offline mode. To this end, our present study examines the performance of an EC hybrid EEG-NIRS BCI operated by MA that uses only the frontal areas for a convenient system setup.

Materials and methods

Participants

Twelve participants participated in the experiment (five males and seven females, 26.7 ± 3.7 (mean ± standard deviation)), none of whom reported any previous or current mental illness. They were given detailed information about the experiment, and a written consent was obtained from each. After completing the experiment, participants were financially reimbursed. Our experiment was performed in compliance with the Declaration of Helsinki and was approved by the Ethics Committee of the Institute of Psychology and Ergonomics, Berlin Institute of Technology (approval number: SH_01_20150330).

Instrumentation

A brainAmp amplifier (Brain Products GmbH, Gliching, Germany) was used to record EEG signals using linked mastoid reference (sampling rate: 1000 Hz) from 22 locations on a custom-made elastic cap (EASYCAP GmbH, Herrsching, Germany; AFp1, AFp2, AFF1h, AFF2h, AFF5h, AFF6h, F3, F4, F7, F8, Cz, C3, C4, T7, T8, Pz, P3, P4, P7, P8, OI1, and OI2). The ground electrode was placed on Fz. A NIRScout (NIRx GmbH, Berlin, Germany) was used to record NIRS signals (sampling rate: 12.5 Hz). Five NIR light sources and three detectors were positioned on the PFC. The adjacent sources and detectors consisted of nine channels near Fp1, Fp2, and Fpz. The inter-optode distance was set as 30 mm. NIRS optodes were placed on the same cap as the EEG electrodes. Fig 1 shows the EEG electrodes (blue and white circles) and NIRS channels (red circles). The one gray circle indicates the ground electrode. EOG was recorded using the same BrainAmp amplifier at the same sampling rate of the EEG using two vertical (above and below the left eye) and two horizontal (the outer canthus of each eye) electrodes. All signals were recorded simultaneously, and trigger signals were sent to each system via a parallel port using MATLAB for data synchronization. All data used in this study are fully available without any restriction from the following website: https://doi.org/10.6084/m9.figshare.5900842.v1

thumbnail
Fig 1. Location of EEG electrodes (blue, white, and gray circles) and NIRS channels (red circles).

Gray indicates the ground electrode. Blue and white electrodes represent frontal and parieto-occipital EEG channels, respectively. Note that only the frontal EEG channels were used with NIRS channels in data analysis to investigate the feasibility of the EEG-NIRS EC BCI employing only frontal areas.

https://doi.org/10.1371/journal.pone.0196359.g001

Experimental paradigm

All participants were seated in a comfortable armchair 1.6 m from a 50-inch white screen and all instructions were displayed by a video projector. Fig 2 shows the experimental paradigm. Each session consisted of a pre-rest (15 s) with a fixation cross and 20 repetitions of a single trial followed by a post-rest (15 s). In the pre- and post-rest periods, participants rested with their eyes open while looking at a fixation cross that was displayed in the middle of the screen. A single trial included a visual instruction (2 s) indicating the type of task, a task period (10 s), and a rest period with a random length (15 to 17 s). In the instruction period, the type of task was randomly given (MA or BL). For MA, an arbitrary three-digit number minus a one-digit number between 6 and 9 was given as an initial calculation. For BL, a fixation cross was displayed. Participants were instructed to close their eyes as soon as they recognized the type of task. During the task period that began with a short beep (250 ms), participants were asked to continue performing the given task with their eyes closed. For MA, participants continuously subtracted a one-digit number from the result of their previous calculation. For BL, participants remained relaxed. After the short beep (250 ms), a “STOP” sign was displayed on the screen, the fixation cross reappeared, and participants relaxed with their eyes open while looking at the cross.

thumbnail
Fig 2. Block of the experimental paradigm.

Each session consisted of pre-rest (15 s) staring at a fixation cross, then 20 repetitions of a single trial followed by a post-rest (15 s). With respect to the instruction period, “567–8” and “+” indicate MA and BL, respectively. Note that different combinations of three- and one-digit numbers were used to prevent the participants from becoming accustomed to the problem.

https://doi.org/10.1371/journal.pone.0196359.g002

Data analysis

Preprocessing

MATLAB R2013b (MathWorks, Natick, MA, USA) was used for data analysis. For data processing and analysis, only frontal EEG electrodes (AFp1, AFp2, AFF1h, AFF2h, AFF5h, AFF6h, F3, F4, F7, and F8) were used together with the NIRS channels in order to investigate the feasibility of the EC hybrid BCI employing only frontal areas. The EEG signals were downsampled to 200 Hz and band-pass filtered (3rd-order Butterworth filter with 0.5–50 Hz passband) before EOG rejection. Blind source separation-based EOG rejection was performed using the automatic artifact rejection toolbox in EEGLAB [40, 41]. For NIRS, the modified Beer-Lambert law was applied to convert light intensity changes to the concentration changes of deoxy- and oxyhemoglobin (ΔHbR and ΔHbO) [42]. ΔHbR and ΔHbO were band-pass filtered using a 6th-order zero-phase Butterworth filter with a passband of 0.01–0.2 Hz. Fig 3 provides a flow of the data processing and analytical procedure.

thumbnail
Fig 3. Flow of EEG and NIRS data processing and analysis.

EEG data were downsampled by 5. Both EEG and NIRS data were band-pass filtered (BPF). Blind source separation (BSS) was performed to remove ocular artifacts in the EEG data. For both sets of data, feature vectors were independently constructed and shrinkage linear discriminant analysis (sLDA) was used to discriminate between specific task-related brain activations. For the hybrid approach, the classifier outputs of EEG and NIRS data were concatenated, thus creating a separate feature vector. The classifications of EEG, NIRS, and hybrid data were performed separately.

https://doi.org/10.1371/journal.pone.0196359.g003

Feature extraction

We used EEG epochs acquired from the entire task period (i.e., 0–10 s) and NIRS epochs acquired from the end of the task period for 5 s (i.e., 10–15 s) based on the preliminary analysis investigating the impact of analysis time window on classification performance (see S1 Fig). For EEG, prior to spatial filtering, the EEG data were band-pass filtered with a participant-specific passband. For 0–10 s EEG data, the participant-specific passband was selected using the heuristic band selection method based on signed r2-values (sgn r2) using point biserial correlation coefficient with sign preserved [43, 44]. The common spatial pattern (CSP) filter was then applied to the filtered EEG data [4550]. Note that the CSP filter and participant-specific passband were determined based only on the training data within the inner cross-validation loop to avoid over-fitting. Feature vectors were produced by calculating the log variances of the first and last three CSP components based on the ratio-of-medians score. This score is more robust with respect to outliers than conventional eigenvalue scores [47]. For NIRS data, considering the hemodynamic delay, the mean value and average slope of ΔHbR and ΔHbO between 10 and 15 s were used to create feature vectors, as the hemodynamic change fully developed during the period [51] and showed the highest discriminative information (see the result section for details).

Classification

A shrinkage linear discriminant analysis (sLDA) was used as a classifier [48, 52]. The classification performance was calculated by 10 x 5-fold cross-validation. The same classifier and cross-validation approach were applied to EEG and NIRS. The combined EEG and NIRS data were evaluated according to a meta-classification method. The outputs of each classifier (i.e., EEG, HbR, and HbO) were combined to build new feature vectors for the meta-classifier [33]. The classification performance of all possible combinations of EEG and both NIRS chromophores were examined (HbR+HbO, EEG+HbR, EEG+HbO, and EEG+HbR+HbO).

Information transfer rate (ITR)

Among diverse metrics to assess the performance of a BCI system, information transfer rate (ITR), in bits per minute, has been commonly used as a measure of the BCI performance [53]. ITR can be computed as [7] (1) where m is number of trials per minute, N is the number of task types and P is the classification accuracy (N is 2 in this study). Because the length of the rest period (15–17 s) was redundantly long in terms of unimodal EEG-BCI, in order to fairly assess the ITR, we set a single trial length considering only the length of the task period (10 s) excluding the length of the rest period.

Experimental results

Spectral, temporal, and spatial characteristics

Fig 4 shows the grand-average time-frequency analysis results of EEG, in which spectral power changes due to MA and BL and the difference between MA and BL are shown. The spectral power changes were averaged over 10 frontal channels (AFp1, AFp2, AFF1h, AFF2h, AFF5h, AFF6h, F3, F4, F7, and F8). For reference, mean spectral power changes of five central (Cz, C3, C4, T7, and T8), five parietal (Pz, P3, P4, P7, and P8), and two occipital channels (POO1 and POO2) were also provided. Note that only the 10 frontal electrodes were used for classification. An increase of natural α-rhythm power with closed eyes was observed over whole areas. The second harmonic of the natural α-rhythm was clearly observed in the occipital area. By contrast, decrease in power resulting from closed eyes was clearly observed except in the α- and low β-bands in the frontal and central areas [54]. In accordance with [55], a decrease in task-related α-rhythm spectral power was also observed through whole brain areas, and thus distinct spectral power difference between MA and BL (MA-BL) was observed (see the third column in Fig 4).

thumbnail
Fig 4. Grand average of time-frequency analysis results for MA, BL, and the difference between MA, BL (MA-BL).

The spectral power changes were averaged over 10 frontal (AFp1, AFp2, AFF1h, AFF2h, AFF5h, AFF6h, F3, F4, F7, and F8), five central (Cz, C3, C4, T7, and T8), five parietal (Pz, P3, P4, P7, and P8), and two occipital channels (POO1 and POO2). The colorbar indicates the spectral power in dB.

https://doi.org/10.1371/journal.pone.0196359.g004

Fig 5 shows task-related spectral power difference in terms of signed r2-values (sgn r2) for each frequency band in detail. Dark blue indicates the higher separability of MA and BL. In the θ- (4–8 Hz) and low β-band (13–20 Hz), parietal and occipital areas showed higher separability. In addition, higher separability was apparent near frontal areas in the α-band (8–13 Hz). The high β- (20–30 Hz) and γ-bands (30–50 Hz) do not present a strong and meaningful spectral power difference between MA and BL for discrimination. Considering only frontal areas were used for classification, the highest separability appeared in the α-band, whereas moderate separability was apparent in the θ- and low β-band. Conforming to the results shown in Figs 4 and 5, the α-band was mostly included in participant-specific passbands for classification, followed by the θ- and low β-band (see Table 1).

thumbnail
Fig 5. Signed r2-values (sgn r2) in the θ- (4–8 Hz), α- (8–13 Hz), low β- (13–20 Hz), high β- (20–30 Hz), and γ-band (30–50 Hz).

The colorbar indicates the level of sgn r2 ranging from -0.04 to 0 dB. Note that the lower value (dark blue) indicates better separability than the higher value (light blue). Considering only frontal areas used for data analysis, the highest separability is shown in the α-band; moderate separability is shown in the θ- and low β-band.

https://doi.org/10.1371/journal.pone.0196359.g005

thumbnail
Table 1. Individual classification accuracy of EEG, NIRS, and hybrid approaches (HYB).

Note that “std” refers to standard deviation and [fL fH] indicates the most frequently selected participant-specific passbands for CSP filters estimated by the heuristic method. The p-values were corrected by the false discovery rate.

https://doi.org/10.1371/journal.pone.0196359.t001

Fig 6 shows the grand average scalp maps of hemodynamic responses. HbR decrease and HbO increase were observed in the early stage (0–5 s) for both MA and BL. After 5 s, an opposite trend was observed, and they peaked at 10–15 s. For HbR, the activation was stronger at anterior channels than posterior ones. Then, the amplitude of activation generally decreased. Note that even though MA and BL induced similar spatial patterns of hemodynamic responses for each period, MA led to higher brain activation than BL in general.

thumbnail
Fig 6. Spatial distribution of hemodynamic responses at given time intervals.

(a) Δ[HbR] for MA and BL. (b) Δ[HbO] for MA and BL. The colorbar indicates the amount of change in concentration.

https://doi.org/10.1371/journal.pone.0196359.g006

Performance

The offline classification accuracies of EEG, NIRS, and hybrid approach and their averages are shown in Table 1. Frontal channel EEG scored 77.3 ± 15.9% and NIRS reached a slightly lower classification accuracy (75.9 ± 6.3%), but the difference was not statistically significant (Friedman test, p = 0.018; post-hoc: Wilcoxon signed rank sum test with false discovery rate (FDR) correction, corrected-p = 0.694). The average hybrid approach (HYB) had significantly higher classification accuracy (83.9 ± 10.3%) than those derived from EEG or NIRS. Most participants (10 of 12) showed considerably improved classification accuracies when using the hybrid approach (bold numbers in Table 1). Performance improvement with respect to classification accuracy is shown in Fig 7. Blue and red circles indicate individual performances for unimodal EEG or NIRS, respectively, compared to HYB. 83.3% of participants showed improved classification accuracies by HYB (FDR corrected-p < 0.01 for both EEG vs. HYB and NIRS vs. HYB).

thumbnail
Fig 7. Comparison of classification accuracies of unimodal BCI (EEG or NIRS) and the HYB.

Each circle indicates the individual result. Blue and red circles refer to the performance of EEG vs. HYB and NIRS vs. HYB, respectively. Note that the circles above the diagonal line indicate that the individual classification accuracy improved by HYB. Percentage values show the ratio of the number of improved individual classification accuracies by HYB over the total number of participants.

https://doi.org/10.1371/journal.pone.0196359.g007

Fig 8 shows the average ITRs across all participants. NIRS scores lower ITR (1.32 bits/min.) than EEG (2.03 bits/min.) while HYB shows ITR improvement (2.53 bits/min.). Note that significant difference is verified between EEG/NIRS and HYB (Friedman test: p = 0.018, post-hoc: Wilcoxon signed rank sum test with false discovery rate-corrected p < 0.01), while no significant difference between EEG and NIRS (corrected p = 0.301).

thumbnail
Fig 8.

Information transfer rates (ITRs) computed by using EEG (top), NIRS (middle) and HYB (bottom). Average ITRs (AVG) are shown in the corresponding subfigures. The ITRs of NIRS are obtained using the classification accuracies of HbR+HbO. Dotted lines are theoretical ITRs according to classification accuracy. ‘+’ symbols indicate the individual ITRs.

https://doi.org/10.1371/journal.pone.0196359.g008

Discussion

In our previous EC NIRS-based BCI study [24], we confirmed that the performance of an EC NIRS-based BCI (75.6 ± 7.3%) was comparable with that of an EO NIRS-based BCI (77.0 ± 9.2%), thus demonstrating for the first time the feasibility of an EC NIRS-based BCI. In this study, we attempted to implement a hybrid EEG-NIRS BCI using only frontal brain activations that were generated by MA and BL under an EC state in order to improve the performance of an EC NIRS-based BCI. As expected, we confirmed in our offline study that the classification accuracy of an EC NIRS-based BCI as implemented in this study was similar to the previous one (76.4 ± 6.3%), even though participants differed between the two studies. With the hybrid EEG-NIRS BCI, we achieved significantly improved classification accuracy (83.9 ± 10.3%) compared to that obtained using only NIRS (75.9 ± 6.3%) or EEG (77.3 ± 15.9%). It is noteworthy that the performance improvement was accomplished using only a few frontal EEG electrodes, and not all EEG electrodes were attached over the entire scalp.

For EEG, the α-rhythm induced under an EC state appeared over whole scalp areas (Fig 4). Nevertheless, EEG classification accuracy was not considerably affected by the natural α-rhythm. The decrease in α-power during the MA task was still detectable when eyes were closed, as shown in Fig 4. Based on this neurophysiological phenomenon, despite the presence of very strong α-rhythms, we were able to achieve reasonable classification accuracy using only the frontal EEG, which is comparable to that of a previous study in which 75.9% classification accuracy was achieved and which was estimated by using EEG electrodes distributed over the entire scalp [44].

As CLIS patients are generally bed-ridden and artificially ventilated through tracheostoma, sensors used to capture brain activity in an experiment should be carefully attached. In fact, a recent EEG-based BCI study performed with ALS patients reported the same difficulty, where EEG electrodes were attached around occipital areas to measure SSVEPs [56]. As a result, our hybrid BCI employing only frontal brain areas is expected to provide a safer and more convenient means of communicating with paralyzed patients, especially those having impaired oculomotor functions.

In most BCI studies, BCI systems were designed and assessed in eyes-open state, while a few number of recent BCI studies were conducted for verifying the feasibility of EC BCI systems [23, 24, 57]. Our results added another piece of evidence to previous EC BCI studies, demonstrating its feasibility. However, BCIs that are independent of the state of the eyes should be fundamentally developed for end users such as late-state ALS patients because they are frequently unable to voluntarily control their eye-lids, e.g., opening and closing the eyes. To develop a BCI system independent of eyes’ state, one study investigated the effect of the state of the eyes on BCI performance, and showed closing the eyes during a cognitive task decreases BCI performance compared to eyes-open condition for amplitude modulation but not for frequency modulation features [58]. This result can be utilized when developing a hybrid BCI system totally independent of eyes’ state in future studies.

Recently, one BCI study was the first to apply simultaneous recordings of EEG and NIRS for ALS and CLIS patients. However, the modalities were independently employed and not fused for data analysis [59]. In addition, the locations of EEG electrodes and NIRS optodes were somewhat different from that (i.e., the brain region) used in our study, as the fronto-central region was used in [59]. Thus, further studies for patients having neurodegenerative diseases must be conducted in order to address thoroughly the clinical benefits of EEG and NIRS hybrid BCIs when employing only frontal brain areas.

Conclusion

In this study, we proposed an EC hybrid BCI that combines EEG and NIRS in order to improve the performance of the single modality-based EC BCI. Specifically, only frontal brain areas were used to discriminate MA-related brain activation from that related to BL. We achieved a promising classification accuracy with our hybrid BCI under EC condition. Our results provide evidence that a hybrid EEG-NIRS BCI can be implemented with only frontal areas and EC, and may be useful for future applications in studies on end users having oculomotor dysfunctions.

Supporting information

S1 Fig. Impact of analysis time window on classification performance.

EEG, NIRS and HYB classification accuracies calculated by using various time windows. The Tables below the figures denote time periods for EEG and NIRS data used for calculating the corresponding classification accuracies. For the upper panel, the EEG time window is fixed and the NIRS time window varies, while the EEG time window varies and the NIRS time window is fixed for the lower panel.

https://doi.org/10.1371/journal.pone.0196359.s001

(DOCX)

Acknowledgments

This work was supported by a grant from Institute for Information & Communications Technology Promotion (IITP) grant funded by the Korea government (No. 2017-0-00451), and by the National Research Foundation of Korea (NRF) grant funded by the Korea government (Ministry of Science, ICT & Future Planning) (No.2017R1C1B5017909). Correspondence to KRM and HJH.

References

  1. 1. Birbaumer N. Breaking the silence: Brain-computer interfaces (BCI) for communication and motor control. Psychophysiology. 2006;43(6):517–32. pmid:17076808
  2. 2. Ramos-Murguialday A, Broetz D, Rea M, Laeer L, Yilmaz O, Brasil FL, et al. Brain-machine interface in chronic stroke rehabilitation: a controlled study. Ann Neurol. 2013;74(1):100–8. pmid:23494615
  3. 3. Müller-Putz G, Leeb R, Tangermann M, Höhne J, Kübler A, Cincotti F, et al. Towards noninvasive hybrid brain–computer interfaces: framework, practice, clinical application, and beyond. Proc IEEE. 2015;103(6):926–43.
  4. 4. Höhne J, Holz E, Staiger-Salzer P, Muller KR, Kubler A, Tangermann M. Motor imagery for severely motor-impaired patients: evidence for brain-computer interfacing as superior control solution. PLoS ONE. 2014;9(8):e104854. pmid:25162231
  5. 5. Neuper C, Müller GR, Kübler A, Birbaumer N, Pfurtscheller G. Clinical application of an EEG-based brain–computer interface: a case study in a patient with severe motor impairment. Clin Neurophysiol. 2003;114(3):399–409. pmid:12705420
  6. 6. Dornhege G, Millán JR, Hinterberger T, McFarland D, Müller K-R. Toward brain-computer interfacing. Cambridge, MA: MIT press; 2007.
  7. 7. Wolpaw JR, Birbaumer N, Heetderks WJ, McFarland DJ, Peckham PH, Schalk G, et al. Brain-computer interface technology: a review of the first international BCI meeting. IEEE Trans Rehabil Eng. 2000;8(2):164–73. pmid:10896178
  8. 8. Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain-computer interfaces for communication and control. Clin Neurophysiol. 2002;113(6):767–91. pmid:12048038
  9. 9. Wolpaw JR, Wolpaw EW. Brain-computer interfaces: principles and practice. New York: Oxford University Press; 2012.
  10. 10. Vidaurre C, Sannelli C, Müller K-R, Blankertz B. Machine-learning-based coadaptive calibration for brain-computer interfaces. Neural Comput. 2011;23(3):791–816. pmid:21162666
  11. 11. Vidaurre C, Sannelli C, Müller K-R, Blankertz B. Co-adaptive calibration to improve BCI efficiency. J Neural Eng. 2011;8(2):025009. pmid:21436515
  12. 12. Vidaurre C, Blankertz B. Towards a cure for BCI illiteracy. Brain topogr. 2010;23(2):194–8. pmid:19946737
  13. 13. Guger C, Edlinger G, Harkam W, Niedermayer I, Pfurtscheller G. How many people are able to operate an EEG-based brain-computer interface (BCI)? IEEE Trans Neural Syst Rehabil Eng. 2003;11(2):145–7. pmid:12899258
  14. 14. Neuper C, Müller-Putz GR, Scherer R, Pfurtscheller G. Motor imagery and EEG-based control of spelling devices and neuroprostheses. Prog Brain Res. 2006;159:393–409. pmid:17071244
  15. 15. Kübler A, Nijboer F, Mellinger J, Vaughan TM, Pawelzik H, Schalk G, et al. Patients with ALS can use sensorimotor rhythms to operate a brain-computer interface. Neurology. 2005;64(10):1775–7. pmid:15911809
  16. 16. Sellers EW, Donchin E. A P300-based brain-computer interface: initial tests by ALS patients. Clin Neurophysiol. 2006;117(3):538–48. pmid:16461003
  17. 17. Hoffmann U, Vesin J-M, Ebrahimi T, Diserens K. An efficient P300-based brain-computer interface for disabled subjects. J Neurosci Meth. 2008;167(1):115–25.
  18. 18. Kübler A, Furdea A, HaIder S, Hammer EM, Nijboer F, Kotchoubey B. A brain-computer interface controlled auditory event-related potential (P300) spelling system for locked-in patients. In: Schiff ND, Laureys S, editors. Disorders of Consciousness. Annals of the New York Academy of Sciences. 1157. Oxford: Blackwell Publishing; 2009. p. 90–100.
  19. 19. Lim J-H, Hwang H-J, Han C-H, Jung K-Y, Im C-H. Classification of binary intentions for individuals with impaired oculomotor function: ‘eyes-closed’ SSVEP-based brain–computer interface (BCI). J Neural Eng. 2013;10(2):026021. pmid:23528484
  20. 20. Vansteensel MJ, Hermes D, Aarnoutse EJ, Bleichner MG, Schalk G, van Rijen PC, et al. Brain-computer interfacing based on cognitive control. Ann Neurol. 2010;67(6):809–16. pmid:20517943
  21. 21. Hohmann MR, Fomina T, Jayaram V, Widmann N, Förster C, Just J, et al. A cognitive brain–computer interface for patients with amyotrophic lateral sclerosis. In: Coyle D, editor. Progress in Brain Research. 228: Elsevier; 2016. p. 221–39. https://doi.org/10.1016/bs.pbr.2016.04.022
  22. 22. Fomina T, Lohmann G, Erb M, Ethofer T, Scholkopf B, Grosse-Wentrup M. Self-regulation of brain rhythms in the precuneus: a novel BCI paradigm for patients with ALS. J Neural Eng. 2016;13(6).
  23. 23. Hwang H-J, Ferreria VY, Ulrich D, Kilic T, Chatziliadis X, Blankertz B, et al. A gaze independent brain-computer interface based on visual stimulation through closed eyelids. Sci Rep. 2015;5(1):15890.
  24. 24. Shin J, Müller K-R, Hwang H-J. Near-infrared spectroscopy (NIRS) based eyes-closed brain-computer interface (BCI) using prefrontal cortex activation due to mental arithmetic. Sci Rep. 2016;6(1):36203.
  25. 25. Goldman RI, Stern JM, Engel J, Cohen MS. Simultaneous EEG and fMRI of the alpha rhythm. Neuroreport. 2002;13(18):2487–92. pmid:12499854
  26. 26. Herrmann MJ, Ehlis AC, Fallgatter AJ. Prefrontal activation through task requirements of emotional induction measured with NIRS. Biol Psychol. 2003;64(3):255–63. pmid:14630406
  27. 27. Power SD, Falk TH, Chau T. Classification of prefrontal activity due to mental arithmetic and music imagery using hidden Markov models and frequency domain near-infrared spectroscopy. J Neural Eng. 2010;7(2):026002.
  28. 28. Power SD, Kushki A, Chau T. Towards a system-paced near-infrared spectroscopy brain–computer interface: differentiating prefrontal activity due to mental arithmetic and mental singing from the no-control state. J Neural Eng. 2011;8(6):066004. pmid:21975364
  29. 29. Power SD, Kushki A, Chau T. Toward a system-paced NIRS-BCI: Differentiating prefrontal activity due to mental arithmetic and music imagery from the no-control state. J Neural Eng. 2011;8(6):066004 pmid:21975364
  30. 30. Moghimi S, Kushki A, Power S, Guerguerian AM, Chau T. Automatic detection of a prefrontal cortical response to emotionally rated music using multi-channel near-infrared spectroscopy. J Neural Eng. 2012;9(2):026022. pmid:22419117
  31. 31. Herff C, Heger D, Fortmann O, Hennrich J, Putze F, Schultz T. Mental workload during n-back task—quantified in the prefrontal cortex using fNIRS. Front Hum Neurosci. 2013;7(1):935.
  32. 32. Hong K-S, Naseer N, Kim Y-H. Classification of prefrontal and motor cortex signals for three-class fNIRS-BCI. Neurosci Lett. 2015;587(1):87–92.
  33. 33. Fazli S, Mehnert J, Steinbrink J, Curio G, Villringer A, Müller K-R, et al. Enhanced performance by a hybrid NIRS-EEG brain computer interface. Neuroimage. 2012;59(1):519–29. pmid:21840399
  34. 34. Fazli S, Dähne S, Samek W, Bießmann F, Müller K-R. Learning from more than one data source: data fusion techniques for sensorimotor rhythm-based brain–computer interfaces. Proc IEEE. 2015;103(6):891–906.
  35. 35. Dähne S, Bießmann F, Samek W, Haufe S, Goltz D, Gundlach C, et al. Multivariate machine learning methods for fusing multimodal functional neuroimaging data. Proc IEEE. 2015;103(9):1507–30.
  36. 36. von Lühmann A, Wabnitz H, Sander T, Müller K-R. M3BA: a mobile, modular, multimodal biosignal acquisition architecture for miniaturized EEG-NIRS based hybrid BCI and monitoring. IEEE Trans Biomed Eng. 2017;64(6):1199–210. pmid:28113241
  37. 37. Delorme A, Sejnowski T, Makeig S. Enhanced detection of artifacts in EEG data using higher-order statistics and independent component analysis. Neuroimage. 2007;34(4):1443–9. pmid:17188898
  38. 38. Shin J, von Lühmann A, Kim D-W, Mehnert J, Hwang H-J, Müller K-R. Simultaneous acquisition of EEG and NIRS during cognitive tasks for an open access dataset. Sci Data, in press. 2018; pmid:29437166
  39. 39. Tomita Y, Vialatte FB, Dreyfus G, Mitsukura Y, Bakardjian H, Cichocki A. Bimodal BCI using simultaneously NIRS and EEG. IEEE Trans Biomed Eng. 2014;61(4):1274–84. pmid:24658251
  40. 40. Gomez-Herrero G, De Clercq W, Anwar H, Kara O, Egiazarian K, Van Huffel S, et al., editors. Automatic removal of ocular artifacts in the EEG without an EOG reference channel. 7th Nordic Signal Processing Symposium; 2006 7–9 Jun.; Reykjavik, Iceland.
  41. 41. Tichavsky P, Yeredor A, Nielsen J, editors. A fast approximate joint diagonalization algorithm using a criterion with a block diagonal weight matrix. 2008 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); 2008 Mar. 31-Apr. 4; Las Vegas, USA.
  42. 42. Kocsis L, Herman P, Eke A. The modified Beer–Lambert law revisited. Phys Med Biol. 2006;51(5):N91. pmid:16481677
  43. 43. Blankertz B, Lemm S, Treder M, Haufe S, Müller K-R. Single-trial analysis and classification of ERP components—a tutorial. Neuroimage. 2011;56(2):814–25. pmid:20600976
  44. 44. Shin J, von Lühmann A, Blankertz B, Kim D-W, Jeong J, Hwang H-J, et al. Open access dataset for EEG+NIRS single-trial classification. IEEE Trans Neural Syst Rehabil Eng. 2017;25(10):1735–45.
  45. 45. Blankertz B, Kawanabe M, Tomioka R, Hohlefeld F, Müller K-R, Nikulin VV, editors. Invariant common spatial patterns: alleviating nonstationarities in brain-computer interfacing. Advances in neural information processing systems (NIPS); 2007 Dec. 3–6; Vancouver and Whistler, Canada. Cambridge, MA: MIT Press; 2008.
  46. 46. Koles ZJ, Soong ACK. EEG source localization: implementing the spatio-temporal decomposition approach. Electroencephalogr Clin Neurophysiol. 1998;107(5):343–52. pmid:9872437
  47. 47. Blankertz B, Tomioka R, Lemm S, Kawanabe M, Müller K-R. Optimizing spatial filters for robust EEG single-trial analysis. IEEE Signal Process Mag. 2008;25(1):41–56.
  48. 48. BBCI toolbox [Available from: https://github.com/bbci/bbci_public/.
  49. 49. Blankertz B, Acqualagna L, Dahne S, Haufe S, Kraft MS, Sturm I, et al. The Berlin brain-computer interface: progress beyond communication and control. Front Neurosci. 2016;10(1):530.
  50. 50. Blankertz B, Tangermann M, Vidaurre C, Fazli S, Sannelli C, Haufe S, et al. The Berlin brain–computer interface: non-medical uses of BCI technology. Front Neurosci. 2010;4(1):00198.
  51. 51. Naseer N, Hong K-S. fNIRS-based brain-computer interfaces: a review. Front Hum Neurosci. 2015;9(1):00003.
  52. 52. Friedman JH. Regularized discriminant analysis. J Am Stat Assoc. 1989;84(405):165–75.
  53. 53. Schudlo LC, Chau T. Towards a ternary NIRS-BCI: single-trial classification of verbal fluency task, Stroop task and unconstrained rest. J Neural Eng. 2015;12(6):066008. pmid:26447770
  54. 54. Freeman WJ, Holmes MD, Burke BC, Vanhatalo S. Spatial spectra of scalp EEG and EMG from awake humans. Clin Neurophysiol. 2003;114(6):1053–68. pmid:12804674
  55. 55. Friedrich EVC, Scherer R, Neuper C. Stability of event-related (de-) synchronization during brain–computer interface-relevant mental tasks. Clin Neurophysiol. 2013;124(1):61–9. pmid:22749465
  56. 56. Hwang H-J, Han C-H, Lim J-H, Kim Y-W, Choi S-I, An K-O, et al. Clinical feasibility of brain-computer interface based on steady-state visual evoked potential in patients with locked-in syndrome: case studies. Psychophysiology. 2017;54(3):444–51. pmid:27914171
  57. 57. Lim JH, Hwang HJ, Han CH, Jung KY, Im CH. Classification of binary intentions for individuals with impaired oculomotor function: 'eyes-closed' SSVEP-based brain-computer interface (BCI). J Neural Eng. 2013;10(2):039501.
  58. 58. Görner M, Schölkopf, B., and Grosse-Wentrup, M., editor Closing one’s eyes affects amplitude modulation but not frequency modulation in a cognitive BCI. 7th Graz BCI Conference 2017; 2017 18–22 Sep.; Graz, Austria.
  59. 59. Chaudhary U, Xia B, Silvoni S, Cohen LG, Birbaumer N. Brain–computer interface–based communication in the completely locked-in state. PLoS Biol. 2017;15(1):e1002593. pmid:28141803