Admittedly, all EEG headsets have their own BCI developer tools that can stream out real-time data to computers. For example, the Emotiv headset has the EPOC application and Muse headband has the muse-lab. However, the disadvantage of using these software development kits is that we have very little degree of freedom to combine the data in the SDKs to a desired functions written in any common computer language like Python, Java and etc. Thus, the data are not very useful for many tasks. At the same time, as the EEG technology are mainly applied on personal computer right now, building up the BCI between the headband and the micro-controller will be an interestingly new approach for many engineering and medical researches. Although it is not realistic for us to design a micro computer, we can image using an available single-board computer such as the Raspberry Pi to minimize the size of the whole system. Therefore, we believe it is meaningful to write our codes to set up the communication between the Muse headband and the computer.
In fact, Zachary Bluestein, a student who is also working on a project under the direction of Prof. ShiNung Ching, is going to use our codes for his study that is designed to evaluate multi-modal perception and integration. Thus, we need to synchronize the EEG signals with some sound sources. To achieve this goal, we have to set up a universal timer. Zac’s experiment design is shown as Appendix A at the end of this report.