
Brain-computer interfacing (BCIs)
Check out my project that combines neuroengineering, computer science, drones, and virtual reality.
South Korea: BCI-Controlled Drones
Importance: The application of Brain-Computer Interfaces (BCIs) in drone control, as exemplified by the scenario, represents a significant advancement with profound implications for both technology and neuroscience. BCIs offer the potential to bridge the gap between human cognition and external devices, enabling individuals to exert control over technology through their thoughts. This innovation finds relevance in various domains, including healthcare, where BCIs hold promise for individuals with severe physical disabilities, facilitating communication and enhancing their quality of life (Lebedev & Nicolelis, 2006). Moreover, the fusion of machine learning algorithms with EEG data, as illustrated, underscores the importance of robust training protocols in enabling BCIs to interpret complex neural signals accurately. These developments also extend to the field of robotics, enhancing human-robot interaction by enabling more intuitive and responsive control mechanisms (Wolpaw et al., 2002). In the broader context, such advancements exemplify the continuous evolution of neurotechnology, reaffirming the inexorable march toward harnessing the full potential of the human brain to interface with and manipulate our increasingly complex technological environment.
Example of BCI Motor Imagery training session:

Presenting my project in South Korea!!

Kinesis: A Motor Imagery BCI Method
Importance: This method is practically useful for potential medical applications. For example, it can ease communication with people suffering from locked-in syndrome as they will be able to write basic things using their thoughts. Furthermore, it can also be useful for healthy populations. For example, astronauts are subject to unusual conditions and would benefit from brain-computer interfaces (BCIs) assistance (Rossini et al., 2009). Nevertheless, you also can send a text message to your partner while driving without touching your phone (although you can also do that with Siri but this is just slightly more impressive haha).
Limitations: This exact prototype allows me to communicate basic things that I had predefined. For example, I trained it to detect when I'm imagining myself doing jumping jacks, then it would write: "I want to go to the gym." However, there's a lot of room for improvement. A more sophisticated prototype is to relate trained thoughts to all of the keystrokes. Doing that will allow more freedom in communicating things that are not predefined. In other words, user will be able to form their own sentences just by thinking about the letters. Nevertheless, this is also not a very efficient way, but maybe if we work together we can find a more effective approach to that!
I'm currently developing a virtual reality brain-controlled experience. I will keep updating the content on this page when I make important progress. In the meantime, you can find below older reflections of this project.
Project Propsal - July 2022
I know it sounds like science fiction, or more like pseudoscience. However, check out my project proposal (until the 5 mins mark). Currently, I got my project approved and purchased the products needed for making it possible. I will be updating this paragraph throughout working on this project. For now, enjoy hearing about the idea!
Summary: each time you move your hands, specific regions of your brain are activated to send signals so that you can actually move it. These activities can be measured by electrodes using Electroencephalography (EEG) by detecting the weak electrical signals from the brain on the surface of the scalp. Based on the detected signals, the brain-computer interface will interpret the meaning of the activity, and make predictions. For example, when you are about to move your hand the device will sense the activity on your brain and interpret it. My project is about making a correlation between those neural signals and what we can do programming-wise. For example, controlling your laptop without having to touch a keyboard/mouse/joystick or anything, but moving the cursor just by your thoughts. I am looking to do that in a virtual environment using a VR headset. I know it sounds like a story out of a science fiction book, but I will be updating this paragraph throughout my project and keep you updated!
Project Gallery



