A pilot investigation into brain-computer interface use during a lucid dream
During lucid moments of a dream, the sleeper is aware of the dream as it is occurring, and as a result can often perform predetermined actions within the dream. This provides a unique opportunity for dream research, as lucid dreamers can send real-time signals from sleep to the external world. Historically, such sleep-to-wake communication from a lucid dream is executed via left-right eye movements, which places hard limitations on information transfer. Recent advances in biomedical equipment — specifically brain-computer interfaces — have resulted in headsets that use neural recordings to translate mental imagery into computer commands. In light of evidence suggesting that dreamed and imagined actions recruit similar neural resources, I considered the possibility that the same mental commands that are collected and translated from waking imagery could be similarly performed and detected from within a lucid dream. In this exploratory study with proof-of-concept intent, three participants were asked to use an Emotiv EPOC+ headset and companion software to map a mental motor command (pushing a block) with a resulting computer action (graphic of block moving forward). After waking training, participants attempted to induce a lucid dream while wearing the headset, and upon lucidity perform the same mental command. In two participants, subjectively reported lucid dream task completion was corroborated with video footage of the resulting computer graphic. These preliminary results suggest that a wake-trained brain-computer interface can be controlled from sleep and offer important directions for future dream communication and research.