I worked with Chris Imbriano during the TechCrunch Disrupt SF 2014 Hackathon. We spent the weekend hacking around with the Muse, a consumer EEG headband, NodeJS and the Spark Core. By the end of the 24 hours, we were able to control the lights connected to the Spark Cores using only our mind. The best part is that is actually worked on stage:

http://techcrunch.com/video/muse-hack-presents-disrupt-sf-2014-hackathon/518404060/

The Details

Goals

Not having worked with the Muse or NodeJS before, we had set achievable goals for the weekend:

  • Learn some NodeJS
  • Get the stream of data from the Muse pushed in realtime to a remote server

We weren’t worried about building anything on top of that stream of data. We’d have been satisfied to just have it available remotely. As it turns out, this only took about 4 hours.

Showing something

Well this was great and all, but there wasn’t much to show in a presentation. So, we spent the next few hours dumping out all the data into the browser — eeg readings, accelerometer data, various pre-processed frequency bands, etc… Then we built a chart using Rickshaw. Though the chart lagged a bit (wifi there was awful), it was a good visual.

But was it? Charts of brainwaves didn’t quite show the core of the project, which was brainwave data ported to the cloud.

So we went on one of our many walks to brainstorm (ha).

IoT Hardware, duh!

While walking around, we stumbled upon Zach Supalla showing someone the light-board (“giant core”) he had created out of 16 lightbulbs and 4 spark cores. We thought, “Hey, we can probably control that with our mind.” Zach graciously gave us the API key and let us borrow the board for the presentation.

The idea was to have the lights reflect the calmness or excitedness of the brain. The calmer the mind the fewer the lights.

We spent the next few hours processing alpha waves, figuring out light patterns and writing the API calls. The result? It sucked. The problems were:

  • The wifi was really bad so the spark cores couldn’t execute in realtime.
  • Brainwave data was comin’ in hot. We were sending way too many instructions to the spark cores.

So we tweaked it to send instructions at 2 second intervals and by 4 or 5am the wifi had started cooperating. Removing every other row of lights helped to reduce the load, too. After some calibration, it was actually working reliably. Joy.

With only a minute allotted for the presentation and not having slept all night, I knew I wouldn’t be able to get into any kind of deep meditation on stage, so I calibrated it to be fairly sensitive. I don’t think this guy appreciated that.

During the presentation I sat down on the stage and closed my eyes. My first thought was, “Wow my heart is racing, this will never work.” Then my mind went completely blank like I had been practicing all morning. The light-board responded. The audience clapped. It worked perfectly.

The Prize

We won DigitalOcean’s second place prize for the project, which included:

  • PARROT AR.DRONE 2.0 for each participating team member
  • $500 Apple Gift Card for each participating team member
  • DigitalOcean Love Pack filled with sweatshirts, messenger bags and flasks
  • $5,000 in DigitalOcean credit

Woohoo!

The Code

The disclaimer: this code is pretty messy. This was our first go at Node and funny things happen when you code without sleeping.

We wrote two separate applications, one that runs on the client machine and one that runs on a remote server.

The code that runs on the client machine acts as a UDP server, collecting muse sensor data, lightly processing it, and pushing it to the remote server.

The remote server listens for data and pushes it to the browser where it gets drawn. It also processes it and makes the API calls.

Client: https://github.com/cimbriano/muse-hack

Server: https://github.com/cimbriano/muse-hack-server