Ronan Byrne’s ultra low-cost brain-computer interface

Over the years, I’ve dabbled a bit in the creation of a so-called brain-computer interface (BCI). These systems take various forms, but the basic idea is to use technology to transfer information from the conscious mind of a human into a computer, without channeling it through muscle movements. Humans communicate information from their conscious (and unconscious) minds into the outside world all the time in the form of speech, body language, writing, typing, touchscreen interaction, etc. However, all of these forms of communication rely on the brain’s ability to control muscles to transmit information into the outside world. A BCI uses technology to observe and interpret brain activity directly. When it works, a BCI allows its user to control technology using his/her thoughts!

This semester, I’m delighted to be supervising Ronan Byrne’s final-year project which explores the design of an ultra low-cost EEG-based BCI. Ronan has designed his own EEG amplifier and custom software. He just posted a couple of great videos showing what he’s achieved so far, so I thought I’d share them here.

In this video, Ronan gives an overview of his system. The communication interface isn’t fully functioning yet, but you can see the real-time processing of Ronan’s EEG while he’s recording the video!

His second video shows a really nice take on what is probably the simplest type of EEG-based BCI: modulation of alpha waves by opening and closing the eyes. Ronan’s interface allows the user to give yes/no answers to questions by increasing or decreasing alpha activity (roughly 10-12 Hz) in the occipital lobes (at the back of the head), which can be achieved by opening or closing the eyes.

Advertisements
Posted in Uncategorized | 2 Comments

H-bridge control example for Arduino Nano (ATmega328) – two phase-displaced square waves

This is the code:

// 
// H-bridge control example for Arduino Nano (ATmega328)
// Written by Ted Burke, 27-4-2018
//
// 20 kHz square wave output on OC2 (pin D11) and OC2B (pin D3)
// The phase difference between OC2 and OC2B can be controlled
// by varying the value of OCR2B between 1 and 49 inclusive.
//
// OCR2B = 49 makes the square waves in opposite phase.
// OCR2B = 1 makes the square waves almost in phase.
//
// Subsequent changes to OCR2B must be done with care because
// missing a compare event will change the phase of OC2B by
// 180 degrees!
//

void setup()
{
  // TC2 (Timer/Counter 2) in CTC mode with 8:1 prescaler
  TCCR2A = (0<<COM2A1)|(1<<COM2A0)|(0<<COM2B1)|(1<<COM2B0)| 0        | 0        |(1<<WGM21)|(0<<WGM20);
  TCCR2B = (0<<FOC2A )|(1<<FOC2B )| 0         | 0         |(0<<WGM22)|(0<<CS22 )|(1<<CS21 )|(0<<CS20 );
  OCR2A = 49; // sets the frequency
  OCR2B = 1; // phase shift between outputs (49 is opposite phase, 1 is almost in phase)

  // Enable timer output pins: OC2 (pin D11) and OC2B (pin D3)
  pinMode(3, OUTPUT);
  pinMode(11, OUTPUT);
}

void loop()
{
  // Do other stuff here if desired
}

Example Output

In the following oscilloscope screenshots, channel 1 (yellow) displays the signal from OC2 (pin D11) and channel 2 (blue) displays the signal from OC2B (pin D3).

This is the output when OC2RB = 49, making the two square waves 180° out of phase:

This is the output when OC2RB = 25, making the two square waves 90° out of phase:

This is the output when OC2RB = 1, making the two square waves 3.6° out of phase (i.e. almost in phase):

Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , | Leave a comment

An offcut from the Fraktalismus pattern factory

This animation, which I created by accident while trying to do something else, struck me as eye catching.

This is the complete C code used to generate video frames as individual PNM image files:

//
// offcut.c
// Written by Ted Burke, 1-2-2018
// See http://batchloaf.com
//
// To build:
//     gcc -O3 -Wall -o offcut offcut.c -lm
//
// To run:
//     ./offcut 0 1 400
//
// To combine individual frames into a movie:
//     
//     ffmpeg -framerate 8 -f image2 -s 1920x1080 -i %03d.pnm -framerate 8 -s 1920x1080 -i %03d.pnm -filter_complex "[0:v:0][1:v:0]concat=n=2:v=1[outv];[outv]scale=1920x1080[outv]" -c:v libx264 -preset slow -crf 17 -pix_fmt yuv420p -c:a copy -map [outv] -r 8 offcut.mkv
//

#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <complex.h>

#define W 3840
#define H 2160

unsigned char p[H][W][3] = {0};

int main(int argc, char **argv)
{
    int n, N=25, x, y, v, t, t_start, t_step, T;
    complex double z, c;
    double zlim = 10.0;
    char filename[1024];
    FILE *f;
    
    t_start = atoi(argv[1]);
    t_step = atoi(argv[2]);
    T = atoi(argv[3]);
    fprintf(stderr, "t_start:%d t_step:%d T:%d\n", t_start, t_step, T);
    
    for (t=t_start ; t<T ; t+=t_step)
    {
        // Vary parameter
        c = -0.825 -0.1*I - 0.2*cexp(I*t*2.0*M_PI/T);
        
        // Generate pixels
        for (y=0 ; y<H ; ++y) for (x=0 ; x<W ; ++x)
        {
            z = 0.002 * (I*(x-W/2.0) + (y-H/2.0));
            
            // This iterating function makes a good texture - it's unusual!
            for (n=0 ; n<N && cabs(z)<zlim ; ++n) z = 1 / (cpow(z,z+c) + c);
            
            // Colour mapping from angle of final z value to shade of blue
            v = 128.0 + 127.0 * carg(z)/M_PI;
            p[y][x][0] = v>127 ? 2*(v-127) : 0;
            p[y][x][1] = v>127 ? 2*(v-127) : 0;
            p[y][x][2] = v>127 ? 255 : 2*v;
        }
        
        // Write current frame to a PNM image file
        sprintf(filename, "%03d.pnm", t);
        fprintf(stderr, "Writing file %s...", filename);
        f = fopen(filename, "w");
        fprintf(f, "P6\n%d %d\n255\n", W, H);
        fwrite(p, 3, W*H, f);
        fclose(f);
        fprintf(stderr, "done\n");
    }
    
    return 0;
}

I decided to render the frames at double resolution (3840 x 2160 px) and then scale back down to 1080p resolution during the video conversion. This was the ffmpeg command I used to combine PNM images into a single video at 8 frames per second, with the full sequence repeated twice:

ffmpeg -framerate 8 -f image2 -s 1920x1080 -i %03d.pnm -framerate 8 -s 1920x1080 -i %03d.pnm -filter_complex "[0:v:0][1:v:0]concat=n=2:v=1[outv];[outv]scale=1920x1080[outv]" -c:v libx264 -preset slow -crf 17 -pix_fmt yuv420p -c:a copy -map [outv] -r 8 mugz.mkv
Posted in Uncategorized | Leave a comment

Cafe Terrace at Starry Night

Image | Posted on by | 1 Comment

€2 Robots in DIT

In this video, I chat with Kevin Chubb about his final-year project on ultra low-cost swarm robotics, here in DIT’s School of Electrical and Electronic Engineering. Kevin’s designing robots that can be built by hand from less than €2 worth of off-the-shelf parts. He doesn’t quite have a swarm yet, but in the video we get to see one of his tiny robots scuttling to and fro.

I for one welcome our competitively priced new robotic overlords!

Posted in Uncategorized | Tagged , , , , , , , , , , , , | 2 Comments

A brief introduction to binary numbers…

Posted in Uncategorized | Tagged , , , , | Leave a comment

Can the PIC12F675 drive motors directly from its GPIO pins?

As I mentioned in my previous post, my project student Kevin Chubb is developing tiny ultra low cost robots using a PIC12F microcontroller. One of the things that’s great about PICs is that they can source and sink relatively high current through their digital i/o pins. Kevin and I have been hoping that it might be possible to build a robot that powers its actuators directly from the microcontroller pins, so that was a big factor in choosing the PIC12F. Anyway, yesterday I received a package of tiny “coreless” motors in the post from AliExpress, so I’ve just carried out an experiment to see if they can be powered directly from the GPIO pins of a PIC12F.

I received two types of motors in yesterday’s delivery, but the ones I’m using in this experiment are these ones:

To begin with, I measured the current drawn by one of the motors when it was running unloaded at 3V, which turned out to be approximately 31 mA. Although that’s slightly higher than the rated pin current on the PIC12F (25 mA), I decided it was worth taking a chance and I set up an experiment with two of the motors connected directly to a PIC12F675. Here’s the video:

The complete code is used is shown below:

//
// PIC12F675 example: motors on GP4 and GP5
// Written by Ted Burke - 20-4-2017
//
// To compile:
//
//    xc8 --chip=12F675 main.c
//
 
#include <xc.h>
 
#pragma config FOSC=INTRCIO,WDTE=OFF,MCLRE=OFF,BOREN=OFF
 
void main(void)
{
    TRISIO = 0b11001111; // Make pins GP4 and GP5 outputs
    ANSEL  = 0b00000000; // Disable all analog inputs
     
    while(1)
    {
        GP4 = 1;         // Set pin GP4 high
        GP5 = 0;         // Set pin GP5 low
        _delay(500000);  // 0.5 second delay
        GP4 = 0;         // Set pin GP4 low
        GP5 = 1;         // Set pin GP5 high
        _delay(500000);  // 0.5 second delay
    }
}
Posted in Uncategorized | 2 Comments