Using the PHP command line web server to transfer files between devices on a local network

When you install PHP, you get a simple built-in webserver as a bonus. This is very handy for testing web pages you’re writing, but I also sometimes use it as a simple way to transfer files between devices on my local wifi network. In particular, it’s a convenient way of getting files from my laptop onto my phone.

The basic idea is that I run the PHP web server in the folder that contains the files I want to transfer onto my phone, then use the browser on my phone to download the files. Let’s say that the file I’m transferring is called “myfile.txt” and it’s in the directory “/home/xubuntu/Downloads/“. I would type the following commands in a terminal to run the PHP web server in that directory:

cd /home/xubuntu/Downloads
php -S 0.0.0.0:8000

The “0.0.0.0” part is a dummy IP address that tells the PHP web server to listen on all network interfaces (e.g. 127.0.0.1 and 192.168.0.16 on my machine). I’ve also specified 8000 as the port number for the web server. To download the file onto my phone, the URL would be:

http//192.168.0.16:8000/myfile.txt

The phone needs to be connected to the same local network as the computer that the web server is running on. I have both connected to my wifi router.

If I have a few files to transfer, or if the filenames are long, it can be useful to make a little PHP script in the directory to generate a simple web page with links to each file. This is what I’m currently using (saved in the same directory as “index.php“):

<!DOCTYPE html>

<head>
    <style>
        body {font-size:200%;}
    </style>
</head>

<body>
    <br>

    <?php
    $i = 1;
    $g = glob("*");
    foreach ($g as $f)
    {
        if ($f == "index.php") continue;
        echo $i . ". " . "<a href=\"" . $f . "\">" . $f . "</a><br><br>";
        $i = $i + 1;
    }
    ?>
</body>

Here are the files stored in “/home/xubuntu/Downloads/”:

And here’s how it looks in the phone browser (the address is “192.168.0.16:8000”):

When I’m finished transferring files, I just press Ctrl-C in the terminal to close down the web server.

Advertisements
Posted in Uncategorized | Leave a comment

€5 PPG – photoplethysmogram amplifier / Arduino circuit

The photoplethysmogram (PPG) is a signal that measures changes in blood volume in some part of the body (e.g. the fingertip) by shining light into the skin and detecting small changes in the level of light absorption that occur due to the blood vessels enlarging and contracting. One common application is heart rate measurement. When the heart beats, blood vessels around the body swell slightly due to the increased blood pressure. This results is variable light absorption over the course of each cardiac cycle.

This circuit is a €5 PPG system that uses a TCRT5000 reflective infrared sensor, an LM358 opamp and an Arduino Nano.

The following Arduino code samples the analog voltage on pin A7 (the sample rate is approximately 100 Hz) and prints the values via the serial connection. The signal can therefore be plotted using the Arduino development environment’s Serial Plotter tool (located under the Tools menu). Normally, the Serial Plotter dynamically scales the vertical axis to fit the displayed signal. To prevent this (and maintain constant scaling), this program actually outputs two additional dummy signals – one which is always 0 and another which is always 1023. These hold the vertical axis limits at constant values.

//
// Photoplethysmogram (PPG) example
// Written by Ted Burke, 3-4-2019
//

void setup()
{
  pinMode(2, OUTPUT);
  Serial.begin(9600);
}

void loop()
{
  int du;

  du = analogRead(7);

  Serial.print("0 1023 ");
  Serial.println(du);

  delay(10);
}

This is an example PPG signal recorded using the above circuit (I rested my fingertip directly on top of the TCRT5000) and displayed in the Serial Plotter:

Since the PPG is a very low frequency signal, you may wish to reduce the gain of the amplifier at higher frequencies, which will tend to reduce the visible interference and smooth out the signal. This can be achieved by placing a 100nF capacitor in parallel with the 100kΩ resistor. The following signal was recorded with that capacitor in the circuit.

Posted in Uncategorized | Tagged , , , , , , , , , , | Leave a comment

Clap detector circuit / AirSpell typing system

This circuit combines a simple audio amplifier (based on an LM358 opamp) with an Arduino Nano to facilitate the detection of clapping sounds or blowing on the microphone.

This is my breadboard circuit:

I actually used a loudspeaker for my microphone, as shown here:

This Arduino program switches an LED on/off on pin D2 when a clap is detected:

//
// Clap on/off - written by Ted Burke
// Last updated 3/4/2019
//

void setup()
{
  pinMode(2, OUTPUT); // LED output
}

float v, avg = 1.66;
int lamp = 0; // on-off state of LED

void loop()
{
  // Sample analog voltage on A7 and convert to volts
  v = analogRead(7) * 5.0 / 1023.0;

  // Update moving average (really an IIR low-pass filter)
  avg = 0.99 * avg + 0.01 * v;

  // Detect sudden changes above a certain magnitude
  if (abs(v - avg) > 0.2)
  {
    // Toggle lamp (LED on D2)
    lamp = 1 - lamp;
    digitalWrite(2, lamp);
    delay(100);
  }
}

This Arduino program prints a single character over the serial connection whenever a clap is detected:

//
// Clap click - written by Ted Burke - 3/4/2019
//
 
void setup()
{
  pinMode(2, OUTPUT);
  Serial.begin(9600);
}
 
float v, avg = 1.66;
 
void loop()
{
  // Sample analog voltage on A7 and convert to volts
  v = analogRead(7) * 5.0 / 1023.0;
 
  // Update moving average (really an IIR low-pass filter)
  avg = 0.99 * avg + 0.01 * v;
 
  // Detect sudden changes above a certain magnitude
  if (abs(v - avg) > 0.2)
  {
    // Send a character via serial link
    Serial.print("c");
 
    // Flash the LED
    digitalWrite(2, HIGH);
    delay(100);
    digitalWrite(2, LOW);
  }  
}

Finally, the following bash script generates a mouse click whenever a character is received from a device connected on /dev/ttyUSB0 (i.e. the Arduino). In combination with onboard (a free Linux onscreen keyboard) this allows text to be spelled out by blowing on the microphone in short bursts (or clapping. Onboard needs to be configured in scanning mode. This may perform better when /dev/ttyUSB0 is in configured in “raw” mode (e.g. by running sudo stty -F /dev/ttyUSB0 raw) because the read command may finish faster facilitating faster repetition.

#!/bin/bash

while true; do
read -n 1 </dev/ttyUSB0
xdotool click 1
done

Save the above bash script to a file called “clicky” and then run the following commands in the same directory:

chmod 755 clicky
sudo stty -F /dev/ttyUSB0 raw
sudo ./clicky
Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , , , | Leave a comment

AirMouse – control mouse pointer in Linux using one switch or by blowing on microphone

Article under construction!

To install xdotool:

sudo apt install xdotool

Bash script (save as “airmouse” and “chmod 755 airmouse” to make it executable):

#!/bin/bash

# Run these commands first in terminal (as root)
#
#    stty -F /dev/ttyUSB0 raw
#    cat /dev/ttyUSB0 | ./airmouse
#

# Process input characters from arduino (u,d,l,r,c)
while true;
do
    read -n 1 INPUT
    if [ "$INPUT" = "u" ]; then
        xdotool mousemove_relative -- 0 -10
    elif [ "$INPUT" = "d" ]; then
        xdotool mousemove_relative -- 0 10
    elif [ "$INPUT" = "l" ]; then
        xdotool mousemove_relative -- -10 0
    elif [ "$INPUT" = "r" ]; then
        xdotool mousemove_relative -- 10 0
    elif [ "$INPUT" = "c" ]; then
        xdotool click 1
    fi
done

Arduino code:

//
// AirMouse - Ted Burke - 2/4/2019
//

void setup()
{
  pinMode(2, OUTPUT);

  Serial.begin(9600);

  set_state(1);
}

int state = 1;
unsigned long t;
int direction = 0; // 0:up, 1:down, 2:left, 3:right

void set_state(int n)
{
  state = n;
  t = millis();
}

void loop()
{
  int v;
  unsigned long elapsed;

  delay(50);
  
  v = analogRead(7);
  elapsed = millis() - t; // time elapsed since entering current state

  if (state == 1) // STATE 1: WAIT FOR INPUT
  {
    digitalWrite(2, LOW);

    if (v > 512) set_state(2);
  }
  else if (state == 2) // STATE 2: IS THIS A CLICK OR A MOVE?
  {
    digitalWrite(2, HIGH);

    if (v <= 512) set_state(3);
    if (elapsed >= 200) set_state(4);
  }
  else if (state == 3) // STATE 3: SEND A CLICK
  {
    digitalWrite(2, LOW);
    Serial.print("c");
    delay(250);
    set_state(1);
  }
  else if (state == 4) // STATE 4: MOVE MOUSE
  {
    digitalWrite(2, HIGH);

    if (v <= 512) set_state(5);
    else if (direction == 0) Serial.print("u");
    else if (direction == 1) Serial.print("d");
    else if (direction == 2) Serial.print("l");
    else if (direction == 3) Serial.print("r");
  }
  else if (state == 5) // STATE 5: CHANGE MOUSE DIRECTION
  {
    digitalWrite(2, LOW);
    
    direction = (direction + 1)%4;

    set_state(1);
  }
}
Posted in Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , | Leave a comment

Some RGB fractal doodles

Click on the animation to view full size gif.

This is the code I used to generate the animation:

//
// fraktalismus modulo - written by Ted Burke 15-1-2019
// Compiled and tested on Xubuntu Linux 18.10
//
// To compile:
//     gcc -o fraktalismus fraktalismus.c -lm
//
// To run:
//     ./fraktalismus
//
// To create mp4 video from frames using ffmpeg:
//     ffmpeg -f image2 -framerate 15 -i "%04d.png" output.gif
//

#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <complex.h>

int main()
{
    // Declare variables
    unsigned char *p;             // buffer of rgb pixels for each frame 
    complex double *q;            // stores a frame of complex pixel values
    double qmagn, qarg;           // magnitude and angle of 
    unsigned char r, g, b;        // colour components of each pixel
    int w=4*1366, h=4*768;        // image width and height
    int n, x, y, t, T;            // t is the frame number
    int xoffset, yoffset;         // shifts centre point of image in complex plane
    complex double z, c, coffset; // z is the iterated complex value
    double zmax;                  // z values wrap around at this magnitude
    double px;                    // width/height of each pixel in complex plane
    FILE *fimage;                 // file pointer for writing pnm files
    char pnm_filename[256];       // stores PNM image filename for each frame
    char png_filename[256];       // stores PNG image filename for each frame
    char command[1024];           // buffer for imagemagick shell command
    
    // Allocate memory for pixels
    p = malloc(w*h*3);
    q = malloc(w*h*sizeof(complex double));
    
    // Generate frames of the animation one by one
    T = 31;
    for (t=0 ; t<T ; ++t)
    {
        // Generate pixel values
        c = -0.6 -0.725*I + 0.025*cexp(I*2.0*t*M_PI/T);
        xoffset = -600*4;
        yoffset = -500*4;
        px = 0.0007/4.0;
        zmax = 1000.0;
        for (y=0 ; y<h ; ++y) for (x=0 ; x<w ; ++x)
        {
            // Iterate complex function
            z = px * ((x-w/2.0+xoffset) + I*(y-h/2.0+yoffset));
            coffset = 0.025 * cexp(I*atan2(y,x));
            for (n=0 ; n<10 ; ++n)
            {
                z = z*z + c + coffset;
                if (cabs(z) > zmax)
                {
                    z = fmod(cabs(z), zmax) * cexp(I*carg(z));
                }
            }
            
            // Find end point of this point's N-step orbit
            q[y*w + x] = z;
        }
        
        for (y=0 ; y<h ; ++y) for (x=0 ; x<w ; ++x)
        {
            // Set RGB components of current pixel
            qmagn = log(cabs(q[y*w + x]));
            qarg = carg(q[y*w + x]);
            r = 255.0 * qmagn * 0.5 * (1.0 + cos(qarg + 0*2.0*M_PI/3.0));
            g = 255.0 * qmagn * 0.5 * (1.0 + cos(qarg + 1*2.0*M_PI/3.0));
            b = 255.0 * qmagn * 0.5 * (1.0 + cos(qarg + 2*2.0*M_PI/3.0));
            r = 255 - r; g = 255 - g; b = 255 - b;
            p[3*(y*w + x) + 0] = r;
            p[3*(y*w + x) + 1] = g;
            p[3*(y*w + x) + 2] = b;
        }
        
        // Write image to PNM file
        sprintf(pnm_filename, "%04d.pnm", t);
        sprintf(png_filename, "%04d.png", t);
        fprintf(stderr, "Writing %s\n", pnm_filename);
        fimage = fopen(pnm_filename, "w");
        fprintf(fimage, "P6\n%d %d\n255\n", w, h);
        fwrite(p, 3, w*h, fimage);
        fclose(fimage);
        
        // Convert image to PNG format and then delete PNM file
        sprintf(command, "convert %s -resize 25%% %s", pnm_filename, png_filename);
        fprintf(stderr, "Executing: %s\n", command);
        system(command);
        sprintf(command, "rm %s", pnm_filename);
        fprintf(stderr, "Executing: %s\n", command);
        system(command);
    }
    
    // Free dynamically allocated memory
    free(p);
    free(q);
}

The version below is obtained by modifying the iterating function on line 59 of the program, as follows:

z = (z*z + c + coffset)/(z*z - c + coffset);

Click on the animation to view full size version.

Posted in Uncategorized | Leave a comment

How to display USB webcam as live video on desktop using mplayer

mplayer -tv driver=v4l2:gain=1:width=1280:height=720:device=/dev/video1:fps=10:outfmt=rgb16 tv://
Posted in Uncategorized | Leave a comment

Ronan Byrne’s ultra low-cost brain-computer interface

Over the years, I’ve dabbled a bit in the creation of a so-called brain-computer interface (BCI). These systems take various forms, but the basic idea is to use technology to transfer information from the conscious mind of a human into a computer, without channeling it through muscle movements. Humans communicate information from their conscious (and unconscious) minds into the outside world all the time in the form of speech, body language, writing, typing, touchscreen interaction, etc. However, all of these forms of communication rely on the brain’s ability to control muscles to transmit information into the outside world. A BCI uses technology to observe and interpret brain activity directly. When it works, a BCI allows its user to control technology using his/her thoughts!

This semester, I’m delighted to be supervising Ronan Byrne’s final-year project which explores the design of an ultra low-cost EEG-based BCI. Ronan has designed his own EEG amplifier and custom software. He just posted a couple of great videos showing what he’s achieved so far, so I thought I’d share them here.

In this video, Ronan gives an overview of his system. The communication interface isn’t fully functioning yet, but you can see the real-time processing of Ronan’s EEG while he’s recording the video!

His second video shows a really nice take on what is probably the simplest type of EEG-based BCI: modulation of alpha waves by opening and closing the eyes. Ronan’s interface allows the user to give yes/no answers to questions by increasing or decreasing alpha activity (roughly 10-12 Hz) in the occipital lobes (at the back of the head), which can be achieved by opening or closing the eyes.

Posted in Uncategorized | 3 Comments