Showing posts with label java. Show all posts
Showing posts with label java. Show all posts

Thursday, 15 December 2022

Slack/Whatsapp controlled colour-changing and morse code Christmas lights

Several years ago I made some Christmas lights that could be controlled via Slack and WhatsApp, and every year since I find that I need to update the interface to these platforms to account for the changes that they make during the year. 

Usability updates

This year whilst I was doing that, I took the opportunity to add a few new features and simplify the interface.

Now it responds to some key words – all, top, bottom, half, alternate, led, and allows the user to specify certain colours by word and all colours by hex code.

Additionally I refined the serial interface to speed things up.

Moving to a single-board computer

 
For my home implementation of this, previously I would just run a USB extension lead from my desktop. 


I have now migrated the code to a PCDuino2 single board computer. This is a 1Ghz Allwinner A10 chip SBC, which also contains Arduino pin headers.


The board is discontinued (I bought this one from Maplin years ago during their closing down sale), and as a result there is little in the way of official repositories or archives for the Ubuntu variant distribution that it runs. Plus it looks like there’s some sketchiness around the official firmwares around the allwinner chip that powers it.


I found a version of Armbian linux for this platform and burnt it to a micro SD card.
This allowed me to get the PCDuino to boot to Armbian.


Unfortunately it looks like there is no real support for installing Armbian to the NAND storage which is on the PCDuino. However, leaving it on the micro SD has some advantages such as easily being able to image and back up the install.


Then it was simply a case of connecting it to the network and updating it, enabling SSHD for ease of maintenance going forward, and installing required tools for the Christmas lights server (JDK and Arduino).


As the PCDuino is not a typical Arduino development board it needs its own board definition to be used with the Arduino IDE, so for now I decided to continue using an externally connected Arduino.

 

Implementing morse code

 
One of the downsides to the set up the I currently have is that the serial data transmission can be disrupted by the update process transferring data from the Arduino to the lights.


The changes I have made reduce the amount of data that is transmitted via serial and so minimize the problem.


However there is additional functionality that I wish to add that will require further data to be transmitted – that is to have a set of lights which flash similarly to regular Christmas lights, but instead of a standard or random pattern, they blink out morse code, based on a message input from the control application.


Rather than cram more functionality into the WS2812 lights that I am currently using, I plan to leave them be and use a regular set of plain white fairy lights for this. Typically I have always decorated our Christmas trees with a multi colours set and a plain white set, so this fits with what I would typically do anyway.

The lights I used are powered from 3xAA batteries (4.5v), which can easily be substituted for USB and powered from a USB hub. To toggle their on and off state, I have used an optocoupler which will be toggled from the PCDuinos GPIO pins.


The ‘arduino-style’ pins of the PCDuino board can be reached in be linux file system via GPIO – similar to the way I toggled an output pin on the frequency switch project.


Seeing is this makes it quite easy to simply toggle a pin on and off, I should be able to use this for the morse code without having to depend on the rather outdated and no longer supported PCDuino board definitions.

Finding the pin reference

 
A ‘gotcha’ about using the sysfs GPIO interface is that the pin numbers as they may have silk screened on the board do not necessarily correlate to the pin numbers in the file system.


The first thing to do is figure out what the pin identifier would be. In this instance, I found some config files for the PCDuino on GitHub.


There lists the PCDuinos GPIO pins and there corresponding identifier, summarised below.

GPIO PinPin ID
0PI19
1PI18
2PH07
3PH06
4PH08
5PB02
6PI03
7PH09
8PH10
9PH05
10PI10
11PI12
12PI13
13PI11
14PH11
15PH12
16PH13
17PH14
18PH15
19PH16

I’d decided to use GPIO 8, so the relevant ID is H10

The Sunxi wiki demonstrates how to calculate the relevant GPIO number from that ID

(position of letter in alphabet - 1) * 32 + pin number

so with H10, H is the 8th letter so

(8-1)*32 + 10 = 234


So with that number calculated, we can export the pin with

echo 234 > /sys/class/gpio/export

which will make ‘gpio234’ available under /sys/class/gpio.

From there we can make it an output, and change the ownership so of it’s value file so that a regular user can write to it.

echo out > /sys/class/gpio/gpio234/direction

chown -R ant:ant /sys/class/gpio/gpio234/value


Then to switch the lights on,

echo 1 > /sys/class/gpio/gpio234/value

and

echo 0 > /sys/class/gpio/gpio234/value

to turn them off


A demo video is below, and as usual the source is available on GitHub.

Merry Christmas!

 

 

Friday, 19 March 2021

Sound-activated switch for a set frequency

Clap switches are an old automation gimmick from the 1980s. Basically the circuit hears a noise above a given volume (amplitude), and activates.

This project is an attempt to refine that idea, to create a switch that does the same, but responds only to a given sound (frequency).

This was inspired by watching my partner fail miserably playing South Park - The Stick of Truth.

In the game during battles, there is a timed button press during an attack that increases damage. Timing these button presses was not going well...

That noise triggers at the time the button press is required, so I started thinking about how that noise could be listened for, and the button press triggered.

Band-pass filters

This system relies on the use of a band-pass filter. There are plenty of explanations around about these and how they work, so I won't reinvent the wheel here.

For the purposes of this project, the key point is if the input has a frequency between the low and high thresholds, it is allowed through. Frequencies outside of those thresholds (bands) are rejected.

These can be built in hardware as circuits, but also in software on regular computers.

Fast-fourier Transform (FFT)

This is a well-known algorithm that, in simplistic terms, takes input over time (such as audio), and breaks it down into the frequencies it's composed of.

I'll admit, my understanding of FFTs is similar to the relationship most people have with their household appliances - know how to use it, but can't really explain how it works under the hood. There's plenty of detailed explanation for the more mathematically inclined.

So the basic concept is this

  • Pipe the audio input into the FFT
  • The FFT converts it into the frequency domain
  • Zero out all the values for the frequencies that fall outside of the range we're 'listening' for.
  • Do an inverse FFT transform, which turns the frequency domain data back into time-domain (i.e. back to real audio). This gives us sound where everything except for the range we're listening for is muted.
  • This can then be passed into a regular 'clap-switch', where we trigger if the volume of the sound is above a given level.

 

Finding the target frequency

This part of the process can be trickier than it initially seems. Sounds are composed of many frequencies, so it is necessary to select a frequency range that is unique to the target part of the audio.

To start with, I extracted the audio from the above video clip using FFMPEG, and opened it up in Audacity. This initially shows the audio waveform.


Select the area containing the sound, and select Tools, Plot Spectrum.

This will show the frequencies that exist within the selection. However, this doesn't give us all the answers. Save the plot (I just took the below screen-shot and used that). Then select another parts of the audio and repeat the exercise. Then basically it's a case of spot-the-difference, looking for a frequency spike that appears in our target audio but not the other samples.

A sample spectrum from elsewhere in the audio.
The segment containing the target sound. Circled are frequency spikes not seen in the rest of the audio,

Hardware

To run this switch I'm going to use the Next Thing Co CHIP. This is the same system I used for the TV desk project years ago. Unfortunately these are now discontinued, but there are still many ARM-based SoCs running Linux out there.

Potentially this could be distilled down further on to a smaller microcontroller, although I'd have reservations about how far you could reduce the resources until the processing time introduces enough lag to make it too slow to use.

As well as effectively being a 'proper' computer, the CHIP has general purpose IO pins, like most microcontrollers. This can provide the interface for the output of the 'clap' switch. In this case, for the sake of example I'm just hooking up a simple LED that will blink on detection of the given sound.

Although the CHIP does have microphone pins and the ability to switch it's video pin from the jack to be an audio in, for the sake of prototyping, I found it much easer to just use a cheap USB adapter which has microphone and headphone sockets.

Power comes from a standard USB phone charger.

Control is done via a serial connection to my PC, using Minicom.


Software

As the CHIP is a full-blown Linux distribution, there's lots more flexibility in the software that we use. I ended up using Java and the Apache Commons Math library.

The basic OS was pre-installed, Java 8 JDK was installed from here.

The Java code listens to the microphone input, and allows the user to load a JSON file containing details of the filter to apply - this made testing and refining the filter easier. The values are the start and end of the frequency range to listen for, and the threshold of amplitude to trigger the output (This figure can be a bit of trial and error based on how loud the input is, and can vary if the input volume varies.)

It can also be controlled via command line to either pass-through the audio to the headphones as-is, or post-filter - i.e. you hear what the 'clap' circuit would hear.

Code is on GitHub here.

Configuration

Some configuration was required to enable the GPIO pins to be activated on boot. To do this, the necessary commands (below) are wrapped in the bash script 

/etc/init.d/preparegpio.sh:

echo 1023 > /sys/class/gpio/export
echo out > /sys/class/gpio/gpio1023/direction
echo 0 > /sys/class/gpio/gpio1023/value
chown chip:chip /sys/class/gpio/gpio1023/value
 

(Refer to the CHIP docs for what exactly these mean)

This is set to run on boot by adding the below line to /etc/rc.local

sh /etc/init.d/preparegpio.sh

Finally, for the Java code to trigger blinking the LED, it triggers another SH script. I went this route as I intend to develop the Java code into a more general purpose audio tool, so didn't want to tie the code too closely to the hardware I'm using for this project.

It also has the benefit that the audio processing doesn't wait for the GPIO operation to complete, thus reducing the lag.

~/triggergpio.sh

echo 1 > /sys/class/gpio/gpio1023/value
sleep .5
echo 0 > /sys/class/gpio/gpio1023/value

Testing

While it certainly does respond to the input audio as expected, there is definitely some processing lag, as can be seen as the video progresses.

This isn't entirely unexpected, and could be overcome by throwing more computing power at the processing (The CHIP is a 1GHz processor), or possibly further optimisation of the code (or porting it to C or similar).

That might be the subject of a follow up project at a later date, but for now, this demonstrates the idea.


 



 

Monday, 21 December 2020

Slack & WhatsApp controlled Christmas Tree Lights

December 2022 update: I've developed this idea further, giving the project more features, including the ability to blink lights in morse code. The original post remains below.

With the Covid-19 pandemic still dragging on, it looks like things won't be back to normal in time for Christmas.

At work there was discussion about Christmas parties, team building exercises, and other such things to try to boost morale.

I'm currently working on a project that uses WS2812 RGB LEDs (which'll no doubt wind up published here eventually), and on a whim, wondered if there existed fairy lights that use the same chip.

Turns out, there are. So obviously I had to buy some.

My Christmas tree is positioned so that is can be seen in the background when I'm on Zoom calls with work, so I thought it'd be fun if I could setup some system that would let my colleagues interact with the lights.

Once the lights arrived and I'd tested them, I got to tearing them down.

Wiring

The power supply/controller is a USB unit. I was powering it from a phone charger adapter. That also contained an infra-red receiver for the included remote. I thought about potentially working that angle, but the remote had limited options, and I wanted to offer more granular control.

From the controller to the lights was just 3 leads, which I rationalised must be +5V, Ground, and Data.

I cut the wire, figured out which was which, and connected in an Arduino. It wasn't able to provide enough current to drive the LEDs itself, to I made use of the existing charger - the charger continued to be connected to +5V and Ground, sharing the ground with the arduino's ground, and then the data line connected to the arduino.

The circuit. It works fine like this for short periods, although I later found adding a 470 Ohm resistor between arduino D12 and the first LED increased reliability over time.

 

Arduino code

I ended up using the PololuLedStrip library, and modifying the existing code for that.

Initially I'd hoped to be able to provide full RGB control, but it just seemed that the sheer amount of data being sent meant bytes were being lost, or I had to slow the data rate down to unusually slow.

Eventually I settled on using 10 preset colours, defined with numbers 0-9, being sent as a 100-character serial string - 1 character (colour) per LED. This seemed to be the best trade off of functionality and reliability.

The code can be found here.

 

The 'server'

This is a simple web server that accepts GET requests for one endpoint, and is simply there to provide a go-between for the messaging apps and the serial port.

As each LED is individually addressable, I wanted to create a format that would give users control at that level. I settled on messages with the following JSON structure

[1,"colour"]

or

[[1,2,3], "colour"]

where the numbers are the IDs of the LEDs, and the string is one of the available colours.

This would let them control either an individual LED or a series of them in an individual command.

Hooking into Slack and WhatsApp

As at work we use Slack, and we have some previous integration with it, it seemed like the obvious choice.

However, their API functionality is heavily focused on HTTP endpoints, and requires a server to interface with it. That's fine for 'proper' development, but when I'm just messing around at home, I don't really want to be dealing with opening up my home network and all that entails.

Really what I wanted was some client-side plugin functionality. There isn't an official API or interface for one, so I had to improvise.

Using the web-browser interface, and making use of Firefox's Javascript console, I added a MutationObserver to the page, and made use of the DOM to narrow in on the message elements, and get the contents of the latest message in whatever chat/channel the user was in.

The javascript for Slack can be found here.

Once retrieved, some basic checks are done to make sure that the message is in a usable format and hasn't been acted upon already, it's then passed to a local server via HTTP, where it's processed, converted to the serial format used by the arduino, and sent onward to it.

After realising how useful MutationObserver is for things like this, it was quite trivial to do the same for WhatsApp.

The finished product


Monday, 22 June 2020

Context-sensitive macro keypad

I've been doing quite a bit of 3D work recently for both 3D printing and VR software development.

The main software that I use for 3D work are OpenSCAD and Blender.

One annoyance I have is that the short cut keys each use for standard things such as scale, rotate, move, etc. are all different, which makes switching between applications more awkward than it need be.

So my latest electronics project is to create universal short cut keys - a physical keypad that has single button presses for those functions, but then translates them into the relevant keyboard/mouse presses dependent on which application has focus at the time.

The hardware
As we're still under lockdown due to COVID-19, I'm restricted to using only components and tools I have got at home, like with the USB Switch project.

The keypad itself is this mechanical key switch tester.



Unfortunately the eighth switch was lost some time ago. I'll leave the gap there as potential room for expansion in future though.

The controller is an Arduino Pro Micro. I thought about using a different microcontroller, however the real sticking point was my lack of a spare USB to serial adapter.

The Pro Micro helps keep the overall form factor nice and compact, whilst keeping the programming side of things straightforward.

I picked out some LEDs to backlight the keys as there was spacing for them, and the keycaps are translucent. I tried to find all different colours, but my inventory didn't allow for that, so there's a couple of duplications.

The rest of the hardware is just some diodes and resistors, a bit of perfboard and some scavenged wires.

The hardware is wired up like so:
Quite simply, there's two matrices, one for the switches, and one for the LEDs. Because of the missing key, one column in each only has three connections.


The case is just some plastic container that would otherwise have been trash. It wasn't a perfect fit so required some Dremel-based customisation.

The Firmware

The Arduino will stay set up as a USB to Serial device rather than as a keyboard because there needs to be bi-directional communication - the Arduino needs to also receive input from the application in order to control the lights.

This post was a useful resource when putting together the matrices.

The full source is available in the project on GitHub.

The Software
The software is a java desktop app that communicates with the Arduino using jSerialComm. It controls which keys are 'set' and indicates this with the LEDs.

The interfacing with the other applications comes from a thread that uses system calls to the 'xdotool' linux command, reading in the output of that command which is the name of the current window that has focus. There's room here to make the application multi-platform, by implementing a similar Windows command in something like AutoHotKey.

If the window matches the defined rule set, the relevant keys are highlighted and a thread monitors serial input. If that key is pressed, java built-in Robot class sends the relevant commands to the application.

As with the firmware, the source is available on GitHub. It's in a pretty simplistic state at the moment due to the one-week time limit I've placed on these lockdown projects, but there's room to expand and improve in future.