Lego Sherlock Holmes figure focuses a light beam on a Lego criminal with a small magnifying glass.

Dancing LEGO Robots

I decided it was time to see if I could make Mindstorms motors move in synch with music. This project was tough to get working, but worth the effort. I built three little robots, seen in the video above, based on contraptions described in the excellent book by Yoshihito Isogawa called "The LEGO Mindstorms EV3 Idea Book." Bass frequency (centered at 63-Hz) sounds of the audio spectrum trigger motion in the flapping robot on the left, midrange frequency (based at 400-Hz) sounds trigger motion in the pumping-arms robot in the middle, and high frequency (based at 2500-Hz) sounds trigger motion in the spinning robot on the right. A mixture of spectral sounds, like music, will trigger a variety of dancing motions.

The music in the video is a guitar solo from my 2003 album called "Guitar Musing." This song is called "Keep Moving . . . Change Lanes Later." I wanted to use a Rush song ("Tom Sawyer" looks great on this LEGO dancing machine), but this would be a copyright mess. So Grady's music it is. The song opens with a rhythm groove from a guitar synthesizer, then adds a techno-taurus synth riff, then adds guitar. The overlayed guitar-synths and guitar is done real time with a Boss Looping Station. But guitar-geek conversation is for another day . . .

Signal processing of the Dancing LEGO Robots is hosted on a Raspberry Pi, along with a BrickPi3 from Dexter Industries and a GrovePi from Dexter Industries. The BrickPi3 is a great new device, with a very fast update rate to Mindstorms motors. It's this high-speed motor control that allows the dancing motion.

To analyze sound and create spectrum analysis I used the Audio Analyzer from DF Robot (dfrobot.com). This analyzer includes a microphone (seen in the video mounted between the arm-pumping robot and spinning robot). A seperate, small circuit board breaks down the sound into an analog signal in seven audio frequency bands. I'm only use three of the audio bands in this project centered at 63-Hz, 400-Hz, and 2500-Hz. The Audio Analyzer requires a strobe sequence of logic signals, shown in the oscilloscope screen shot below as the yellow signal. I implemented the logic strobe using a GPIO pin on the Raspberry Pi. After the falling edge of each strobe, the Audio Analyzer produces a voltage proportional to audio level in a particular frequency band. This analog output is shown in the oscillscope trace as the blue signal. The next task was to sample the analog audio output for the frequency bands of interest.

This analog sampling was done using the GrovePi on one of the analog ports. A Python program takes the analog signal and produces a flag if the audio signal goes above a threshold. The flag in turn activates the motor.

Oscilloscope screenshot of audio analyzer signals.