wearable-flames-with-fur-and-led-strips

LED VEST

My LED fur vest.

It is a lot of fun to take out at night, especially at events like White Night.

In total I have spend about 120 hours constructing the vest, trialling three different battery systems, creating the software, and creating animations for it.

The vest uses 470 WS2811 LED on 30led per meter strips, and a Teensy 3.1 to drive the lights. The animations are a mix of plasma algorithms and rendered video.

Hack-A-Day were kind enough to feature the vest on their website: http://hackaday.com/2014/04/16/wearable-flames-with-fur-and-led-strips/

To see how far I could go with it, I decided to render Star Wars (episode 4) and see if it could play the whole movie. And it can!

 

VEST CONSTRUCTION

VestConstructionThe vest was sewn together by me, by tracing a regular vest, cutting and putting it together. (thanks for the lend of the sewing machine mum!)

The inside uses regular jacket lining to permit it to slip around freely and not get caught which might stretch the electronics and break them.

White fur was attached over the top of the vest for diffusion.

A zipper was used in the bottom to permit access to the electronics.

.

LEDS

VestNakedLights470 WS2812 leds on regular non-waterproof strips, 30 per meter,  were cut to size so that they were vertically mounted. This helped place minimal bending on the strips.

Each strip was coated with 12mm clear heat shrink to protect the leds, and keep water and playa dust off.

The strips run in on continuous sequence, up-down-up-down, around the entire vest with the lights at uniform spacing to create a consistant grid of lights.

 

MICROCONTROLLER

VestController3After several attempts to generate plasma live on a TEENSY 3.0, which were successful but with a low frame rate of 4fps due to hefty maths, I opted for precompiled animations. It now runs at 31 frames per second, governed back to 25.


VestController2

ANIMATION METHODS

The trick to getting the entire vest to operate like a canvas rather than just turning lights on and off is to use an index of 470 LED objects, each holding the relative and correct X-Y coordinates of the corresponding LED.

This index can then be used to pic relative locations of pixels from a video. This animation of fire was sourced as a video. Interestingly animated fire looked better than real fire.

I initially used PROCESSING to create the plasma effect frame by frame (as featured in the first video above). As each frame is rendered in Processing, I grab the color of the pixels from the image that is generated, based on the XY coordinates of all the LED object in the index.

The data is collected pixel by pixel, and then stored in a file as raw color data, with three bytes per pixel.

However processing was very slow to render the Plasma, so I wrote a new plasma generator in C#.Net. This time it can render thousands of frames per second.

How do we generate Plasma? CLICK HERE TO FIND OUT.

The moving balls were created with Processing by accelerating the location of circles to random points on the screen. once they reach their target point, they accelerate towards a new randomly selected target.

This video shows the ball animation video produced in Processing, and the corresponding render on the vest in synchronized time.

You can read about how I created this animation in BALL TRACKING ANIMATION.

 

PLAYBACK

Storing data is done in sequence, reg-green-blue, from the first led to the last in each frame, then each frame is placed one after the other. The file length in bytes is:

LedCount * 3 * NumberOfFrames

The first three bytes are the (red green and blue) colors for the first LED, the fourth, fifth and six bytes are the red, green and blue colors for the second pixel.

In total there are 1410 bytes per frame, which is 470 * 3.

It happens to be the case that RGB colors extracted from a bit map image are the same as that used on the WS2811, (extracted as separate Red Green And Blue bytes) so you can simply grab the corresponding XY coordinate pixel colors from each video frame generated in Processing, and send it to the corresponding LED.

For video mapping, as with frames generated in processing, I simply process each frame from the video, extract the colors from the frame image by iterating the LED array, get the corresponding XY pixel color, and store the colors as raw bytes in a file.

To play back the precompiled frames, I have a Teensy 3.0 with an SD card attached.

Using the Neopixel library to manage the strip of leds, I simply get the first three bytes from the file, insert them end to end into a single integer (which is 32 bits but the neopixel library just ignores the 8 most significant bits) and copy the integer into the first location of the strip object. The second three bytes from the file go to the second led in the strip object…and so on for all 470 lights in that frame.

To play the next frame, just get the next lot of bytes from the file, which would start at the 1411th byte…

 

 

PARTS LIST:

  • 1×Teensy 3.1
  • 470×WS2812b
  • 1×Micro SD Card Reader
  • 1×Standard Dress Vest
  • 1×Faux White Fur, 2 meters.
  • 6×Super Glue

 

BATTERY SYSTEM:

I have made three battery systems in total, each improving on the previous.

The first was a 6 volt Lead acid battery weighing 1.5 kilos. I chose this battery as the voltage was useable, and I didn’t want to use a regulator. Keeping the battery on a harness on a belt, after 4 hours my big toes would go numb. Not good.

The second battery array was 25 18650 lithium ion batteries on a belt intended for shot cartridges. This works well as a belt. And I go 2 hours run time out of it, which is not great given it should hold 50amp hours! Dodgy batteries! And 25 Lipo batteries wired in parallel is a big fire hazard. So I dismantled it.

The current power pack uses four 5000mah 7.4v LiPo batteries in parallel, heavily fused, and a 15 amps regulator.

Typical current draw is 3 amps, so it won’t get hot. This battery system is stable and reliable.

 

Vest