posts tagged with the keyword ‘processing’


Simple Photobooth

It’s become a tradition around here to update my simple photo booth using Processing when a new version of Processing comes out. I’m not sure Processing 3.x is final yet, but I’m using it, and it’s got all sorts of good stuff. (You probably remember Processing PhotoBooth v2 and Processing PhotoBooth, which are both deprecated now, but see them to know what I’m talking about.)

One of the things new in Processing is the fullScreen() function, which gets rid of the whole “figure out the size of the display” issue, but just saying “run at full screen”!

There’s also a new thing called settings() which can appear before setup(), but I won’t get into that…

Here’s some code!

 * PhotoBoothV3.pde
Capture cam;

void settings() {

void setup() {
  String[] cameras = Capture.list();
  if (cameras.length == 0) {
  } else {
    cam = new Capture(this, cameras[0]);

void draw() {
  if (cam.available()) {;
    image(cam, 0, 0);

void keyPressed() {
    if (key == ' ') {  // space bar

And hey, once again you’ve got a simplistic photo booth application. Congratulate yourself by purchasing this lovely button for it. (Or get this “bare” button and build your own damn case.)

Now, I don’t know if the fullScreen() thing has a bug, or if it’s my setup, but here’s what I’m seeing. I typically run my display at 1440×900 using QuickRes, which is a non-standard setup. When I ran the sketch it seemed to display at 1280×720 in the upper-left corner. My guess is that the camera is only capturing 1280×720, so the sketch only fills that amount of the display, no matter what the resolution is. I’ve tested it at higher resolutions and get the same thing. If I set the display to 1280×800 it’s all good.

This is most likely not a bug, but a “thing to be aware of” in the future…

Besides all that, the most exciting thing about Processing lately is that there is finally an official version that runs on the Raspberry Pi! This is super-awesome and has great potential for artists and others who do exhibits and installations. I’ve already got a few ideas in the works. ;)



I recently prototyped a device to read cards (physical cards with printing on them) for a project. I used five SparkFun Digital Line Sensor Breakout Boards attached to a 3D printed mount and wired up to an Arduino.

Card and Sensors

The cards have five blocks at the bottom, which are either black or white, representing 1 or 0. Using ones and zeroes allows us to create a binary encoding scheme, so with five positions we use 1, 2, 4, 8, 16 for the values and can represent any number from 1 to 31.

Sensor Mount

I started by grabbing the image of the sensors from the SparkFun product page and dropping them into Inkscape (sized appropriately) so I could design the barcode part of the card, and so I could design the mount for the sensors.

Sensor Mount

Once I had a 2D design in Inkscape I exported it as a DXF file and used the linear_extrude command in OpenSCAD to create a 3mm tall plate, and then added another plate. It wasn’t perfect, but it was fast. I started the 3D printer while I got to work soldering…



Sensors all soldered up, mounted to the plate with 3mm screws, and wired to an Arduino via a breadboard. All of this is still prototyping stage. It doesn’t look pretty, but it worked and it was enough to test things out and do a demo.

Cards with Barcodes

Here’s an example of some card templates. Can you determine what number is being passed by reading it in binary? Since we’ve got 5 positions we can have 31 different cards… If you needed 63 cards, you would need 6 positions (and one more sensor.) 127 cards? That would be 7 positions and two more sensors. Any more than that and you might consider using the SparkFun Line Follower Array which has 8 sensors on a single board.

Card and Sensors

The total time to create this prototype was just a few hours from starting a design in Inkscape to 3D printing a piece, soldering up and mounting the sensors, and writing the code. (I also wrote a simple Processing application which read the serial output from the Arduino to display the card data on screen.)



In a recent post I mentioned a silly Processing sketch, and how Vishal and I made the mouse pointer jump around the screen using a Teensy as a USBHID device. This worked fine, but I mainly did it due to lack of time…

The correct way to make the mouse pointer jump around the screen in Processing is below.

import java.awt.*;
import java.awt.event.*;

Robot robot;

void setup() {
  try {
    robot = new Robot();
  catch (AWTException e) {
    println("Robot class not supported by your system!");

void draw() {
  robot.mouseMove(int(random(1,displayWidth)), int(random(1,displayHeight)));

This code does not work in ProcessingJS, but if you’re running Processing sketches in the IDE or as a standalone application, it works great.


X11_ssh_tunnelling by Tene~commonswiki

When I first started using *nix-based operating systems I played with xeyes, which is a “follow the mouse X demo” and a very simple program. (Really, it’s not fancy, but 25 years ago, it was sort of neat.)

A few days before Maker Faire Milwaukee Vishal and I were brainstorming ideas for something done in Processing to show with a projector, and I suggested xeyes because it was silly and simple and we hadn’t slept much.

I figured there was a Processing version out there, and our old pal Alex has one at Tinkerlog. I grabbed it and started hacking. Someone mentioned putting the old Milwaukee Makerspace logo in place and putting the eyes on it. (It may have been me, I honestly don’t remember, again… not much sleep.)

Then Lexie showed up and I ran the demo and she suggested there should be a fly for the cursor. In my tired state I thought this was a great idea, and then checked on how cursors work in Processing (yes, you can use an image) and then I found a fly on and added it.


Now we had something that let you move the mouse around and the fly would move and the eyes would follow it. I sent Vishal the code and he had a trackpad he thought about using with it… But then he said it would be cool if it just moved around on its own. I didn’t have time to write the code, so Vishal asked if I had a Teensy on me, and since I always do, I gave it to him. He then wrote code to make the Teensy act as a mouse and randomly move around the screen.

We hacked this all together pretty quickly, and it was fun, and not super-impressive, but we liked it. Oh, I also made a Processing.js version you can try. (It’s an early test version before I added the fly.)


A few days after Maker Faire I got an email from Bryan Cera about running Processing on the Raspberry Pi, which we had been discussing. He got it working, so I finally circled back around to give it a try. Well, it worked, and I got this mms-xeyes thing running as a full-on application.

This is pretty awesome. I mean, the cursor is a little weird, and disappears when you do not move the mouse (but reappears when you do move it) but overall it does work, and I’m pretty pleased with it. I’ve got a few ideas that involve Raspberry Pi computers running Processing sketches, so yeah… overall, this is good.


Banana Logo

I’m posting this because someone asked for it, and I aim to please… Here’s the Banana Pong code. I used code someone else wrote to bootstrap this thing, but there was no comment about who wrote it, and I didn’t make record of where I grabbed it, so… no attribution. Sorry! If it’s your code, let me know.

Have fun playing Banana Pong with your MaKey MaKey!

Note: The first ZIP file is the Processing source code. The second is a Mac OS X application. Since Processing has changed how it exports applications I can’t easily create versions for Windows and Linux like I did for the Apple Piano code. So if you want a Windows or Linux standalone version, you’ll need to grab the code and do it yourself. It should serve as a good starting point.

« Older Entries |

buy the button:

Buy The Button

top recent artists: