Categories
Uncategorized

In Advance of Troubleshooting

Teensy

I’m working on a new exhibit that will be using an Arduino (actually, a Teensy++ 2.0) to talk to an application running on a PC via serial data. The Teensy will be sending one byte to control the application’s behavior. This is an upgrade from an older version where the Teensy just sent keystrokes to the application. The nice thing about sending keystrokes is that it was very easy for anyone to troubleshoot because they could just open Notepad and press some buttons to see if they were sending any output. The bad part was that if a normally closed switch was open, it would just stream characters to the computer, which could make things hard to troubleshoot for some people.

ATMTester

To deal with the troubleshooting issue (which will eventually come up, as it always does) and make it easy for non-technical people to view a serial data stream, I wrote a simple application in Processing that reads the byte and displays the value, along with the status of each physical control of the exhibit.

ATMTester

The exhibit should always have the Teensy plugged into COM3 on the PC, but again, once something leaves the shop we never know what strange things might happen. When the application starts up it will present a dialog showing the COM ports, and asking you to select the correct one. If you select the wrong one it will just display nothing. This should be enough to help troubleshoot things via phone or email.

The trickiest part was the code to choose the COM port. (I know, we don’t call them “COM ports” on Mac OS X, and yes, the application works fine on Mac OS X, that’s another thing I love about Processing.) The code for choosing the COM port came from this forum thread How to let the user select COM (serial) port within a sketch?.

I did have to install Java to get the application to run, but it looks and functions like any other Windows application. Here’s hoping this all works and never has to be used, but is there just in case…

Categories
Uncategorized

Processing PhotoBooth v3

Simple Photobooth

It’s become a tradition around here to update my simple photo booth using Processing when a new version of Processing comes out. I’m not sure Processing 3.x is final yet, but I’m using it, and it’s got all sorts of good stuff. (You probably remember Processing PhotoBooth v2 and Processing PhotoBooth, which are both deprecated now, but see them to know what I’m talking about.)

One of the things new in Processing is the fullScreen() function, which gets rid of the whole “figure out the size of the display” issue, by just saying “run at full screen”!

There’s also a new thing called settings() which can appear before setup(), but I won’t get into that…

Here’s some code!

/**
 * PhotoBoothV3.pde
 */
 
import processing.video.*;
Capture cam;

void settings() {
  fullScreen();
}

void setup() {
  colorMode(RGB);
  String[] cameras = Capture.list();
  if (cameras.length == 0) {
    exit();
  } else {
    cam = new Capture(this, cameras[0]);
    cam.start();
  }   
  noSmooth();
  background(0);
}

void draw() {
  if (cam.available()) {
    cam.read();
    image(cam, 0, 0);
  }
} 

void keyPressed() {
    if (key == ' ') {  // space bar
       saveFrame("Picture-######.jpg");
    }
}

And hey, once again you’ve got a simplistic photo booth application. Congratulate yourself by purchasing this lovely button for it. (Or get this “bare” button and build your own damn case.)

Now, I don’t know if the fullScreen() thing has a bug, or if it’s my setup, but here’s what I’m seeing. I typically run my display at 1440×900 using QuickRes, which is a non-standard setup. When I ran the sketch it seemed to display at 1280×720 in the upper-left corner. My guess is that the camera is only capturing 1280×720, so the sketch only fills that amount of the display, no matter what the resolution is. I’ve tested it at higher resolutions and get the same thing. If I set the display to 1280×800 it’s all good.

This is most likely not a bug, but a “thing to be aware of” in the future…

Besides all that, the most exciting thing about Processing lately is that there is finally an official version that runs on the Raspberry Pi! This is super-awesome and has great potential for artists and others who do exhibits and installations. I’ve already got a few ideas in the works. ;)

Categories
Uncategorized

Barcode Binary Card Reader

Sensors

I recently prototyped a device to read cards (physical cards with printing on them) for a project. I used five SparkFun Digital Line Sensor Breakout Boards attached to a 3D printed mount and wired up to an Arduino.

Card and Sensors

The cards have five blocks at the bottom, which are either black or white, representing 1 or 0. Using ones and zeroes allows us to create a binary encoding scheme, so with five positions we use 1, 2, 4, 8, 16 for the values and can represent any number from 1 to 31.

Sensor Mount

I started by grabbing the image of the sensors from the SparkFun product page and dropping them into Inkscape (sized appropriately) so I could design the barcode part of the card, and so I could design the mount for the sensors.

Sensor Mount

Once I had a 2D design in Inkscape I exported it as a DXF file and used the linear_extrude command in OpenSCAD to create a 3mm tall plate, and then added another plate. It wasn’t perfect, but it was fast. I started the 3D printer while I got to work soldering…

Sensors

Sensors

Sensors all soldered up, mounted to the plate with 3mm screws, and wired to an Arduino via a breadboard. All of this is still prototyping stage. It doesn’t look pretty, but it worked and it was enough to test things out and do a demo.

Cards with Barcodes

Here’s an example of some card templates. Can you determine what number is being passed by reading it in binary? Since we’ve got 5 positions we can have 31 different cards… If you needed 63 cards, you would need 6 positions (and one more sensor.) 127 cards? That would be 7 positions and two more sensors. Any more than that and you might consider using the SparkFun Line Follower Array which has 8 sensors on a single board.

Card and Sensors

The total time to create this prototype was just a few hours from starting a design in Inkscape to 3D printing a piece, soldering up and mounting the sensors, and writing the code. (I also wrote a simple Processing application which read the serial output from the Arduino to display the card data on screen.)

Categories
Uncategorized

Mouse Control in Processing

Auto

In a recent post I mentioned a silly Processing sketch, and how Vishal and I made the mouse pointer jump around the screen using a Teensy as a USBHID device. This worked fine, but I mainly did it due to lack of time…

The correct way to make the mouse pointer jump around the screen in Processing is below.


import java.awt.*;
import java.awt.event.*;

Robot robot;

void setup() {
  try {
    robot = new Robot();
  }
  catch (AWTException e) {
    println("Robot class not supported by your system!");
    exit();
  }
  fullScreen();
  background(255);
}

void draw() {
  robot.mouseMove(int(random(1,displayWidth)), int(random(1,displayHeight)));
  delay(100);
}

This code does not work in ProcessingJS, but if you’re running Processing sketches in the IDE or as a standalone application, it works great.

Categories
Uncategorized

mms xeyes

X11_ssh_tunnelling by Tene~commonswiki

When I first started using *nix-based operating systems I played with xeyes, which is a “follow the mouse X demo” and a very simple program. (Really, it’s not fancy, but 25 years ago, it was sort of neat.)

A few days before Maker Faire Milwaukee Vishal and I were brainstorming ideas for something done in Processing to show with a projector, and I suggested xeyes because it was silly and simple and we hadn’t slept much.

I figured there was a Processing version out there, and our old pal Alex has one at Tinkerlog. I grabbed it and started hacking. Someone mentioned putting the old Milwaukee Makerspace logo in place and putting the eyes on it. (It may have been me, I honestly don’t remember, again… not much sleep.)

Then Lexie showed up and I ran the demo and she suggested there should be a fly for the cursor. In my tired state I thought this was a great idea, and then checked on how cursors work in Processing (yes, you can use an image) and then I found a fly on OpenClipArt.org and added it.

mms-xeyes

Now we had something that let you move the mouse around and the fly would move and the eyes would follow it. I sent Vishal the code and he had a trackpad he thought about using with it… But then he said it would be cool if it just moved around on its own. I didn’t have time to write the code, so Vishal asked if I had a Teensy on me, and since I always do, I gave it to him. He then wrote code to make the Teensy act as a mouse and randomly move around the screen.

We hacked this all together pretty quickly, and it was fun, and not super-impressive, but we liked it. Oh, I also made a Processing.js version you can try. (It’s an early test version before I added the fly.)

mms-xeyes-rpi

A few days after Maker Faire I got an email from Bryan Cera about running Processing on the Raspberry Pi, which we had been discussing. He got it working, so I finally circled back around to give it a try. Well, it worked, and I got this mms-xeyes thing running as a full-on application.

This is pretty awesome. I mean, the cursor is a little weird, and disappears when you do not move the mouse (but reappears when you do move it) but overall it does work, and I’m pretty pleased with it. I’ve got a few ideas that involve Raspberry Pi computers running Processing sketches, so yeah… overall, this is good.