Monthly Archives: December 2008

“crush collision”, a new album from The MD5

No, not really, but wouldn't it be awesome?

This is the team that wrote the paper MD5 considered harmful today: Creating a rogue CA certificate which showed that the security model of HTTPS can be defeated using a combination of attacks on the MD5 hash algorithm plus non-random serial numbers in the popular RapidSSL and FreeSSL certificates.  They used a cluster of 200 PS3s for a weekend of number crunching.

Mc5-kick-out-the-jams

The techno show Crush Collision is on Thursdays at 10pm Eastern on WCBN.

Merge the Ann Arbor Community Television Network with the Ann Arbor District Library’s video services?

Matt Hampel has released a study of Ann Arbor's Community Television Network.  He writes about an organization that was innovative and forward-looking in the 1970s that is now running with little public oversight and with only the most tentative ways of engaging with the public through networks other than cable television.

Most notable in the whole discussion is a comparison of CTN's video archiving and online access system with a similar setup at the Ann Arbor District Library.  The library has direct access into the CTN digital network feeds, and uses off the shelf software to transcode video for delivery to cable.  Where CTN is hamstrung by a reliance on city IT staff to do technology development – an IT staff that only does necessary maintenance – the AADL has an active IT department that is doing development in support of their mission.

I'm sure that the people at CTN are doing a good job at their core mission, of teaching people how to do video production.  The system is failing where it fails because there is not a corresponding core set of priority on video distribution and access beyond their cable television franchise.  The whole system looks like it would be better off if CTN lopped off the approximately $180,000 per year they spend on City of Ann Arbor IT services and instead merged that effort into the Ann Arbor District Library's existing video efforts.  That would put both innovative production and innovative distribution under the same roof, and move citizens closer to access to public production libraries to let them be the media, not just consume it on channel 19.

How to get a copy of the Google Transit data set for your bus system; part 1, AATA

Step 1: ask.

From: "Edward Vielmetti" <edward.vielmetti@gmail.com>
To: aatainfo@theride.org
Date: Tue, 30 Dec 2008 16:20:46 -0500
Subject: data request: schedule data
Cc: editor@annarborchronicle.com

Please provide a copy of the AATA schedule data provided to Google for their
Google Transit application in a digital format as described by

http://code.google.com/transit/spec/transit_feed_specification.html

A favor of a prompt reply is welcomed.

Step 2: wait.

Step 3. Wish you didn't have to ask.

Some other systems that are proactive:

Portland TriMet: TransitCast demo

A page full of open access feeds from googletransitdatafeed

Step 4. Read up about FOIA.

Our lawyers acquiesced to a Public Records Act request (under the
federal FOIA statute) for the feed, and made the feed site public
(with disclaimer) for our convenience.  That possibility might give
you some attention..

stuck on a postcard / postful / curl problem

FIXED, see the bottom.  12/31

rather than give up in total frustration, here's what I am trying to implement, using "curl" to drive the system.  This is from the "postful" API docs.

1. Upload the document.

To uploading a document, submit a POST to:

http://www.postful.com/service/upload

Be sure to include the Content-Type and Content-Length headers and the document itself as the body of the request.

POST /upload HTTP/1.0
Content-Type: application/octet-stream
Content-Length: 301456

... file content here ...

If the upload is successful, you will receive a response like the following:

<?xml version="1.0" encoding="UTF-8"?>
<upload>
<id>290797321.waltershandy.2</id>
</upload>

What is critical in this response is the id assigned by the server to your document, in this example 290797321.waltershandy.2. You will use that id to reference the document in the next step.

I think it should work with

curl -v –user edward.vielmeti@gmail.com:t0ps3kr3t  –data-binary "upload=@m-28-shield.png" http://postful.com/service/upload

but, it doesn't.  ideas?  mac os if it matters, the

% curl –version
curl 7.16.3 (powerpc-apple-darwin9.0) libcurl/7.16.3 OpenSSL/0.9.7l zlib/1.2.3
Protocols: tftp ftp telnet dict ldap http file https ftps
Features: GSS-Negotiate IPv6 Largefile NTLM SSL libz 

update: typo fixed

update2: working now!  command line is

curl -k -v –user edward.vielmetti@gmail.com:s4x4s54  –data-binary "@m-28-shield.png" https://postful.com/service/upload

note three things: "-k" option; https, not http:; spell the user name right (duh).

next task: get the address side formatted right.

lego vacuum, patent 6048249 : Plastic building block toy cleanup vacuum attachment

if you have too many legos on the floor, here's the patented design for picking them up:

A toy cleanup vacuum attachment which can be used easily by attaching
to household vacuum hose for the quick and efficient clean up of a
plurality of different size and shape plastic building block toys. The
toy cleanup vacuum attachment compromises: a curved suction channel for
sucking up plastic toy building blocks; a convex plate guide with vent
holes which guides the plastic building block toys within a suction
channel into a drop channel while allowing the passage of vacuum
current and dust; a rectangular drop channel through which plastic
building block toys pass from the force of vacuum momentum and gravity;
a transparent collection container for housing the plastic building
block toys during cleanup and an air-tight bottom, hinged lid for
conducting quick and simple dropping of plastic building block toys
back into a toy box; a spring loaded push/twist thumb button agitator
for dislodging clogged or stuck plastic building block toys; a grip
handle for ease of use and…

See the whole thing on wikipatents.

image processing on the mac with sips, ImageMagick – watching DTE restore power, slowly

As part of watching DTE slowly restore power to the area, I've been pulling down copies of their outage maps.  These are big PDF files, suitable for framing, that encode a bunch of data in them about where the power is out. 

Here's some tools I used to get details out of PDF into image formats, (hopefully) to get them back down to a spreadsheet full of numbers.

1.  Fetch the outage report automatically with curl:

DATESTAMP=$(date "+%Y-%m-%d-%H-%M")
curl -s -o outage.$DATESTAMP.pdf http://my.dteenergy.com/map/zipCodeOutageMap.pdf

This grabs the file and date/time stamps it appropriately.  I'm running in an every 15 minutes loop, which means it grabs the file that's updated every 30 minutes twice (a little redundancy, not too bad).

2.  Convert the report from PDF to PNG with sips:

sips -s format png outage.$DATESTAMP.pdf –out outage.$DATESTAMP.png

'sips' (the "scriptable image processing system") has a bunch of Applescript front end documentation for it, but there is also a command line lurking there as well.  I'm collecting bookmarks for tutorials on sips at delicious.  There's a lot that it can do, since it is mostly a front end for the core graphics library on the Mac which is quite powerful, but the doc is clearly not written with the intention that anyone uses it directly.

3.  Do fun stuff to a series of images with ImageMagick:

composite  -compose difference ~/Desktop/outage.2008-12-28-23-47.png outage.2008-12-29-13-28.png difference-progress.gif

ImageMagick has dozens, maybe hundreds, of transformations that you can use to take a set of images and turn them into something else.  It's powerful and also reasonably well documented, and I won't even try to summarize the ImageMagick command line options list which gives you plenty to play with.

4.  Stare at the results and try to make sense of them.

dte power outage restoration progress

Here's the DTE progress map (middle) with before (left) and after (right) reflecting about 14 hours of restoration efforts.  Click through to Flickr for detail.  The large amounts of black in the middle photo represent locations where work progress has not resulted in a change in service that's visible, though there may in fact have been 1000s of lines restored.

5.  What I want is a tool that would work on a PNG and would give me the color value of a given pixel, or that would give me a histogram of all of the color values of all of the pixels in an image.  There is a histogram function in ImageMagick but it doesn't quite do the trick at first glance, and I'd be better off posting this than trying to do more myself!

The takeaway on this if you squint hard enough – 14 hours of effort, and not a lot of immediately visible progress across broad parts of Livingston, Washtenaw, and Downriver areas in DTE service restoration.

DTE power outage on 28 December 2008 takes down AT&T, iPhone wireless service

It was a dark and stormy night.  Suddenly, the power went out.

The biggest effect: 210,000 DTE customers all over SE Michigan went dark.  Here’s a piece of the map to give you an idea – click through for current map.

The best DTE site for updates seems to be mobile.dteenergy.com, as much as there is a “best”.

Picture 1

The biggest side effect of this was an outage of the AT&T mobile network and as a result the iPhone network.  Redeye Chicago is doing ongoing coverage of the AT&T outage.  That writer is also on twittter – follow @iptib for details.  The story as it stands is that a power outage in “Bloomfield MI” (Bloomfield Hills?  Bloomfield Township?) took out some key equipment.

News coverage:

Ann Arbor Chronicle: DTE Outage Affects 3000 in Ann Arbor

Detroit Free Press: Winds cut power to 230,000; families try to stay warm

Click On Detroit (WDIV 4): 195,000 DTE Customers Without Power

Associated Press: High wind knocks out power to 413,000 in Michigan

Grand Rapids Press: County by county power outages due to high winds around Michigan