Showing posts with label urban. Show all posts
Showing posts with label urban. Show all posts

Sunday, March 15, 2009

Watching the Street (Navigator) / citySCENE

Vague Terrain 13: citySCENE has just launched. As editor Greg J. Smith writes:

This issue of Vague Terrain is founded on two notions - that the city is a stage set for intervention and an engine for representation.
The collection expands out from this premise in multiple directions: carto-mashups, projection-bombing, sound walks, psychogeographic imaging and ubicomp experiments. Early highlights for me included Crisis Fronts' Cognitive Maps and Database Urbanisms, which presents some impressive work on data visualisation and generative models as urban mapping strategies (below: Case Study: Los Angeles). Overall, on a first look, this collection is incredibly rich. It shows that a creative, wired-up, critical urbanism is not just a wisftul aspiration of the technorati, but a real practice.


Having said all that, it's a privelege to be a part of this collection. My contribution is Watching the Street (Navigator), a browsable visualisation of a single day of images from the Watching the Street dataset. It tests out the hunch that these time-lapse slit-scans can be used to read real patterns in the urban environment - that they are (or can be) more than just suggestive abstractions. It uses a simple interface to display both a single source frame, and a correlated slit-scan visualisation, with image-space and time-space sharing an axis, a bit like a slide rule. Greg Smith called it an "urban viewfinder", which sums the intention up nicely.


Playing with the navigator for a while seems to confirm that hunch. The composites reveal temporal patterns in the environment, but not the spatial context that allows us to identify their causes; the source frames show that spatial context, but not the change over time. Reading the two against each other involves chains and cycles of discovery, analysis and inference. These might be open-ended (spatiotemporal browsing) or more directed. What time do the sandwich-boards go out? How long does the delivery truck stay?

Building the navigator presented some interesting technical challenges: mainly, how to make a web-friendly interface to 1440 source frames (240 x 320) and 480 slit-scan composites (720 x 320). That adds up to about 75Mb of jpegs. Processing 1.0 came to the rescue, with its new built-in dynamic image loader. requestImage() pulls in an image from a given URL, on cue, without bringing the whole applet to a grinding halt; it provides some basic feedback on the state of that image - whether it's loading, loaded, or un-loadable. I also blundered into two other useful lessons: how to use the applet "base" parameter, and how to manage Java's local cache, which kept throwing up earlier versions of the applet during testing.

Having made a lean, mean, browser-friendly version, I'm now thinking of adapting the navigator into a full-screen, offline app, with the whole eight-day dataset, and perhaps some tools for annotation and intra-day comparison. Best of all would be a long term installation; a sort of urban space-time observatory, watching the street but also opening it up to ongoing interpretation. If you'd like it running in your foyer, let me know.

Read More...

Thursday, November 27, 2008

Watching the Street

wts_out_1112
The recent Dorkbot show seemed to go off nicely - it was great to be part of such a strong show of local work (some documentation). I showed some prints from Limits to Growth, as well as a more experimental process piece, Watching the Street - a (sub)urban remake of Watching the Sky.


Credit to Nathan McGinness for the suggestion: use the same time-lapse / slit-scan technique to image change in an urban environment. Technically, the setup was fairly straightforward. Instead of a digital stills camera I used a webcam (in portrait orientation), and wrote a simple Processing script to save stills at one-minute intervals, while extracting and compiling one-pixel slices into 24-hour composites. The webcam was installed in a window box on the gallery street front, with a view across the road, under a street tree, to one of Manuka's low-rise shopping arcades (above). I also attached a printer to the installed rig, so that a new composite could be produced and pinned to the wall each day. So here, some of the resulting images, and a bit of commentary.

The image-gathering process got off to a rocky start. After a few hours, the webcam came unstuck from the side of the window-box, and lay forlornly on its side for the next 48 hours (here's what that looks like). I gaffed it back in place just before the opening, and restarted the capture in time to catch some gallery-goers loitering around out the front.

wts_out_1107
wts_out_1108
These two are the Frday the 7th and Saturday the 8th of November, the first two full day composites. Those striped rectangular chunks around mid-frame are cars, parked in the 30 minute loading zone accross the road. Some stay for a few minutes, a couple for what looks like an hour. Of course on the Saturday, the loading zone doesn't operate, and there's a single car parked in it from mid-morning to mid-afternoon. The single-pixel vertical shards give an indication of passing car and pedestrian traffic.

wts_out_1109
wts_out_1114
A quiet, sunny Sunday the 9th; the form hinted at on the 8th, reveals itself as the shadow of the big plane tree, creeping across the footpath. Then the following Friday the 14th. It's all happening; lots of car and pedestrian traffic, changes in sunlight, looks like an afternoon breeze in the foliage as well. The dominant, bluish horizontal stripe in all these images is the neon sign on the shopping centre - which runs all night. The orange rectangle that extends into the evening is the interior light of a shop - which you'll notice switches off at slightly different times each night.

So you'll notice that as in Watching the Sky, I'm persisting in reading these as visualisations of the environment, as well as digital images in themselves. I'm struck by how this simple, indiscriminate process reveals both expected and unexpected patterns, and continues to provoke new questions. This despite, or I would argue because of, its openness to multiple material / temporal systems. In an interesting bit of synchronicity, I was teaching in the UTS Street as Platform masterclass with Dan Hill (more on that soon) while this piece was running. Could a simple visualisation process like this function "informationally", as it were; to help answer real questions about a very specific slice of urban environment, in near-real time? More interesting for me, could it function in that way without prescribing the question in advance - that is, could it support an open-ended process of exploration and interpretation? I'm planning to build an interactive version of this piece, to try out these ideas. In these static visualisations there's a huge amount of data missing: I set the slice point more-or-less arbitrarily, so there are 479 other potentially interesting slices to browse. It would be nice to be able to change the slice point dynamically, as well as navigating through the source images. I notice that Processing 1.0 (yay!) now supports threaded loading of images: could come in handy. Meanwhile, the full set of composite images are up on Flickr.

Read More...