Showing posts with label fabrication. Show all posts
Showing posts with label fabrication. Show all posts

Tuesday, February 07, 2012

Local Colour: Smaller World Network

Back in September I showed a little work called Local Colour at ISEA 2011. This project continues my thinking about generative systems, materiality and fabrication. It's a work in two parts: the first is a group of laser-cut cardboard bowls, made from reclaimed produce boxes - you can see more on Flickr, and read the theoretical back-story in the ISEA paper. Here I want to briefly document the second element, a sort of network diagram realised as a vinyl-cut transfer. The diagram was created using a simple generative system, initially coded Processing - it's embedded below in Processing.js form (reload the page to generate a new diagram).

Local Colour at ISEA 2011
Network diagrams are one of the most powerful visual tropes in contemporary digital culture. Drawing on the credibility of network science they promise a paradigm that can be used to visualise everything from social networks to transport and biological systems. I love how they oscillate between expansive significance and diagrammatic emptiness. In this work I was curious to play with some of the conventions of small world or scale-free networks. A leading theory about how these networks forms involves preferential attachment: put simply it states that nodes entering a network will prefer to connect to those nodes that already have the most connections. In visualising the resulting networks, graph layout processes (such as force direction) use the connectivity between nodes to reposition the nodes themselves; location is determined by the network topology.



This process takes the standard small-world-network model and changes a few basic things. First, it assigns nodes a fixed position in space. Second, it uses that position to shape the connection process: here, as in the standard model, nodes prefer to connect to those with lots of existing connections. But distance also matters: connecting to a close node is "cheaper" than connecting to a distant one. And nodes have a "budget" - an upper limit on how far their connection can reach. These hacks result in a network which has some small world attributes - "hubs" and "clusters" of high connectivity - but where connectivity is moderated by proximity. Finally, this diagram visualises a change in one parameter of the model, as the distance budget decreases steadily from left to right. It could be a utopian progression towards a relocalised future, or the breakdown or dissolution of the networks we inhabit (networks in which distance remains, for the time being, cheap enough to neglect).

The process running here generates the diagram through a gradual process of optimisation. Beginning with 600 nodes placed randomly (but not too close to any other), each node is initially assigned a random partner to link to. Then they begin randomly choosing new partners, looking for one with a lower cost - and cost is a factor of both distance and connectivity. The Processing source code is here.

Read More...

Sunday, June 06, 2010

Measuring Cup

Measuring Cup is a little dataform project I've been working on this year. It's currently showing in Inside Out, an exhibition of rapid-prototyped miniatures at Object gallery, Sydney.

This form presents 150 years of Sydney temperature data in a little cup-shaped object about 6cm high. The data comes from the UK Met Office's HadCRUT subset, released earlier this year; for Sydney it contains monthly average temperatures back to 1859.


The structure of the form is pretty straightforward. Each horizontal layer of the form is a single year of data; these layers are stacked chronologically bottom to top - so 1859 is at the base, 2009 at the lip. The profile of each layer is basically a radial line graph of the monthly data for that year. Months are ordered clockwise around a full circle, and the data controls the radius of the form at each month. The result is a sort of squashed ovoid, with a flat spot where winter is (July, here in the South).


The data is smoothed using a moving average - each data point is the average of the past five years data for that month. I did this mainly for aesthetic reasons, because the raw year-to-year variations made the form angular and jittery. While I was reluctant to do anything to the raw values, moving average smoothing is often applied to this sort of data (though as always the devil is in the detail).


The punchline really only works when you hold it in your hand. The cup has a lip - like any good cup, it expands slightly towards the rim. It fits nicely in the hand. But this lip is, of course, the product of the warming trend of recent decades. So there's a moment of haptic tension there, between ergonomic (human centred) pleasure and the evidence of how our human-centredness is playing out for the planet as a whole.


The form was generated using Processing, exported to STL via superCAD, then cleaned up in Meshlab. The render above was done in Blender - it shows the shallow tick marks on the inside surface that mark out 25-year intervals. Overall the process was pretty similar to that for the Weather Bracelet. One interesting difference in this case is that consistently formatted global data is readily available, so it should be relatively easy to make a configurator that will let you print a Cup from your local data.

Read More...

Wednesday, October 07, 2009

Weather Bracelet - 3D Printed Data-Jewelry

Given my rantings about digital materiality and transduction, fabrication is a fairly obvious topic of interest. I posted earlier about an experiment with laser-cut generative forms and Ponoko - more recently I've been playing with 3d-printing via Shapeways, as well as trying out data-driven (or "transduced") forms. This post covers technical documentation as well as some more abstract reflections on this project - creating a wearable data-object, based on 365 days of local (Canberra) weather data.


Shapeways has good documentation on how to generate models using 3d-modelling software. Here I'll focus more on creating models using code-based approaches, and Processing specifically. The first challenge is simply building a 3d mesh. I began with this code from Marius Watz, which introduces a useful process: first, we create a set of 3d points which define the form; then we draw those points using beginShape() and vertex().

The radial form of the Weather Bracelet model shows how this works. The form consists of a single house-shaped slice, where the shape of each slice is based on temperature data from a single day. The width is static, the height of the peak is mapped to the daily maximum, and the height of the shoulder (or "eave") is mapped to the daily minimum. To create the radial form, we simply make one slice per day of data, rotating each slice around a central point. As the diagram below shows, this gets us a ring of slices, but not a 3d-printable form. As in Watz's sketch, I store each of the vertices in the mesh an array - in this case I use an array of PVectors, since each PVector conveniently stores x,y and z coordinates. The array has 365 rows (one per day, for each slice) and 5 columns (one for each point in the slice). To make a 3d surface, we just work our way through the array, using beginShape(QUADS) to draw rectangular faces between the corresponding points on each of the slices.


To save the geometry, I used Guillame laBelle's wonderful SuperCad library to write an .obj file. I then opened this in MeshLab, another excellent open source tool for mesh cleaning and analysis. Because of the way we draw the mesh, it contains lots of duplicate vertex information; in MeshLab we can easily remove duplicate vertices and cut the file size by 50%. MeshLab is also great for showing things like problems with normals - faces that are oriented the wrong way. When generating a mesh with Processing, the order in which vertices are drawn determines which way the face is ... er, facing... according to the right hand rule. Curl the fingers of your right hand, and stick up your thumb: if you order the vertices in the direction that your fingers are curling, the face normal will follow the direction of your thumb. Although Processing has a normal() function that is supposed to set the face normal, it doesn't seem to work with exported geometry. Anyhow, the right hand rule works, though it is guaranteed to make you look like a fool as you contort your arm to debug your mesh-building code.

The next step in this process was integrating rainfall into the form. I experimented with presenting rainfall day-by-day, but the results were difficult to read; I eventually decided to use negative spaces - holes - to present rainfall aggregated into weeks. Because Shapeways charges by printed volume, this had the added attraction of making the model cheaper to print! The process here was to first generate the holes in Processing as cylindrical forms. Unlike the base mesh, each data point (cylinder) is a separate, simple form: this meant I could take a simpler approach to drawing the geometry. I wrote a function that would just generate a single cylinder, then using rotate() and scale() transformations made instances of that cylinder at the appropriate spots. Because I wanted the volume of each cylinder to map to rainfall, the radius of each cylinder is proportional to the square root of the aggregated weekly rainfall. As you can see in the grab below, the base mesh and the cylinders are drawn separately, but overlayed; they were also saved out as separate .obj files. The final step in the process was to bring both cleaned-up .obj files into Blender (more open source goodness) and run a Boolean operation to literally subtract the cylinders from the mesh. This took a while - Blender was completely unresponsive for a good few minutes - but worked flawlessly.





Finally, after checking the dimensions, exporting an STL file from MeshLab, and uploading to Shapeways, the waiting; then, the printed form. I ordered two prints, one in Shapeways' White, Strong and Flexible material, and the other in Transparent Detail. You can clearly see the difference between the materials in these photos. The very small holes tested the printing process in both materials; in the SWF print the smallest holes are completely closed; in the TD material they are open, but sometimes gummed up with residue from the printing process (which comes out readily enough). Overall I think the TD print is much more successful - I like the detail and the translucency of the material, as well as the cross-hatched "grain" that the printing process generates.






So, a year of weather data, on your wrist - as a proof of concept the object works, but as a wearable and as a data-form it needs some refinement. As a bracelet it's just functional - the sizing is about right, but the sharp corners of the profile are scratchy against the skin. As a data-form, it could do with some simple reference points to make the data more readable - I'm thinking of small tick-marks on the inner edge to indicate months, and perhaps some embossed text indicating the year and location. More post-processing work in Blender, I think.

Another line of development is to do versions with other datasets - and hey, if you'd like one for your city, get in touch. But that also raises some tricky questions of scaling and comparability. The data scaling in this form has been adjusted for this dataset; with another year's data, the same scaling might break the form - rain holes might eat into the temperature peaks, or overlap each other, for example. A single one-size-fits-all scaling would allow comparisons between datasets, but might make for less satisfying individual objects - and, finding that scaling requires more research.


What has been most enjoyable with this project, though, is the immediate reaction the object evokes in people. The significance and scale of the data it embodies, and its scale, seem to give it a sense of value - even preciousness - that has nothing to do with the cost of its production or the human effort involved. The bracelet makes weather data tangible, but also invites an intimate, tactile familiarity. People interpret the form with their fingers, recalling as they do the wet Spring, or that cold snap after the extreme heat of February; it mediates between memory and experience, and between public and private - weather data becomes a sort of shared platform on which the personal is overlayed. The form also shows how the generalising infrastructures of computing and fabrication can be brought back to a highly specific, localised point. This for me is the most exciting aspect of digital fabrication and "mass customisation" - not more choice or user-driven design (which are all fine, but essentially more of the same, in terms of the consumer economy) - but the potential for objects that are intensely and specifically local.

Read More...

Sunday, May 03, 2009

Transduction, Transmateriality, and Expanded Computing

In common usage a transducer is a device that converts one kind of energy to another. Wikipedia lists a fantastic variety of transducers, mapping out links between thermal, electrical, magnetic, electrochemical, kinetic, optical and acoustic energy. In this form transducers are everywhere: a light bulb transduces electrical energy into visible light (and some heat). A loudspeaker transduces fluctuations in voltage into physical vibrations that we perceive as sound.

In analog media, transduction is overt (put the needle on the record...). But digital media are riddled with it too. Inputs and output devices all contain transducers: the keyboard transduces motion into voltage; the screen transforms voltage into light; the hard drive mediates between voltage and electromagnetic fields. A printer takes in patterns of voltage and emits patterns of ink on a page. Strictly transduction only refers to transformations between different energy types; here I want to extend it to talk about all the propagating matter and energy within something like a computer, as well as those between that system and the rest of the world. From this transmaterial perspective a computer is a cluster of linked mechanisms and substrates; a machine for shifting patterns through time and space.


If this sounds unfamiliar, it's only by historical accident. Mechanical computers, where these patterns are physically perceptible, predate electrical (let alone digital) ones, by centuries (above: a replica of Konrad Zuse's Z1, a mechanical computer from 1936. Image by rreis). Materially, our current computers are more or less black box systems. Their transductions come as a sort of preconfigured bundle or network, a set of familiar relations constructed again by mixtures of hard- and software, protocols, standards: generalising frameworks. I press a key, a letter appears; this is all I need to know. Click "OK". No user-serviceable parts inside.

Except that currently, across the media arts and a whole slew of other fields, the computer is undergoing a rich and productive decomposition. It's composting, to borrow a Sterlingism. This goes under all kind of different names: hardware hacking, device art, homebrew electronics, physical computing. Such practices mount a direct assault on the computer as a material black box, literally and figuratively cracking it open, hooking it up to new inputs and outputs, extending and expanding its connections with the environment. Microcontrollers like the Arduino present us with nothing but a row of bare I/O pins. Finally we can tackle the question of what should go in, and what should come out: of transduction. A whole generation of artists, designers, nerds and tinkerers are taking up soldering irons and doing just that. Below: the Spoke-o-dometer from Rory Hyde and Scott Mitchell's Open Source Urbanism project.


One side-effect of this decomposition of computing is that the ontological status of the digital starts to break down with it. As Kirschenbaum shows brilliantly, the digital is just the analog operating within certain tolerances or threshholds. Thomas Traxler's The Idea of a Tree (below) is a solar-powered system that fabricates objects from epoxy, dye and string, by turning a spindle. Solar energy generates electrical energy, which drives the motor, which draws the string through the dye and onto the spindle: a chain of analog transductions produce an object that manifests specific changes in its local environment. The work is a beautiful demonstration that variability doesn't have to be worked up with generative code: if the system is open to it, it's already there in the flux of the material field.


This is not to dismiss computing, only to recast it: an incredibly dynamic, pliable set of techniques for manipulating the material environment. Paradoxically the very generalities of computing - the abstractions and protocols that insulate it from local, material conditions - make it a powerful tool for transduction, that is, the propagation of specificities. Usman Haque's Pachube is a generalised infrastructure, a set of protocols and standards that rest in turn on wider standards like XML, and which assume a whole stack of functional layers: IP, HTTP, and so on. All in order to propagate material patterns and flows from here to there: this is an architecture of transduction whose utopian aim is to "patch the planet" into a translocal ecology of linked environments.

Digital fabrication is part of the same shift: an expansion and extension of the computer's range of material transductions. Digital pattern, to lasercutter instructions, to physical form. Fabbing shows how material matters. It's unsurprising that a piece of laser-cut ply is aesthetically different to a luminous pattern of pixels; more interesting is the way computation reaches out into the substrate's material properties, and the range of potential applications and domains it opens up. Fabbing has often presented itself with a narrative of materialisation, making the virtual real, translating bits into atoms - Generator.x 2.0 was subtitled "Beyond the Screen." Not so: because of course, the "virtual" never was, and the screen is material too. Fabbing does get us beyond the screen, but only because its processes and materials have different properties, different specificities, and they hook us up to new contexts, as well as new sensations. (Below: Andreas Nicolas Fischer & Benjamin Maus: Reflection - from 5 Days Off: Frozen)


Transduction suggests a way to link practices like physical computing, fabrication, networked environments, and many more. Data visualisation - in the broadest sense, from poetic to fuctionalist - is about creating customised transductions, sourcing new inputs and/or manifesting new outputs (even if they don't reach "beyond the screen"). We could add tangible interfaces, augmented reality, and locative systems. What does all this amount to? In 1970 Gene Youngblood observed a similar moment as the dominant cultural form diversified into a networked, participatory, interdisciplinary field of practices. He called it expanded cinema. So perhaps we can call this expanded computing: digital media and computation as material flows, turned outwards, transducing anything to anything else.

Read More...

Friday, December 19, 2008

Fabricated Growth Forms (Processing to Ponoko)

Like many others playing with generative techniques, I'm fascinated by the potential of digital fabrication. Getting beyond the screen and into the world of objects is a significant move for a field that has, until the last few years, reveled in its own immateriality. There's a lot to think about in this material turn, but that's for another post. Here, a quick report on my first experiment with generative fabrication.


I don't have a laser cutter handy at my workplace (though as William Turkel writes there are lots of good reasons why I should) so I decided to check out Ponoko; I wanted to see what was involved in generating, uploading and fabbing a small project. I started with the Processing sketch from Limits to Growth, and tweaked it to turn out much smaller forms (a few hundred nodes, rather than tens of thousands). I used the built-in PDF export, then opened the PDFs in Illustrator. (Illustrator is the only commercial/proprietary software step in this process, so I'd be interested to hear of any alternatives). The forms are drawn as linked line segments of varying stroke widths. Ponoko needs an EPS with only the outside edge of this form, so I used Illustrator to merge it into a composite path, then set the stroke colour and width as instructed (0,0,255 and 0.001mm).


The upload to Ponoko took a few tries - I was getting some strange errors as their system failed to "see" the cutting paths on the template - but after some swift and cheerful technical assistance it all worked. Pricing was also trial and error; the first design I uploaded was more complex than these, and of course these branching forms pack a long cutting path into a small surface area. I simplified the design, packed four forms onto a sheet, and opted for 4mm ply rather than acrylic. Final cost including (expensive) shipping to Australia was about $A60 (currently around $US40). Not what I'd call cheap, but not prohibitive either. There are intricate discussions of the economics of the business - shipping, exchange rates, local vs global, etc - on the Ponoko forums.

Eighteen days later, they arrived. Novelty counts for a lot here, but still, I'm totally charmed by these objects. A few surprises, but all good: they are smaller and finer than I imagined, and they smell very slightly of charred wood (excellent!). The cut edges are dark with a nice smooth, burnished surface, and the ply surface is clean. The scale and intricacy of the things seems to entice people to touch and handle them. I find them far more satisfying than the (much more detailed) laser prints I made with the same system.


Immediately it's clear how the fabbing process, and the materials, can reach back up through the production chain and influence the design and the generative system. One flaw in the design is a product of how I'm drawing the shapes: there are small rounded "shoulders" at the joints between line segments, caused by the overlap between one rounded line cap and the next segment - this is obvious in the physical forms. Better to draw the segments as tapered rectangles, and avoid the shoulders. Also, the branching topology is structurally risky; how to introduce more joins without breaking the generative model? This interplay, between computational process, manufacturing process, material and form, seems really promising. Ponoko seems to be an excellent, affordable way to try this out, and the built-in fab-on-demand shopfront is great, if you want to sell your wares. But it's still, ironically, working with a mass-production paradigm of one design, n copies. With hooks for a more dynamic, generative front end, it could get really interesting: designers like the wonderful Nervous System are doing this already. More documentation of the growth forms over on Flickr.

Read More...