Wednesday, April 23, 2008

There is no Software - Kittler and Evolvable Hardware

I'm slowly developing this notion of transmateriality; in this post, some media theory and a nice example from computer science. In the next I'll try to connect all this with some current art work, and back to the notion of the transmaterial.

Thanks to a prod from my friend Brogan Bunt, I've been reading Friedrich Kittler, a literary and media theorist who has made some striking forays into computational media. In a paper from 1995 he grapples, like Kirschenbaum, with the grounding of computation in matter, distilling this position to a wonderful aphorism: "there is no software." Kittler begins by announcing the end of writing; that texts "do not exist anymore in perceivable time and space" but have been miniaturised to the scale of integrated circuits. This miniaturisation, in which writing escapes the bounds of human perception, is facilitated by Turing's core principle of computing, which sets out minimal conditions for computation and proves its independence from hardware - the ability for any number of different physical machines to implement a universal computer. This principle, Kittler says, "has had the effect of duplicating the implosion of hardware by an explosion of software." "Ordinary language" is overtaken by a new hierarchy of programming languages, layers that reach from the command line down to assembler and the very protocols embedded in the silicon itself.

Kittler plays out this "descent" in which each layer depends on the one below it; the word processor depends on DOS, which in turn rests on the hardwired BIOS. Ultimately "
[a]ll code operations, despite their metaphoric faculties such as "call" or "return", come down to absolutely local string manipulations and that is, I am afraid, to signifiers of voltage differences." In other words, although they resemble "ordinary" language, programming languages return us to the imperceptible, inaccessable, too-fast and too-small world of the microprocessor. So, "there is no software at all," except as defined by the "environment of everyday language" that surrounds computation.

In a paranoid turn Kittler analyses the tendency of computer culture to "
systematically obscure hardware by software, electronic signifiers by interfaces between formal and everyday languages." In the GUI, but also in high level languages such as C (let alone Java), the physical machine is increasingly concealed from its users, and its programmers, in the name of functionality and "friendliness."

Finally Kittler considers the limits and thresholds of "programmable matter"; he points out that current computing hardware relies on the isolation of discrete elements from each other and thus a limit to connectivity. This contrasts with the "maximal connectivity" of the physical systems - "waves, weather and wars" outside the computer. The current approach to computing hardware is essentially more of the same: more transistors, more elements, smaller circuits with better isolation. Kittler instead suggests that the only way to "keep up" with the physical complexity of the world is to match it with "nonprogrammable systems" made of "sheer hardware": "
a physical device working amidst physical devices and subjected to the same bounded resources." In such devices once again "software as an ever-feasible abstraction would not exist any more."

Interestingly Kittler ends up close to where my earlier work on art and artificial life (published as Metacreation)
came to rest. In considering the desire for emergence in a-life art, I wondered about the constraints imposed by the physical substrate - the "coarse, rigid grammar" of digital electronics. My favourite demonstration of what lies beyond this grammar is the evolvable hardware work of Adrian Thompson, in which circuit designs for programmable chips (field gate programmable arrays - as in the image above) are evolved using a genetic algorithm and tested in hardware for their performance in a particular task. It's perhaps not surprising that successful circuits were evolved over many thousands of generations; but the fun part is that when analysed, these circuits were completely unlike any human-designed computing machine. In Kittler's words they were "sheer hardware," treating the chip as a "maximally connective" physical substrate rather than an abstracted set of discrete elements. Some chips drew on external influences, such as electromagnetic radiation, to achieve their evolved ends; so the chip is not formally isolated (but to quote Kittler again) "a physical device working amidst physical devices." Not only that, they were often "tuned" to the physical specificities of a single chip, despite the FPGAs being notionally identical - the exact opposite to the hardware-indpendence of the Turing Machine.


Friday, April 11, 2008

Wanted: Research Students (A Message from my Sponsor)

I've kept my academic day job out of this blog until now; but that's really a false distinction since the work presented here is largely supported by my employer. So with that in mind, a message from my sponsor - and actually, from me too.

I'm looking for research students! My research interests are pretty well represented by this blog, and visualised in the tag cloud: criticism, theory and practice in computational media, data practices, generative art, a-life art, experimental sound and music, digital culture in general. With my colleagues Stephen Barrass and Sam Hinton we span internet history and theory, gaming, sonification, AR, perceptual approaches to HCI, and wearables. With our collective track record and mix of specialisations, we're one of the best groups in the country for this kind of work. What's more our new Faculty of Design and Creative Practice now combines media arts with architecture, landscape architecture, cultural heritage, industrial design and graphic design, so there's a vast field of crossovers there. All our research programs encourage practice-led research, and thesis forms that combine writing with creative projects.

If this sounds like you, and you're interested in stand-alone Honours, Masters by Research or PhD study, get in touch.

We now return you to our scheduled programming.


Thursday, April 03, 2008

Strange Ontologies

Another piece of work from my time at CEMA; this one a paper, co-authored with Mark Guglielmetti and Troy Innocent. This paper started with some discussions about models in generative systems, and a feeling that certain kinds of models, or rather certain ontologies - formally defined networks of entities and relations - play an important role in defining the generative outcomes of formal systems. Troy and Mark are also very much into gaming (more than me anyway, my peak gaming experience occurred about twenty years ago and involved an Amiga 1000); as we talked it seemed that these generative ontological structures might also be at work in some of the more interesting games and game art projects around. Mark made me sit down and play Portal (below). Then we started discussing social software...

So in this paper we consider both philosophical and computational senses of "ontology", and propose that computational ontologies (or data models) actually implement philosophical ontologies (notions of what "is"). What's more these ontologies become dynamic, interactive processes; and that's when things get interesting. We focus on "strange ontologies": where default, common-sense or conventional ontological structures are tweaked or hacked, or where emergent phenomena pop out from apparently straightforward structures of being and relation. We draw on examples from social software, gaming (including Portal and Warcraft), art games or game art (including Julian Oliver's Second Person Shooter), new media / generative art (Guglielmetti's own Laboratories of Thought (below) and Jonathan McCabe's Origami Butterfly Method).

The paper has been submitted to an upcoming issue of ACM Computers in Entertainment; for now, grab the pdf and cite it via the permalink for this post. We're seeking feedback on this too - let us know your thoughts.