Noise at the Interface

Andrew Prior
The notion of noise occupies a contested territory, in which it is framed as pollution and detritus even as it makes its opposite a possibility. Noise is always defined in opposition to something else, even if this ‘other’ is not quite clear. I am interested in exploring noise in the context of ‘the interface’ and draw historically on information theory which defines noise in opposition to signal.

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design. (Shannon, 1)

In ‘A Mathematical Theory of Communication’ (written in 1948), Claude Shannon outlined a way in which any communication might be encoded mathematically, stored numerically and decoded back into its original form. Information Theory developed out of this to encompass the mathematics and the material means, the electronics, logistics etc. Its initial focus was on strengthening signals for the improvement of mass-media systems (telephone networks in particular), but clearly developing a means to deal with information digitally has impacted far beyond this initial remit. The issue of signal strength is about overcoming noise, reproducing a message despite interference that may intrude within a communications system. Thus noise is fundamental to the concept of Information Theory and predetermining an appropriate spectrum of possibilities to be communicated (through resolution bandwidth, and encoding), a necessary stage in defining what is and isn’t noise. Despite enormous strides forward in technology since Information Theory was at the ‘cutting edge’, its legacy is one of literally millions of interfaces based on its reductive logic. At this scale, the question of what is noise and what is signal, what is an appropriate spectrum of possibilities to be communicated, and how signal and noise is differentiated is thrown into stark relief, drastically altering our experience of technology, culture and biopolitics.

Marx’s notion of ‘real abstraction’ (found in his “Fragment on General Intellect”) explores the notion that material means can embody ideas, social relations and so on. This can occur in very direct un-technological ways but technology does this overtly. Even today, the operational logic of software does not seem to stray far from Shannon’s assertion that it should be programmed to “operate for each possible selection” from “a set of possible messages”. Interfaces operating within the overall system (both at, above, and below the level of the Graphical User Interface) set appropriate types and ranges of interaction and input.

Moving briefly beyond the scope of Information Theory to focus on the characteristics of the interface itself, one might invoke the process of ‘encapsulation’ within Object Oriented Programming (OOP). Encapsulation allows objects to hide their internal methods such that only those methods that need to be accessed outside the object are ‘public’. In effect an interface is the public face of a (set of) processe(s). Therefore one can argue that whilst interfaces offer particular functionality, they also imply making other kinds of interaction impossible, hiding away the workings within the black box. Interfaces then, act as filters blocking out certain messages, whilst privileging others for relay.

If there is an informational quality to contemporary culture, then it might be not so much because we exchange more information than before, or even because we buy, sell or copy informational commodities, but because cultural processes are taking on the attributes of information – they are increasingly grasped and conceived in terms of their informational dynamics. (Terranova, 7)

Whilst encapsulation is a function of Object Oriented Programming specifically, and interfaces in general, it is Information Theory that first provided a way for any message to be translated into digital ‘information’. It is as ‘information’ that human relations are most effectively subsumed in technology, and therefore as information that they become subject to the filtering and relay processes of interfaces. Whilst Information Theory is intended to operate at a micro level, encoding the constituent parts of messages (such as letters or waveforms); one might argue its logic results in a kind of scalar symmetry whereby operations at a macro level of communication mimic its reductionism (such as the example of a call centre), not only resembling it at the level of metaphor.

Information and Semantics

Digitality brings to media, and relations mediated by technology, a leveling effect in which all becomes information (which in terms of information theory, means bits of information – binary digits, 1’s and 0’s); and as such all becomes subject to the assumptions that determine digital information. Once media are encoded as information there is little ontological difference between sound, text and visual media, or for that matter software. In 1949, Warren Weaver suggested adaptation of Shannon’s signal/noise opposition to account for ‘Semantic Noise’, his assumption being that mathematics could overcome not only problems of engineering noise but also those of semantic meaning – allowing technology to faithfully communicate a message without itself modulating the meaning. This notion of a transparent technology overlooked the notions of real abstraction and the encapsulation processes of the interface, the way in which the design process must necessarily premeditate the role and flexibility of the realization to optimize its performance. To some extent these assumptions produce an interesting power reversal in which the user, as presumed controller of the interface becomes subject to it:

Software has traditionally been understood to place the user as its subject, and the computational patterns and elements initiated, used and manipulated by the user as the corresponding grammatical objects. (Cramer & Fuller, 151)

The affordances of media as ‘digital information’ mean that on the one hand the possibilities to manipulate content multiply drastically, whilst on the other encapsulation of the interface easily obscures the restrictions and boundaries of functionality. The filtering tendency of the interface therefore has a homogenic, normative result on our experience of media by reducing the breadth of possible interaction into prescribed choices through a kind of aesthetic quantization. In combination with the propensity toward repetition underwritten by media as information, such homogeneity is magnified manifold. Filtering is not only relevant to the signal that gets through – everything that isn’t input is bypassed as unacceptable, creating a kind of modulus situation whereby non-compliant input is automatically recast as noise.

Noisy Tactics

One obvious way around this normative effect is to go beyond, or beneath, the interface – beyond with actions that break or subvert it (for example, through circuit bending); beneath it by invention of new interfaces through programming, electronics or physical design, a situation in which the ‘tool becomes the message’ (Cascone).

Interface design involves decisions around what functionality will or will not be provided, acceptable ranges of interaction and so on. Through this process, interfaces become representations of the operational logic of the systems they act upon. Despite the tendency for interfaces to exclude unwanted noise, obsolescence means they soon become the source of new noise, embodying outdated assumptions, giving realized form to fragmented schema. Nevertheless, whilst broken or obsolete interfaces may represent dislocated knowledge, once repurposed they also offer the possibility of bricolage and revitalized meaning, outflanking homogeneity even in the act of repetition.

The recent interest in glitch art, whilst part of a long tradition of the aesthetic possibilities of chance, failure and openness, is in one sense a response to the suffocating, self-correcting hydra of informational dynamics. Beyond the repetition of information, the normative qualities of the interface are a result of filtering, quantizing the breadth of possibilities through which interaction can occur. Glitches and failures overcome the seeming inevitability of systemized communication.

Nevertheless, even such strategies as these have the potential to be derailed. Glitch as an aesthetic is always recuperable, monitisable. Whilst this is by no means the fault of aesthetics, the role it has come to play makes this trajectory to some extent inevitable. The claims toward progress under a late capitalist, post-modernist, informational society mean that the nature of revolutionary, antagonistic, or noisy aesthetics is always to provide new territories for domination and control.

[T]he popularization and cultivation of the avant-garde of mishaps has become predestined and unavoidable […] The procedural essence of glitch art is opposed to conservation; […] to design a glitch means to domesticate it. (Menkman, 6)

Interfaces are arbiters of noise and signal. Their influence when operational is toward standardization, and when broken towards fragmentation and dislocation. Their ubiquity demands attention, scrutiny and challenge. In “Glitch Studies Manifesto”, Rosa Menkman argues that only through focus on failure as process, rather than outcome, can such tactics overcome the normative trajectories of culture and economy. Attali’s metaphorical noise is often misread to equal signal noise – interference, distortion, dissonance – yet, all these prove to be as pliant and accommodating vessels for commodification as consonance, harmony, clarity and so on. The trouble with difference is that it implies an eternal connection to the very thing it negates.

Works cited:

Attali, Jacques. Noise – The Political Economy of Music. Minneapolis: University of Minnesota Press, 1985. Print.

Cascone, Kim. “The Aesthetics of Failure – “Post-Digital” Tendencies in Contemporary Computer Music”. In Computer Music Journal. 24, 4 (Winter 2000).

Cascone, Kim. “The Failure of Aesthetics”. Share Festival. Lecture. <http://vimeo.com/17082963>.

Cramer, Florian, and Fuller, Matthew. “Interface”. In Software Studies – A Lexicon. Fuller, ed. Cambridge, Mass.: MIT Press, 2008. Print.

Marx, Karl. “Fragment on Machines”. In The Grundrisse: Foundations of the Critique of Political Economy. 1858. <http://thenewobjectivity.com/pdf/marx.pdf>.

Menkman, Rosa. “Glitch Manifesto”. 2010. <http://www.slideshare.net/r00s/glitch-studies-manifesto>.

Shannon, Claude. “The Mathematical Theory of Communication”. In The Bell System Technical Journal. Vol. 27. 1948: 379–423, 623–656. Print.

Terranova, Tiziana. Network Culture: Politics for the Information Age. London: Pluto Press, 2004. Print.

Weaver, Warren. Recent Contributions to The Mathematical Theory of Communication. Urbana: University of Illinois Press, 1949.

Posted in Public Interfaces Category