Rendered Inoperable: Uber and the Collapse of Algorithmic Power

Luke Munn, PhD student, Institute for Culture & Society, Western Sydney University

Today the algorithmic moves off the whiteboard and into the world, producing subjectivities, articulating relationships, and shaping behaviours. Yet to obtain its objectives, the algorithmic must draw upon bodies, flows, and materials — matter which is contentious and agents which have their own intentionalities. Efficacy cannot simply be assumed, but must be incessantly negotiated via a set of procedures. What are the operations needed to incorporate subjects and spaces into regimes of algorithmic coordination? By examining the ride-sharing platform Uber as a case-study, three operations are identified as critical: encapsulation, enlistment and enchantment. When these operations are incomplete, algorithmic traction on a subject slips away, producing an array of undesired and unanticipated effects.

“We exist in the place where atoms and bits come together”, once stated former CEO of Uber, Travis Kalanick. But the ‘real world’ is a much more fraught space. The infiltration of algorithmic systems in the everyday brings lucrative new possibilities, evidenced by the financial success of ‘unicorns’ like Uber and Airbnb, but it also brings new vulnerabilities. The intersection of ‘bits and atoms’ drastically amplifies the negotiations with materiality that any software has to deal with, bringing the agencies of other actors to the fore. Rather than the highly compliant medium of pixels, systems such as Uber must enlist the much more frictious element of people — and their diverse motivations — into algorithmic processes. A new dependence emerges, a reliance on agents that remain somewhat outside their spheres of control. And this dependence is not a one-time deal that can ever be simply guaranteed. Instead, it takes the form of an ongoing negotiation that occurs millions of times per day — every single time a Rider requests a ride, Uber must somehow command a Driver to be there.

The algorithm has increasingly suffused into laboring bodies, into domestic interiors, and into urban fabrics. For a platform like Uber this entails new forms of algorithmic governance that ushers drivers to particular locations in the city at particular times of the day, and draws out a specific type of performance understood as ‘best practice’. For the ‘always listening’ digital assistant that is Amazon Alexa, this means filling the traditionally private space of the kitchen or living room with an invisible new zone of capture. And within a system like Airbnb, the algorithmic indexing of listings exerts unseen pressures on architectures — rearranging apartments, transforming homes into hotels and subtly reconstituting the wider geographies of the city itself. Alongside these consumer-facing examples are less visible but equally significant intrusions made at the enterprise or governmental levels. These come without focus-grouped product names, but determine teacher rankings, credit scores, loan approvals, parole sentences, and no fly lists. More and more, the algorithmic permeates into the processes and people around us, impinging upon society and culture in highly significant ways. How does the algorithmic invest bodies, enlist subjects, move matter, and coordinate relationships? In short, how does an algorithmic procedure attain and exert power?

The ability to answer this question has been hindered by a particular understanding of the algorithm. To sketch a brief genealogy, the word algorithm is merely an updated form of ‘algorism’, an older term originating from the Latin translation of the ninth century Arabic mathematician, Al-Khwarizmi. As historian Robert Steele demonstrates, algorism “owes its name to the accident that the first arithmetical treatise translated from the Arabic happened to be one written by Al-Khwarizmi in the early ninth century, ‘de numeris Indorum’, beginning in its Latin form ‘Dixit Algorismi'”, a translation made about 1120 by Adelard of Bath (xiv). Khwarizmi’s text introduced new tools for calculation and the processes for their effective operation: a sleek new set of Hindu-Indian numbers (1, 2, 3) to replace the unwieldy Roman equivalents (V, VIII, LIX, etc.), a formal set of operations such as multiplication and division, and most importantly the introduction of a special new integer, the cypher or zero. The introduction of zero, as Steele asserts (xv), enabled the “computer to dispense with the columns of the Abacus”. A new mode of computation emerged that was both more concise and more easily checked.

From there the typical genealogy of the algorithmic moves forward to Babbage’s Analytical Engine in 1834, Ada Lovelace’s program for calculating Bernoulli numbers in 1843, Alonzo Church’s work in symbolic logic throughout the 1930s, Alan Turing’s seminal paper on computation published in 1936, and von Neumann’s architecture underpinning the ENIAC and merge-sort algorithm in 1945 (Schönhart et al.). A general understanding of the algorithm thus emerges from this lineage. As mathematical historians Crossley and Henry argue (105), “in the 12th century and for a long time thereafter the spelling ‘algorism’, with an ‘s’, meant the rules and procedures for using the nine Hindu-Arabic numerals 1, 2, 3, 4, 5, 6, 7, 8, 9, and the cypher”. Even today, some computer-science papers continue to use algorism over algorithm, and the understanding of the term as a step-by-step process has not drastically shifted. One of the more well-known definitions, for instance, comes from mathematician Stephen Kleene in the 1940s, who defined the algorithm as a performable procedure (59).

Following this lineage, the algorithmic today is often conflated with code, with a set of instructions written by a programmer in a particular language. To conduct research into the algorithm and to understand its logics, one must delve into this special set of ciphers. Mark Marino, for example, states that “we can read and explicate code the way we might explicate a work of literature”. Historian Len Shustek writes that “software is a form of literature, written by humans to be read by humans as well as machines” (110). And theorist Alexander Galloway attempts “to read the never-ending stream of computer code as one reads any text” (20). If, the argument goes, the user or researcher could only read back this text, then all would be revealed. But this text is typically proprietary, only available to employees or selected developers. The moment of enlightenment never arrives. Instead the algorithmic becomes the oft-cited black-box, an opaque object unable to be examined or intervened within.

A new starting point is needed. In 1979 Robert Kowalski published a paper titled “Algorithm = Logic + Control”. Despite the title, the paper was not meant to define the term. Kowalski, a computer scientist, was far more interested in efficiencies than etymologies. For Kowalski, ‘logic’ comprised the assumptions and objectives of a program — for example, to find a path; ‘control’ on the other hand, consisted of the strategies and processes employed in order to achieve it — for example, a particular sorting routine. While the goal was always the same, clearly some routines better exploited the properties of integers, the architecture of processors, and the availability of memory, and were thus more efficient. For Kowalski, this cleanly separated approach allowed the programmer to focus on optimization — retaining the logic while refining the speed and accuracy of the control procedures. Yet this notion of ‘control’ strongly foregrounds the algorithm as a performance enacted in the world, a performance both underpinned and impinged upon by heat and light, structures and surfaces, topographies and territories. Despite Kowalski’s practical focus, the paper thus offers a productive theoretical framing — suggesting that the algorithm is not simply an idealized and abstracted formula that exists in a vacuum, but rather a sociotechnical entity that must enlist material actors, make compromises and negotiate for its successes. As the starting point for a critique of algorithmic culture, this in turn suggests a set of problematics around power and governance — how is this force exerted, how is the procedural made operational? Coordinated by a logic of calculation, the control carried out by the algorithm is nevertheless fundamentally material and performative.

This more expansive, materialist understanding seems to offer a more productive foundation for analyzing contemporary algorithmic systems. Uber, for example, no longer conforms to the traditional framing of source code and software. The scale of its operations is vast, encompassing thousands of cities, dozens of countries, and millions of users. This in turn establishes diverse set of legal requirements and local stipulations (e.g. Chinese vs US transport legislation). Constantly shifting, these must be integrated into the system as a whole without disrupting services or breaking existing functionality. This is why Uber, like many contemporary platforms, have moved away from monolithic applications with single codebases, and instead are comprised of microservices.[1] These are small, targeted services that do one thing and do it well — converting currency, logging miles, tracking ads. Each microservice is maintained by a single team, and each can be updated without disrupting other services. Hundreds of these services sit within a wider ecosystem, responding asynchronously to requests as they arrive. From a traditional code studies perspective, this means that there is no source code — no single text responsible for the functionality witnessed in the whole.

Instead, behaviours emerge from the complex interplay between agents — flows of data pass between microservices, matter is spun up in data-centers, bodies are looped into queued tasks, capital is shunted between accounts. So algorithmic objects can be understood as ecologies. For one, this corresponds to their internal disparities. Rather than a smooth, monolithic medium, the term ‘ecologies’ seems to better encompass this heterogeneous mix of cables and wire, bodies and vehicles, capital and code. The algorithmic glues together these disparate elements and divergent objectives into an effective procedure, but their latent differences remain. Thus we might ask, as Matthew Fuller does, what makes up these ecologies with their “shared rhythms, codes, politics, capacities, predispositions and drives, and how can these be said to mix, to interrelate and to produce patterns, dangers and potentials?” (2). Secondly, the notion of an ecology foregrounds their distributed nature. Rather than a single object, an algorithmic ecology is spatially and temporally dispersed. Take, for instance, the everyday act of a user locating herself using a phone. Even this apparently simple operation encompasses a gesture of the hand, a collection of smartphone circuitry, a network of data centres, a stretch of submarine cabling, a series of geospatial satellites, and so on. As Erich Hörl suggests, this is a “culture of control that is radically distributed and distributive, manifest in computers migrating into the environment, in algorithmic and sensorial environments” (4). Multi-scalar in its operations and messy in its blend of the social, material and technical, the algorithmic ecology seems to be a productive expansion from the singular and typically apolitical algorithm. Reframed in this way, ecologies provide a way of “understanding the various scales and layers through which media are articulated together with politics, capitalism and nature, in which processes of media and technology cannot be detached from subjectivation” (Parikka and Goddard 1).

These elements come together in various ways to carry out activity in the world. A certain “grammar of operations” (Fuller 167) must be performed in order to map subjects and spaces, draw them into a functional sequence, and exhaust their productive potential. To do this, the forces exerted by the ‘merely’ technical operations of the algorithmic — storing, searching, indexing, presenting — must accumulate into meta-operations: encapsulating life, enlisting subjects, remaking space, and enchanting users. In focusing on these performances, we move away from secret codes and software to a set of observable and embodied operations that can be analyzed. But the move from whiteboard to world is also hazardous. To consistently arrive at a particular objective, an algorithmic ecology must successfully coordinate human and non-human forces — matter which can be ambivalent or even antagonistic. Here, data becomes messy, subjects turn contentious, space can be uncooperative. Execution, as Wendy Chun insists, is not simply a “perfunctory affair” (304). Nothing is guaranteed. Instead, any power must be incessantly negotiated. What occurs when these operations are unsuccessful? For the ride-share company Uber, human labor must be smoothly integrated as a component and coordinated into the overall objective of moving passengers from A to B. But often this human element is inadequately understood and internalized.[2] The result, as explored below, is a collapse of algorithmic power — a critical inoperability.

Uber as Algorithmic Failure

Encapsulation

Uber’s worker starts life as a data-object. The object specifies the properties that represent the platform’s so-called Driver-Partner: name, city, rating, current status and so on. Within Uber’s inner world, every Driver-Partner is abstracted into a collection of variables or parameters. The rich life of the subject is thus mapped onto an internal schema, a process I call encapsulation. In computer science terms, this abstraction forms the information ontology, defining the Objects that can exist, the Properties assignable, the Relationships that are to be acknowledged, and the Functions that can be executed. This abstraction is highly productive in that it establishes a common schema across a product or platform — both defining a core set of features and a means of cross-indexing fields. By defining Arjun and Mika as Driver objects, for example, both inherit a predefined set of affordances, giving them both the ability to accept Ride Requests. This definition also assigns a common set of properties, allowing any driver to be rated and compared against any other driver.

But to abstract is also to ignore. Any internalization of those parameters deemed significant is simultaneously an externalization of those aspects of the subject considered superfluous. Rather than nefarious, this is the inevitable result of any design decision, a decision which inevitably foregrounds particular aspects whilst discarding others. Yet in a terrain of algorithmic governance which both establishes the positionality of the laborer and the contours of production, this abstraction becomes highly important. In Seb Franklin’s words, “the question of what is central (and thus captured and modeled) and what is peripheral (and thus discarded) within computationalist modes of social representation takes on a distinctive historical and political significance” (47). In the case of Uber, this model or internal understanding of the Driver reflects the axiomatics of capital, producing a particularly ‘thick’ description of those parameters considered significant for accumulation whilst including a very thin understanding of other aspects of identity: race, religion, gender, culture and class. Whilst the internal informational structures of Uber are, of course, proprietary and therefore locked away from scrutiny, the 500+ variables associated with each Uber Rider were made public due to a recent court case (Spangenberg vs Uber Technologies Inc). A very small selection of these include:

advertiser_id
billing_user_country_id
cancels_10mins_prior_to_last_cancel
card_bin_banned_ users
card_typedeferred_promotion_count
dynamic_fare
firstname
fraud_risk
google_advertisingpayment_profile_banned
payment_profile_count
payment_profile_prepaid
payment_profile_uuid
potential_rider_ driver _collusion_tags_shared_by_device
rating
request_device_rooted
signup_lat
signup_lngtotal_billing_country_id
trip_distance
trip_duration
trip_status
uber_id
user_agent

Whilst any direct translation between Rider and Driver objects would be speculative, the leaked variables list reinforces this lopsided tendency of the algorithmic — piling on parameters in order to build up a highly articulated understanding of earnings performance and product preferences, for example, whilst leaving other components of subjectivity lightly sketched or entirely unaccounted for. The result is a generic driver, interchangeable with any other. Important complexities and contingencies are not encapsulated, leaking out of this strict envelope. As Matthew Fuller attests, “systems grappling with their outside” inevitably produce a likeness, but also a “collapse and spillage” (83). Encapsulation takes place simultaneously with dis-encapsulation, in which significant forms of subjectivity are discarded as excess. So Uber’s understanding of the worker is universal, fungible — a driver is a driver. And this thin understanding recoils on the rideshare company in various ways.

Enlistment

Every time a Rider requests a ride, a driver needs to be there. And they not only need to show up, but to perform a professional and timely service. In moving into the world, the algorithmic must consistently draw upon the productive performances of bodies, materials, and flows, an operation I call enlistment. The algorithmic attempts to incorporate and coordinate these performances, and yet this matter cannot simply be coerced. In this particular case, Uber must enlist a worker towards a specific objective. Yet Uber, like other contemporary labor platforms, has gone to great lengths to decouple itself from its suppliers, insisting that its drivers are atomized and autonomous. This dream of commanding labor without taking on the full financial, logistical or ethical responsibilities for labor is a highly seductive vision from the perspective of capital. But it also makes things more difficult. Contractually, for instance, Uber defines its workers as freelance Driver-Partners, foreclosing an array of legal and labor coercions available within the traditional employer/employee agreement. Spatially, Uber has largely jettisoned the traditional brick-and-mortar infrastructure of the traditional corporation. City offices, for example, have been replaced by the company’s so-called Green Light Hubs, in which a handful of hot-desking employees armed with some basic administrative software are expected to provide basic support for all of Uber’s drivers in a major city.[3] Supervisor and supervised do not occupy the same physical space, precluding a set of disciplinary techniques derived from gazes and beratements directed onto bodies. Thus the particular conditions of labor that Uber establishes short circuits many conventional procedures for asserting power, requiring instead a new set of techniques which are neither corporeal or contractual in the strict sense, but must nevertheless be highly effective.

To assert the force necessary for enlistment, Uber deploys a cluster of techniques: timed messaging, gamified missions, citywide campaigns, surge notifications. Promotions, for example, are featured on the home ‘feed’ in the app and take the form of targeted campaigns which typically offer higher wages for driving in a specified place at a set time. While these campaigns conform to classic incentivization schemes, the real-time feedback enabled by the platform shifts them into gamification. For instance, the promotion of ‘Drive 18 trips, make $60 extra’ as a proposition written in text appears as a purely financial reward—a performance-based pay boost. However, the campaign is represented as an ongoing challenge, indicated by a green progress bar which notches up instantly after every successful drop-off. The combination of responsive data and real-time messaging thus transforms a dry offer into a gamified mission, harnessing the same kind of level-up logic and micro dopamine hits well understood in the gaming and gambling industries. As one London driver explains, “it’s like being in the bookies. It is very, very addictive” (Knight). Taken together, these attempt to direct drivers into a ‘best practice’ performance conducted in particular places at particular times.

But enlistment can only operate on the understanding of the Driver that Uber has encapsulated — a universal everyman, a generic caricature. Alex Rosenblat and Tim Hwang, drawing upon extensive ethnographic research into the rideshare company, have argued that the unique performances required of the worker in different cities sets up a categorical distinction — they are not the same job (6). And yet from the perspective of data (and the business logic built atop it) the distinctions between drivers in Toronto or Taipei, part-time or full-time, retiree or student are largely elided. As the duo argue, the universal platform mistakenly sees the labor pool as monolithic, a “relatively equivalent mass” (4). Because of this, Uber’s ‘targeted communications’ largely miss their target and instead fall on an abstracted, algorithmically constructed subject that often fails to incorporate the complex and varied motivations unique to each worker. This explains why Uber’s attempts to funnel workers into shift work have been largely ineffective, and why many drivers ignore mechanisms like Surge pricing altogether (Lee et al. 5). A clear gap begins to emerge between the worker and Uber’s understanding of the worker.

Enlistment becomes de-enlistment. Rather than been drawn into the overall objectives of the algorithmic, workers ignore this pull — and in many cases withdraw from the regime entirely. Uber’s own report, commissioned in 2015 in collaboration with Princeton University, found that just under half of all drivers quit the rideshare platform after the first year (Hall and Krueger 16). Indeed, this trend of exiting labor appears to be accelerating. Drawing upon internal information from Uber itself, The Information recently demonstrated that only 6% of drivers remain after the first year (Efrati). The various rationales underlying such desertion en masse are no doubt complex. But this is precisely the point — Uber’s enlistment of the worker is underpinned by an abstracted object which fails to encapsulate a diversity of drivers and their equally diverse desires.

Enchantment

Flung into the world, the ability of the algorithmic to directly code behaviours and practices is limited by various frictions: social, material, legal, ethical, and so on. In a similar fashion, the capacity of the algorithmic to capture and understand the performances at work is highly constrained — only so much information can be gleaned from smartphones and sensors. Thus, when faced with the complexities of reality, the limits of technicity rapidly come to the fore. To overcome these limits, algorithmic ecologies often contain variants of enchantment, an operation that seeks to draw out a particular subjectivity which accommodates itself to the algorithmic. Here the technical is supplemented by the psychological. The subject adapts his or her behaviours, collaborating with the algorithmic by playing to its strengths and overlooking its weaknesses. For Alfred Gell, enchantment becomes a form of technology in itself, one which “contributes to securing the acquiescence of individuals in the network of intentionalities in which they are enmeshed” (43).

The enchanted subject works to make her activities legible. Practices must not simply be performed, but done so in a way which is algorithmically recognized. Researcher Tarleton Gillespie calls this type of performance “turning to face the algorithm”. A subjectivity is cultivated that remains sensitive to the values of the algorithmic and attempts to mirror the desired response. Every algorithmic regime contains its own particular logi — certain practices are privileged while others go ignored. To be sure, the ability to understand and mirror back a particular logic provides a set of tangible rewards. For example, Gillespie notes the additional likes, shares and traction that social media content with certain hashtags can gain, a set of metrics directly convertible to cultural or financial capital. In other words, the mastery of an algorithmic grammar plays out as performances of images, codes, and phrases deployed in specific ways to achieve certain ends. Yet to see this behaviour as rote ritual or superficial mimicry would be to miss the extent to which enchantment attempts to draw out an inner reconfiguration, a reconfiguration which must ultimately be initiated and refined by the self. By internalizing the logic involved, performing these logics in ways that are legible, observing the results that follow, and then adjusting the self as necessary, a loop of iterative subjectivation is established. Within algorithmic environments, this iteration engenders a powerful circuit of perpetual self-formation. In doing so, it brings into “congruence the gaze of the other and that gaze which one aims at oneself when one measures one’s everyday actions” (Foucault 221).

When enchantment takes hold, Uber’s drivers also make this turn towards the algorithmic, actively collaborating with its logics and offsetting its blind spots. One example of this is the affective labor undertaken by each driver, the ‘service with a smile’ theorized by Arlie Hochschild in her seminal study. As Hochschild defined it, this was the labor requiring “one to induce or suppress feeling in order to sustain the outward countenance that produces the proper state of mind in others (7). For Uber, this affective labor — a greeting, a handshake, an offer of water, an atmosphere of hospitality — cannot be hardcoded, not merely because of technical constraints, but because this kind of work must always appear convivial and improvised in order to be effective. In order to feel authentic, affective labor must lie beyond the bounds of automation. As Hochschild theorized (43), this does indeed require a kind of internal management, a discipline that fosters warmth while suppressing frustration and fatigue “for otherwise the labor would show in an unseemly way, and the product — passenger contentment — would be damaged” (43). Ratings — which must remain at 4.7 or above — certainly provide both the incentive to undertake this management and a metric measuring the success or failure of this affective labor. Yet the specific form of these ‘above and beyond’ gestures and the kernel of sincerity necessary to instigate them are left undefined. Technicity reaches its limits and an enchanted subjectivity steps into the gap.

If successful, enchantment results in a self-managed accommodation to algorithmic logics. But Uber’s ineffective encapsulation and enlistment instead often disenchants the worker. Disillusioned, drivers work to obfuscate rather than make legible, discovering ‘hacks’ and share them on forums. For example, if the driver has declined a Ride Request, he or she will receive a warning message in the Partner homescreen with the attention-grabbing headline of ‘Your Earnings’. These messages are color-coded in orange and accompanied by the conventional cautionary icon of an exclamation mark centred in a triangle. As driver Harry Campbell explains, they are warnings, because “if you miss more than 2 requests, Uber will actually place a driver on ‘time out’ for 2 minutes”. However one veteran driver on a forum offered an easy workaround to the ‘missed pings’ (declined rides) ban. The solution, as Campbell points out, “is to log off IMMEDIATELY after letting a ping go, then logging right back in. This will clear your missed pings before they can put you in ‘time-out'”.

Rather than ‘breaking’ the system, techniques such as this are better understood as immanent to it, widening a fundamental gap that already exists, the gap between subjects and their algorithmically understood counterpart. The section on encapsulation demonstrated the inevitable slippages which emerge between the driver and her data representation, between the rich sociocultural realities of the subject and her thinly defined object within an information ontology. In the case of the logoff technique, the gap between subject and Uber’s understanding takes the form of a temporal distinction — a difference between the smooth, cohesive time of the subject and the syncopated temporality of the platform. Far from being glitches or errors, these techniques rely on the very consistency of computation — logically working with its internal (and inevitably partial) understandings.

Today, power is conducted through the prism of the algorithmic. This power is never given or assumed, but must be incessantly performed through a set of operations. These technical operations—instantiating objects and indexing data — must coalesce into meta-operations—creating subjectivities, forming relations, and directing work. In carrying out encapsulation, enlistment and enchantment, algorithmic platforms exert significant force on subjects. Yet the opposite also applies — when this grammar of operations is partial or unsuccessful, traction is not attained and a gap between subject and referent emerges. As each new technique is added, the gap between subject and referent only increases. In this sense, the algorithmic is often constructed, not unlike finance, as “long chains of increasingly speculative instruments that all rest on the alleged stability of that first step” (Sassen 118). Instrumentalizing this discrepancy suggests more intentional and effective interventions in the algorithmic regimes that increasingly shape our everyday.

 

Notes

[1] For an early discussion on the difficulties of scaling Uber and a decision to move to a service-oriented approach, see Haddad, Einas. “Service-Oriented Architecture: Scaling the Uber Engineering Codebase As We Grow.” Uber Engineering Blog, 8 Sept. 2015, https://eng.uber.com/soa/.
For an overview of the benefits of microservices and a case-study of one particular service, see: Reinhold, Emily. “The Opportunities Microservices Provide at Uber Engineering.” Uber Engineering Blog, 20 Apr. 2016, https://eng.uber.com/building-tincup/.
For an example of how other algorithmically driven corporations have adopted microservices, see: Cebula, Melanie. Airbnb, From Monolith to Microservices: How to Scale Your Architecture. https://www.youtube.com/watch?v=N1BWMW9NEQc. FutureStack Conference New York.

[2] The failure of Uber (or any other algorithmic ecology) to successfully internalize and instrumentalize human productivities should not be read as a rehabilitation of some immutable boundary between humanity and technology. Indeed, these categories are highly entangled: on the one hand, as Marcel Mauss reminded us, man’s “first and most natural technical object” is the body, and on the other, the ostensibly technical aspects of algorithmic systems are actually all-too-human: sedimentations of mathematical techniques, scholarly research, capitalist imperatives, business logic, and so on. Nor, as one reviewer pointed out, are these operational frictions limited only to the human, as might be inferred when only reading the present case-study on Uber. Articles elsewhere have focused, for example, on Amazon’s negotiation with the unwanted noisiness of the kitchen space and the undesired latency of geographical distance. Yet, at the same time, this text does want to stress how algorithmic infiltration into the everyday establishes a new set of frictions, and how the frictional human—whilst not exceptional—is a good example of an element with complex historical, psychological and cultural aspects which are abstracted away or ignored when integrated into operational logics. While it is traction, not perfection, that matters to algorithmic power, such thin (mis)understandings end up impinging upon operability itself.

[3] This is certainly the case in Auckland, New Zealand, for instance, where a personal visit to the Green Light Hub in Parnell reveals that a handful of young employees with laptops and a suite of service software underpin Uber’s operations in a city of around 2 million residents and 300,000 Riders, according to Uber’s own advertisements: https://www.uber.com/info/ride-nz/.

 

Works cited

Campbell, Harry. “How Uber Uses Behavior Modification To Control Its Drivers.” The Rideshare Guy Blog and Podcast, 17 Oct (2016). Web. http://therideshareguy.com/how-uber-uses-behavior-modification-to-control-its-drivers/.

Cebula, Melanie. Airbnb, From Monolith to Microservices: How to Scale Your Architecture. FutureStack Conference New York. Web. https://www.youtube.com/watch?v=N1BWMW9NEQc.

Chun, Wendy Hui Kyong. “On ‘Sourcery,’ or Code as Fetish.” Configurations, vol. 16, no. 3 (2008): 299–324. print.

Crossley, John N., and Alan S. Henry. “Thus Spake Al-Khwārizmī: A Translation of the Text of Cambridge University Library Ms. Ii. Vi. 5.” Historia Mathematica, vol. 17, no. 2 (1990): 103–131. Print.

Efrati, Amir. “How Uber Will Combat Rising Driver Churn.” The Information, 20 Apr (2017). Web. https://www.theinformation.com/articles/how-uber-will-combat-rising-driver-churn.

Foucault, Michel. “Self Writing.” Ethics: Subjectivity and Truth, The New Press, 1997, pp. 207–222. Print.

Franklin, Seb. Control: Digitality as Cultural Logic. Cambridge, Mass.: MIT Press, 2015. Print.

Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture. MIT Press, 2005.

Galloway, Alexander R. Protocol: How Control Exists after Decentralization. Cambridge, Mass.: MIT Press, 2004. Print.

Gell, Alfred. “The Technology of Enchantment and the Enchantment of Technology.” Anthropology, Art and Aesthetics, edited by Jeremy Coote and Anthony Shelton, 1992, pp. 40–63. Print.

Gillespie, Tarleton. “The Relevance of Algorithms.” Media Technologies Essays on Communication, Materiality, and Society, edited by Tarleton Gillespie et al., MIT Press, 2014, pp. 167–94. Print.

Haddad, Einas. “Service-Oriented Architecture: Scaling the Uber Engineering Codebase As We Grow.” Uber Engineering Blog, 8 Sept (2015). Web. https://eng.uber.com/soa/.

Hall, Jonathan V., and Alan B. Krueger. “An Analysis of the Labor Market for Uber’s Driver-Partners in the United States.” Industrial Relations Review (2015): 1–27. Print.

Hill, Kashmir. “Uber Doesn’t Want You To See This Document About Its Vast Data Surveillance System.” Gizmodo, 19 May (2017). Web. https://www.gizmodo.com.au/2017/05/uber-doesnt-want-you-to-see-this-document-about-its-vast-data-surveillance-system/.

Hochschild, Arlie Russell. The Managed Heart: Commercialization Of Human Feeling. University of California Press, 2003. Print.

Hörl, Erich. “Introduction to General Ecology: The Ecologization of Thinking.” General Ecology: The New Ecological Paradigm, edited by Erich Hörl and James Burton, Bloomsbury Academic, 2017, pp. 1–74. Print.

Kalanick, Travis. “Celebrating Cities: A New Look and Feel for Uber.” Uber Global, 2 Feb (2016). Web. https://newsroom.uber.com/celebrating-cities-a-new-look-and-feel-for-uber/.

Kleene, Stephen Cole. “Recursive Predicates and Quantifiers.” Transactions of the American Mathematical Society, vol. 53, no. 1 (1943): 41–73. Print.

Knight, Sam. “How Uber Conquered London.” The Guardian, 27 Apr. 2016. The Guardian. Web. https://www.theguardian.com/technology/2016/apr/27/how-uber-conquered-london.

Kowalski, Robert. “Algorithm = Logic + Control.” Communications of the ACM, vol. 22, no. 7 (1979): 424–436. Print.

Lee, Min Kyung, et al. “Working With Machines: The Impact of Algorithmic and Data-Driven Management on Human Workers.” Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, ACM (2015): 1603–1612. Web. https://www.cs.cmu.edu/~mklee/materials/Publication/2015-CHI_algorithmic_management.pdf.

Marino, Marc. “Critical Code Studies.” Electronic Book Review, 4 Dec (2006). Web. http://www.electronicbookreview.com/thread/electropoetics/codology.

Parikka, Jussi, and Michael Goddard. “Unnatural Ecologies.” The Fibreculture Journal, no. 17 (2011): 1–5. Print.

Reinhold, Emily. “The Opportunities Microservices Provide at Uber Engineering.” Uber Engineering Blog, 20 Apr (2016). Web. https://eng.uber.com/building-tincup/.

Rosenblat, Alex, and Tim Hwang. Regional Diversity in Autonomy and Work: A Case Study from Uber and Lyft Drivers. 2016. Google Scholar. Web. https://datasociety.net/pubs/ia/Rosenblat-Hwang_Regional_Diversity-10-13.pdf.

Sassen, Saskia. Expulsions: Brutality and Complexity in the Global Economy. Harvard University Press, 2014. Print.

Schönhart, Sabrina, et al. “History of Algorithms.” Virtual Exhibitions in Informatics (2003). Web. http://cs-exhibitions.uni-klu.ac.at/index.php?id=193.

Shustek, Len. “What Should We Collect to Preserve the History of Software?” IEEE Annals of the History of Computing, vol. 28, no. 4 (2006): 112–111. Print.

Spangenberg, Samuel. SAMUEL WARD SPANGENBERG vs. UBER TECHNOLOGIES, INC. CGC-16-552156, 5 Oct (2016). Web. https://www.documentcloud.org/documents/3227535-Spangenberg-Declaration.html.

Steele, Robert. The Earliest Arithmetics in English. Oxford University Press, 1922. Print.

Posted in Journal Issues, Research Values Tagged with: , , ,