New review of Graham Harmans “Reassembling the Political” on Circling Squares

Philip over at Circling Squares just blogged about his first read of Graham Harman´s new book on Latour´s political philosophy. Both book and review are worth a look (and: remembering my impressions when reading the Prince of Networks I assume Philip´s thoughts on Harman´s new book resonate with me quite well…):

Much like his previous writings on Latour, Harmans new book should not be read as a neutral introduction. Just like Prince of Networks, Harman really ends up talking about himself and his own interests via the medium of Latours concepts. Harmans clear, accessible and sometimes entertaining prose style, and the excessively extended introductory quotations, should not distract from this point. The final two fifths of the book are far stronger than the earlier part with interesting and valuable discussions of Zizek and Strauss. However, as mentioned above, it appears to be a book written both in a hurry and in a style that fosters the appearance of being relatively neutral and introductory while in fact being anything but. Once again, Harman completely ignores the more interesting and complex, pluralistic aspects of Latours work and his unwavering groundedness in problems.


via Circling Squares: Harmans Reassembling the Political—Some first impressions.

Kyle McGee’s ‘Bruno Latour: The Normativity of Networks’

Thanks a lot to Philip for pointing us to Klye McGee´s new book, and especially to the three pieces on the AIME website:

There are three extracts from the book on the AIME website (registration required):
The co-presence of [pol] and [law]
The ontology of lawyer jokes
Legal reasoning as de-stratification

I am so looking forward to checking the book out. Or maybe i just wait for a review on: Circling Squares: Kyle McGee’s ‘Bruno Latour: The Normativity of Networks’.

Latour´s Cosmocolosse. A project of Gaia Global Circus

I am just listening to the german translation of the radio version of Latour´s first play “Cosmocolosse. A project of Gaia Global Circus (written with Frédérique Ait-Touati & Chloé Latour) that was just released in December. I remember that Paul Edwards told me about this being work in progress last summer – and it seems there is no english translation that you can listen to. But the text is available…and reading (to cite Niklas Luhmann´s comment on why he had no TV set) is much faster than listening or watching.

Easing Sociology into the Non-Modern World?

In a recent post on “Understanding Society”, Daniel Little discussed some recent development in the philosophy of social science: Analytic Sociology, Critical Realism and Actor-Network Theory. Here is how it goes:

Understanding Society: How do the poles of current PSS interact?

Start with a few resonances between ANT and CR. Both are grounded in a philosophical system (Deleuze, Kant), and they both make use of philosophical arguments to arrive at substantive conclusions. (…) But a point of contrast is pervasive: CR is realist, and ANT is constructionist.

(…) The anti-philosophical bent of AS makes it difficult for AS scholars to read and benefit from the writings of ANT scholars (witness, for example, Hedstrom’s dismissal of Bourdieu). The model of explanation that is presupposed by AS — demonstration of how higher-level entities are given their properties by the intentional actions of individuals — is explicitly rejected by ANT. (…)

Finally, what about the relation between AS and CR? On the issue of causation there is a degree of separation — AS favors causal mechanisms, preferably grounded in the level of individuals, whereas CR favors causal powers at all levels. (…) But here there is perhaps room for a degree of accommodation, if CR scholars can be persuaded of the idea of relative explanatory autonomy advocated elsewhere here.

via Understanding Society: How do the poles of current PSS interact?.

Somehow I felt a bit like I was suddenly in an academic version of an episode of Doctor Who, jumping back in time, stranded in 1992. In a paper that came out in parallel to the well know “Chicken Debate” (Collins/Yearly 1992; Latour/Callon 1992), but that was burried in a edited volume (McMullin 1992), Latour moved ANT explicitly away from what until then was called the “Social Studies of Science” and towards a framework to deal with the moderns. He wrote:

“A radical is someone who claims that scientific knowledge is entirely constructed ‘out of’ social relations; a progressist is someone who would say that it is ‘partially’ constructed out of social relations but that nature somehow ‘leaks in’ at the end. (…) a reactionary is someone who would claim science becomes really scientific only when it finally sheds any trace of social construction; while a conservative would say that although science escapes from society there are still factors from society that ‘leak in’ and influence its developmentIn the middle, would be the marsh of wishy-washy scholars who add a little bit of nature to a little bit of society and shun the two extremes.(…)” (Latour 1992: 277)

And then concludes:

“If one goes from left to right then one has to be a social constructivist; if, on the contrary, one goes from right to left, then one has to be a closet realist.  (…) It is fun to play but after twenty years of it we might shift to other games (…)”

Hmmm. Maybe it is not Doctor Who, maybe it is Groundhog Day and we are just waking up again at 6:00 to the voices of Sonny and Cher. Do we really have to have the same debate again, now not in the philosophy of science but the philosophy of social science? Maybe there is a reason to do that – after all, Phil Connors has to repeat his morning routine over and over again until he starts making himself a better man and finally finds a way to love and happiness. After all, Little does indeed think it is important to add “ANT to the menu for the philosophy of social science, at least as a condiment if not the main course” and there are others in that field today that share his opinion. Maybe if we don´t discuss those issues again and try to work diplomatically with the moderns (in this case: AS and CR…) on what they value most, we are selfish and grumpy fools – like the character that Bill Murray played so beautifully. But it is not tempting to know that the sound of “I got you babe” will be with us for a while. Is there a way out? A shortcut? A “shift to other games”?

New paper on crowdsourcing

An interesting paper on crowdsourcing just came out in the “Computational & Mathematical Organization Theory” journal. “Maximizing benefits from crowdsourced data” by Geoffrey Barbier et al. explores how crowdsourcing can be used for purposes of collective action and problem-solving, for example, in disaster response and by relief organizations.
Here’s the abstract:

Crowds of people can solve some problems faster than individuals or small groups. A crowd can also rapidly generate data about circumstances affecting the crowd itself. This crowdsourced data can be leveraged to benefit the crowd by providing information or solutions faster than traditional means. However, the crowdsourced data can hardly be used directly to yield usable information. Intelligently analyzing and processing crowdsourced information can help prepare data to maximize the usable information, thus returning the benefit to the crowd. This article highlights challenges and investigates opportunities associated with mining crowdsourced data to yield useful information, as well as details how crowdsource information and technologies can be used for response-coordination when needed, and finally suggests related areas for future research.

Besides being a very useful reference piece by providing a state of coverage with respect to crowdsourced data – like where to find it and what to make of it -, the paper is also a nice illustration of how social scientists become more and more involved in leveraging “big data” from informational infrastructures and from web activity in general. Crowdsourced data but also initially a lot less directed, if not accidental, information flows appear to increasingly be data-mined for a variety of purposes, not at least by – oops – us.
Check out the paper here.

Fields and Infrastructure? A comment to Fligstein on orgtheory

As a guest blogger on orgtheory Neil Fligstein started a series of posts about his and Doug McAdam´s new book “A Theory of Fields ” (Oxford 2012). It seems to me that to continue the debate on Instiutionalism and Infrastructuralism we should take his analysis that most of what is called NI today is – although continuously citing Meyer/Rowan, Dimaggio/Powell and of course the Powell/Dimaggio book – no longer working out the “old neo institutional” programm but try to deal with problems of ongoing activity, constant and gradual change and overlapping fields. In Fligsteins words:

One critical argument of the “new” new institutionalism is that actors are always jockeying for position in existing fields. They are always trying to better their situation and in doing so, can create change in both their position and the underlying order of the field. This produces two distinct kinds of change, the change whereby a new institutional order comes into existence and the more common situation whereby change is more gradual and continuous.


But this view of the world posits two radically different states, one where we can be agents and make our world and the other where we can do little about it. “A Theory of Fields” undermines this entire line of argument by asserting that actors are always acting and this means they are always struggling. They are in a battle for position and the game is always being played. This means that “A Theory of Fields” is part of a “new” new institutionalism that honors actors, sees purposes, interests, and identities, and allows for stuff to happen all the time.

Although I would reflexively add that what Fligstein seems to do here is itself the activity of an instiutional entrepreneur – trying to change the rules of the field or – if that turns out to be impossible – prepare the setup of a new field, I guess there are some common problematiques that both the proposed new new institutionalism and the infrastructuralist agend have in common: a focus on practice (The Theory of Fields draws heavily on Bourdieu), a focus on constant change, a curiosity for the question of how in a world of constant change patterned activity is produced and the focus on struggles (although we might be critical about the psychological undertone of the term struggle and rather speak of trials of strength).

Microfoundations, institutions and two ways of studying technologies

Quite some time ago we had a couple of posts on the possible links between STS and Neo-Institutionalism (see here, here, here and here) and about how both camps can be fruitfully matched in their attempts to get a grasp a the black-boxed, taken-for-granted or institutionalized character of modern practices.

One of the basic lines of linkage we identified back then was this: while Neo-Institutionalism is great at pointing out the empirical details and explaining the diffusion and isomorphisms of patterns that are taken-for-granted (institutions), they lack (following Powell and Colways 2008) a perspective on the respective microfoundations. THAT on the other hand is something that (most) STS approaches are quite good at – but they on the other hand – see for example the underdetermined concept of black-boxing – lack an understanding of how the “functional simplification” (Luhmann 1997) that technology enacts is comparable to other forms of making something taken for granted: habitualization (in the bourdieuian sense), embodyment, signification, formalization, institutionalization.

After reading Barley´s and Tolbert´s 1997 paper in Organization Studies on Institutionalization and Structuration and after reviewing Barles´s research on technologies at workplaces I wondered a. if and how the Powell and Colways argument about the missing micro-foundations has ever been valid in institutional theory given the amount of thought that Barley and Tolbert are investing in designing their concept of scripts and the methodology to analyze them and b. why STS approaches to technology do not seem to play a large role in institutional analysis that deal with technologies on the one hand and why these institutional approaches to technology on the other hand do also not play a significant role in STS? Any thoughts?

Water as a boundary liquid … ahhh…object

Good news for our friends in water infrastructure research (and for those interested in STS and state theory of course): Patrick Carrolls “Water, and Technoscientific State Formation in California” has just been published as an “online-first” by Social Studies of Science!

This paper argues that water gradually became, over a period of more than half a century, a critical boundary object between science and governance in California. The paper historicizes ‘water’, and argues that a series of discrete problems that involved water, particularly the reclamation of ‘swampland’ in the Sacramento Valley, gradually came to be viewed as a single ‘water problem’ with many facets. My overarching theoretical aim is to rethink the ontology of the technoscientific state through the tools of actor-network theory. I conclude with the following paradox: the more the technoscientific state forms into a complex gathering – or ‘thing’ – of which humans are part, the more it is represented and perceived as a simplified and singular actor set apart from those same humans.

Legit Infrastructure?

An interesting, new-ish scholar to look into is Ben Cashore, Professor, Environmental Governance & Political Science; Director of the Governance, Environment, and Markets Initiative at Yale (GEM) and Director, Program on Forest Policy and Governance.

Currently, I’m reading his co-authored paper on establishing legitimate non-state governance infrastructure, in his case, regarding the voluntary self-regulation for the development of forestlands and the sale of such harvests on the global market.

The paper is very similar to his other work, but it raises two great points worth considering:

1. While there is much ado about “governance without government” most governance seems to be about government in some way or another; put another way, most non-governmental goverance is in fact quite governmental in terms of its origins, functions, and composition. For example, the International Monetary Fund (IMF) was established by legitimate states and its body is mainly composed of individuals in or with strong ties to existing governments and it was established by states as a way for states to deal with international issues. This is good hidden-in-plain-sight observation: there is such a thing as goverance without government, but the “government” never gets too far from sight.

2. Because non-state mechanisms, such as voluntary self-regulation for the development of forestlands, have little or no power to enforce standards/norms and authority to penalize non-compliant firms, the legitimacy of these operations becomes paramount to understanding them. Their authority or power, which tend to be limited, are contingent on their legitimacy (note: this is a bit strong-handed, so please read the paper for a more nuanced interpretation).

Now, for the readers of this blog, this raises the issue of legitimate infrastructure. It is relatively rare to see work on infrastructure raise the notion of legitimacy, which is a concept that has many meanings and numerous analytical trajectories in various disciplines. In sociology, the new institutionalism is where I was first exposed to legitimacy arguments. Thinking back to Cashore’s work now, the development of non-state mechanisms such as voluntary self-regulation is influenced by the perceived legitimacy of the mechanism (and its relation to other mechanisms, conceivably), thus, might non-state infrastructural development follow the same underlying dynamic?

A good case, and the role of compatibility in theory selection

All, as promised, I would reveal some early review document about Govind Gopakumar‘s new book on water infrastructure in India named Transforming Urban Water Supply in India.

First, the book’s case selection is quite smart. India has a historical legacy of democratic social involvement of the public (a promise perhaps, between state and society) during the post-independence period. This context emerges, however, beset by global opportunities for growth and supra-state pressures for change. Infrastructure, fittingly, is stuck in the middle, especially when it comes to developing it, expanding it, and restructuring it. The case selection, therefore, becomes a good case to estimate, per the insights gathered by geographers and others interested in the expansion of neoliberal globalizaiton, whether or not the changes taking place in Indian urban infrastructure follow the trend toward “pervasive depoliticalization of public life” (Gopakumar 2012:4). Or, put another way,

Do global efforts erase existing political underpinnings and re-inscribe a fundamentally new political basis, or does the existing social and political environment continue to influence infrastructures in the face of global pressures? (Gopakumar 2012:5).

The question fits quite nicely with what Jan and I were thinking about as the new infrastructuralism. Govind’s research hints in many places, perhaps intentionally or perhaps he’s just feeling his way toward this or another new idea, of a broader theoretical contribution than is presented in the book. In this way, Govind’s book is somewhat bigger than it appears, both theoretically, but also literally, as the book is only 124 pages long.

Second, and here is a little more critique and a little less review, the book draws on David Harvey’s work, the Marxist geographer, Feenberg’s insights about technologies being designed to promote the interests of the powerful/influential, and Winner’s old (but still good) insight that technology shapes political life/reality in a way akin to how legislation does. What is sort of odd is that Pinch and Bijker’s piece on the social construction of technology (SCOT) plays a pivotal role in the theoretical build-up, especially insights about “relevant groups.” Now, SCOT has often been criticized for being apolitical or for ignoring issues related to power and politics (mainly, who is powerful enough to be “relevant,” if we must use the term power). For Govind, SCOT is nice because it allows him to distance himself from explaining infrastructural development and revision as merely moves toward purely technological or economic efficiency. However, this become a bridge to Winner, and wasn’t it Winner who wrote, famously, about the third step in SCOT analysis, of “opening the black and finding it empty” …

What behaviorism might tell us about Post-Foucauldian or "New" Suveillance Studies

I once asked what the “knowledge myth” might mean for STS, but this was really just a ploy to raise the issue of what it would take to generate a post-humanist form of behaviorism in STS.

How might we talk about the behavior of humans and non-humans parsimoniously without getting too bogged-down with endless debates of “who/what is congizant?” and “what role do intentions play?” This well-drodden issue was raised once again during our 4S sessions on the state.

One of the primary issues in behaviorism regarding the knowledge myth is “do you need to know how to do something to do it?” with the important follow-up that “if you do it, then you have shown (a) that you know how to do it, by benefit of doing it, or, if not that, (b) then knowing and doing are not nearly as related as we might otherwise demand in our social science accounts.”

Gary Marx, who wrote a great paper in Surveillance & Society, insists that surveillance in our high-tech age is different in non-trivial ways from traditional Foucauldian imagery of the somewhat distant past of “white hot pincers” during torture or the grand Panopticon. In particular, Marx’s analysis focuses on “unintended” data collection that is likely to be amassed by automated machines; think: data about data, for example, the location of a purchase, or the time stamp of a facebook post (Marx 2002:15). He goes into a, upon first glance, cool example: there is suspicion that a university building was going to fall prey to arson after a gatorade bottle full of explosive material was found on site. By cross-listing the keycard entry registries and the shipping code on the gatorade bottle, the culprit was found, and upon being found, s/he confessed. This seems like a straightforward case where data was collected and then used to capture a miscreant, but these data were never collected with the direct and explicit intention to be used for the purpose of catching criminals or criminal acts. These data were not intended to result in this end (I’d prefer a different term than “unintended,” but that will be another post).

Drawing on these materials liberally is Michalis Lianos (2003:412), in another paper in the same journal about Post-Foucauldian studies, who adopts and develops the idea that diverse technologies at “points of use” (let’s call them) result in data, which then contributes to what is referred to as “unintended control” which is not really intended to promote any values in particular, but that can be used in matters of control after collected.

The technologies so often utilized in Post-Foucauldian analyses are many and diffuse, but they rarely have any intentional politics, according to Lianos.This was quite a surprise, as I had always read these Post-Foucauldians as having an almost unanimous position that armies of little technologies were “out there” doing the dirty work of making neoliberalism a reality.

Back to Lianos: The data these automated machines collect might be used in political ways, big and small, but in a shrewd move, Lianos demands that in studies of control and surveillance, we must “break this correspondence between motive and outcome” or, put more exactly, “the intention to control is not a necessary precondition for effectively producing serious consequences for the sphere of control” (424). And, thus, we are left with an odd behaviorial post-humanist vision of technology where the “intentions” of the designer or user drift from primacy in analysis, and instead we observe what is collected or made, what is done with it, and what this contributes to  local levels and beyond.

Bravo, Lianos.

For us in STS, I have always been concerned that Foucauldians place so much emphasis on dispotif and governmentality when their analyses so often hinge on diffuse, micro-levels of technological use for the purpose of voluntary self-regulation. I am not referring to micro-physics or the art of government either. Instead, we get a nuanced view from Lianos of how the, to borrow a beloved phrase from Bruno, the “missing masses” do all the hard work in Post-Foucauldian Governmentality studies … although not intentionally.

Jan-Hendrik Passoth and I’s (Nicholas Rowland’s) comments at 4S

Jan and I organized Sessions 201 and 222 back-to-back on the topic of states, state measurement, and state theory. These talks and our comments were presented at the Annual Meeting of the Social Studies of Science in Cleveland, OH, November 05, 2011.

Session 201: Counting and Measuring

The relationship between science, technology, and governance is a relationship that shapes and is shaped by contemporary states. While this relationship has been influential in STS research on how contemporary modes of governance influence scientific practice and technological innovations, the converse question of the influence of both on governance is relatively underrepresented.

These sessions, therefore, take-up the task and explore this relationship and its depiction in history and social and political theory. The first session (session 201) is presenting a series of five case studies on the role of conflict, measurement and performativity for the enactment of stateness, drawing from rich empirical projects. The second session (session 222) is focusing on conceptualization and theoretical approaches, dealing mostly with the mechanisms and techniques of creating, maintaining and shifting the multiple ontologies of stateness.

Anat Leibler will show us the traditional science-state relationship, but from a new angle wherein the science of population measurement is embedded in states of conflict, in this case, being Israel and the Occupied Territories.

Hector Vera also emphasizes the central role of measurement, in his case; however, it is about measurement standards adopted by Mexico and the US, in a historical comparative case study approach.

Michael Rodriguez brings together the dual-tasks of counting and countings of populations, but on the level of micro-practices in his work on the role of “partnerships” with Latino communities that are often “undercounted” by traditional census techniques.

Keith Guzik returns our attention back to Mexico where rather than counting techniques or practices, he emphases the role of techno-infrastructure in his historical account of national security programs.

Daniel Barber also provides a historical view, but one more fine grained, drilling-deeply into the 1940s US Department of the Interior where two models of future energy use were evaluated quite openly; however, as we can all see, one of these methods has obviously become taken-for-granted.


Session 222: Theory and Ontology

Patrick Carroll shows us, through a detailed but theoretically oriented case study, how diverse, seemingly unrelated issues of water and water infrastructure became a – read, grouped or combined – political object of state governing.

Hendrik Vollmer describes another transformation which invokes the state; this time, however, through micro-measurement for sake of global comparison and regulation.

Erich Schienke grounds his paper in the fertile fodder of Ecocities in China, which do not yet fully exist (other than in discourse), showing how aggregated environmental indicators will be used, we think/he thinks, to re-position the Chinese state as an ecological civilization in the global theater of political action.

Kelly Moore’s (not in attendance) work challenges us to say “how does the state get into our bodies?” the answer to which turns out be a neoliberal story of government intervention into bodies through what she calls the promulgation of “pleasured self-discipline.”


Concluding Comments (once presentations end, and before questions):

All of the papers tackle the crucial, which we will crudely frame here as the classical concern over the relation between micro processes and macro entities. For example, the micro processes seen in Michael Rodriguez’s work on the day-to-day, on-the-ground counting of the undercounted, or Patrick Carroll’s work on water infrastructure where many seemingly distinct matters relating to people, land, and water where lashed-together and inverted to become one concern over water for some manner of macro entity usually referred to as the state. The relation between micro processes and macro entities is a debate worth studying.

And these presenters do much justice to this enduring debate by taking much more nuanced interpretations into their analyses, especially of counting practices, and their theoretical approaches to understanding where the state is and is not, and its multiple purported effects.

We observe empirically, and we all have seen this here today, that there are important similarities too between what we “see” on-the-ground and the conceptual tools we have inherited from our respective disciplines in sociology, history, geography, political science, and the like. The perhaps surprising link we speak of is between (a) the historically-embedded, highly-contingent, ongoing-accomplishments that we observe in our empirical investigations and (b) the conceptual apparatus that we invoke, as scholars.

To our minds, and this is our closing remark, which is perhaps c
ontroversial: it is of the utmost importance for scholars to remember that the concepts we make and their appearance and use in our field-sites are linked together. These are not merely opportunities to verify or reject our theories. Instead, they are valuable analytical opportunities to critically and empirically engage them.

Whether or not “the state” exists is a waste of our time; rather, it is precisely these ephemeral moments when, by whom, and how the state is brought into existence or invoked as a partner that we should direct our analytical and empirical attention to …  as we consider this a fertile site for STS’s group contribution to state theory.

First book available from "The MIT Press Infrastructure Series"

This is a great future publishing series for folks interested in infrastructure:

Check it out: Geoffrey C. Bowker and Paul N. Edwards, Associate Series Editors

In recent years, awareness of infrastructures has been building to a remarkable degree in virtually every area. The information infrastructure which subtends the revolutionary new forms of sociability, science, scholarship and business is one example. A second is the state of roads, bridges, dams, and other large, expensive, long-term investments as our national and international infrastructures fall into disrepair. A third is the energy infrastructures, both old (fossil fuels) and new (renewables), that subtend the world economy.

A few centers of important scholarship on infrastructures have emerged, such as large technical systems theory (history of technology), urban infrastructures (urban planning, geography), and information infrastructures (information studies, computer-supported cooperative work). Yet too much of this work has been siloed, focusing only a particular system or scale, and with few exceptions it has remained sequestered within some of the smaller academic fields. Finally, remarkably little work has been done on the comparative study of infrastructures: taking lessons from one field and modifying it for another.

The first book in the series is “Standards: Recipes for Reality”


Recipes for Reality
Lawrence Busch

Standards are the means by which we construct realities. There are established standards for professional accreditation, the environment, consumer products, animal welfare, the acceptable stress for highway bridges, healthcare, education–for almost everything. We are surrounded by a vast array of standards, many of which we take for granted but each of which has been and continues to be the subject of intense negotiation. In this book, Lawrence Busch investigates standards as “recipes for reality.” Standards, he argues, shape not only the physical world around us but also our social lives and even our selves.  

Busch shows how standards are intimately connected to power–that they often serve to empower some and disempower others. He outlines the history of formal standards and describes how modern science came to be associated with the moral-technical project of standardization of both people and things. He examines the use of standards to differentiate and how this affects our perceptions; he discusses the creation of a global system of audits, certifications, and accreditations; and he considers issues of trust, honesty, and risk. After exploring the troubled coexistence of standards and democracy, Busch suggests guidelines for developing fair, equitable, and effective standards. Taking a uniquely integrated and comprehensive view of the subject, Busch shows how standards for people and things are inextricably linked, how standards are always layered (even if often addressed serially), and how standards are simultaneously technical, social, moral, legal, and ontological devices.

About the Author

Lawrence Busch is University Distinguished Professor in the Center for the Study of Standards in Society in the Department of Sociology at Michigan State University and Professor of Standards and Society in the Centre for Economic and Social Aspects of Genomics at Lancaster University, U.K.


“Lawrence Busch’s book, Standards: Recipes for Reality, illustrates with vivid clarity the ubiquity and importance of these ‘things’ called standards. Rather than present a dry economic text, or a singular discipline’s focus, Busch has proposed a ‘Unified Field Theory’ for standards—a multidisciplinary view of standardization. For anyone interested in standardization from a policy, technical, or social perspective, this volume is absolutely essential.”
Carl Cargill, Principal Scientist of Standards, Adobe Systems; author of Open Systems Standardization: A Business Approach

“With enviable style and impeccable clarity, Busch shines a bright beam into the anonymous, invisible world of standards to reveal how these commonplace instruments order the messy world we live in. This deeply thoughtful work of political sociology is a must-read for anyone concerned with the hidden dynamics of power in contemporary industrial democracies.”
Sheila Jasanoff, Pforzheimer Professor of Science and Technology Studies, Harvard Kennedy School; author of Designs on Nature

“This book demonstrates that Lawrence Busch is not only an outstanding expert and even connoisseur of the subtle nuances of the world of standards that are used to make and unmake the world; he is also a critical analyst of their political and moral significance. Deeply informed by debates in the social sciences, economics, and even analytical philosophy, the book combines a rigorous examination with a great sense of humor in a journey that leads the reader from Harlequin romances to the auditable firm.”
Laurent Thévenot, Professor, École des Hautes Études en Sciences Sociales, Paris

What’s the next great infrastructure study?

One of the common topics of discussion between Jan-Hendrik and I is “What’s the next great infrastructure study?”

Patrick Carroll is writing about bogs and water infrastructure again, only this time in California rather than Ireland.

Anique Hommels is writing about re-building and unbuilding cities.

David Ribes is writing about cyberinfrastructure (in a paper worth reviewing).

A long time ago, to some, Tom Hughes wrote on the electricity infrastructure in Networks of Power.

What are some of the other classic infrastructure studies, and if we review enough of them, what stones seem obviously unturned? (that might become the next great infrastructure study)

The "lighthouse" (re: Coase) in new institutionalism is the museum

Per recent discussions of black-boxing and institutionalization, a paper that Fabio and I wrote seems useful to remember. It was a piece we wrote to interrogate the use of “the museum” by new institutionalists of the organizational analysis-bent.

Sociologists that study organizations often analyze the museum from a cultural
perspective that emphasizes the norms of the museum industry and the larger
society. We review this literature and suggest that sociologists should take into
account the technical demands of museums. Drawing on insights from social
studies of technology, we argue that museums are better understood as
organizations that must accomplish legitimate goals with specific technologies.
These technologies impact museums and the broader museum field in at least
three ways: they make specific types of art possible and permit individuals and
organizations to participate in the art world; they allow actors to insert new
practices in museums; and they can stabilize or destabilize museum practices.
We illustrate our arguments with examples drawn from the world of contemporary

Black box" and "taken-for-granted

I recently asked:

When are the processes that bring about a black box the same as those that bring about — in the institutionalist frame — the notion of taken-for-grantedness … and, when are the processes that bring about either of these notions incapable of producing the other?

Seems, upon further reflection, to be an obvious paper, which might bridge some of the thinking about technology and institutional arrangements. Restated as a couple of thesis statements, it would go:

Q1. What circumstances/processes do the concepts of “black box” and “taken-for-granted” both meaningfully capture?

Q2. What circumstances/processes does the concept of “black box” meaningfully capture that the concept “taken-for-granted” cannot?

Q3. What circumstances/processes does the concept of “taken-for-granted” meaningfully capture that the concept “black box” cannot?

Seems like an interesting review piece to see where organizational theorists and STSers have historically overlapped and where they have diverged, with the caveat that each might learn something if orthogonal points of divergence where re-considered in the respective lines of research.

Timmermans has done it again — this time about failures!

I have always enjoyed reading Stephan Timmermans’s research, and his new piece in STHV is no exception.

The abstract, which is below, is not only a good reversal on an old idea, but also solid prose  — worth the read.


Sociologists of science have argued that due to the institutional reward system negative research results, such as failed experiments, may harm scientific careers. We know little, however, of how scientists themselves make sense of negative research findings. Drawing from the sociology of work, the author discusses how researchers involved in a double-blind, placebo, controlled randomized clinical trial for methamphetamine dependency informally and formally interpret the emerging research results. Because the drug tested in the trial was not an effective treatment, the staff considered the trial a failure. In spite of the disappointing results, the staff involved in the daily work with research subjects still reframed the trial as meaningful because they were able to treat people for their drug dependency. The authors of the major publication also framed the results as worthwhile by linking their study to a previously published study in a post hoc analysis. The author concludes that negative research findings offer a collective opportunity to define what scientific work is about and that the effects of failed experiments depend on individual biography and institutional context.

Social significance of gap analysis

Although I’m not entirely sure of the implications for infrastructure, gap analysis is commonly used and seems promising as a research site — and yet, despite widespread use in management and implementation of software, gap analysis is an untapped and unappreciated workflow analysis technique in research.

In general, gap analysis takes three forms, which document the gaps between two states: current versus future, expected versus actual, perception versus delivered. The difference between the two states defines the gap, and from such assessments others are possible such as benchmarking (Boxwell 1994).

The first form is a map. Cartographic representations are mainly utilized in lean management to chart flows of raw materials – including information – currently necessary to make a product or service available to consumers so that they can be assessed for flow and waste. Once areas for improved flow and reduced waste are identified, analysts draw them into a future state value stream map. The differences between the two states define the gaps, which orient work toward that future condition. The map gap was designed at Toyota (Rother and Shook 1999).

The second form is a step chart. Temporality is built-into the step chart, which also identifies and compares current practice and desired future state for the performance of a service or product. Brown and Plenert (2006:319) provide a good example of where a step chart might solve the gap between expected and actual states: “customers may expect to wait only 20 minutes to see their doctor but, in fact, have to wait more than thirty minutes.” Step charts chart the steps necessary to move from current practice to future practice (Chakrapani 1999).

The third form, which is most appropriate for working-around packaged software, is a cross-list. Such analyses are most routinely undertaken in consumer research wherein gap analysis refers to the:

methodological tabulation of all known requirements of consumers in a particular category of products, together with a cross-listing of all features provided by existing products to satisfy existing requirements. Such a chart shows up any gaps that exist (n.a. 2006).

Once cross-listed in table format gaps make themselves obvious and their analysis points to unmet consumer demand which new or poorly marketed products might fulfill. However, prior to the establishment of a cross-list, consumer expectations and experiences must be gathered, for example, by focus-group interviews. Once collected and made to populate a cross-listed table, according to Brown and Plenert (2006:320), “gaps can be simply calculated as the arithmetic difference between the two measurements for each attribute.”

From "forces of production" to "forces of customization"

A new line of research might open-up if we read David Noble‘s Forces of Production again and ask “what is the relevance for contemporary packaged software”? Noble, who recently passed-away last December, wrote what is arguably one of the best books in STS’s past about the role of managerial power to determine the direction of technological development, much of which is accomplished by selecting one technology over another to foster toward future development. Additionally, Noble keyed us all to the idea of the “path not traveled” wherein we consider “what might have been” had another road been traveled (i.e., another technology [or no technology] been selected).

In some ways, I think Noble’s work appears old-fashioned to new scholars (despite his excellent empirical material). But maybe not.

If we can extend his ideas about managerial power being augmented by selecting one technology over another toward an analysis that predicts that managerial power is instead augmented by iteratively selecting the ongoing customized form of a flexible technology (i.e., an ongoing process rather than conceptualized as a nominal, usually binary, decision breakpoint).

Special Issue in Science Studies accepting papers on "Patient 2.0"

As an outcome of Track 026 of the last EASST meeting (in Trento, IT), the organizers have:

been working to edit a special issue on “patient 2.0”. We are pleased to announce we’ve been hosted by Science Studies ( as guest editors of a forthcoming publication on the theme.

The journal has a long-standing reputation for publishing high quality articles in the field of Science and Technology Studies since the end of eighties. Science Studies is an Open Access journal and we invite you to have a look at their last issue to better grasp the kind of submissions they welcome.

The call for paper is in attach and, as you will notice, it is an evolution of the track’s cfp. (the call is also available online at If you have not already published your work elsewhere we encourage you to submit your paper for evaluation before 31 January 2012. Of course you are free to submit a completely new work as far as it is consistent with the call. All the papers will be anonymously reviewed and evaluated jointly with the editorial board of the journal.

Making good on disasters: Why did Google help Japan?

A story in the New York Times today describes how Google is making headway among the Japanese. In Japan, Google does not have the vast market share that it does in the U.S. or other countries around the world. However, as the story’s title indicates “Quick Action Helps Google Win Friends in Japan.” The story goes:


Google is using its Street View technology in Kesennuma and elsewhere to make a record of the disaster while tracking reconstruction efforts.

An oddly equipped car made its way last week through the rubble in this tsunami-stricken port city. On the roof: an assembly of nine cameras creating 360-degree panoramic digital images of the disaster zone to archive damage.

It is one of the newest ways that Google, a Web giant worldwide but long a mere runner-up in Japan’s online market, has harnessed its technology to raise its brand and social networking identity in this country.

Google was also quick in the early hours of the disaster to assemble a Person Finder site that helped people learn of the status of friends and relatives affected by the earthquake and tsunami.

It is important to note that Google cannot yet determine whether or not these efforts have helped them to crest the Japanese browser use market; however, that is far from what interests me.

While writing about the social significance of non-events for, I asked the following question:

After studying the Tylenol Poisoning Tragedy (see chapter one of Minding the Machines:Preventing Technological Disaster) and many others, we ask: are there circumstances under which a firm might gain, over the long run, from a carefully handled crisis? Students, especially of the conspiracy theory bent, go nuts with this one, and reformulate my question: are there circumstances under which a firm might gain, over the long run, from a carefully planned and handled crisis?

So, while I have no illusions that Google planned the Tsunamis in Japan, I wonder if non-local crisis response research and development might be a way answer the question above or shift the dialogue to such topics as “planned disaster response by for-profit agencies.” It seems as though organizaitons like Google with oodles of slack resources and a penchant for expansion might serve themselves well by expressing “social responsibility” during times of non-local crisis … especially, in nations where their product, service, etc. is not the leading brand, type, etc.

Hence, almost sounding like a conspiracy theorist now, is it just a coincidence that Google reached out to Japan?

"Networking the failed state" is an interesting paper

Christian Bueger and Felix Bethke presented this piece at the 51st Annual Conference of the International Studies Association, New Orleans, February 2010.The paper basically recognizes that international relations is being and should be studied in less “social” ways and instead uses some of the topography and geography of ANT to do so. Interesting stuff.

Here’s the abstract:

Abstract: The discipline of International Relations is increasingly studied as a social
phenomenon. In contrast to universalist understandings of IR as global knowledge,
sociologies of IR have localized and pluralized our understanding of the discipline. IR is
understood as a conglomerate of national communities. These national communities are
depicted as each having their own culture, organizations and knowledge and are seen as
interdependent to each other, while they all circulate around a centre, the North American
community of IR scholars. We challenge such communitarian understandings of the
discipline. Basing our discussion in Actor-Network Theory (ANT), we argue to conceive of
IR as different spatial form, that is a network or rhizome. Such an understanding enables us
to study the relations of IR to other entities, to emphasis actor and practices as constituting
the discipline and to address issues of power relations, that go beyond inter-community
power relations. To make a case which phenomena come into sight from an ANT
perspective we study the case of research on Failed States. We disentangle the network, and
sketch how IR is enroled as well as transformed.

Games with a purpose – a new role for human web users?

Just coming back from a few days of fieldwork (preparing ethnographic research in the field of semantic software) I could not help but share something I just learned. It fits quite nicely to what I have written before on the masses of non-human actors that populate the web today (crawlers, spiders, bots) and how the interdepencies between “them” and others (like us) change with the implementing of new web technologies.


Semantic technologies are build to process large numbers of unstructured documents and to automatically find (and tag) meaningful entities. And while these frameworks of crawlers, transforming tools and mining algorithms are actually quite good at finding structure in data, they are still (at least initially, they learn quickly) quite bad in assigning meaningful labels to it. They are quick and good in understanding that a text is about something, but they are bad ans slow at judging ambiguitive terms – they fail at understanding. But a recent trend called “gamification” (which is around for a while but was until recently used mainly for encouraging users to fill out boring forms) now is a good example how the configuration of agency changes on the web today. Human users are asked to play games that help annotating and matching ambiguitive patterns – for tagging pictures, texts, music, etc. So not machines are doing tasks for humans – humans are working for machines. 

For those who want to try working for them, check out the “Games with a Purpose” Website. A paper that described what exactly they do can be found here.

Dismantling boundaries in science and technology studies

Nice paper by Cornell’s Peter Dear and Harvard’s Sheila Jasanoff about regulatory science


The boundaries between the history of science and science and technology studies (STS) can be misleadingly drawn, to the detriment of both fields. This essay stresses their commonalities and potential for valuable synergy. The evolution of the two fields has been characterized by lively interchange and boundary crossing, with leading scholars functioning easily on both sides of the past/present divide. Disciplines, it is argued, are best regarded as training grounds for asking particular kinds of questions, using particular clusters of methods. Viewed in this way, history of science and STS are notable for their shared approaches to disciplining. The essay concludes with a concrete example–regulatory science–showing how a topic such as this can be productively studied with methods that contradict any alleged disciplinary divide between historical and contemporary studies of science.

What’s up with ‘social innovation’?

I recently reviewed a book for International Sociology Review of Books about “social innovation.” The book was Non-technological and non-economic innovations edited by Roth Steffen whom also authors the first chapter. The book’s contributors, some of whom wrote mainly conceptual pieces, others of which wrote more empirical works, were mostly European with the exception of a scholar from India and another from Russia.

The book opens with a hidden-in-plain-sight insight: overwhelmingly, studies of innovation tend to emphasize the story of technology going to the market to spread widely among buyers/adopters. What is missing, they say, which is also necessary for a more robust theory of innovation, is a deeper understanding of what these studies necessarily omit — that is, non-technological innovations which do not compete in the market (in the tradition sense of the word).

… a somewhat compelling position as it defines new research and sets an agenda for multiple scholars to advance the field. Still, this may not be something new. After all, social innovations include advanced social networking techniques, trends in outsourcing or downsizing, shifts in organizational form or work processes, advertising and branding techniques, etc. However, this is really now what worries me about this approach.

Here is an excerpt from the forthcoming review, which makes my point:

… The introduction ominously asks: “if innovations also have a social dimension, then is there a social dimension of social innovations, too?” (10) The question reveals a couple of things. First, the adoption of innovations such as advanced networking strategies or intensive outsourcing, for example, are conceptualized as definitively “social” things that spread (82,84). Second, innovations also have a “social” dimension, which might include harnessing symbolic systems in order to evoke a certain set of emotions in advertising or through branding (164). This social dimension might also include promotional events, auctions, or the establishment of auxiliary organizations such as museums or historical societies (246). Third, innovations are embedded in extant social relations, hence, innovations shape and are shaped by the circumstances of their social context (161). If social innovations, which are conceptualized as having one or more social dimensions, shape and are shaped by social contextual factors, within which they are embedded, then it appears there is nothing more social and, therefore, nothing more obviously under the jurisdiction of the social scientist to study than social innovations from this perspective. However, there is a deep theoretical issue to be considered regarding these multiple uses of the term “social,” a term taken to mean, in the context of this book, a thing, a dimension of that thing, and its context. I am thinking foremost about how this book’s raison d’être squares with Bruno Latour’s (2005) recent book Reassembling the social. I cite a forthcoming review of Latour’s book to make my point (Rowland, Passoth, and Kinney):

“Latour’s bottom line: As it happens, much of contemporary sociology is misdirected bunk; (…) Latour’s admittedly self-serving historical portrayal of sociology delivered in this book is perhaps forgivable because, in exchange, we get to see how performativity works among sociologists (rather than just economists). Sociologists give artificial strength to ideas that were only meant to be conceptual. (…) Sociologists are guilty of this sort of performativity, but also something much more grave. The “social” is used at times to explain what binds people together or tears them apart, but sociologists simultaneously demand that the social can also be a backdrop shaping interactions that bring people together or tears them apart. Sociologists get to have their cake and eat it too … ”

With the “social” taking-on so many meanings in this edited book (i.e., a thing, a dimension, and a context), we wonder if scholars of social innovation are also asking to “have their cake and eat it too” much the way Latour suggests sociologists have over the last century.

Panopticism, lateralism, and infrastructure cont.: IT and surveillance, once more

I just got my hands on a new paper by Marion Brivot and Yves Gendron that speaks very directly to the questions of lateral or flat types of power and regulation – and how IT infrastructures may be involved. Here is the abstract:

Beyond panopticism: On the ramifications of surveillance in a contemporary professional setting

This paper provides fieldwork evidence, which solidifies an emerging view in literature, regarding the limitations of the panoptical metaphor in informing meaningfully and productively the analysis of contemporary surveillance and control. Our thesis is that the panopticon metaphor, which conceives of the organization as a bounded enclosure made up of divisible, observable and calculable spaces, is becoming less and less relevant in the age of contemporary surveillance technologies. Through a longitudinal socio-ethnographic study of the ramifications of surveillance ensuing from the implementation of a computerized knowledge management system (KMS) in a Parisian tax/law firm, our analysis points to the proliferation of lateral networks of surveillance having developed in the aftermath of implementation. In this complex and unstable constellation of rhizomatical controls, peers are involved in scrutinizing the validity of one another’s work, irrespective of the office’s hierarchies and official lines of specialization. As a result, games of visibility (exhibitionism), observation (voyeurism) and secrecy (hiding one’s work from the KMS) abound in the office. One of our main conclusions is to emphasize the pertinence of apprehending control and surveillance from angles that take into account the ambiguities, complexities and unpredictability of human institutions, especially in digitalized environments.

Sounds like a keeper. Although I am looking at this journal regularly, this one escaped my intention until I got the respective e-mail alert yesterday (another infrastructural topic, I guess). Anyway, the full paper can be accessed here.

An ANT Paper in Sociological Theory!

Just a short note: The recent issue of “Sociological Theory” features a paper not only based on STS thoughts but one that even has “Actor-Network” in its title. As I am not on the university VPN right now I cannot download it to review it, but judging from other papers I know from Hiro Saito it should be a good one.

A major problem with the emerging sociological literature on cosmopolitanism is that it has not adequately theorized mechanisms that mediate the presumed causal relationship between globalization and the development of cosmopolitan orientations. To solve this problem, I draw on Bruno Latour’s actor-network theory (ANT) to theorize the development of three key elements of cosmopolitanism: cultural omnivorousness, ethnic tolerance, and cosmopolitics. ANT illuminates how humans and nonhumans of multiple nationalities develop attachments with one another to create network structures that sustain cosmopolitanism. ANT also helps the sociology of cosmopolitanism become more reflexive and critical of its implicit normative claims.

An Actor-Network Theory of Cosmopolitanism* – Saito – 2011 – Sociological Theory – Wiley Online Library:

Humans TXT: We Are People, Not Machines.


Do you know who your readers are? I just recently met a reader of our blog from Lancaster at a conference in Berlin and I was very happy to finally have a face to remember when posting (ok, I of course know Nick´s, Hendrik´s and Antonia´s faces). But guess who are the most frequent readers of this site? Machines! The Google-Bot, Posterous-Indexer, Feedburner and their pals harvest websites and it seems they are the most faithful readers of what we write.

As I tried to argue in a german paper on media change and interobjectivity, the specific separation of labour between humans and machines is what is at stake in some of the most interesting innovation processes in the field of web technologies. Who should have to do most of the work? A few of you might remember the hard days of the ongoing browser wars: a web designer these days had to build three or more versions of her site just to please the different web browsers. Or look at the struggle about RSS or, more recently, semantic technologies: who should add all the meta data, who should try to make sense of this mess of interconnected data? Us? Or them?

And now I just stumbled upon a stange idea. It goes like this: if there are files on a website that are for bots only (the “robots.txt” file that asks search engines to please not index a site, a funny example of this is the one on, why not create an equivalent just for human readers? That is the basic idea behind “humans.txt”. And there are huge stakeholders involved. Google already jumped on, this is their file:

Google is built by a large team of engineers, designers, researchers, robots, and others in many different sites across the globe. It is updated continuously, and built with more tools and technologies than we can shake a stick at. If you’d like to help us out, see

Wait, what? Google? After wondering for a while what sense it could make to duplicate the stuff that is already on your “about page” in a textfile without layout and eyecandy I suddenly realized. Guess who likes plain textfiles? Guess who would like to find meta data about a web site always at the same place? Yes. Bots. Those will be the most likely readers. So: who do we write for?


The role of venture capitalists in infrastructure?

A decent and relatively new blog about being a venture capitalist by Fred Wilson got me thinking about how venture capitalist firms are a different type of “organizational entity” as compared to universities or industrial manufacturing firms, in particular, their role in the development, repair, and replacement of infrastructure. While large entities with monopolistic control over primary infrastructure (e.g., roads, water systems, etc.) are usually these odd things called “states,” increasingly venture capitalists are playing a role in the development of all sorts of “modern” infrastructures that are everybit as significant (i.e., bio-engineering, IT, etc.). The difference being, perhaps, in their underlying motivations.

Venture capitalists are looking for high-potential and high risk investment opportunities with early stage startup companies that show signs of future growth and who are in need of seed money. I assume/guess that these sorts of companies produce different types of infrastructure as compaed to monolithic and monopolistic state endeavors. While it is conceptually sloppy to refer to a state as an actor, I must; I assume that states are motivated to invest in precisely the opposite — they want low-risk investment opportunities with foreseeable benefits and prefer to work with established or “prooven” companies to get the infrastructure they want.

Now, that might mean that while venture capitalists and states are both investing in infrastructural innovations (in fact, venture capital investments are sometimes a proxy for or indicator of innovation in a given nation or sector of the economy) are they investing in the same things?

I think not, given their motivations for investment, and some theoretical and empirical comparisons to states would make venture capitalists potentially exciting for STS.

What’s next in the study of management information systems?

After we deliberated a bit about what’s next for STS, I find myself wondering some more about where to take the analysis of management information systems. I have to decide pretty soon whether to commit to an empirical project in this area, and I would of course very much like to avoid focusing on questions which have already been explored at great length by others.
My impression is that when it comes to exploring management information systems, the elementary questions about technology associated with the original program of STS are pretty much in the books. This may be particularly true – or, at least, that’s my impression – with respect the “big” questions of social construction, interrelation (if not identity) of social and technological structure, and so on. But then again, I do still need a general theme with which to associate my research initially in order to establish where I would generally like to take it.
As of now, I have primarily been thinking of pushing the envelope with respect to the analysis of regulatory regimes in terms of a still more micro analysis of how regulation is really brought about and sustained in social situations (in which participants mobilize information systems or particular inscriptions provided by these systems). The general idea would be to unpack regulatory regimes laterally into sets of distinct regulatory situations and look at the respective role(s) of management information systems.
I know some of you are pretty well informed about this field, so might I have a little opinion poll about this?
And where do you think the study of management information systems should more generally be headed? Are there, possibly, current trends which you are (or would be) genuinely excited about?

What is (an) Agenc(y’/ie)s Structure?

This weekend I discovered a nice post on Daniel Little´s Blog UnderstandingSociety dealing with an old (nearly classic) topic of sociology. Daniel reviewed a new book by Peter Martin and Alex Dennis that (2010) that promisses to remodel the old problem of structure and agency – I just ordered it to review it myself. From the TOC it seems to me that the impression I had when I first read the title “HUMAN agency and SOCIAL structure” seems true: There is Habermas, there is Bourdieu, there is Giddens, there is Foucualt. A classic collection of protagonists of the 1990s structure/agency debate (the one about conflations, remember?). 

Here is a piece of Daniel Little´s review: 

This group of researchers addresses the contrast between agency and structure; but really their goal is to help to dissolve the distinction.  They want to show that “structures” do not exist in any strong sense (including the senses associated with critical realism), and that a proper understanding of “agency” involves both subjective and objective features of the individual’s actions, thoughts, and situation.  Social relationships are densely intertwined with reasons, emotion, commitments, beliefs, and attitudes — the aspects of consciousness that make up agency and action.

Here is a representative statement about social structures:

The collective concepts (such as family, state, organisation, class and so on) — which have often been seen as fundamental to sociological analysis — have often encouraged ‘the temptation to reify collective aspects of human life’ (Jenkins 2002a:4); that is, to treat them as if they were real entities, independent of the human beings who constiTute them. (7)

Their affirmative theory of agency — now stripped of the notion that it is a polar opposite to structure — has much in common with the traditions of micro-sociology — Goffman, ethnomethodology, symbolic interactionism, and phenomenological sociology.  The idea here is to emphasize the very concrete ways in which each of these traditions succeeds in identifying the agent, the social actor, as both subjective and objective.  He/she is a subject, in the sense that the agent possesses thoughts, emotions, desires, aversions, allegiances, and the like, which in turn contribute to the actions and lives they live.  But the agent is objective, in the sense that he/she is embedded and developed within a concrete set of social relationships and institutions.

Thus each of these approaches develops in its own way the idea that human social life is carried out through processes of interaction among real people in specific situations, and each seeks to avoid the reification of collective concepts — there are no such ‘things’ as social ‘structures,’ ‘classes’, or indeed ‘societies’, yet terms such as these are indispensable, not only for sociologists but for the purposes of everyday communication. (14)

Social theory in sociology as well as in STS has gone a long way since the 1990s – and it seems to me that the temporary solution of the now 20 year old debates (“make it micro, place structure and agency both into your concept of (human) action”) has been challenged by a diverse set of approaches today. Boltanski and Thevenot, Schatzki, Latour (oh well, and don´t forget their ancestors Dewey and Tarde) all argue (in different ways), that agency is not a quality that (human) actors possess – but an effect of a temporary structuring so that the basic qustion is not: “How do agents structure their relationships and institution?” but “How do different ways of structuring collective relations bring about modes of agency?” 

Robotic Humanities?

Inspired by Jérôme Denis‘s comments/posts on Latour’s play honoring Michel Callon, I tried to think back to an idea I recently read about out of Australia about robots and art, which might be of interests to STSers and those of us (like me) who have a background and/or interest in art and museums.

So, ever heard of “robotic humanities”? Me neither, at least, not until reading this blog entry on the term’s potential origins with Chris Chesher (Australian Centre for Field Robotics at the University of Sydney).

He writes:

The motivation for mobilising the term ‘Robotic Humanities’ was an invitation to speak at an event ‘Digital Editing, Digital Humanities’, organised by Mark Byron, a colleague in the English Department. Digital Humanities is a relatively new name for an expanded version of quite an old tradition of using digital technologies in literary scholarship. Such work includes literary scholars analysing stylistic patterns algorithmically to discover patterns in the words in a certain author’s work. Others scan in notebooks of great writers, marking up the author’s corrections and annotations to create digital editions. The best of this work finds biographical and creative insights through this process. For example, Margaret Webby presented an analysis of Patrick White’s notebooks to show a direct link between White’s criticisms on seeing Ray Lawler’s play Summer of the Seventeenth Doll (which he describes as banal) and new confronting scenes he wrote for his own play The Ham Funeral.

His slide show also has a few provocative slides, perhaps none more so than the (11th) slide on the “fish-bird” exhibit wherein two robot-wheelchairs “communicate” or “interact” with one another and visitors through controlled movements and the presentation of written materials.


Read about it here in a paper by David Rye, Mari Velonaki, Stefan Williams, and Steven Scheding (all at the ARC Centre of Excellence for Autonomous Systems, Australian Centre for Field Robotics, The University of Sydney). Although the interplay between art and robotics is by no means new, this exhibit struck me.

Anybody know when/where the first art/robotics show took place/showcased? Or, for that matter, good materials on the interplay between art and technology OR art and STS?

New articles/idea to watch: Journal of Information Technology & Politics, paper on Wikipedia

Your very own Nicholas Rowland was just added to the Full Editorial Board for the Journal of Information Technology & Politics.

This young journal’s aims is to:

Mission Statement

The Journal of Information Technology & Politics (JITP) seeks high-quality manuscripts on the challenges and opportunities presented by information technology in politics and government.  The primary objectives of the journal are to:

  • promote a better understanding of how evolving information technologies interact with political and governmental processes and outcomes at many levels
  • encourage the development of governmental and political processes that employ IT in novel and interesting ways, and
  • foster the development of new information technology tools and theories that can capture, analyze, and report on these developments.

They have also recently published interesting paper about Wikipedia-use and how it variously frames NGOs and now there is a CFP (call for papers) for a forthcoming special issue on the future of computational social science.