Greatest thing to happen to STS since the Bijker/Pinch paper

New interest in the micro-foundations of institutions has got to be one of the best things to happen to STS since the Bijker/Pinch paper…

The new institutionalism in organizaitonal analysis has been a well-spring for research. A quick summary of neo-I that Fabio Rojas and I wrote (in a paper on museusms):

The hallmark of the ‘new institutional’ school is the relentless focus on how life inside organizations is regulated by stable social practices that define what is considered legitimate in the broader external environment in which an organization operates (DiMaggio 1987, 1991, DiMaggio and Powell 1991b, Meyer and Rowan 1991, Scott 2000). The influence of institutions on organizational behaviour is supposedly most obvious in organizations like museums – organizations that new institutional scholars label as ‘highly institutional and weakly technical’ (Scott and Meyer 1991: 124). By this, scholars usually mean the following: that the organization’s leadership is highly sensitive to the expectations and standards of its industry; that the organization of work within the bureaucracy depends on broader ideologies and cultural scripts found in modern societies; that managers are likely to copy the practices of other organizations, especially high-status organizations; that professional groups are the arbiters of organizational legitimacy; that rational organizational myths and rules structure work practices; and that the ultimate performance of an organization’s set of tasks does not depend much on tools like assembly lines, computers, and the like (see also DiMaggio and Powell (1991a, DiMaggio and Powell 1991b).

The new approach/point of emphasis for neo-I folks is laid-out by Walter Powell and Jeannette Colyvas in their 2008 chapter in “the big green book” of organizations and institutions — copy of the paper is available in draft form at www.orgtheory.net right here.

And so the story goes:

1. Older research is cast as calling for “the need to make the microfoundations of intitutional theory more explicit” (p.276). This is something that institutional theorists have had much success with — positioning papers to create the feeling that this idea is both something new and exciting but also that the call for micrcofoundations is an old one (that we need to now make good on). The opening lines of D&P’s 1983 paper does a good job of saying “that was then” and “this is now.”

2. The upshot: “much analytical purchase can be gained by developing a mirco-level component of institutional analysis” (p.276) which would link “micro-concepts, e.g. identity, sense making, typifications, frames, and categories with macro-processes of institutionalization, and show how these processes rachet upwards” (p.278).The invocation of “hierarchy” or “upward” levels is somewhat disconcerting for those of us set on flatter analysis, but there is likely room to show (and convince) that even the tallest, most stable actors and actions occur locally and laterally on a flat surface of interactions.

3. How can we, in STS, get some purchase on this?

A. Emphasize the interpretations of contexual factors (p.277) rather than assuming them (as has happened now and again in organizational theory devoted to field-level analysis — these are assumptions that occasionally must be made in order to do the diffusion studies so common in neo-I).

B. Display the on-going micro-maintenances of apparently stable institutional forms in daily practice AND/OR discover how stable institutional forms in daily practice result in change over time such that they transform the forms they are intended (in the behavioralist sense) to prolong.

C. Enliven analysis of actors — old new institionalism (let’s say) emphasized two types of actors, “cultural dopes” or “heroric ‘change agents'” the reason being that action was essentially assumed to operate at a level unnecessary to fully capture during large-scale field studies (i.e., so managers simply sought legitimacy at all costs, we assumed, and mimicked their peers) OR in the move to caputre the actions of real actors (instead of assuming organizational entitivity) the studies overwhelmingly invovled entrepreueurs and celebrated/worshipped their field-altering accomplishments, respectively. The new emphasis (of, let’s say, new new institutionalism) sort of smacks of STS lab studies where we saw the how the mundane facets of scientists’ behaviors in labs resulted in field-altering science. Now, neo-I wants to avoid momentous events, or, at minimum, show how seeming huge events were a long time in the making and like all experiments involved loads of failure, which demands of writers the ability to show how local affairs prompt shifts in conventions (locally or broadly) (p.277).

Why is this so good for STS? We have already done much of this type of work, and have oodles of folks committed to these axioms for analysis. The only thing we really need now is a bridge between these two camps — while STS could not break into neo-I on the topic of technology, Powell and Colyvas might have just opened the door to an new institutionalism in STS…

 

One Plug to Charge Them All

A friendly fight over standards in the plug market for electric cars appears to be brewing, according to a NYT article this morning.

WITH electric cars and plug-in hybrids at last trickling into the showrooms of mainstream automakers, the dream of going gasoline-free is becoming a reality for many drivers. Cars like the Nissan Leaf and the Chevrolet Volt can cover considerable distances under electric power alone — certainly enough for local errands and even most daily commutes — while enabling their owners to shun gas stations.

The multi-media portoin of the article is good stuff — I suspect similar pictures will be featured in an STS article sometime soon…

What does the "knowledge myth" mean for SKAT/STS?

A colleague of mine wrote recently about the “myth of knowledge” in a nice blog post. Perhaps one of the most interesting and controversial (and most [overly] generalized) points was about Akido:

Because I am a behaviorist-leaning kind of guy, I would additionally point out that when behavior, talking, and thinking come into conflict, behavior wins. In my article trying to connect ecological and social psychology, I used an example out of Aikido, the martial art that prefers not to hurt people unnecessarily. Indulging in horrible generalizations: In the Western cultures – steeped in dualism and the myth of knowledge – we thinking that ‘knowing’ is about ‘thinking’, but in Eastern cultures this is not so. In Aikido, one of your goals is to blend with your opponent’s movements so you inflict minimal harm. Your goal is not to think about blending, not be able to explain how to blend, nor to be able to accurately imagine blending, rather your goal is to actually blend when the time comes. A person ‘knows’ how to blend when they do it without thinking, and regardless of whether they can teach how to blend or explain what they did after the fact. (By the way, that article is part of a 7 article discussion, including my latest addition now available online.)

One of the main points was the link between “knowing” and “doing” and from a behaviorist perspective in psychology, this is an interesting position to take on such matters. He provides a number of examples such as “how can a legless football coach know how to kick a football?”

Knowledge — beit tacit or explicit, fact-searching or its role in training scientists and engineers — plays a central role in SKAT and STS; however, I’m not entirely sure we’ve jumped on the behaviorist bandwagon just yet.

The ending question: what would STS look like without “knowledge” as a crutch during analysis?

Game theory and society, and infrastructures

I recently attended a conference on “Game theory and society” at the ETH in Zürich. It was a very productive conference with a good mixture of plenary sessions with people like Brian Skyrms and Herbert Gintis and the usual host of more work-in-progress oriented panel sessions. Speakers and attendants had backgrounds in sociology, philosophy, economics, biology, even in physics. If there was a common and unifying interest, this interest was in modeling elementary forms of cooperation. All the more striking was the nearly complete absence of people from sociological theory. Game theory, it appears, has been largely abandoned by sociological theory, leaving it to colleagues specializing in formal modeling or generally versed in quantitative methods. It happens that I found this to be quite a pleasant bunch of people to be around.
A couple of questions with respect to our interest in infrastructures have been bugging me since:
– I might start with the issue brought up by Nicholas a couple of posts ago whether there is a problem in sociological theory of addressing questions of efficiency. After working through some of contemporary game-theoretical research and comparing it to the state of the art in sociological theory, how could I not agree? Game theory could be one, if not THE weapon of choice for sociologists discussing questions of efficiency in an analytical manner, and evolutionary approaches have been demonstrating that the use of game theory need not be congenial to either rationalistic or economistic reconstructions of effiency. Evolutionary game theory is particulary good at showing how inefficent equilibria come about and turn out to be stable.
– Closely related are questions of utility that tend to be treated with a similar kind of disregard by many sociologists. One does not need to adopt a utilitarian perspective to see that analyses how relationships and structures develop, how artefacts evolve and diffuse, etc. are correlated with (mostly implicit) ideas about utility. We may of course treat such ideas about the utility of contacts, associations, or tools as mere background assumptions of our observations of infrastructures, or we may broadly consider them as taken care of by looking at practice pragmatically. Seeing what can be accomplished by taking a more analytic approach to utility, I suspect though that we can do better than just telling utility stories (either with respect to particular cases or in the exposition of theory).
– Which brings me to the more general question of research orientation. Why is there so little modeling in STS and in the emergent field of studies of infrastructures? Researchers have been investigating broadly and writing quite generously about how complex forms of modeling are utilized in the construction of truths and technological artefacts but have been making little use of these methods and tools themselves. It is surely great to have so many sound STS case studies and ethnographies at our disposal in discussing our theoretical concepts and ideas about infrastructures, but again I suppose we could do much better with a less restrictive choice of methods and approaches. If there is a unilateral bias in favor of qualitative methods, story-telling and small-n studies, systemic problems in aggregating empirical data (if not, in the end, a constant recycling and re-invention of theoretical concepts with little progress in accumulating empirical intelligence) are likely to result.
Should we therefore not try to engage more with formal models of cooperation, social order and infrastructures? In Zürich, I found the doors to be generally open, and that there is a lot to learn in terms of concepts and methods. And I find myself encouraged to look into this in a more sustained manner.

Working around over time

Workarounds are:
1. Any way of tricking a system by using the system in a way it was not intended to, but that still gives you the desired outcome. This was first (according to my research) raised by Gasser in 1986. The idea being that in some systems you could enter, for example, incorrect data in order to arrive at the desired outcome. The need for odd initial data is based on infelicities in the system beit software or mechanical. 
2. By “jury rigging” the system wherein you haphazardly put something together, but you don’t expect it work well forever. Sometimes referred to as “make-shift,” it works well enough now — and this happens in computing all the time; you make a quick, often small, but necessary change in the system. Sometimes called a “kludge,” this is where the “permanence” issue is raised in research — how more or less permanent is a workaround, typically assumed to be of limited longevity. Of course, no matter what we make, nothing is permanent. Still, some things last longer than others, and more often than not with packaged software the “slightly-more-permanent” workarounds (in the form of system customization) are more common as compared to the often, but short-lived workarounds used in legacy systems [note: this may be a generalize too wide to bear evidence]. Still, this helps us to understand better the longevity of workarounds.
3. Literature on workarounds is now split on the idea that they are “freeing” employees from the confines of the system, and increasingly scholars ask if all this “freeing” (in research on ERP) creates its own subset of confines (suggesting that large numbers of expensive customizations to systems requires some administrative oversight, which effectively balances the freedom from the previous system with the new need to control those freedoms). This helps us understand the autonomy-producing or -restricting quality of workarounds.
This seems to be the cost of customization: it at once frees you from the confines of the system, but also hurdles the system toward eventual decay (as we have observed with legacy systems), and this is sometimes referred to as “drift.” The more control you exert on the system — in this case, in the form of workarounds — the more brittle it gets and drifts, in principle, from the control of those charged with maintaining the systems. In this way, workarounds are kind of like using a mulligan in golf; it helps you get a better chance in the short term, but in the end it keeps adding +1 to your score until you’ve lost it completely.
However, if one could follow a set of workarounds through the years (and I’ve never seen research like this), explicitly watching them “decay” or “cost,” then the analogy to golf might be observed. When did, in the short run, the workaround get the organization out of a jam? Conversely, when did the workaround, in the long run, costs the organization more than it was worth.
If you could understand the process deeply enough, one could explicitly estimate at which times workarounds “beat the system,” meaning that, you might be able to identify muligans (i.e., workarounds) worth taking (i.e., making) and others which ought to be avoided.

Some shameless self-promotion: On Technology and Society

Nicholas’s public question if there is a book on what the old theorists thought about technology offers a tempting opportunity for some “shameless self-promotion” that I was nearly too modest to seize. But in the pre-ASA mode that nearly every sociology blog I read is in at the moment…well, I´ll jump at the chance: I wrote a book similar to the one that Nick requested – only (sorry) in german and not outlined as a list of old scholars thoughts, but as a sociologized concetual history of explaining the relationship between technology and society.

The usual story is that there was first technological determinism, then social constructivism – a story of a big STS success. But a closer look reveals that the two underlying modes of explanation – technicism and cuturalism – are with us for at least 150 years. This conceptual dichotomy, already established in philosophy and early social theory (Kapp, Marx, Durkheim, Weber), enforces during a first crisis of modernity in the first decades of the 20th century a first explicit version of technicism (Veblen, Dessauer) and a first version of culturalism as a reaction to it from the 1930s (Spengler, Gilfillan, Mumford) on. As once stabilized theoretical artifacts these modes of explanation deal with the social and technical transformations of modernity by attributing them either to an inherent logic of technological development or to major and minor changes of modern society. This leads to pessimistic versions of technicism (Ellul and Jünger) and a critical version of culturalism (Adorno, Horkheimer, Heidegger) after World War II, an anthropological version of technicism (Freyer, Gehlen, Schelsky) and a rationalist culturalism (Marcuse, Habermas) that accompany the stabilization of organized modernities until the 1960s. As a reaction to a second crisis of modernity from the 1970s up to today two versions of technicism and a radical relativist culturalism emerged: while new media technology and digital computing enforces a revival of deterministic thoughts (McLuhan, Postman, Flusser), a large number of empirical work focused on technology assessment was based on modest versions of technicism (Ogburn, Heilbronner, Rapp). The sociology of scientific knowledge (Barnes, Bloor) fosters first a moderate empirical micro-constructivist culturalism (Latour/Woolgar, Knorr-Cetina), then a historical macro culturalism (Hughes, Constant, Dosi) and finally a radical social constructivist culturalism (Bijker, Pinch, Law).

From the 1960s on these theoretical and conceptual differences have been additionally stabilized by bringing them in theory-political as well as real political opposition. By this the basic conceptual distinction between technology and society has been virtually naturalized, it has not been seriously drawn into question since the 1930s. But from the 1980s on a number of attempts have been made to wipe the slate clean in social science theories of technology. These new approaches understand both dynamics and stability of society and technology as entangled and interrelated phenomena in need of explanation. Actor-Network-Theory (Latour, Callon, Law), neo-pragmatist technology studies (Star, Fujimura) and systems theory (Luhmann) are just three of theses new approaches. Despite their differences they teach us to ask and answer questions about the relevance of materiality for the emergence and transformation of the social, about the material and technical mediation of agency and communication, about the importance of artifacts for the formation and change of social institutions and ideas and about the role of technological developments in transforming modernity. To ask and maybe answer them, the discourse on social science theories of technology will have to be connected to the general discourse on social theory, on theories of society and modernity.

 

Public Question: What did the old theorists think about technology?

A while back I asked “does anyone know if there is a good paper or book about what Weber thought about technology?” which is an interesting question in light of new STS work. Marx has been paid some attention by scholars, but here comes the public question:

Is there a book that tackles, one chapter at a time, what the old theorists thought about technology?

This seems like a great edited book or mini-conference or mini-conference that turns into a great edited book.

So, next question:

Is there any interest in a book that tackles, one chapter at a time, what the old theorists thought about technology?

the role of reviews in the social science

All this discussion of new infrastructuralism and the infrastructural relation between “black-boxing” and “institutionalization” makes me think about the role of reviews in social science.

The question is: is a concept what it is, or does how it has been used constitute what it (now) is?

I think this for the following reason: Latour, in Actor Network Theory and After goes into something of a littany regarding the ways that ANT has been used and abused over the years since he and Callon (and Woolgar, honestly) thought of it. Of his many points, a meta-point matters for this post: he basically states that as his ideas spread, they increasingly got used in ways he did not expect and then Latour makes something of a value judgment in suggesting that some research, which appears to be relatively more current as compared to his original works, don’t do ANT right. Of course, Latour takes some blame in saying that perhaps the entire moniker including, A, -, N, and T were not perfect, it still seems like an odd point to hear from Latour. About 120-ish pages into Science in Action, Latour reviews as part of the translation model of how things spread (i.e., diffuse, though he considers this a dirty word) he demands that spread requires change — that a technology, for example, must change as it enters into new hands. This was a counterpoint to diffusion of innovation literature (that he hardly cites) and their supposed assumption that diffusion, as an idea and model, only works so long as we assume the innovation is “constant” over time (meaning that it does not and will not change). Getting to the point: ANT was going to have to change to spread so widely, and the ideas would necessarily be used in ways unintended and perhaps unacceptable to its originators.

Again, then, the question is: is a concept what it is, or does how it has been used constitute what it (now) is?

Latour contributed to the notion of “black-boxing” as much as perhaps any scholar of the last 30-ish years, and given his disappointment with how some of us have used his concepts, does it really matter? (i.e., this value judgment) Or, does it matter more for science not to judge how concepts have been used and instead document how they have been used because the way they have been used is effectively what they are?

Returning full circle, in all this discussion of new infrastructuralism and the infrastructural relation between “black-boxing” and “institutionalization”, what would make the best review paper? Review the terms as if they are not artifacts changing hands in order to conceivably arrive at some core meaning of these concepts, or review how the terms have been used and that this will tell us more about the operational meaning of the terms? 

Institutionalism and Infrastructuralism – some first thoughts on differences

In a recent post following Nicholas’s thoughts about blackboxing and taken-for-grantedness and about what that could mean for discussing the benefits of STS and neo-institutional theory, I asked: what are the difference between institutions and infrastructure? Nicholas and I discussed that today for the first time in detail and we thought it might be worth to post it to see if it makes sense.

Neo-Institutional theory is – to tell a very long story short – based on the question of how many different things (organizations, models, cultural forms) become similar over time. This is the basic problem in DiMaggio/Powell (1983): understanding the institutional isomorphisms if the impetus of rationalization is taken away. It is the problem that Strang and Meyer works on when studying the institutional conditions of diffusion (1993). It´s central focus was – like Powell argued in 2007 – on “the field level, based on the insight that organizations operate amidst both competitive and cooperative exchanges with other organizations.” (2007). DiMaggio (1988) and Powell (1991) both noted that this was a bit too smooth and that institutional arguments would need a more detailed perspective on contestation, fragility and struggles. Nevertheless the framework provided a fresh and new way to understand institutions – so productive that it framed a discipline or two.

Infrastructure studies, on the contrary, focussed on how things can appear systematic and highly-integrated but are actually implemented in many heterogenous, historically contingent local processes (Bowker/Star 1996/1996; Star/Ruhleder 2996). In some ways, diffusion becomes less important as implementation takes a more central role. Infrastructures are not build by system makers, but screwed together loosely by complex arrangements of interfaces, gateways and work-arounds, as Edwards has shown in 2003 and in his fabulous book on climate models (2010). However there seems to be a tendency to focus on normalizing and standardizing effects of classification systems implemented in large infrastructural settings – this is something like the Weberian “iron cage” of infrastructure studies visible already in “Sorting Things Out” and very strong in the works of Hanseth and Monteiro (1997, Monteiro 1998).

The link seems obvious, doesn´t it? Neo-Institutionalism starts looking at heterogenous stuff and finds it similar – too similar perhaps, so that it is missing the complexity of the social world sometimes. But it is a great framework for strong explanations. Infrastructure studies look at systems and find them fragile and fragmented inside. But they seem to lack the “big explanatory” power, which leads to giving up the focus on local multiplicity and emphasizing standardization/normalization instead. Could the strengths of both be added to get a good grasp at the installation of social order under (high) modern conditions?

The new infrastructralism?

Jan-Hendrik and I were discussing this yesterday: “what would it take to create the new infrastructuralism?” In a way, it would be analogous to the new institutionalism, but with a different (although overlapping) set of topics, etc.

We’ll try to post the beginnings of our ideas soon wherein we will see if such a theory could be meaningfully erected.

Technologies/Black-Boxing and Institutions/Taken-for-granted – A question of levels?

This post started as a comment to Nicholas´ post on the museum but became so long that I decided to make it a new post. The debate on black-boxing and “taken-for-grantedness” (or STS & New-Institutional Theory) takles some very important points. They remind me of T.Pinch´s (2008) paper on Technologies and Institutions. Pinch´s focus is on the problem of “skill” and he argues that (new) institutional theory  – for example in its micro form as in the works of Fligstein – is basically only focussing on a very small aspect of ways to make an institution materially stable. Technology, he argues, adds at least a second way because it is black-boxed, not just taken for granted.

The reason why I think this is only partially true is that proposing such an argument is only possible by conflating levels of “taken-for-grantedness”: Sociology knows a whole spectrum of ways to make social order become taken for granted: from the taken-for-granted stream of everyday routines and interactions that make up Schütz´ life world to Mauss/Bourdieu´s techniques of the body that constitute the habitus, from B&L´s (or also Gehlen´s) processes of institutionalization to Foucault´s epistemé, Polanyi´s tacit knowledge and Ryles “knowing how”.Technology, if we follow this route, could be added to the book of tactics to make practice become taken-for-granted – through a very distinct process that has been described as “black-boxing” in which some aspects are packaged and sealed away, others delegated to specialists (for example for maintenance and repair). It is distinct from at least two other tactics exactly because of the form of this process. Embodying habits and skills for example is a process of becoming taken for granted by routine and repetition. Discursive closure is a matter of rhetoric, persuasion and concealment. 

Institutions and Infrastructures I suppose are strategies of “taken-for-grantedness” on a different level: they are hardy stabilized by just one of the discursive, habitual or technologcial tactics just described. Neither can an institution be based just on skills, nor on legitimizing and reglulating discourse, nor on technology. Hey, we know from a long time of STS research that not even technology alone can rely on technology alone. Institutions and infrastructures are complex installations – hybrids or monsters if you will. They both rely on a fragile architecture of “taken-for-grantedness” – plug-ins. What is the difference, then?

ASA blogger party and other ways to meet Nicholas Rowland and Jan Passoth

I want to note the announcement of the annual ASA ScatterPlot Blogger Party! Details here. Short story: Sunday, August 21, 4:30pm at the Seahorse Lounge at Caesar’s Palace. I hope to see many of you there!

Otherwise, I present a paper with Jan-Hendrik Passoth on state theory and another roundtable about state power (power being a dirty little word). Come say “hi” — I’ll be wearing the functionalism t-shirt and have even made one for Jan!

Capture_a

NOTE: *This message was playfully plaigarized from my mentor and friend, Fabio.*

The "lighthouse" (re: Coase) in new institutionalism is the museum

Per recent discussions of black-boxing and institutionalization, a paper that Fabio and I wrote seems useful to remember. It was a piece we wrote to interrogate the use of “the museum” by new institutionalists of the organizational analysis-bent.

Sociologists that study organizations often analyze the museum from a cultural
perspective that emphasizes the norms of the museum industry and the larger
society. We review this literature and suggest that sociologists should take into
account the technical demands of museums. Drawing on insights from social
studies of technology, we argue that museums are better understood as
organizations that must accomplish legitimate goals with specific technologies.
These technologies impact museums and the broader museum field in at least
three ways: they make specific types of art possible and permit individuals and
organizations to participate in the art world; they allow actors to insert new
practices in museums; and they can stabilize or destabilize museum practices.
We illustrate our arguments with examples drawn from the world of contemporary
art.

The black-boxing of a technology is different than the incremental emergence of “taken-for-grantedness”

Below is an excerpt about black boxing and taken-for-grantedness, which I wrote with Fabio Rojas years ago:

The black-boxing of a technology is different than the incremental emergence of “taken-for-grantedness,” which comes from writers such as Schutz (1967) and Berger and Luckmann (1966). They argue that knowledge in everyday life is taken for granted by individuals as reality, “but [that] not all aspects of reality are equally unproblematic” (p. 24).  They provide an example germane to this discussion:

  • …suppose that I am an automobile mechanic who is highly knowledgeable about all American-made cars. Everything that pertains to the latter is a routine, unproblematic facet of my everyday life. But one day someone appears in the garage and asks me to repair his Volkswagen. I am now compelled to enter the problematic world of foreign-made cars (p. 24).

Technologies, they contend, like those related to car repair, get taken for granted over time and through expertise. But by looking at technologies as black-boxes, scholars can gain a fresh perspective on the institutionalization of technology by emphasizing how stable technologies stabilize human networks, rather than how routinization results in a technology’s disappearance for organizational actors. Returning to Berger and Luckmann’s example, as a black-box, automobiles and the networks of dependence and exchange built-up around them are concealed (or ignored). Further, the patterns of human behavior that make a mechanic’s garage the one place to fix broken cars is missed because of the emphasis on how technologies get “taken-for-granted.”

Seems Berger and Luckmann’s (1966) old work on the social construction of reality might find new use distinguishing blacking-boxing from institutionalization. Also, please note: If there was one thing that early Latourian thinking and the new institutionalism in organizational analysis were looking to unlock was how something gets sealed-up and stabilized over time to the point of being taken-for-granted as real, true, or rational. Look back at the early pages of Latour’s book on Pasteur — it opens with the image of Rue Pasteur and asks how did we get this … a good question, no?

Black box" and "taken-for-granted

I recently asked:

When are the processes that bring about a black box the same as those that bring about — in the institutionalist frame — the notion of taken-for-grantedness … and, when are the processes that bring about either of these notions incapable of producing the other?

Seems, upon further reflection, to be an obvious paper, which might bridge some of the thinking about technology and institutional arrangements. Restated as a couple of thesis statements, it would go:

Q1. What circumstances/processes do the concepts of “black box” and “taken-for-granted” both meaningfully capture?

Q2. What circumstances/processes does the concept of “black box” meaningfully capture that the concept “taken-for-granted” cannot?

Q3. What circumstances/processes does the concept of “taken-for-granted” meaningfully capture that the concept “black box” cannot?

Seems like an interesting review piece to see where organizational theorists and STSers have historically overlapped and where they have diverged, with the caveat that each might learn something if orthogonal points of divergence where re-considered in the respective lines of research.

Timmermans has done it again — this time about failures!

I have always enjoyed reading Stephan Timmermans’s research, and his new piece in STHV is no exception.

The abstract, which is below, is not only a good reversal on an old idea, but also solid prose  — worth the read.

Abstract

Sociologists of science have argued that due to the institutional reward system negative research results, such as failed experiments, may harm scientific careers. We know little, however, of how scientists themselves make sense of negative research findings. Drawing from the sociology of work, the author discusses how researchers involved in a double-blind, placebo, controlled randomized clinical trial for methamphetamine dependency informally and formally interpret the emerging research results. Because the drug tested in the trial was not an effective treatment, the staff considered the trial a failure. In spite of the disappointing results, the staff involved in the daily work with research subjects still reframed the trial as meaningful because they were able to treat people for their drug dependency. The authors of the major publication also framed the results as worthwhile by linking their study to a previously published study in a post hoc analysis. The author concludes that negative research findings offer a collective opportunity to define what scientific work is about and that the effects of failed experiments depend on individual biography and institutional context.

Belgian STS network kick-off event * Sept 30th, 2011

For those of you in Europe this might be an interesting opportunity to travel, meet great people and strengthen the international network of STS: Scholars in Belgium are gathering to have a first meeting of the Belgian Science, Technology and Society (BSTS) – a network that started

“… in 2008 as an ad-hoc academic platform, the BSTS network enables STS researchers in Belgium to share with one another their research interests and disciplinary perspectives and to foster collaboration across different fields and locales. The network now extends its hand beyond academia and beyond Belgium to engage an international community consisting of people from research centres, industry, policy making and other professionals with an interest in cross-disciplinary learning and knowledge sharing.”

Here is some more information on the Belgian STS Network.

 

 

 

Evading efficiency arguments is what sociology is good at

Why is sociology so affraid of efficiency arguments?

After re-reading this great old piece …

Oberschall, Anthony, and Eric M. Leifer. 1986. “Efficiency and Social Institutions: Uses and Misuses of Economic Reasoning in Sociology.” Annual Review of Sociology 12:233-253.

… I was reminded that sociology has made something of a history of explicitly avoiding extant arguments regarding efficiency.

Marx, for example, rejected efficiency and emphasized exploitation of labor by the bourgeoisie. Given Marx’s economic theory of value and labor, exploitation was the only way to get more value than was invested by fairly paid labor (e.g., the wage from six hours a day is enough to feed and clothe a family of four for a day; however, without the means of production workers might work eight hours per day rather than six for the same wage since they have no bargaining power). Thus, the creation of surplus (i.e., profit). However, a falling rate of profit was expected as capitalists competed with each other in hopes of attracting more and more laborers, which ultimately cut into profit margins. Enter machines. The primary problem, however, for Marx was that machines could bring no real efficiency or profit; machines are incapable of producing profit (or only for a short time) because all competitors will soon have them. At this point, each capitalist is back to “square one.” Simlutaneously, the price of machines goes up and the price of products goes down. Thus, profit has to fall and efficiency is lost (however, according to contemporary economics: profits fall within the business cycle, but not across cycles, showing some flaw in Marx’s thinking). Still, as it happens, “Machinery and improved organization provide … [enhanced efficiency] too, because they increase the productivity of labor” (p. 42, Collins and Makowsky 1998).

Also writing at a time of great scientific and industrial progress, Durkheim, in contradiction to rationalists, finds “society … a ritual order, a collective conscience founded on the emotional rhythms of human interactions” (p. 102, Collins and Makowsky 1998). Even though specialization (in the form of organic solidary) hold society together (despite the loss of mechanical solidarity), efficiency seems to play a lowly role in Durkheim’s models of integration.

Weber seems the closest for allowing efficiency some room to breath. Still, above efficiency was his deep-seeded concern over organizational stability. The organization of groups stabilized through strong personal ties (patrimonialism) or by setting rules (bureaucracy), which follows broadly from Tönnies (Gemeinschaft and Gesellschaft, respectively). Domestic or personalistic organizations, like that of a family estate, wherein close friends and family members made-up the bulk of enterprise employees and related services (be they war, trading, tax collecting, etc.). Of course,  personalistic forms of organization are not easy to control and seemingly inefficient (as compared to, for instance, a bureaucracy). The organization of communications is poor—what starts as a direct order at the top chain of command ends up a rumor, a whisper, or nothing at the bottom rungs. Under certain circumstances, innovation is ignored or resisted falling back on tradition—doing as was done the last time or as far back as can be remembered for sake of personal ease and safety from criticism from above. Authority from the top dissipated over time as their top assistants grew in power and potentially ceded.The bureaucracy would fix all that by establishing rules and regulations to guide individual behavior even in the absense of authoritative oversight. While bureaucracy can be interpreted as an efficiency argument, Weber’s focus on cultural underpinnings of groups like Protestants as shaping historical achievements along with his works on Judaism, China, India, etc., the library of work leads me to believe that culture, rather than efficiency, was the root of his arguments.

There are no doubt many more — certainly the old functionalists like Selznick and Merton (who showed the disfunctions of bureaucracy) would fit right in…

Social significance of gap analysis

Although I’m not entirely sure of the implications for infrastructure, gap analysis is commonly used and seems promising as a research site — and yet, despite widespread use in management and implementation of software, gap analysis is an untapped and unappreciated workflow analysis technique in research.

In general, gap analysis takes three forms, which document the gaps between two states: current versus future, expected versus actual, perception versus delivered. The difference between the two states defines the gap, and from such assessments others are possible such as benchmarking (Boxwell 1994).

The first form is a map. Cartographic representations are mainly utilized in lean management to chart flows of raw materials – including information – currently necessary to make a product or service available to consumers so that they can be assessed for flow and waste. Once areas for improved flow and reduced waste are identified, analysts draw them into a future state value stream map. The differences between the two states define the gaps, which orient work toward that future condition. The map gap was designed at Toyota (Rother and Shook 1999).

The second form is a step chart. Temporality is built-into the step chart, which also identifies and compares current practice and desired future state for the performance of a service or product. Brown and Plenert (2006:319) provide a good example of where a step chart might solve the gap between expected and actual states: “customers may expect to wait only 20 minutes to see their doctor but, in fact, have to wait more than thirty minutes.” Step charts chart the steps necessary to move from current practice to future practice (Chakrapani 1999).

The third form, which is most appropriate for working-around packaged software, is a cross-list. Such analyses are most routinely undertaken in consumer research wherein gap analysis refers to the:

methodological tabulation of all known requirements of consumers in a particular category of products, together with a cross-listing of all features provided by existing products to satisfy existing requirements. Such a chart shows up any gaps that exist (n.a. 2006).

Once cross-listed in table format gaps make themselves obvious and their analysis points to unmet consumer demand which new or poorly marketed products might fulfill. However, prior to the establishment of a cross-list, consumer expectations and experiences must be gathered, for example, by focus-group interviews. Once collected and made to populate a cross-listed table, according to Brown and Plenert (2006:320), “gaps can be simply calculated as the arithmetic difference between the two measurements for each attribute.”

The Journal of Science Policy and Governance

The Journal of Science Policy and Governance

http://www.sciencepolicyjournal.org/

Now accepting rolling submissions!
The Journal of Science Policy and Governance is an interdisciplinary journal that seeks high-quality submissions on emerging or continuing policy debates. Current students (undergraduate or graduate) and recent graduates within three years of earning a degree (bachelors, masters, or doctoral) are eligible to submit. We seek to publish articles on a variety of policy areas including: scientific research, engineering, innovation, technology transfer, commercialization, bio-medicine, drug development, energy, the environment, climate change, the application of technology in developing countries, STEM education, and space exploration. Submissions on other topics are also welcome as long as they relate to the theme of science policy and governance. The Journal strives to publish articles in a timely manner to ensure that publications can be considered in the context of current policy debates.

Please see website for submission guidelines. Questions and/or submissions should be sent to jofspg@gmail.com.

Painting infrastructure

I wonder if graffitti could be a “strategic research site” for scholars of art, sociology, and infrastructure. In the New York Times today, a story line reads “Cities Report Surge in Graffitti” (see also these slides).

An upturn in graffiti has renewed debates about whether it signifies alienation in struggling areas or whether its glorification contributes to urban blight.

It raises an issue I have not yet heard discussed and that is that infrastructure might be modified through simply changing its exterior appearance, in this case, via graffitti. In a way, this would imply individuals “tagging” infrastructure as an indicator of, for example, its quality, which would then contribute to, if not certify, the current state of affairs. I guess I don’t know of any research that theorize the influence on infrastructure of is “wrappings” (shall we say).

Winnie the Pooh and infrastructure?

15poohpiccombo-articlelarge

Okay, less about infrastructure, but a question has been brewing and I cannot seem to find a suitable answer. In recent years, Pixar has made millions with the following Disney formula:

Reformulate … [for example, Alvin and Chimpunks or Smurfs] in 3-D, give them a skateboard and sunglasses, add some dance moves and inundate children and their nostalgic parents with advertising.

But this seems to fail when it comes to Winnie the Pooh, or there might be a recency effect where consumers are tired of eating the same old rehashed 3-D stories. Either way, Pooh is going to hit the silver screen handdrawn.

So, why might it be that Pooh can’t go 3-D?

From "forces of production" to "forces of customization"

A new line of research might open-up if we read David Noble‘s Forces of Production again and ask “what is the relevance for contemporary packaged software”? Noble, who recently passed-away last December, wrote what is arguably one of the best books in STS’s past about the role of managerial power to determine the direction of technological development, much of which is accomplished by selecting one technology over another to foster toward future development. Additionally, Noble keyed us all to the idea of the “path not traveled” wherein we consider “what might have been” had another road been traveled (i.e., another technology [or no technology] been selected).

In some ways, I think Noble’s work appears old-fashioned to new scholars (despite his excellent empirical material). But maybe not.

If we can extend his ideas about managerial power being augmented by selecting one technology over another toward an analysis that predicts that managerial power is instead augmented by iteratively selecting the ongoing customized form of a flexible technology (i.e., an ongoing process rather than conceptualized as a nominal, usually binary, decision breakpoint).

rethinking workarounds

At least since the work of Les Gasser (1986:216), the act of working-around (or jury rigging) and the resulting workaround (or kluge) has been good fodder for Science and Technology Studies (STS). In the transition from building administrative software in-house to purchasing packaged software solutions from private market vendors, the workaround has received renewed attention by scholars. And rightly so. These are pressing matters given the widespread use of packaged software, the near irreversibility of implementations projects once initiated, and the reported high probably of dissatisfaction following installation.

Workarounds are commonly employed to localize, maintain, and extend software programs, especially when it is necessary to coax occasionally suboptimal implementations into functioning properly as the systems age. Still, workarounds have their limitations; they grow brittle over time, but are crucial for freeing users from restrictive or incomplete systems. Research on packaged software, however, challenges the notion that systems are still worked-around. Designers of packaged software anticipate user modifications. Embedded and increasingly inter-organizational actors now determine when a work endeavor is or is not defined as a workaround. Pollock (2005), examining a case of package software being implemented in a university setting, shows how the boundary between users and designers is relationally dynamic rather than static. This is clear when, for instance, local users share code with software designers hired by vendors, but also when designers distance their responsibility over specific user problems by categorizing some problems as local (rather than the general concern of numerous implementing organizations) (505, 503). Because packaged software appears to presuppose that adopting organizations will participate in the design process by modifying the software for local use, Pollock seems to have updated Gasser’s (1986) notion of working-around. Pollock calls into question Gasser’s (1986:216) original formulation, specifically, that working-around implies “intentionally using computing in ways for which it was not designed,” given that the tools to work-around are already embedded-within packaged software.

Research also suggests that workarounds are not as freeing as previous literature indicates. Modifications do not only free local users from the constraints of technology; they also create tensions inside and between organizations. For example, modifications that are difficult and therefore slow to establish create tension between employees and their supervisors (Pollock 2005:507). Likewise, some modifications generate conflict between support desk operators and local programmers concerning who is responsible for coaxing the packaged software into operation (506). Also examining packaged software in higher education, Kitto and Higgins (2010), through the lens of governmentality, show how a university wrested control over their newly adopted ERP by modifying it. Surprisingly, once modified, the resulting system did not appear to create renewed autonomy for employees. Instead, control over the system simply shifted from the monolithic vendor to a more local supervisor charged with maintaining jurisdiction over the host of new modifications.

In the move from homegrown to packaged software in higher education, traditional interpretations of the workaround seem to be transforming, and with it, I imagine, the precursors of workarounds – the “gaps” in system operations that workarounds necessarily bridge … although there is scant research on where these workarounds come from.

Diaspora* as an alternative to google+ and facebook?

Diaspora* is another social networking site (nearly a year old) … just like facebook, with one important exception. Here is an excerpt from a New York Magazine article:

… as the name suggests, their project was intended less as an imitation of Facebook than as an escape route from it—a path to freedom for those who had come to fear the dark side of the social network. In the years since Facebook launched (and long before Aaron Sorkin decided to take a whack at it), the service had begun to feel unsettling, sinister, less a benign link to friends and more a stealth database, open to all takers.

Diaspora*—if it worked—would do everything Facebook did. But users would own their data.

Will google+ close the coffin on Diaspora*?

Special Issue in Science Studies accepting papers on "Patient 2.0"

As an outcome of Track 026 of the last EASST meeting (in Trento, IT), the organizers have:

been working to edit a special issue on “patient 2.0”. We are pleased to announce we’ve been hosted by Science Studies (http://www.sciencestudies.fi/) as guest editors of a forthcoming publication on the theme.

The journal has a long-standing reputation for publishing high quality articles in the field of Science and Technology Studies since the end of eighties. Science Studies is an Open Access journal and we invite you to have a look at their last issue to better grasp the kind of submissions they welcome.

The call for paper is in attach and, as you will notice, it is an evolution of the track’s cfp. (the call is also available online at http://www.sciencestudies.fi/node/2070). If you have not already published your work elsewhere we encourage you to submit your paper for evaluation before 31 January 2012. Of course you are free to submit a completely new work as far as it is consistent with the call. All the papers will be anonymously reviewed and evaluated jointly with the editorial board of the journal.

Howard Silver, COSSA, and protecting NSF’s SBE

Recently, lobbyist and former chair of the National Science Foundation Funding Howard Silver commented on a post regarding the potential closure of the US-based National Science Foundation’s Social, Behavioral, and Economic Sciences branch.

Silver is part of COSSA (Consortium of Social Science Associations):

The Consortium of Social Science Associations (COSSA) began in the late 1960s as an informal group of social science associations that met to exchange information and discuss common problems. In May 1981, the disciplinary associations, responding to disproportionately large budget cuts proposed by the new Reagan Administration for the social and behavioral sciences at the National Science Foundation (NSF), used the informal COSSA collaboration to establish a Washington-based advocacy effort.

Their website provides a number of interesting topics, but for me the consistent updates about the fate of science fuding in Washington on the homepage is probably what I’ll be checking each morning — a good place for updates on a fash changing subject.

Keep fighting the good fight, Howard!

Making good on disasters: Why did Google help Japan?

A story in the New York Times today describes how Google is making headway among the Japanese. In Japan, Google does not have the vast market share that it does in the U.S. or other countries around the world. However, as the story’s title indicates “Quick Action Helps Google Win Friends in Japan.” The story goes:

Google_post

Google is using its Street View technology in Kesennuma and elsewhere to make a record of the disaster while tracking reconstruction efforts.

An oddly equipped car made its way last week through the rubble in this tsunami-stricken port city. On the roof: an assembly of nine cameras creating 360-degree panoramic digital images of the disaster zone to archive damage.

It is one of the newest ways that Google, a Web giant worldwide but long a mere runner-up in Japan’s online market, has harnessed its technology to raise its brand and social networking identity in this country.

Google was also quick in the early hours of the disaster to assemble a Person Finder site that helped people learn of the status of friends and relatives affected by the earthquake and tsunami.

It is important to note that Google cannot yet determine whether or not these efforts have helped them to crest the Japanese browser use market; however, that is far from what interests me.

While writing about the social significance of non-events for orgtheory.net, I asked the following question:

After studying the Tylenol Poisoning Tragedy (see chapter one of Minding the Machines:Preventing Technological Disaster) and many others, we ask: are there circumstances under which a firm might gain, over the long run, from a carefully handled crisis? Students, especially of the conspiracy theory bent, go nuts with this one, and reformulate my question: are there circumstances under which a firm might gain, over the long run, from a carefully planned and handled crisis?

So, while I have no illusions that Google planned the Tsunamis in Japan, I wonder if non-local crisis response research and development might be a way answer the question above or shift the dialogue to such topics as “planned disaster response by for-profit agencies.” It seems as though organizaitons like Google with oodles of slack resources and a penchant for expansion might serve themselves well by expressing “social responsibility” during times of non-local crisis … especially, in nations where their product, service, etc. is not the leading brand, type, etc.

Hence, almost sounding like a conspiracy theorist now, is it just a coincidence that Google reached out to Japan?

Is there such thing as a good introductory book to STS?

A couple of introductory STS texts are listed below — what’s missing? What do you use?

Bauchspies, Wenda, Jennifer Croissant, and Sal Restivo (2005). Science, Technology, and Society: A Sociological Approach (Wiley-Blackwell, 2005).

Fuller, Steve (1993). Philosophy, rhetoric, and the end of knowledge: The coming of science and technology studies. Madison, WI: University of Wisconsin Press. (2nd edition, with James H. Collier, Lawrence Erlbaum Associates, 2004)

Kleinmann, Daniel (1991). Science and Technology Stidies. Wiley-Blackwell; 1 edition (January 16, 1991)

Sismondo, Sergio. (2009). An Introduction to Science and Technology Studies. Wiley-Blackwell; 2 edition (October 20, 2009)

Volti, Rudi (2001). Society and technological change. New York: Worth.

Is anyone satisfied with Wikipedia’s STS page?

Last semester, while teaching STS 200 “Topics in Science and Technology Studies,” to primarily engineering students at Penn State, I found something peculiar. Students complained — some a little, some a lot — that I was asking them test questions whose answers were not to be found on-line and, in particular, on Wikipedia’s STS page.

Is anyone satisfied with Wikipedia’s STS page? I don’t even see the terms “ICT” or “infrastructure.” What is to be done with this Wikipedia page?

Now, I realize that the “core” of STS is a running problem in the field, as there is no center to speak of or fully shared history. This is obvious in many ways, but there is one that I have routinely found of interest: upper-level undergraduate and lower-level graduate texts which introduce “history and philosophy of science,” “Science, Technology, and Society,” and “Science and Technology Studies.” Such texts rarely cover the same material in a way that sociology introductory texts contain a good deal of similar information.

I use Sismondo’s and Volti’s introductory texts and their books contain concerns not reflected in Wikipedia site such as “ghost publishing” or the “political economy of knowledge.”

NSF may be forced to close the Social, Behavioral, and Economic Sciences (this includes STS)

Just heard this:

“…the House Commerce, Justice & Science Committee is considering eliminating or severely cutting back the directorate for Social, Behavioral & Economic Sciences at the National Science Foundation (NSF).”

David Brooks wrote this opinion piece on this topic called “The Unexamined Society” which details the need for the social sciences and laments the potential loss. Here is an excerpt and closing remark:

People are complicated. We each have multiple selves, which emerge or don’t depending on context. If we’re going to address problems, we need to understand the contexts and how these tendencies emerge or don’t emerge. We need to design policies around that knowledge. Cutting off financing for this sort of research now is like cutting off navigation financing just as Christopher Columbus hit the shoreline of the New World.

E-mails are spreading quickly now and here is one that I got from STSgrad:

From Laurel Smith-Doerr:

Dear Colleagues,

The House Subcommittee on Commerce, Justice & Science (CJS) is considering changing the 2012 appropriation to eliminate the Social,
Behavioral & Economic Sciences (SBE) directorate at the NSF, which includes the STS Program.  The Consortium of Social Science
Associations (COSSA), a coalition to which the ASA belongs supporting Federal funding for the social sciences, is encouraging its members to write to their House Representatives and Senators, urging the House to continue to support the human sciences at NSF.  Having had the privilege of serving recently as one of the Program Officers at the NSF in the SBE directorate, I want to endorse COSSA’s request, believing that eliminating SBE would be disastrous for the social sciences in the US and for sociology in particular.

So I encourage you to write to your House Representatives and US Senators, ideally before the CJS Subcommittee meeting on 7 July, or
before the full House Appropriations Committee meeting on  13 July, and at least before the floor discussion scheduled for the week of 25 July.

You may want to copy Subcommittee Chair Frank Wolf R-VA and Ranking Member Chakah Fattah D-PA and perhaps other members of the Subcommittee (http://www.appropriations.house.gov/Subcommittees/Subcommittee/?IssueID=34794) and Appropriations Committee Chair Harold Rogers (R-KY) and Ranking Member Norm Dicks (D-WA) (http://www.appropriations.house.gov).  You can find contact information for your representative using the ?Write Your Representative? feature athttps://writerep.house.gov/writerep/welcome.shtml, and you will find a list of Senators, sortable by state, at http://www.senate.gov/general/contact_information/senators_cfm.cfm! a>.

We all lead busy lives and if you prefer to send something more or less ready made I suggest something along the lines of the letter made available by the previous Assistant Director of SBE (a linguist) athttp://www.lsadc.org/info/NSFSBEletter.pdf.  You may copy and paste the text from this letter (make sure the formatting has copied appropriately) and if you have the opportunity, elaborate and tell your representatives something about our field. Furthermore, you might strengthen your argument by pointing to NSF-supported work being conducted at a university in the representative’s area.

Support will be particularly valuable from the Republican party. I wrote to Scott Brown, using the AD’s letter as a starting point. My letter is pasted below (unformatted).

Please feel free to forward this request to colleagues, I have taken parts of it from the linguists but obviously it is important for representatives to hear from all of the social sciences.

Laurel Smith-Doerr

July 1, 2011
Scott Brown
US Senator
2400 JFK Federal Building
15 New Sudbury St.
Boston, MA 02203

Dear Senator Brown,
I am alarmed to hear that the House Commerce, Justice & Science Committee is considering eliminating or severely cutting back the directorate for Social, Behavioral & Economic Sciences at the National Science Foundation (NSF).

In the US, basic research in the social sciences is funded alongside the natural sciences and engineering, through the same agency. This is unusual from an international perspective and means that the social sciences are done better here, by being more closely integrated with work
in the other sciences. Having the full range of basic science funded within one agency has led to more collaborative, interdisciplinary work, with better results on all sides.

One major example of this integration is our study of scientific innovation itself, one of the most important drivers of a strong economy (as acknowledged in the 2007 America COMPETES Act, which was led by the Bush Administration but supported across parties). Somehow basic
science conducted at lab benches and engineering projects started in garages produce new knowledge products that spark new industries like biotechnology and information technology which give the United States a real competitive edge in the global marketplace. This innovation
process is not yet well understood but is a central concern across social sciences including sociology, economics, psychology, and science policy studies. The importance of better understanding the innovation process (in order to facilitate it) has generated the new interdisciplinary area called the science of science and innovation policy (SciSIP). This program at NSF is funding research to scientifically understand the innovation process and which policies are more effective at producing beneficial outcomes in science and technology.

NSF is unique in combining experts from the social sciences with experts in natural sciences and engineering. For example, social scientists and chemists in Massachusetts (and other states) have received grants in a collaborative initiative at NSF between SciSIP (in Social/Behavioral/ Economic Sciences directorate) and Chemistry (in Math/Physical Sciences directorate). An article in this week?s Chemical
and Engineering News (‘Measuring Chemistry’s Impact’) announces the initiative and its importance to understanding the chemical sciences. This initiative ‘Pathways to Innovation in the Chemical Sciences’ would not have been possible if social sciences were not part of NSF. More
information about this initiative and others in the study of innovation and science policy can be found at the following website: (http://www.scienceofsciencepolicy.net/page/about-sosp).

The integration of all the basic sciences at the NSF represents one of the national treasures of the US, which has yielded much competitive advantage. Massachusetts has been at the forefront of this kind of interdisciplinary research, as it has led innovation and science in general.
I urge you to oppose any efforts to weaken that integration, which will be detrimental to our state
and our nation.

Sincerely,
Laurel Smith-Doerr
Associate Professor of Sociology
Boston University
Ldoerr@bu.edu

Patrick Carroll’s Good Idea

As many of you know, Jan-Hendrik and I write about seeing the state from an STS perspective. We’ve written a book chapter, published an article, and have a couple more in the publishing process now.

Soon, at the American Sociological Association’s 2011 annual meeting, Jan and I will sit on a panel about Science and the State. Patrick Carroll refereed some of the papers and provided comments, among them was our paper about American eHealth and state theory.

Here is an excerpt from the presentation we’ll give that quotes Patrick’s insightful comments.

We believe that our more general research agenda offers a potentially fruitful route to get beyond the ‘state as actor’ and ‘state as network’ opposition that currently defines these two lines of inquiry. As was suggested by Patrick (Carroll), the solution might go like this:

perhaps the solution partly lies in recognizing that these apparently contradictory realities are not constructed out of identical stuff; indeed perhaps we should be thinking of ontologies of state (in the plural) rather the ontology of the state.

After much reflection on his proposal, we think he may be right and I would encourage anyone interested in Patrick’s idea to ask about it during question and answer so that we might explore it during discussion.

Any ideas about this notion of “actor state” and “network state” models not “sampling” on the same basic, raw stuff?

Or, I guess, the idea that STSers have taken to believing that — per our commitments to materiality — that everything is made up of the same basic stuff?

"Networking the failed state" is an interesting paper

Christian Bueger and Felix Bethke presented this piece at the 51st Annual Conference of the International Studies Association, New Orleans, February 2010.The paper basically recognizes that international relations is being and should be studied in less “social” ways and instead uses some of the topography and geography of ANT to do so. Interesting stuff.

Here’s the abstract:

Abstract: The discipline of International Relations is increasingly studied as a social
phenomenon. In contrast to universalist understandings of IR as global knowledge,
sociologies of IR have localized and pluralized our understanding of the discipline. IR is
understood as a conglomerate of national communities. These national communities are
depicted as each having their own culture, organizations and knowledge and are seen as
interdependent to each other, while they all circulate around a centre, the North American
community of IR scholars. We challenge such communitarian understandings of the
discipline. Basing our discussion in Actor-Network Theory (ANT), we argue to conceive of
IR as different spatial form, that is a network or rhizome. Such an understanding enables us
to study the relations of IR to other entities, to emphasis actor and practices as constituting
the discipline and to address issues of power relations, that go beyond inter-community
power relations. To make a case which phenomena come into sight from an ANT
perspective we study the case of research on Failed States. We disentangle the network, and
sketch how IR is enroled as well as transformed.

New journal of potential interest: Journal of Sociology of Science & Technology

Has anyone heard of Metha Press?

Apparently, Metha Press has recently opened a number of new journals, among them the Journal of Sociology of Science & Technology, and is looking for editor in chief for a large number of journals. A recent press release reads:

Mehta Press is a premier publisher of academic, technical and scientific work, reaching around the globe to collect essential reference material and the latest advances and make them available to researchers, academics, professionals, and students in a variety of accessible formats.

Mehta Press Invites Curriculum Vitae of efficient Scientists for Chief Editor Post in newly launched Social Science & Humanities. Applications are invited for a Editor-in-Chief for following journals

The contribution of the Editor-in-Chief will be crucial to the development of a unique journal style and image. We are confident that the quality and high ethical standards of our journals can offer a unique service to the international community. Together we can promote scientific advancement and excellence for the benefit of all.

If you are interested in our project, please submit a copy of your curriculum vitae, including your  study record and a brief list of your main publications, for our internal evaluation.

Games with a purpose – a new role for human web users?

Just coming back from a few days of fieldwork (preparing ethnographic research in the field of semantic software) I could not help but share something I just learned. It fits quite nicely to what I have written before on the masses of non-human actors that populate the web today (crawlers, spiders, bots) and how the interdepencies between “them” and others (like us) change with the implementing of new web technologies.

Media_httpgwapcachefl_jifpw

Semantic technologies are build to process large numbers of unstructured documents and to automatically find (and tag) meaningful entities. And while these frameworks of crawlers, transforming tools and mining algorithms are actually quite good at finding structure in data, they are still (at least initially, they learn quickly) quite bad in assigning meaningful labels to it. They are quick and good in understanding that a text is about something, but they are bad ans slow at judging ambiguitive terms – they fail at understanding. But a recent trend called “gamification” (which is around for a while but was until recently used mainly for encouraging users to fill out boring forms) now is a good example how the configuration of agency changes on the web today. Human users are asked to play games that help annotating and matching ambiguitive patterns – for tagging pictures, texts, music, etc. So not machines are doing tasks for humans – humans are working for machines. 

For those who want to try working for them, check out the “Games with a Purpose” Website. A paper that described what exactly they do can be found here.

ICTA 2011: International Conference on Information and Communication Technologies & Applications

Announcement------------------------------------International Conference on Information and Communication Technologies &Applications: ICTA 2011 (www.2011conferences.org/icta), to be held on November29th - December 2nd, 2011 in Orlando, Florida, USA------------------------------------Participants in this conference will receive a password to also have access toall the virtual sessions (associated with the face-to-face sessions) of theprograms of the conferences organized by IIIS and held on March 2011(www.iiis2011.org/imcic/Program/html/program-public-wvp.asp?vc=26) and to beheld on July 2011(www.iiis2011.org/wmsci/Program/html/program-public-wvp.asp?vc=1)------------------------------------Deadlines:Submissions and Invited Sessions Proposals: July 12th, 2011Authors Notifications: September 1st, 2011Camera-ready, full papers: September 22nd, 2011------------------------------------Technical keynote speakers will be selected from early submissions because thisselection requires an additional evaluation according to the quality of thepaper, assessed by its reviewers, the authors' CV and the paper's topic.Submissions for *Face-to-Face* or for *Virtual* Participation are bothaccepted. Both kinds of submissions will have the same reviewing process andthe accepted papers will be included in the same proceedings.All Submitted papers/abstracts will go through three reviewing processes: (1)double-blind (at least three reviewers), (2) non-blind, and (3) participativepeer reviews. Authors of accepted papers who registered in the conference canhave access to the evaluations and possible feedback provided by the reviewerswho recommended the acceptance of their papers/abstracts, so they canaccordingly improve the final version of their papers.Pre-Conference and Post-conference Virtual sessions (via electronic forums)will be held for each session included in the conference program.Registration fees of an effective invited session organizer will be waivedaccording to the policy described in the web page (click on 'Invited Session',then on 'Benefits for the Organizers of Invited Sessions').Authors of the best 30%-50% of the papers presented at the conference (includedthose virtually presented) will be invited to adapt their papers for theirpublication in the Journal of Systemics, Cybernetics and Informatics (JSCI), orin the Journal of Education, Informatics, and Cybernetics (JEIC)ICTA 2011 Organizing CommitteesIf you wish to be removed from this mailing list, please send an email toremove@mail.2011conferences.org with REMOVE MLCONFERENCES in the subject line.Address: Torre Profesional La California, Av. Francisco de Miranda, Caracas,Venezuela.

Journal on governance issues accepting rolling submissions…

The Journal of Science Policy and Governance

http://www.sciencepolicyjournal.org/

Now accepting rolling submissions!

The Journal of Science Policy and Governance is an interdisciplinary journal that seeks high-quality submissions on emerging or continuing policy debates. Current students (undergraduate or graduate) and recent graduates within three years of earning a degree (bachelors, masters, or doctoral) are eligible to submit. We seek to publish articles on a variety of policy areas including: scientific research, engineering, innovation, technology transfer, commercialization, bio-medicine, drug development, energy, the environment, climate change, the application of technology in developing countries, STEM education, and space exploration. Submissions on other topics are also welcome as long as they relate to the theme of science policy and governance. The Journal strives to publish articles in a timely manner to ensure that publications can be considered in the context of current policy debates.

Please see website for submission guidelines.
Questions and/or submissions should be sent to jofspg@gmail.com.

Good opportunity to be named as a contributing editor in the forthcoming Handbook of STS

From the Technoscience update:

4S Seeks Editors for 4th Handbook of Science and Technology Studies

The Society for Social Studies of Science Publications Committee invites proposals for the fourth edition of The Handbook of Science and Technology Studies.  The Handbook consists of state-of-the-art review articles, along with occasionally more specific articles, that cover the current range of research in science and technology studies.  The 3rd edition was published in 2008.  At this point we are looking for a team of four editors who will enlist authors to write the full range of articles.

In your proposal, provide names and affiliations of editors along with a 1 paragraph biography outlining each editor’s areas of expertise.  Also include proposed section and chapter titles with brief outlines that scope out substantive coverage in each chapter.  Please submit electronic copies of your proposal by 15 October 2011 to Stephen Zehr, Chair of the 4S Publications Committee (szehr@usi.edu).  Proposals will be reviewed by members of the Publications Committee.  Once a team of editors has been selected, the Publications Committee will make suggestions regarding topical omissions, overlap, editors, potential authors and so forth to facilitate the project.

4S News is archived at http://www.4sonline.org/4s_news

Dismantling boundaries in science and technology studies

Nice paper by Cornell’s Peter Dear and Harvard’s Sheila Jasanoff about regulatory science

Abstract

The boundaries between the history of science and science and technology studies (STS) can be misleadingly drawn, to the detriment of both fields. This essay stresses their commonalities and potential for valuable synergy. The evolution of the two fields has been characterized by lively interchange and boundary crossing, with leading scholars functioning easily on both sides of the past/present divide. Disciplines, it is argued, are best regarded as training grounds for asking particular kinds of questions, using particular clusters of methods. Viewed in this way, history of science and STS are notable for their shared approaches to disciplining. The essay concludes with a concrete example–regulatory science–showing how a topic such as this can be productively studied with methods that contradict any alleged disciplinary divide between historical and contemporary studies of science.

The Third International Conference on Social Informatics (SocInfo’11)

W’ere reviewing papers now for the Third International Conference on Social Informatics (SocInfo’11).

The organizers, Singapore Management University, host on 6 – 8 October, 2011.

The Third International Conference on Social Informatics (SocInfo’11) 6 – 8 October, 2011, Singapore

Social Informatics is an emerging area of informatics that studies how information systems can realize social goals, apply social concepts, and become sources of information relevant for social sciences and for analysis of social phenomena.

The third international Social Informatics conference will attempt to create an interdisciplinary community of researchers interested in the interactions between the information system and society. Information scientists working on ways to analyze and improve information systems from the point of view of realizing social goals are invited to participate.

Background

The International Conference on Social Informatics was first held in Warsaw, Poland in 2009, followed by Laxenburg Austria in 2010. SocInfo2011 will be held in Singapore, a major hub in the Asia Pacific region well known for its multi-racial and multi-cultural society. Both SocInfo2009 and SocInfo2010 were small meetings that covered mainly the computing perspective of social informatics. This will change at SocInfo2011, which aims to broaden the scope of social informatics while reaching out to diverse researchers worldwide.

The mission of SocInfo2011 is to make the conference a premier venue for both social and computer scientists to exchange the latest research ideas that better intergrate scholars from the two disciplines. The conference program will reflect this in the keynote talks, tutorials, workshops and paper sessions addressing emerging topics which attract interdisciplinary research attention.

For the first time, the conference has a strong representation of social science researchers. Both the organizing committee and program committee have diverse discoplinary representation. The committees are actively seeking papers covering a wide spread of topics and approaches. SocInfo2011 is supported by the International Communication Association (ICA) and Singapore Infocomm Technology Federation (SiTF). SocInfo2011 also welcomes industry particpation by giving demos and poster presentations.

Other than serious research exchanges, SocInfo2011 will offer a social program for participants to know one another better, to visit some places of interest and to appreciate the local delicacies. With a good combination of work and fun, SocInfo2011 hopes to foster collaboration among the social informatics researchers as well as to demonstrate the relevance of their research to a wider community.

What’s up with ‘social innovation’?

I recently reviewed a book for International Sociology Review of Books about “social innovation.” The book was Non-technological and non-economic innovations edited by Roth Steffen whom also authors the first chapter. The book’s contributors, some of whom wrote mainly conceptual pieces, others of which wrote more empirical works, were mostly European with the exception of a scholar from India and another from Russia.

The book opens with a hidden-in-plain-sight insight: overwhelmingly, studies of innovation tend to emphasize the story of technology going to the market to spread widely among buyers/adopters. What is missing, they say, which is also necessary for a more robust theory of innovation, is a deeper understanding of what these studies necessarily omit — that is, non-technological innovations which do not compete in the market (in the tradition sense of the word).

… a somewhat compelling position as it defines new research and sets an agenda for multiple scholars to advance the field. Still, this may not be something new. After all, social innovations include advanced social networking techniques, trends in outsourcing or downsizing, shifts in organizational form or work processes, advertising and branding techniques, etc. However, this is really now what worries me about this approach.

Here is an excerpt from the forthcoming review, which makes my point:

… The introduction ominously asks: “if innovations also have a social dimension, then is there a social dimension of social innovations, too?” (10) The question reveals a couple of things. First, the adoption of innovations such as advanced networking strategies or intensive outsourcing, for example, are conceptualized as definitively “social” things that spread (82,84). Second, innovations also have a “social” dimension, which might include harnessing symbolic systems in order to evoke a certain set of emotions in advertising or through branding (164). This social dimension might also include promotional events, auctions, or the establishment of auxiliary organizations such as museums or historical societies (246). Third, innovations are embedded in extant social relations, hence, innovations shape and are shaped by the circumstances of their social context (161). If social innovations, which are conceptualized as having one or more social dimensions, shape and are shaped by social contextual factors, within which they are embedded, then it appears there is nothing more social and, therefore, nothing more obviously under the jurisdiction of the social scientist to study than social innovations from this perspective. However, there is a deep theoretical issue to be considered regarding these multiple uses of the term “social,” a term taken to mean, in the context of this book, a thing, a dimension of that thing, and its context. I am thinking foremost about how this book’s raison d’être squares with Bruno Latour’s (2005) recent book Reassembling the social. I cite a forthcoming review of Latour’s book to make my point (Rowland, Passoth, and Kinney):

“Latour’s bottom line: As it happens, much of contemporary sociology is misdirected bunk; (…) Latour’s admittedly self-serving historical portrayal of sociology delivered in this book is perhaps forgivable because, in exchange, we get to see how performativity works among sociologists (rather than just economists). Sociologists give artificial strength to ideas that were only meant to be conceptual. (…) Sociologists are guilty of this sort of performativity, but also something much more grave. The “social” is used at times to explain what binds people together or tears them apart, but sociologists simultaneously demand that the social can also be a backdrop shaping interactions that bring people together or tears them apart. Sociologists get to have their cake and eat it too … ”

With the “social” taking-on so many meanings in this edited book (i.e., a thing, a dimension, and a context), we wonder if scholars of social innovation are also asking to “have their cake and eat it too” much the way Latour suggests sociologists have over the last century.

Panopticism, lateralism, and infrastructure cont.: IT and surveillance, once more

I just got my hands on a new paper by Marion Brivot and Yves Gendron that speaks very directly to the questions of lateral or flat types of power and regulation – and how IT infrastructures may be involved. Here is the abstract:
 

Beyond panopticism: On the ramifications of surveillance in a contemporary professional setting

This paper provides fieldwork evidence, which solidifies an emerging view in literature, regarding the limitations of the panoptical metaphor in informing meaningfully and productively the analysis of contemporary surveillance and control. Our thesis is that the panopticon metaphor, which conceives of the organization as a bounded enclosure made up of divisible, observable and calculable spaces, is becoming less and less relevant in the age of contemporary surveillance technologies. Through a longitudinal socio-ethnographic study of the ramifications of surveillance ensuing from the implementation of a computerized knowledge management system (KMS) in a Parisian tax/law firm, our analysis points to the proliferation of lateral networks of surveillance having developed in the aftermath of implementation. In this complex and unstable constellation of rhizomatical controls, peers are involved in scrutinizing the validity of one another’s work, irrespective of the office’s hierarchies and official lines of specialization. As a result, games of visibility (exhibitionism), observation (voyeurism) and secrecy (hiding one’s work from the KMS) abound in the office. One of our main conclusions is to emphasize the pertinence of apprehending control and surveillance from angles that take into account the ambiguities, complexities and unpredictability of human institutions, especially in digitalized environments.

Sounds like a keeper. Although I am looking at this journal regularly, this one escaped my intention until I got the respective e-mail alert yesterday (another infrastructural topic, I guess). Anyway, the full paper can be accessed here.

How flat exactly is (social) infrastructure?

Still toying with ideas about approaching management information systems from a lateral perspective, I am wondering how ‘flat’ an approach to regulation can/should become. With ‘flatness’ I am referring to the counter-scheme against ‘transcendent’ sociological approaches to the regulation of social life expressed by Latour and others (most beautifully and briefly, I think, Latour has expressed it here). What I am asking myself is how far the analytical levelling – thinking about governing, regulation, power, and so on in a lateral rather than hierarchical-levels-of-order manner – can be radicalized without, if you will, collateral damage to the empirical questions under study which in the case of management information systems (or ERP) clearly has something to do with power, hierarchies, and so on. In other words, how much can hierarchies be conceptually flattened without ceasing to be hiearchies?
This may be a general question when analyzing infrastructures. Do we need to be careful to translate every supposedly top-down relationship into a sequential ordering of steps, or into a route through a network of nodes? Or, conversely, is there a sense in which we should retain some role for hierarchies and levels of (social, technological, biological etc.) order?

An ANT Paper in Sociological Theory!

Just a short note: The recent issue of “Sociological Theory” features a paper not only based on STS thoughts but one that even has “Actor-Network” in its title. As I am not on the university VPN right now I cannot download it to review it, but judging from other papers I know from Hiro Saito it should be a good one.

A major problem with the emerging sociological literature on cosmopolitanism is that it has not adequately theorized mechanisms that mediate the presumed causal relationship between globalization and the development of cosmopolitan orientations. To solve this problem, I draw on Bruno Latour’s actor-network theory (ANT) to theorize the development of three key elements of cosmopolitanism: cultural omnivorousness, ethnic tolerance, and cosmopolitics. ANT illuminates how humans and nonhumans of multiple nationalities develop attachments with one another to create network structures that sustain cosmopolitanism. ANT also helps the sociology of cosmopolitanism become more reflexive and critical of its implicit normative claims.

An Actor-Network Theory of Cosmopolitanism* – Saito – 2011 – Sociological Theory – Wiley Online Library:

Humans TXT: We Are People, Not Machines.

Media_httphumanstxtor_bgpsi

Do you know who your readers are? I just recently met a reader of our blog from Lancaster at a conference in Berlin and I was very happy to finally have a face to remember when posting (ok, I of course know Nick´s, Hendrik´s and Antonia´s faces). But guess who are the most frequent readers of this site? Machines! The Google-Bot, Posterous-Indexer, Feedburner and their pals harvest websites and it seems they are the most faithful readers of what we write.

As I tried to argue in a german paper on media change and interobjectivity, the specific separation of labour between humans and machines is what is at stake in some of the most interesting innovation processes in the field of web technologies. Who should have to do most of the work? A few of you might remember the hard days of the ongoing browser wars: a web designer these days had to build three or more versions of her site just to please the different web browsers. Or look at the struggle about RSS or, more recently, semantic technologies: who should add all the meta data, who should try to make sense of this mess of interconnected data? Us? Or them?

And now I just stumbled upon a stange idea. It goes like this: if there are files on a website that are for bots only (the “robots.txt” file that asks search engines to please not index a site, a funny example of this is the one on youtube.com), why not create an equivalent just for human readers? That is the basic idea behind “humans.txt”. And there are huge stakeholders involved. Google already jumped on, this is their file:

Google is built by a large team of engineers, designers, researchers, robots, and others in many different sites across the globe. It is updated continuously, and built with more tools and technologies than we can shake a stick at. If you’d like to help us out, see google.com/jobs.

Wait, what? Google? After wondering for a while what sense it could make to duplicate the stuff that is already on your “about page” in a textfile without layout and eyecandy I suddenly realized. Guess who likes plain textfiles? Guess who would like to find meta data about a web site always at the same place? Yes. Bots. Those will be the most likely readers. So: who do we write for?

 

The role of venture capitalists in infrastructure?

A decent and relatively new blog about being a venture capitalist by Fred Wilson got me thinking about how venture capitalist firms are a different type of “organizational entity” as compared to universities or industrial manufacturing firms, in particular, their role in the development, repair, and replacement of infrastructure. While large entities with monopolistic control over primary infrastructure (e.g., roads, water systems, etc.) are usually these odd things called “states,” increasingly venture capitalists are playing a role in the development of all sorts of “modern” infrastructures that are everybit as significant (i.e., bio-engineering, IT, etc.). The difference being, perhaps, in their underlying motivations.

Venture capitalists are looking for high-potential and high risk investment opportunities with early stage startup companies that show signs of future growth and who are in need of seed money. I assume/guess that these sorts of companies produce different types of infrastructure as compaed to monolithic and monopolistic state endeavors. While it is conceptually sloppy to refer to a state as an actor, I must; I assume that states are motivated to invest in precisely the opposite — they want low-risk investment opportunities with foreseeable benefits and prefer to work with established or “prooven” companies to get the infrastructure they want.

Now, that might mean that while venture capitalists and states are both investing in infrastructural innovations (in fact, venture capital investments are sometimes a proxy for or indicator of innovation in a given nation or sector of the economy) are they investing in the same things?

I think not, given their motivations for investment, and some theoretical and empirical comparisons to states would make venture capitalists potentially exciting for STS.

What’s next in the study of management information systems?

After we deliberated a bit about what’s next for STS, I find myself wondering some more about where to take the analysis of management information systems. I have to decide pretty soon whether to commit to an empirical project in this area, and I would of course very much like to avoid focusing on questions which have already been explored at great length by others.
My impression is that when it comes to exploring management information systems, the elementary questions about technology associated with the original program of STS are pretty much in the books. This may be particularly true – or, at least, that’s my impression – with respect the “big” questions of social construction, interrelation (if not identity) of social and technological structure, and so on. But then again, I do still need a general theme with which to associate my research initially in order to establish where I would generally like to take it.
As of now, I have primarily been thinking of pushing the envelope with respect to the analysis of regulatory regimes in terms of a still more micro analysis of how regulation is really brought about and sustained in social situations (in which participants mobilize information systems or particular inscriptions provided by these systems). The general idea would be to unpack regulatory regimes laterally into sets of distinct regulatory situations and look at the respective role(s) of management information systems.
I know some of you are pretty well informed about this field, so might I have a little opinion poll about this?
And where do you think the study of management information systems should more generally be headed? Are there, possibly, current trends which you are (or would be) genuinely excited about?