Unknown's avatar

About Nicholas

Professor of Sociology, Environmental Studies, and Science and Technology Studies at Penn State, Nicholas writes about scientific study of states and the future.

Write tilt-shift

Seeing this intersting tilt-shift video of a city, I was reminded that scale is a salient issues regarding infrastructure as infrastructural entities often exist in at such massive scale that it is difficult to “humanize” or make it “knowable” to people.

Capture

Similarly:

Regarding numbers, sometimes numbers are so large that their meaning (or reality) is somehow compromised by their sheer size/scale and they are somehow unknowable.

Regarding art, sometimes a painting or sculpture can be so large that it fails to relate to human viewers.

All of this reminds me of the first time I read de Certeau’s “The Practice of Everyday Life” and, in particular, the section on walking in cities. The experience of walking in a city is quite different as compared to maps of the city or arial views. Unlocking the scale issue for infrastructure is quite important, especially emphasis on the massiveness, but, as this video indicates, although only through implication and extension: as scholars, we may need to find a way to write tilt-shift about infrastructure.

How brains work in imaginary worlds

A colleague of mine, Eric Charles (psychology), recently posted some thoughts about how brains work in the world of Marvel X-men. It occurs to me that the modest role of expertise and cognitive psychology that oftentimes make it into our classes on STS might be meaningfully enhanced if we teach a lesson like this about X-men to students.

Explicitly how memory works might be a fruitful avenue, the role of memory and what it means to “know” something, and what expertise might mean if we consider the brain/body relationship, and of course, one gets to talk about Wolverine during class in the process.

"There will be an infrastructure."

Check out this recent video from the FuturICT group in which Paul Lukowicz presents his take on the project from the perspective of a computer scientist.
A lot of the issues we have been discussing come up very explicitly in this bit. Particularly interesting, I think, is how emergent structure on the one hand and purposefully built infrastructure on the other are being renegotiated conceptually, and how both are finally raised to the offer of a platform: “there will be an infrastructure” which all kinds of people may contribute to, may use to run their own projects on, and built their own apps.

A couple of more videos are linked at the site, for example a ten-minute promo of the project.

Specifying infrastructures cont.

I was just revisiting the earlier post about the wikipedia page about infrastructures, and the sentiments expressed in the comments about the missing social science and STS references on that page, impressive and elaborate as it is. As far as this blog is concerned, the issue of specifying a common understanding of infrastructures has so far turned out to be, I think, one of its implicit continuous commitments, and one that perhaps merits re-addressing explicitly from time to time. So, very briefly, and slowly gearing up for the 4S meeting, some thoughts on where we are at this point.
On the one hand, there are lots of ressources and discourses about infrastructures drawing in participants that from all types of sources and disciplines. On the other hand, there is STS as a field in social science with some maturity, and with various kinds of theory able to bring infrastructures under the auspices of their concepts and terminologies. From time to time, STS scholars, like other social and political scientists, feel like intervening into public discourse by offering their own types of expertise about particular cases and problems of infrastructures. So far, we have not been satisfied that the conceptual work required for an appropriate understanding of infrastructures has already been done, and that we would merely need to extend the application of otherwise well-known concepts to the exploration of infrastructures. Infrastructures can clearly become “normal” cases of networks, assemblages, socio-technical orders etc., and there is nothing wrong with analyzing them as such. It may, however, also present a danger of locking analyses of infrastructures into foregone conclusions.
Here are a couple of possible lines for discussing specifications of the concept of infrastructures after taking another look at the wiki entry:
– Infrastructures as supporting something (“a society or enterprise”, “an enconomy” etc.). Clearly, the idea of an assemblage (network etc.) supporting something other than itself is worth noticing. General references to use or purpose are, of course, common when talking about all kinds of artefacts, but to speak of such heterogeneous sets of entities in terms of a general externally given purpose must be puzzling.
– References to a general public. Political issues and the state are very salient on the wiki page despite its focus on economics and engineering, and despite the fact that the definition of infrastructure is given in a way that takes great care to exclude political questions, e.g. speaking of “facilities necessary”, or “services essential” as if these qualifications were unproblematic.
– The differentation of hard vs. soft infrastructure – can we utilize this differentation at all? It rings like hard vs soft facts/science/knowledge, though the implied reference to deconstruction (or rather, the potential ease of it) may be more material, less epistemic in this case – if the connotation is not a straightforward military one. The hard vs soft differentiation clearly expresses a concern about stability and vulnerability but is this concern somewhat specific when worrying about infrastructure (rather than about truth)?
– Topographical references abound. Is infrastructure always about some association of artefacts and territories, or perhaps, more generally, about technology and place? Like the references to politics, the references to geography are ubiquitous in the wiki entry although they are not explicitly part of the definition at the top.
Would any of these aspects warrant a respecification of infrastructures in a way that would constitute them as a generic class of research objects? Would we even want to have such a class?

Job offer: Amherst: Science and Technology Policy

And a job offer that sounds interesting:

The Department of Political Science at the University of Massachusetts Amherst (http://polsci.umass.edu/) seeks to fill a full-time tenure-track position at the rank of Assistant Professor in science and technology politics to start in September 2012. The Department welcomes applications from political science, public policy, public administration, as well as from related disciplines.  Geographic, methodological and
science and technology specializations are open.

In recent years, the Department has nearly doubled in size largely through a Faculty Hiring Initiative. This search continues the department’s efforts to add to the strength of its diverse and growing faculty with scholars whose work addresses broad political
questions arising in one or more of the department’s thematic emphases on a) global forces; b) governance and institutions; and c) democracy, participation and citizenship.

The successful candidate will contribute to this trajectory, adding to our current strengths while broadening our reach into new areas. The faculty hire will teach four courses in the Department’s graduate and undergraduate programs. Successful candidates must have the Ph.D. in hand by September
2012. Salary and credit toward tenure will be commensurate with qualifications and experience.

The deadline for applications is October 15, 2011, but acceptance will continue until the position is filled.  The department strongly prefers that applicants submit their cover letter, curriculum vitae, and writing samples in electronic form through the Academic Jobs Online website athttps://academicjobsonline.org/ajo/jobs/917   and arrange for electronic transmission of three letters of recommendation to the same site.
Alternatively, printed versions of the application materials can be sent to Stephen Marvell, Office Manager, Department of Political Science/UMass, 322 Thompson Tower, Amherst, MA 01003-9277. Those who apply online should not also submit paper materials.  Inquiries about the position may be directed totechnology@polsci.umass.edu.

The University of Massachusetts Amherst is an Affirmative Action/Equal Opportunity employer.  It and the Department are strongly committed to increasing the diversity of faculty, students, and curriculum, and encourage applications from women and minorities

FuturICT – an epistemic infrastructure in the making

An interesting endeavour was recently brought to my attention that, I think, is well worth checking out: the FuturICT project. I will just give you a sample of quotes from the website and you will immediately see that this project in more than one way relates to the topic of this blog:
“FuturICT wants science to catch up with the speed at which new problems and opportunities are arising in our changing world as consequences of globalization, technological, demographic and environmental change, and make a contribution to strengthening our societies’ adaptiveness, resilience, and sustainability.  It will do so by developing new scientific approaches and combining these with the best established methods in areas like multi-scale computer modeling, social supercomputing, large-scale data mining and participatory platforms. (…) The FuturICT Knowledge Accelerator is a previously unseen multidisciplinary international scientific endeavour with focus on techno-socio-economic-environmental systems. (…) Revealing the hidden laws and processes underlying societies probably constitutes the most pressing scientific grand challenge of our century and is equally important for the development of novel robust, trustworthy and adaptive information and communication technologies (ICT), based on socially inspired paradigms. We think that integrating ICT, Complexity Science and the Social Sciences will create a paradigm shift, facilitating a symbiotic co-evolution of ICT and society. Data from our complex globe-spanning ICT system will be leveraged to develop models of techno-socio-economic systems. In turn, insights from these models will inform the development of a new generation of socially adaptive, self-organized ICT systems. (…) The FuturICT flagship proposal intends to unify hundreds of the best scientists in Europe in a 10 year 1 billion EUR program to explore social life on earth and everything it relates to.”
Basically, as it appears to me, the FuturICT is a call to arms of sorts for social scientists of all persuasions to do something with the myriad of data our current ICT systems are producing. The aim is to build an epistemic infrastructure, or rather a range of infrastructures that would put all these data to use. One of the interesting things is that everybody can at this point is invited to join in, though the emphasis is clearly on building a large network of institutions, the current state of which you can see here. It very probably though will not hurt to leave your name, affiliation and expertise, if only to be updated as things progress. Some of the information provided at the website does sound kind of sci-fi, some of it kind of eery, but believe me, as I happen to know some of the people involved, these people are very serious – and they are very capable. So, I am very curious what this will grow into.
One thing this made we wonder about with respect to the exploration of infrastructures in general was whether we have been giving quantity enough thought. The impetus for the FuturICT initiative is the mass of data already available, and the rationale is that the very fact of having these data not only will support an epistemic infrastructure but that is also constitutes an outright demand for it. Is this not something which dintinguishes infrastrutures (e.g. infrastructures for traffic, services or electric power) from other types of networks and socio-technical assemblages: that there is some input or throughput, that it comes in high numbers, and that developer-entrepreneurs try to establish infrastructures as complements or as purpose-giving or profit-generating tools with respect to the throughput?

Innovation in book reviews

A few months ago, Jan-Hendrik and I were discussing the utility of writing book reviews. One concern we had was that book reviews basically do nothing for one’s academic standing, but more than that, in thinking about the book reviews themselves, we were frustrated with them because unlike journal articles, they rarely reference other book reviews for the same book.

So, we wrote a book review that did, to test if there was any value to this. We enlisted a student of mine, Alexander Kinney, and we set to work writing a book review that included other reviews of the same book.

We wrote our editors:

To the editors,

Please see a book review of Latour’s “Reassembling the Social.” While my co-authors realize that the length is somewhat past the desired 1000 words, we hope that you find the document satisfactory. It employs a somewhat unorthodox approach where other book reviews are cited where appropriate so that we can essentially “review what has not yet been sufficiently reviewed by other reviewers.” Additionally, we ask for a small editing consideration for adding a small “box” around a subset of identified text (this mirrors what was done in Latour’s book). I know that this is an unorthodox review, and hope that the innovation is tolerated. Still, we are prepared to make amends if this document does not meet the standards of the journal.

best,
Nicholas J. Rowland

So, it has gone through a couple rounds of editing and is now in the proofs stage (please note that we realize that Latour’s book was written in 2005 [not 2007, as the title currently states]). Also, this new approach to book reviews also requires that one reviews a book a few years after publication rather than soon after publication. This is so that the other reviews can be written and, to some extent, responded to.

Here is the document below, feel free to comment on the approach, style, or content:

Teaching STS: Reinvention and Modification

I saw this in a student presentation yesterday about the role of adaptation in the process of diffusion, where we were discussing matters of re-invention and post-hoc modification/workarounds. I was somewhat stunned and the students in the class were mezmerized:

Capture

What you see in the image above is a C5 Russian missile launcher removed from its “aircraft source” and then adapted/modified for use on the rollbar of a jeep/truck. There is also a video too, below the image available at reposter here.

Another of these “DIY” wartime inventions is a hand-held grenade launcher modified for individual use (the source being a slew of them mounted on the bed of a truck).

Capture

All of these examples, along with the videos, could be used in lessons about diffusion and re-inventions, of course. My guess, however, is to ask the students: how does this make you rethink some of the ideas scholars have about diffusion and re-invention. Certainly, the old, fun ideas from STS about “using technologies in ways not originally intended by designers” is a good one here, but beyond that one could begin to rethink the, what one might call, “quick and easy” story of diffusion that seems to dominate the basic literature. I’m speaking here about the binary “1 for adopt, 0 for non-adoption” interpretation of spread. It becomes useless to think about C5 missile launchers in this way. Bringing up the old work of Akrich (1995, solar cells) and the newer work of De Laet and Mol (2002, hand-pump) leads to a much more nuanced vision of re-invention, modification, and localizatoin, but is even that enough? The role of “necessity” seems obviously right, but analytically weak as determining “moments of necessity” from conditions of non-necessity is a deadend for research. Taking a Weberian approach and forcing a claim like because of their geopolitical circumstances and cultural approach to the world around them, common Libyans are relatively more “resourceful” than their governmental/military counterparts also seems analytically weak. Is this a classic “drifting edges of global networks come together unintentionally and unexpectedly” making this outcome, as in, Soviet degeneration leading to the global sales of ersatz military resources (that almost nobody can maintain and) which are (therefore) cheap creates the conditions underwhich the only way to get additional utility out of these machines is in remaking their uses. I’m not even sure what one would call that sort of an analysis … “luck theory”? The motivation behind any modification, reinvention or workaround appears to be some combination of the need to localize and/or extend the utility of something (or a portoin of something). Trying to determine the motivation beyond mere “necessity” or “resourcefulness” is difficult to do. In this case, survival plays an obvious motivation factor; however, extending that to a broader framework seems foolhardy too. So, “where does reinvention come from?” ought to be an enduring question for our students and ourselves in STS…

Please note: reposter.net is a resposting site, so the original material comes from somewhere else, always:

Here are the videos, in order and linked to the original posts on alive.in/libya

Teaching STS: Challenging Technological Determinism in Caliente, NV

If you were raised on STS in America, then it is likely that you read about the death of a train town named Caliente, NV. This is:

Death by Dieselization: A Case Study in the Reaction to Technological Change
W. F. Cottrell
American Sociological Review
Vol. 16, No. 3 (Jun., 1951), pp. 358-365
(article consists of 8 pages)
Published by: American Sociological Association
This is not a bad read, and easy for instructors to challenge on the grounds of “technological determinism” on two accounts:
1. the town did not die because of the out-of-control technological advance of locomotives, and instead the government-military complex invested heavily in diesel locomotoves as part of mid-century war time efforts (potentially even linking technological advance with patriotism such that any resistance to the technology was seen as anti-American).

 

2. the town did not die because of the out-of-control technological advance of locomotives because like so many towns of this age and this sort, it had a uni-dimensional economy such that the town was susceptible to new technology that challenged the source of their economic security.
I like to emphasize on the account during steam train advances, however, as they are even more telling about this “technological determinism” that seems so easy to swallow for students. Sure, Cottrell shows how Caliente, NV, was run asunder by the advent and subsequent quickened spread of diesel trains on the American landscape.

 

However, during advances to the steam train, and I am referring to low-tensile boilers as compared to high-tensile boilers (and this is somewhat simplistic of train buffs, so please forgive me), it was towns like Caliente, NV, that gained the most! A student and I created this set of PowerPoint slides to explain this (you’ll have to download it to see the animation — the small white dots are “towns” set every 100 miles from the port town): check it out here (note, you’ll have to download it to see the cool animation).

 

As some of you know, I work in Altoona, PA, which was once a heart of the Pennsylvania Rail Road. Altoona, to some extent, suffered a similar death as Caliente, NV, to use Cottrell’s words.

Call for Papers: Performing ANT ??? Socio-Material Practices of Organizing, 17-18 February 2012, St. Gallen

Just got that a few days ago and forgot to post it here – now as I am preparing for three weeks of “off-time” (meaning: a bit of traveling and weeks of being online only once every few days) I had to post it.

Reading that I thought: what does it mean that workshops that specifically use “ANT” in their title are mostly workshops for younger scholars? Just wonder…

Teaching STS with "A fist full of quarters"

One way I teach students the philosophy of science is by using the documentary “The King of Kong: A fist full of quarters.”

King-of-kong-a-fistful-of-quarters-poster-1

Storyline

In the early 1980s, legendary Billy Mitchell set a Donkey Kong record that stood for almost 25 years. This documentary follows the assault on the record by Steve Wiebe, an earnest teacher from Washington who took up the game while unemployed. The top scores are monitored by a cadre of players and fans associated with Walter Day, an Iowan who runs Funspot, an annual tournament. Wiebe breaks Mitchell’s record in public at Funspot, and Mitchell promptly mails a controversial video tape of himself setting a new record. So Wiebe travels to Florida hoping Mitchell will face him for the 2007 Guinness World Records. Will the mind-game-playing Mitchell engage; who will end up holding the record? Written by <jhailey@hotmail.com>

The film is full of ideas from the philosophy of science. For example, logical positivists were obsessed with (1) establishing theories only from data and (2) considering what evidence either falsifies or verifies a theory. In the film, Steve Weibe, the up and comer in the world of competitive gaming, sends a score into Walter Day, the guy that runs the world record center, but the score is ultimately rejected because while the video tape recording appeared legitimate, the machine he was playing on was questionable. This one is good for the falsificationists too: the score he had could not be verified because of questions concerning the video game machine he used; however, because there was no concrete evidence — merely a hunch — of tampering, the score could not be entirely falsified either. Consensus among a group of experts emerged upon reviewing the evidence of Steve’s claim to have the new highest score on Donkey Kong. This nicely emphasizes the role of experts and how consensus over reality is as important as “reality” itself.

Now, thinking all the way back to Shapin’s work on early laboratories and experiments, Steve is invited to attend an annual competition where he can achieve his highest score “live” so that all the other experts can witness first hand his skill at Donkey Kong. He does, and the entire community of competitive gamers more or less warms to the newcomer. This is not a bad lesson in the role of social connections and acceptance of newcomers in science. This is a place to begin discussions of Merton’s norms of science, and, in particular, disinterestedness. However, there is much more to say about functionalism. His competitor, Billy Mitchell, the previous record holder and longstanding insider, sends in, at the last possible moment, a video tape of a score that beats the score Steve just accomplished in person. Merton reminds us that what is good for science tends to advance it. In this case, what’s good for Walter Day and competitive gaming also happens to be what’s good for Billy Mitchell. Bill’s sketchy video score is accepted and immediately posted on-line for the world of competitive gamers to see. Additionally, and in violation of the norm of communism, Billy’s tape is not shared with Steve, even thought Steve’s original tape, which was rejected, was shared with Billy.

The documentary is also funny in places, and it does a nice job showing how a group of gaming experts arrive at conclusions about the nature of reality through norm following, norm violation, and, importantly, consensus. If you teach STS, check it out; I’ve even got a sheet prepared for students to follow along (write me at njr12@psu.edu if you’d like to see it). Also, if you’re just interested, then check it out too.

One closing remark: those old games like Donkey Kong required a very different skill set as compared to contemporary games like Halo or Neverwinter Nights. It is nice to remind new students that games used to be hard in a much different way.

Personal Health Records and patient- oriented infrastructures

International workshop on Personal Health Record 

Personal Health Records and patient- oriented infrastructures 

Empowering, involving, and enrolling patients through information systems: 

Trento, Faculty of Sociology 

via Verdi, 26 

12-13 December 2011

Deadline for abstracts submission: September 30th 2011

Notification to authors: October 15th 2011

Personal Health Record (PHR) has become a popular label to refer to a wide range of patient-controlled information systems aimed at allowing laypeople to access, manage, share and supplement their medical information. Launched in the US at the beginning of the new millennium, PHRs are spreading in Europe (especially in the UK and Scandinavia), where one witnesses an increasing number of experimental systems that vary to suit the local healthcare context. Nevertheless, these technologies appear to be in their infancy, as clearly demonstrated by the low rate of PHR actually implemented in real-life settings compared with the (relatively) high numbers of trials.

Whilst there is still little evidence that PHRs may affect healthcare, they are regarded by different actors (policymakers, healthcare managers, patients’ association, doctors) as “holding out great promise” to revolutionize it by reducing medical errors, cutting costs, increasing patient awareness and control over their health, and providing physicians with information in emergency situations – to mention only some of the potential benefits. This new ‘patient role’, proactive and characterized by greater control and responsibility over one’s health, is reinforced by the very existence of an electronic tool, suggesting that these new activities require an information system somehow similar to those used by doctors. The name itself, PHR, recalls the acronyms for the standard healthcare systems – EHR (Electronic Health Record) and EPR (Electronic Patient Record) – and thus affirms that it belongs within the semantic space of professional tools.

PHR systems are becoming the point of convergence among different visions concerning the future of healthcare systems characterized by the (desired) emergence of ‘new patients’ willing to share the burden of care and to reshape their relationships with doctors and institutions. Accordingly, PHR can

be considered an interesting lens through which social informatics researchers can examine the tentative transformation of different dimensions of the healthcare sector.

We believe that the time has come to engage in debate on these technologies, which are increasingly presented by policymakers and healthcare systems managers as the “next big thing” in healthcare. It is necessary to move away from a mere technocentric perspective (like the one sometimes provided by medical informatics) in order to bring the actors, their work/daily practices, and the meanings attached to them, back into play.

The purpose of this workshop is to gather together scholars, practitioners and professionals who reflect and work on PHR from different perspectives in different countries. Whilst some interesting socially-informed studies have been already presented and published, to our knowledge no attempt has yet been made to create an opportunity for dialogue among them.

We welcome contributions about, but not limited to, the following themes:

·         the design of patient-centered IS and their integration with professional ones;

·         new forms of computer-mediated doctor-patient or patient-to-patient communication;

·         the evolution of healthcare infrastructures and organizations, and the creation of new representations of health/illness;

·         new forms of alignments and conflicts between self-care practices and institutional treatment;

·         the redefinition of responsibilities and roles within the network of patient-doctors-institution-caregivers.

·         the extent to which patients use PHRs to generate data for use in patient-doctor and patient-patient communication

·         the extent to which health professionals make use of patient-generated data from PHRs

Abstracts (max. 1500 words) should be sent to phr@unitn.it

More information is available at http://events.unitn
.it/en/phr2011
or can be obtained by contacting the organizers at phr@unitn.it

We plan to select the best abstracts and presentations and invite their development into full papers to be submitted for a special issue on the topic. Further information will be given during the workshop or before it on the website.

Organizers:

Silvia Gherardi, Faculty of Sociology silvia.gherardi@unitn.it

Enrico Maria Piras, Fondazione Bruno Kessler piras@fbk.eu

Alberto Zanutto, Faculty of Sociology alberto.zanutto@unitn.it

 

Teaching STS: Controversies

Teaching controversies is a mainstay of STS; if you need a good film to show, check out “Judgment Day: Intelligent Design on Trial” replete with Steve Fuller weighing-in on intelligent design…

Also, I have a handout already made to help students to navigate the documentary. Write me if you you’d like a copy or if you’ve used this clip for your own courses (send to: njr12 at psu.edu).

Science and Technology Studies: Opening the Black Box

Somatosphere just posted a link to a set of video recordings from the STS – The Next Twenty Years conference in Harvard last April. I would have loved to go there, but unfortunately poor european scholars only have money to travel abroad when they are participating actively. But, luckily, the whole conference was on live-stream back then. I was not able to watch all of it so I am so very happy to be able to watch them now. Trevor Pinch´s “provocations” are STS at its rhetorical best – so watch, laugh and think.

Should STS articles have methods sections?

It has come to my attention that a good number of STS case studies contain no methods section, and some no mention of method at all (typically utilizing a case study approach). So, I asked today:

Should STS adopt the traditional social scientific methods/data/analysis sections, or is the implied case study methods acceptable, or perhaps a critique of science “as usual”?

So, should, for example, SSS or STHV require a methods section?

Sergio Sismondo on black-boxing and taken-for-grantedness

Concern over the relationship between processes of black-boxing and gradual taken-for-grantedness has been expressed a bunch of times on this blog — here, here, and here.

Gearing-up to teach STS to mainly engineering students today has me reading Sismondo’s intro text — and in Chapter 11, on the topic of “controversies”, he lays out the terms as follows:

Science and technology produce black-boxes, or fact and artifacts that are taken for granted; in particular, their histories are usually seen as irrelevant after good facts and successful artifacts are established (2010:120).

It is nice that the world of ideas in science is not auotomatically labeled “taken for granted” (when facts are momentarily settled) and the world of things in engineering is not automatically labeled “black boxed” (when artifacts are momentarily settled) so that the distinction is not reified (i.e., that facts are only taken for granted and that machines are only black boxed).

However, the two terms seem to be synonyms to Sismondo — do you agree with Sergio?

A Thought on Data and an Orbituary

This NYT article has been on my reading list for a while (some might have noticed that I posted it accientially before two times). I wanted to share it because first (of course) as an orbituary, as a bow before one of the last centuries most inspiring teacher of programming and computing. But I also wanted to share it because it points us who are interested in the assemblage of contemporary infrastructure to a figure that STS seems to like to forget after getting rid of the myth of the genius inventor: the programmer.

For years, Mr. McCracken was the Stephen King of how-to programming books. His series on Fortran and Cobol, a computer language designed for use in business, were standards in the field. Mr. McCracken was the author or co-author of 25 books that sold more than 1.6 million copies and were translated into 15 languages.

Well, of course not the individual, creative and inventive programmer – I sure we would step into the same explanatory traps again that were connected with the inventor-myth. But programming – the core acivity of building, connecting and maintaining IT infrastructure – is a cutural practice on its own, a mixture of play, craft and learned or trained skill. And as any practice, it gaines stability and cultural significance by the network of activities and things surrounding it: trainings, courses, guidelines, how-to-books, textbooks, journals and so on. Maybe it is time that we spend some thoughts on how this particular practice was shaped – an idea that struck me after reading this: 

In the early days, computer professionals typically fell into one of two camps — scientists or craftsmen. The scientists sought breakthroughs in hardware and software research, and pursued ambitious long-range goals, like artificial intelligence. The craftsmen wanted to use computers to work more efficiently in corporations and government agencies. (…) But his books are not like the how-to computer books of more recent years written for consumers. His have been used as programming textbooks in universities around the world and as reference bibles by practicing professionals.

Greatest thing to happen to STS since the Bijker/Pinch paper

New interest in the micro-foundations of institutions has got to be one of the best things to happen to STS since the Bijker/Pinch paper…

The new institutionalism in organizaitonal analysis has been a well-spring for research. A quick summary of neo-I that Fabio Rojas and I wrote (in a paper on museusms):

The hallmark of the ‘new institutional’ school is the relentless focus on how life inside organizations is regulated by stable social practices that define what is considered legitimate in the broader external environment in which an organization operates (DiMaggio 1987, 1991, DiMaggio and Powell 1991b, Meyer and Rowan 1991, Scott 2000). The influence of institutions on organizational behaviour is supposedly most obvious in organizations like museums – organizations that new institutional scholars label as ‘highly institutional and weakly technical’ (Scott and Meyer 1991: 124). By this, scholars usually mean the following: that the organization’s leadership is highly sensitive to the expectations and standards of its industry; that the organization of work within the bureaucracy depends on broader ideologies and cultural scripts found in modern societies; that managers are likely to copy the practices of other organizations, especially high-status organizations; that professional groups are the arbiters of organizational legitimacy; that rational organizational myths and rules structure work practices; and that the ultimate performance of an organization’s set of tasks does not depend much on tools like assembly lines, computers, and the like (see also DiMaggio and Powell (1991a, DiMaggio and Powell 1991b).

The new approach/point of emphasis for neo-I folks is laid-out by Walter Powell and Jeannette Colyvas in their 2008 chapter in “the big green book” of organizations and institutions — copy of the paper is available in draft form at www.orgtheory.net right here.

And so the story goes:

1. Older research is cast as calling for “the need to make the microfoundations of intitutional theory more explicit” (p.276). This is something that institutional theorists have had much success with — positioning papers to create the feeling that this idea is both something new and exciting but also that the call for micrcofoundations is an old one (that we need to now make good on). The opening lines of D&P’s 1983 paper does a good job of saying “that was then” and “this is now.”

2. The upshot: “much analytical purchase can be gained by developing a mirco-level component of institutional analysis” (p.276) which would link “micro-concepts, e.g. identity, sense making, typifications, frames, and categories with macro-processes of institutionalization, and show how these processes rachet upwards” (p.278).The invocation of “hierarchy” or “upward” levels is somewhat disconcerting for those of us set on flatter analysis, but there is likely room to show (and convince) that even the tallest, most stable actors and actions occur locally and laterally on a flat surface of interactions.

3. How can we, in STS, get some purchase on this?

A. Emphasize the interpretations of contexual factors (p.277) rather than assuming them (as has happened now and again in organizational theory devoted to field-level analysis — these are assumptions that occasionally must be made in order to do the diffusion studies so common in neo-I).

B. Display the on-going micro-maintenances of apparently stable institutional forms in daily practice AND/OR discover how stable institutional forms in daily practice result in change over time such that they transform the forms they are intended (in the behavioralist sense) to prolong.

C. Enliven analysis of actors — old new institionalism (let’s say) emphasized two types of actors, “cultural dopes” or “heroric ‘change agents'” the reason being that action was essentially assumed to operate at a level unnecessary to fully capture during large-scale field studies (i.e., so managers simply sought legitimacy at all costs, we assumed, and mimicked their peers) OR in the move to caputre the actions of real actors (instead of assuming organizational entitivity) the studies overwhelmingly invovled entrepreueurs and celebrated/worshipped their field-altering accomplishments, respectively. The new emphasis (of, let’s say, new new institutionalism) sort of smacks of STS lab studies where we saw the how the mundane facets of scientists’ behaviors in labs resulted in field-altering science. Now, neo-I wants to avoid momentous events, or, at minimum, show how seeming huge events were a long time in the making and like all experiments involved loads of failure, which demands of writers the ability to show how local affairs prompt shifts in conventions (locally or broadly) (p.277).

Why is this so good for STS? We have already done much of this type of work, and have oodles of folks committed to these axioms for analysis. The only thing we really need now is a bridge between these two camps — while STS could not break into neo-I on the topic of technology, Powell and Colyvas might have just opened the door to an new institutionalism in STS…

 

One Plug to Charge Them All

A friendly fight over standards in the plug market for electric cars appears to be brewing, according to a NYT article this morning.

WITH electric cars and plug-in hybrids at last trickling into the showrooms of mainstream automakers, the dream of going gasoline-free is becoming a reality for many drivers. Cars like the Nissan Leaf and the Chevrolet Volt can cover considerable distances under electric power alone — certainly enough for local errands and even most daily commutes — while enabling their owners to shun gas stations.

The multi-media portoin of the article is good stuff — I suspect similar pictures will be featured in an STS article sometime soon…

What does the "knowledge myth" mean for SKAT/STS?

A colleague of mine wrote recently about the “myth of knowledge” in a nice blog post. Perhaps one of the most interesting and controversial (and most [overly] generalized) points was about Akido:

Because I am a behaviorist-leaning kind of guy, I would additionally point out that when behavior, talking, and thinking come into conflict, behavior wins. In my article trying to connect ecological and social psychology, I used an example out of Aikido, the martial art that prefers not to hurt people unnecessarily. Indulging in horrible generalizations: In the Western cultures – steeped in dualism and the myth of knowledge – we thinking that ‘knowing’ is about ‘thinking’, but in Eastern cultures this is not so. In Aikido, one of your goals is to blend with your opponent’s movements so you inflict minimal harm. Your goal is not to think about blending, not be able to explain how to blend, nor to be able to accurately imagine blending, rather your goal is to actually blend when the time comes. A person ‘knows’ how to blend when they do it without thinking, and regardless of whether they can teach how to blend or explain what they did after the fact. (By the way, that article is part of a 7 article discussion, including my latest addition now available online.)

One of the main points was the link between “knowing” and “doing” and from a behaviorist perspective in psychology, this is an interesting position to take on such matters. He provides a number of examples such as “how can a legless football coach know how to kick a football?”

Knowledge — beit tacit or explicit, fact-searching or its role in training scientists and engineers — plays a central role in SKAT and STS; however, I’m not entirely sure we’ve jumped on the behaviorist bandwagon just yet.

The ending question: what would STS look like without “knowledge” as a crutch during analysis?

Game theory and society, and infrastructures

I recently attended a conference on “Game theory and society” at the ETH in Zürich. It was a very productive conference with a good mixture of plenary sessions with people like Brian Skyrms and Herbert Gintis and the usual host of more work-in-progress oriented panel sessions. Speakers and attendants had backgrounds in sociology, philosophy, economics, biology, even in physics. If there was a common and unifying interest, this interest was in modeling elementary forms of cooperation. All the more striking was the nearly complete absence of people from sociological theory. Game theory, it appears, has been largely abandoned by sociological theory, leaving it to colleagues specializing in formal modeling or generally versed in quantitative methods. It happens that I found this to be quite a pleasant bunch of people to be around.
A couple of questions with respect to our interest in infrastructures have been bugging me since:
– I might start with the issue brought up by Nicholas a couple of posts ago whether there is a problem in sociological theory of addressing questions of efficiency. After working through some of contemporary game-theoretical research and comparing it to the state of the art in sociological theory, how could I not agree? Game theory could be one, if not THE weapon of choice for sociologists discussing questions of efficiency in an analytical manner, and evolutionary approaches have been demonstrating that the use of game theory need not be congenial to either rationalistic or economistic reconstructions of effiency. Evolutionary game theory is particulary good at showing how inefficent equilibria come about and turn out to be stable.
– Closely related are questions of utility that tend to be treated with a similar kind of disregard by many sociologists. One does not need to adopt a utilitarian perspective to see that analyses how relationships and structures develop, how artefacts evolve and diffuse, etc. are correlated with (mostly implicit) ideas about utility. We may of course treat such ideas about the utility of contacts, associations, or tools as mere background assumptions of our observations of infrastructures, or we may broadly consider them as taken care of by looking at practice pragmatically. Seeing what can be accomplished by taking a more analytic approach to utility, I suspect though that we can do better than just telling utility stories (either with respect to particular cases or in the exposition of theory).
– Which brings me to the more general question of research orientation. Why is there so little modeling in STS and in the emergent field of studies of infrastructures? Researchers have been investigating broadly and writing quite generously about how complex forms of modeling are utilized in the construction of truths and technological artefacts but have been making little use of these methods and tools themselves. It is surely great to have so many sound STS case studies and ethnographies at our disposal in discussing our theoretical concepts and ideas about infrastructures, but again I suppose we could do much better with a less restrictive choice of methods and approaches. If there is a unilateral bias in favor of qualitative methods, story-telling and small-n studies, systemic problems in aggregating empirical data (if not, in the end, a constant recycling and re-invention of theoretical concepts with little progress in accumulating empirical intelligence) are likely to result.
Should we therefore not try to engage more with formal models of cooperation, social order and infrastructures? In Zürich, I found the doors to be generally open, and that there is a lot to learn in terms of concepts and methods. And I find myself encouraged to look into this in a more sustained manner.

Working around over time

Workarounds are:
1. Any way of tricking a system by using the system in a way it was not intended to, but that still gives you the desired outcome. This was first (according to my research) raised by Gasser in 1986. The idea being that in some systems you could enter, for example, incorrect data in order to arrive at the desired outcome. The need for odd initial data is based on infelicities in the system beit software or mechanical. 
2. By “jury rigging” the system wherein you haphazardly put something together, but you don’t expect it work well forever. Sometimes referred to as “make-shift,” it works well enough now — and this happens in computing all the time; you make a quick, often small, but necessary change in the system. Sometimes called a “kludge,” this is where the “permanence” issue is raised in research — how more or less permanent is a workaround, typically assumed to be of limited longevity. Of course, no matter what we make, nothing is permanent. Still, some things last longer than others, and more often than not with packaged software the “slightly-more-permanent” workarounds (in the form of system customization) are more common as compared to the often, but short-lived workarounds used in legacy systems [note: this may be a generalize too wide to bear evidence]. Still, this helps us to understand better the longevity of workarounds.
3. Literature on workarounds is now split on the idea that they are “freeing” employees from the confines of the system, and increasingly scholars ask if all this “freeing” (in research on ERP) creates its own subset of confines (suggesting that large numbers of expensive customizations to systems requires some administrative oversight, which effectively balances the freedom from the previous system with the new need to control those freedoms). This helps us understand the autonomy-producing or -restricting quality of workarounds.
This seems to be the cost of customization: it at once frees you from the confines of the system, but also hurdles the system toward eventual decay (as we have observed with legacy systems), and this is sometimes referred to as “drift.” The more control you exert on the system — in this case, in the form of workarounds — the more brittle it gets and drifts, in principle, from the control of those charged with maintaining the systems. In this way, workarounds are kind of like using a mulligan in golf; it helps you get a better chance in the short term, but in the end it keeps adding +1 to your score until you’ve lost it completely.
However, if one could follow a set of workarounds through the years (and I’ve never seen research like this), explicitly watching them “decay” or “cost,” then the analogy to golf might be observed. When did, in the short run, the workaround get the organization out of a jam? Conversely, when did the workaround, in the long run, costs the organization more than it was worth.
If you could understand the process deeply enough, one could explicitly estimate at which times workarounds “beat the system,” meaning that, you might be able to identify muligans (i.e., workarounds) worth taking (i.e., making) and others which ought to be avoided.

Some shameless self-promotion: On Technology and Society

Nicholas’s public question if there is a book on what the old theorists thought about technology offers a tempting opportunity for some “shameless self-promotion” that I was nearly too modest to seize. But in the pre-ASA mode that nearly every sociology blog I read is in at the moment…well, I´ll jump at the chance: I wrote a book similar to the one that Nick requested – only (sorry) in german and not outlined as a list of old scholars thoughts, but as a sociologized concetual history of explaining the relationship between technology and society.

The usual story is that there was first technological determinism, then social constructivism – a story of a big STS success. But a closer look reveals that the two underlying modes of explanation – technicism and cuturalism – are with us for at least 150 years. This conceptual dichotomy, already established in philosophy and early social theory (Kapp, Marx, Durkheim, Weber), enforces during a first crisis of modernity in the first decades of the 20th century a first explicit version of technicism (Veblen, Dessauer) and a first version of culturalism as a reaction to it from the 1930s (Spengler, Gilfillan, Mumford) on. As once stabilized theoretical artifacts these modes of explanation deal with the social and technical transformations of modernity by attributing them either to an inherent logic of technological development or to major and minor changes of modern society. This leads to pessimistic versions of technicism (Ellul and Jünger) and a critical version of culturalism (Adorno, Horkheimer, Heidegger) after World War II, an anthropological version of technicism (Freyer, Gehlen, Schelsky) and a rationalist culturalism (Marcuse, Habermas) that accompany the stabilization of organized modernities until the 1960s. As a reaction to a second crisis of modernity from the 1970s up to today two versions of technicism and a radical relativist culturalism emerged: while new media technology and digital computing enforces a revival of deterministic thoughts (McLuhan, Postman, Flusser), a large number of empirical work focused on technology assessment was based on modest versions of technicism (Ogburn, Heilbronner, Rapp). The sociology of scientific knowledge (Barnes, Bloor) fosters first a moderate empirical micro-constructivist culturalism (Latour/Woolgar, Knorr-Cetina), then a historical macro culturalism (Hughes, Constant, Dosi) and finally a radical social constructivist culturalism (Bijker, Pinch, Law).

From the 1960s on these theoretical and conceptual differences have been additionally stabilized by bringing them in theory-political as well as real political opposition. By this the basic conceptual distinction between technology and society has been virtually naturalized, it has not been seriously drawn into question since the 1930s. But from the 1980s on a number of attempts have been made to wipe the slate clean in social science theories of technology. These new approaches understand both dynamics and stability of society and technology as entangled and interrelated phenomena in need of explanation. Actor-Network-Theory (Latour, Callon, Law), neo-pragmatist technology studies (Star, Fujimura) and systems theory (Luhmann) are just three of theses new approaches. Despite their differences they teach us to ask and answer questions about the relevance of materiality for the emergence and transformation of the social, about the material and technical mediation of agency and communication, about the importance of artifacts for the formation and change of social institutions and ideas and about the role of technological developments in transforming modernity. To ask and maybe answer them, the discourse on social science theories of technology will have to be connected to the general discourse on social theory, on theories of society and modernity.

 

Public Question: What did the old theorists think about technology?

A while back I asked “does anyone know if there is a good paper or book about what Weber thought about technology?” which is an interesting question in light of new STS work. Marx has been paid some attention by scholars, but here comes the public question:

Is there a book that tackles, one chapter at a time, what the old theorists thought about technology?

This seems like a great edited book or mini-conference or mini-conference that turns into a great edited book.

So, next question:

Is there any interest in a book that tackles, one chapter at a time, what the old theorists thought about technology?

the role of reviews in the social science

All this discussion of new infrastructuralism and the infrastructural relation between “black-boxing” and “institutionalization” makes me think about the role of reviews in social science.

The question is: is a concept what it is, or does how it has been used constitute what it (now) is?

I think this for the following reason: Latour, in Actor Network Theory and After goes into something of a littany regarding the ways that ANT has been used and abused over the years since he and Callon (and Woolgar, honestly) thought of it. Of his many points, a meta-point matters for this post: he basically states that as his ideas spread, they increasingly got used in ways he did not expect and then Latour makes something of a value judgment in suggesting that some research, which appears to be relatively more current as compared to his original works, don’t do ANT right. Of course, Latour takes some blame in saying that perhaps the entire moniker including, A, -, N, and T were not perfect, it still seems like an odd point to hear from Latour. About 120-ish pages into Science in Action, Latour reviews as part of the translation model of how things spread (i.e., diffuse, though he considers this a dirty word) he demands that spread requires change — that a technology, for example, must change as it enters into new hands. This was a counterpoint to diffusion of innovation literature (that he hardly cites) and their supposed assumption that diffusion, as an idea and model, only works so long as we assume the innovation is “constant” over time (meaning that it does not and will not change). Getting to the point: ANT was going to have to change to spread so widely, and the ideas would necessarily be used in ways unintended and perhaps unacceptable to its originators.

Again, then, the question is: is a concept what it is, or does how it has been used constitute what it (now) is?

Latour contributed to the notion of “black-boxing” as much as perhaps any scholar of the last 30-ish years, and given his disappointment with how some of us have used his concepts, does it really matter? (i.e., this value judgment) Or, does it matter more for science not to judge how concepts have been used and instead document how they have been used because the way they have been used is effectively what they are?

Returning full circle, in all this discussion of new infrastructuralism and the infrastructural relation between “black-boxing” and “institutionalization”, what would make the best review paper? Review the terms as if they are not artifacts changing hands in order to conceivably arrive at some core meaning of these concepts, or review how the terms have been used and that this will tell us more about the operational meaning of the terms? 

Institutionalism and Infrastructuralism – some first thoughts on differences

In a recent post following Nicholas’s thoughts about blackboxing and taken-for-grantedness and about what that could mean for discussing the benefits of STS and neo-institutional theory, I asked: what are the difference between institutions and infrastructure? Nicholas and I discussed that today for the first time in detail and we thought it might be worth to post it to see if it makes sense.

Neo-Institutional theory is – to tell a very long story short – based on the question of how many different things (organizations, models, cultural forms) become similar over time. This is the basic problem in DiMaggio/Powell (1983): understanding the institutional isomorphisms if the impetus of rationalization is taken away. It is the problem that Strang and Meyer works on when studying the institutional conditions of diffusion (1993). It´s central focus was – like Powell argued in 2007 – on “the field level, based on the insight that organizations operate amidst both competitive and cooperative exchanges with other organizations.” (2007). DiMaggio (1988) and Powell (1991) both noted that this was a bit too smooth and that institutional arguments would need a more detailed perspective on contestation, fragility and struggles. Nevertheless the framework provided a fresh and new way to understand institutions – so productive that it framed a discipline or two.

Infrastructure studies, on the contrary, focussed on how things can appear systematic and highly-integrated but are actually implemented in many heterogenous, historically contingent local processes (Bowker/Star 1996/1996; Star/Ruhleder 2996). In some ways, diffusion becomes less important as implementation takes a more central role. Infrastructures are not build by system makers, but screwed together loosely by complex arrangements of interfaces, gateways and work-arounds, as Edwards has shown in 2003 and in his fabulous book on climate models (2010). However there seems to be a tendency to focus on normalizing and standardizing effects of classification systems implemented in large infrastructural settings – this is something like the Weberian “iron cage” of infrastructure studies visible already in “Sorting Things Out” and very strong in the works of Hanseth and Monteiro (1997, Monteiro 1998).

The link seems obvious, doesn´t it? Neo-Institutionalism starts looking at heterogenous stuff and finds it similar – too similar perhaps, so that it is missing the complexity of the social world sometimes. But it is a great framework for strong explanations. Infrastructure studies look at systems and find them fragile and fragmented inside. But they seem to lack the “big explanatory” power, which leads to giving up the focus on local multiplicity and emphasizing standardization/normalization instead. Could the strengths of both be added to get a good grasp at the installation of social order under (high) modern conditions?

The new infrastructralism?

Jan-Hendrik and I were discussing this yesterday: “what would it take to create the new infrastructuralism?” In a way, it would be analogous to the new institutionalism, but with a different (although overlapping) set of topics, etc.

We’ll try to post the beginnings of our ideas soon wherein we will see if such a theory could be meaningfully erected.

Technologies/Black-Boxing and Institutions/Taken-for-granted – A question of levels?

This post started as a comment to Nicholas´ post on the museum but became so long that I decided to make it a new post. The debate on black-boxing and “taken-for-grantedness” (or STS & New-Institutional Theory) takles some very important points. They remind me of T.Pinch´s (2008) paper on Technologies and Institutions. Pinch´s focus is on the problem of “skill” and he argues that (new) institutional theory  – for example in its micro form as in the works of Fligstein – is basically only focussing on a very small aspect of ways to make an institution materially stable. Technology, he argues, adds at least a second way because it is black-boxed, not just taken for granted.

The reason why I think this is only partially true is that proposing such an argument is only possible by conflating levels of “taken-for-grantedness”: Sociology knows a whole spectrum of ways to make social order become taken for granted: from the taken-for-granted stream of everyday routines and interactions that make up Schütz´ life world to Mauss/Bourdieu´s techniques of the body that constitute the habitus, from B&L´s (or also Gehlen´s) processes of institutionalization to Foucault´s epistemé, Polanyi´s tacit knowledge and Ryles “knowing how”.Technology, if we follow this route, could be added to the book of tactics to make practice become taken-for-granted – through a very distinct process that has been described as “black-boxing” in which some aspects are packaged and sealed away, others delegated to specialists (for example for maintenance and repair). It is distinct from at least two other tactics exactly because of the form of this process. Embodying habits and skills for example is a process of becoming taken for granted by routine and repetition. Discursive closure is a matter of rhetoric, persuasion and concealment. 

Institutions and Infrastructures I suppose are strategies of “taken-for-grantedness” on a different level: they are hardy stabilized by just one of the discursive, habitual or technologcial tactics just described. Neither can an institution be based just on skills, nor on legitimizing and reglulating discourse, nor on technology. Hey, we know from a long time of STS research that not even technology alone can rely on technology alone. Institutions and infrastructures are complex installations – hybrids or monsters if you will. They both rely on a fragile architecture of “taken-for-grantedness” – plug-ins. What is the difference, then?

ASA blogger party and other ways to meet Nicholas Rowland and Jan Passoth

I want to note the announcement of the annual ASA ScatterPlot Blogger Party! Details here. Short story: Sunday, August 21, 4:30pm at the Seahorse Lounge at Caesar’s Palace. I hope to see many of you there!

Otherwise, I present a paper with Jan-Hendrik Passoth on state theory and another roundtable about state power (power being a dirty little word). Come say “hi” — I’ll be wearing the functionalism t-shirt and have even made one for Jan!

Capture_a

NOTE: *This message was playfully plaigarized from my mentor and friend, Fabio.*

The "lighthouse" (re: Coase) in new institutionalism is the museum

Per recent discussions of black-boxing and institutionalization, a paper that Fabio and I wrote seems useful to remember. It was a piece we wrote to interrogate the use of “the museum” by new institutionalists of the organizational analysis-bent.

Sociologists that study organizations often analyze the museum from a cultural
perspective that emphasizes the norms of the museum industry and the larger
society. We review this literature and suggest that sociologists should take into
account the technical demands of museums. Drawing on insights from social
studies of technology, we argue that museums are better understood as
organizations that must accomplish legitimate goals with specific technologies.
These technologies impact museums and the broader museum field in at least
three ways: they make specific types of art possible and permit individuals and
organizations to participate in the art world; they allow actors to insert new
practices in museums; and they can stabilize or destabilize museum practices.
We illustrate our arguments with examples drawn from the world of contemporary
art.

The black-boxing of a technology is different than the incremental emergence of “taken-for-grantedness”

Below is an excerpt about black boxing and taken-for-grantedness, which I wrote with Fabio Rojas years ago:

The black-boxing of a technology is different than the incremental emergence of “taken-for-grantedness,” which comes from writers such as Schutz (1967) and Berger and Luckmann (1966). They argue that knowledge in everyday life is taken for granted by individuals as reality, “but [that] not all aspects of reality are equally unproblematic” (p. 24).  They provide an example germane to this discussion:

  • …suppose that I am an automobile mechanic who is highly knowledgeable about all American-made cars. Everything that pertains to the latter is a routine, unproblematic facet of my everyday life. But one day someone appears in the garage and asks me to repair his Volkswagen. I am now compelled to enter the problematic world of foreign-made cars (p. 24).

Technologies, they contend, like those related to car repair, get taken for granted over time and through expertise. But by looking at technologies as black-boxes, scholars can gain a fresh perspective on the institutionalization of technology by emphasizing how stable technologies stabilize human networks, rather than how routinization results in a technology’s disappearance for organizational actors. Returning to Berger and Luckmann’s example, as a black-box, automobiles and the networks of dependence and exchange built-up around them are concealed (or ignored). Further, the patterns of human behavior that make a mechanic’s garage the one place to fix broken cars is missed because of the emphasis on how technologies get “taken-for-granted.”

Seems Berger and Luckmann’s (1966) old work on the social construction of reality might find new use distinguishing blacking-boxing from institutionalization. Also, please note: If there was one thing that early Latourian thinking and the new institutionalism in organizational analysis were looking to unlock was how something gets sealed-up and stabilized over time to the point of being taken-for-granted as real, true, or rational. Look back at the early pages of Latour’s book on Pasteur — it opens with the image of Rue Pasteur and asks how did we get this … a good question, no?

Black box" and "taken-for-granted

I recently asked:

When are the processes that bring about a black box the same as those that bring about — in the institutionalist frame — the notion of taken-for-grantedness … and, when are the processes that bring about either of these notions incapable of producing the other?

Seems, upon further reflection, to be an obvious paper, which might bridge some of the thinking about technology and institutional arrangements. Restated as a couple of thesis statements, it would go:

Q1. What circumstances/processes do the concepts of “black box” and “taken-for-granted” both meaningfully capture?

Q2. What circumstances/processes does the concept of “black box” meaningfully capture that the concept “taken-for-granted” cannot?

Q3. What circumstances/processes does the concept of “taken-for-granted” meaningfully capture that the concept “black box” cannot?

Seems like an interesting review piece to see where organizational theorists and STSers have historically overlapped and where they have diverged, with the caveat that each might learn something if orthogonal points of divergence where re-considered in the respective lines of research.

Timmermans has done it again — this time about failures!

I have always enjoyed reading Stephan Timmermans’s research, and his new piece in STHV is no exception.

The abstract, which is below, is not only a good reversal on an old idea, but also solid prose  — worth the read.

Abstract

Sociologists of science have argued that due to the institutional reward system negative research results, such as failed experiments, may harm scientific careers. We know little, however, of how scientists themselves make sense of negative research findings. Drawing from the sociology of work, the author discusses how researchers involved in a double-blind, placebo, controlled randomized clinical trial for methamphetamine dependency informally and formally interpret the emerging research results. Because the drug tested in the trial was not an effective treatment, the staff considered the trial a failure. In spite of the disappointing results, the staff involved in the daily work with research subjects still reframed the trial as meaningful because they were able to treat people for their drug dependency. The authors of the major publication also framed the results as worthwhile by linking their study to a previously published study in a post hoc analysis. The author concludes that negative research findings offer a collective opportunity to define what scientific work is about and that the effects of failed experiments depend on individual biography and institutional context.

Belgian STS network kick-off event * Sept 30th, 2011

For those of you in Europe this might be an interesting opportunity to travel, meet great people and strengthen the international network of STS: Scholars in Belgium are gathering to have a first meeting of the Belgian Science, Technology and Society (BSTS) – a network that started

“… in 2008 as an ad-hoc academic platform, the BSTS network enables STS researchers in Belgium to share with one another their research interests and disciplinary perspectives and to foster collaboration across different fields and locales. The network now extends its hand beyond academia and beyond Belgium to engage an international community consisting of people from research centres, industry, policy making and other professionals with an interest in cross-disciplinary learning and knowledge sharing.”

Here is some more information on the Belgian STS Network.

 

 

 

Evading efficiency arguments is what sociology is good at

Why is sociology so affraid of efficiency arguments?

After re-reading this great old piece …

Oberschall, Anthony, and Eric M. Leifer. 1986. “Efficiency and Social Institutions: Uses and Misuses of Economic Reasoning in Sociology.” Annual Review of Sociology 12:233-253.

… I was reminded that sociology has made something of a history of explicitly avoiding extant arguments regarding efficiency.

Marx, for example, rejected efficiency and emphasized exploitation of labor by the bourgeoisie. Given Marx’s economic theory of value and labor, exploitation was the only way to get more value than was invested by fairly paid labor (e.g., the wage from six hours a day is enough to feed and clothe a family of four for a day; however, without the means of production workers might work eight hours per day rather than six for the same wage since they have no bargaining power). Thus, the creation of surplus (i.e., profit). However, a falling rate of profit was expected as capitalists competed with each other in hopes of attracting more and more laborers, which ultimately cut into profit margins. Enter machines. The primary problem, however, for Marx was that machines could bring no real efficiency or profit; machines are incapable of producing profit (or only for a short time) because all competitors will soon have them. At this point, each capitalist is back to “square one.” Simlutaneously, the price of machines goes up and the price of products goes down. Thus, profit has to fall and efficiency is lost (however, according to contemporary economics: profits fall within the business cycle, but not across cycles, showing some flaw in Marx’s thinking). Still, as it happens, “Machinery and improved organization provide … [enhanced efficiency] too, because they increase the productivity of labor” (p. 42, Collins and Makowsky 1998).

Also writing at a time of great scientific and industrial progress, Durkheim, in contradiction to rationalists, finds “society … a ritual order, a collective conscience founded on the emotional rhythms of human interactions” (p. 102, Collins and Makowsky 1998). Even though specialization (in the form of organic solidary) hold society together (despite the loss of mechanical solidarity), efficiency seems to play a lowly role in Durkheim’s models of integration.

Weber seems the closest for allowing efficiency some room to breath. Still, above efficiency was his deep-seeded concern over organizational stability. The organization of groups stabilized through strong personal ties (patrimonialism) or by setting rules (bureaucracy), which follows broadly from Tönnies (Gemeinschaft and Gesellschaft, respectively). Domestic or personalistic organizations, like that of a family estate, wherein close friends and family members made-up the bulk of enterprise employees and related services (be they war, trading, tax collecting, etc.). Of course,  personalistic forms of organization are not easy to control and seemingly inefficient (as compared to, for instance, a bureaucracy). The organization of communications is poor—what starts as a direct order at the top chain of command ends up a rumor, a whisper, or nothing at the bottom rungs. Under certain circumstances, innovation is ignored or resisted falling back on tradition—doing as was done the last time or as far back as can be remembered for sake of personal ease and safety from criticism from above. Authority from the top dissipated over time as their top assistants grew in power and potentially ceded.The bureaucracy would fix all that by establishing rules and regulations to guide individual behavior even in the absense of authoritative oversight. While bureaucracy can be interpreted as an efficiency argument, Weber’s focus on cultural underpinnings of groups like Protestants as shaping historical achievements along with his works on Judaism, China, India, etc., the library of work leads me to believe that culture, rather than efficiency, was the root of his arguments.

There are no doubt many more — certainly the old functionalists like Selznick and Merton (who showed the disfunctions of bureaucracy) would fit right in…

Social significance of gap analysis

Although I’m not entirely sure of the implications for infrastructure, gap analysis is commonly used and seems promising as a research site — and yet, despite widespread use in management and implementation of software, gap analysis is an untapped and unappreciated workflow analysis technique in research.

In general, gap analysis takes three forms, which document the gaps between two states: current versus future, expected versus actual, perception versus delivered. The difference between the two states defines the gap, and from such assessments others are possible such as benchmarking (Boxwell 1994).

The first form is a map. Cartographic representations are mainly utilized in lean management to chart flows of raw materials – including information – currently necessary to make a product or service available to consumers so that they can be assessed for flow and waste. Once areas for improved flow and reduced waste are identified, analysts draw them into a future state value stream map. The differences between the two states define the gaps, which orient work toward that future condition. The map gap was designed at Toyota (Rother and Shook 1999).

The second form is a step chart. Temporality is built-into the step chart, which also identifies and compares current practice and desired future state for the performance of a service or product. Brown and Plenert (2006:319) provide a good example of where a step chart might solve the gap between expected and actual states: “customers may expect to wait only 20 minutes to see their doctor but, in fact, have to wait more than thirty minutes.” Step charts chart the steps necessary to move from current practice to future practice (Chakrapani 1999).

The third form, which is most appropriate for working-around packaged software, is a cross-list. Such analyses are most routinely undertaken in consumer research wherein gap analysis refers to the:

methodological tabulation of all known requirements of consumers in a particular category of products, together with a cross-listing of all features provided by existing products to satisfy existing requirements. Such a chart shows up any gaps that exist (n.a. 2006).

Once cross-listed in table format gaps make themselves obvious and their analysis points to unmet consumer demand which new or poorly marketed products might fulfill. However, prior to the establishment of a cross-list, consumer expectations and experiences must be gathered, for example, by focus-group interviews. Once collected and made to populate a cross-listed table, according to Brown and Plenert (2006:320), “gaps can be simply calculated as the arithmetic difference between the two measurements for each attribute.”

The Journal of Science Policy and Governance

The Journal of Science Policy and Governance

http://www.sciencepolicyjournal.org/

Now accepting rolling submissions!
The Journal of Science Policy and Governance is an interdisciplinary journal that seeks high-quality submissions on emerging or continuing policy debates. Current students (undergraduate or graduate) and recent graduates within three years of earning a degree (bachelors, masters, or doctoral) are eligible to submit. We seek to publish articles on a variety of policy areas including: scientific research, engineering, innovation, technology transfer, commercialization, bio-medicine, drug development, energy, the environment, climate change, the application of technology in developing countries, STEM education, and space exploration. Submissions on other topics are also welcome as long as they relate to the theme of science policy and governance. The Journal strives to publish articles in a timely manner to ensure that publications can be considered in the context of current policy debates.

Please see website for submission guidelines. Questions and/or submissions should be sent to jofspg@gmail.com.

Painting infrastructure

I wonder if graffitti could be a “strategic research site” for scholars of art, sociology, and infrastructure. In the New York Times today, a story line reads “Cities Report Surge in Graffitti” (see also these slides).

An upturn in graffiti has renewed debates about whether it signifies alienation in struggling areas or whether its glorification contributes to urban blight.

It raises an issue I have not yet heard discussed and that is that infrastructure might be modified through simply changing its exterior appearance, in this case, via graffitti. In a way, this would imply individuals “tagging” infrastructure as an indicator of, for example, its quality, which would then contribute to, if not certify, the current state of affairs. I guess I don’t know of any research that theorize the influence on infrastructure of is “wrappings” (shall we say).

Winnie the Pooh and infrastructure?

15poohpiccombo-articlelarge

Okay, less about infrastructure, but a question has been brewing and I cannot seem to find a suitable answer. In recent years, Pixar has made millions with the following Disney formula:

Reformulate … [for example, Alvin and Chimpunks or Smurfs] in 3-D, give them a skateboard and sunglasses, add some dance moves and inundate children and their nostalgic parents with advertising.

But this seems to fail when it comes to Winnie the Pooh, or there might be a recency effect where consumers are tired of eating the same old rehashed 3-D stories. Either way, Pooh is going to hit the silver screen handdrawn.

So, why might it be that Pooh can’t go 3-D?

From "forces of production" to "forces of customization"

A new line of research might open-up if we read David Noble‘s Forces of Production again and ask “what is the relevance for contemporary packaged software”? Noble, who recently passed-away last December, wrote what is arguably one of the best books in STS’s past about the role of managerial power to determine the direction of technological development, much of which is accomplished by selecting one technology over another to foster toward future development. Additionally, Noble keyed us all to the idea of the “path not traveled” wherein we consider “what might have been” had another road been traveled (i.e., another technology [or no technology] been selected).

In some ways, I think Noble’s work appears old-fashioned to new scholars (despite his excellent empirical material). But maybe not.

If we can extend his ideas about managerial power being augmented by selecting one technology over another toward an analysis that predicts that managerial power is instead augmented by iteratively selecting the ongoing customized form of a flexible technology (i.e., an ongoing process rather than conceptualized as a nominal, usually binary, decision breakpoint).

rethinking workarounds

At least since the work of Les Gasser (1986:216), the act of working-around (or jury rigging) and the resulting workaround (or kluge) has been good fodder for Science and Technology Studies (STS). In the transition from building administrative software in-house to purchasing packaged software solutions from private market vendors, the workaround has received renewed attention by scholars. And rightly so. These are pressing matters given the widespread use of packaged software, the near irreversibility of implementations projects once initiated, and the reported high probably of dissatisfaction following installation.

Workarounds are commonly employed to localize, maintain, and extend software programs, especially when it is necessary to coax occasionally suboptimal implementations into functioning properly as the systems age. Still, workarounds have their limitations; they grow brittle over time, but are crucial for freeing users from restrictive or incomplete systems. Research on packaged software, however, challenges the notion that systems are still worked-around. Designers of packaged software anticipate user modifications. Embedded and increasingly inter-organizational actors now determine when a work endeavor is or is not defined as a workaround. Pollock (2005), examining a case of package software being implemented in a university setting, shows how the boundary between users and designers is relationally dynamic rather than static. This is clear when, for instance, local users share code with software designers hired by vendors, but also when designers distance their responsibility over specific user problems by categorizing some problems as local (rather than the general concern of numerous implementing organizations) (505, 503). Because packaged software appears to presuppose that adopting organizations will participate in the design process by modifying the software for local use, Pollock seems to have updated Gasser’s (1986) notion of working-around. Pollock calls into question Gasser’s (1986:216) original formulation, specifically, that working-around implies “intentionally using computing in ways for which it was not designed,” given that the tools to work-around are already embedded-within packaged software.

Research also suggests that workarounds are not as freeing as previous literature indicates. Modifications do not only free local users from the constraints of technology; they also create tensions inside and between organizations. For example, modifications that are difficult and therefore slow to establish create tension between employees and their supervisors (Pollock 2005:507). Likewise, some modifications generate conflict between support desk operators and local programmers concerning who is responsible for coaxing the packaged software into operation (506). Also examining packaged software in higher education, Kitto and Higgins (2010), through the lens of governmentality, show how a university wrested control over their newly adopted ERP by modifying it. Surprisingly, once modified, the resulting system did not appear to create renewed autonomy for employees. Instead, control over the system simply shifted from the monolithic vendor to a more local supervisor charged with maintaining jurisdiction over the host of new modifications.

In the move from homegrown to packaged software in higher education, traditional interpretations of the workaround seem to be transforming, and with it, I imagine, the precursors of workarounds – the “gaps” in system operations that workarounds necessarily bridge … although there is scant research on where these workarounds come from.

Diaspora* as an alternative to google+ and facebook?

Diaspora* is another social networking site (nearly a year old) … just like facebook, with one important exception. Here is an excerpt from a New York Magazine article:

… as the name suggests, their project was intended less as an imitation of Facebook than as an escape route from it—a path to freedom for those who had come to fear the dark side of the social network. In the years since Facebook launched (and long before Aaron Sorkin decided to take a whack at it), the service had begun to feel unsettling, sinister, less a benign link to friends and more a stealth database, open to all takers.

Diaspora*—if it worked—would do everything Facebook did. But users would own their data.

Will google+ close the coffin on Diaspora*?

Special Issue in Science Studies accepting papers on "Patient 2.0"

As an outcome of Track 026 of the last EASST meeting (in Trento, IT), the organizers have:

been working to edit a special issue on “patient 2.0”. We are pleased to announce we’ve been hosted by Science Studies (http://www.sciencestudies.fi/) as guest editors of a forthcoming publication on the theme.

The journal has a long-standing reputation for publishing high quality articles in the field of Science and Technology Studies since the end of eighties. Science Studies is an Open Access journal and we invite you to have a look at their last issue to better grasp the kind of submissions they welcome.

The call for paper is in attach and, as you will notice, it is an evolution of the track’s cfp. (the call is also available online at http://www.sciencestudies.fi/node/2070). If you have not already published your work elsewhere we encourage you to submit your paper for evaluation before 31 January 2012. Of course you are free to submit a completely new work as far as it is consistent with the call. All the papers will be anonymously reviewed and evaluated jointly with the editorial board of the journal.