Alright, as promised, I will provide you with a little report on my experiences with and impressions of this year’s 4S conference in Sydney, Australia. The conference’s overall theme had been “TRANSnational” STS, something that was well reflected altogether by the conferences content and also by the ‘performance’ of its participants, with many of them traveling very far to make it to the conference venue. The 4S took place, once again, [Read more…]
Before I post a short report on the (spoiler so far: great) 4S conference 2018 in Sydney, Australia, I want to share a thought that had been inspired by a very compelling discussion I had at the conference with Nicholas Rowland and Barbara Bok but also is somehow a product of the whole conference itself and its vibrant discourses. [Read more…]
Form, Matter, and Pragmatics
The ascent of computer simulations in scientific modelling coincided not only with the rise of the digital computer as such but also with the advent of data visualisation technologies capable of representing the dynamics of a “target system”. In accordance with Mary Hesse’s distinction between formal and material analogies in scientific modelling, computer simulations appear to assume a peculiar dual nature: First, simulations typically are computational realisations of underlying formal models of their target systems, and as such help to determine their empirical correctness. Second, simulations typically comprise an aspect of material modelling, so as to make relevant properties of the target system perceivable. How do these two aspects of computer simulations interact, both on epistemological and pragmatic levels?
Being a relatively recent method of scientific inquiry, computer simulations are an even more recent topic of social and philosophical inquiries into science and technology. There are arguments for their genuinely “motley” epistemological character, for their characteristic “epistemic opacity”, and for their fundamentally dynamic nature, but also for treating them as experiments ‘in silico’. However, these debates only take limited account of simulational practice and its epistemic import. Conversely, from an STS perspective, computer simulations can be understood as complex networks of computer software, hardware, and infrastructures as well as of institutions, communities and practices. However, it remains to be demonstrated how these networks and practices affect the modes of knowledge thus produced. This project aims at an integration of STS and philosophical perspectives in order to address the epistemological and pragmatic aspects of computer simulations as an integrated whole.
The two-part working hypothesis to be tested in the present inquiry is this: First, one model might be realised in a variety of simulational fashions. The formal model bears the primary responsibility for representing the target system, whereas the computational core and the empirical rendering of the simulation are underdetermined by that formal model. Second, the criteria that join the formal and material aspects together are essentially pragmatic. Simulations are chosen in accordance with what shall be communicated about the target system within a community of researchers. In typical contemporary research settings, that community is heterogeneous in disciplinary, topical and technological terms. A computer simulation will be vindicated if and when it is successful on both levels. Underdetermination on the first is resolved on the second level, and problem-oriented, pragmatic empirical adequacy outdoes more foundational epistemological criteria of success.
Doing Data, Computational Culture and the Machinery of Governing
Not only etymologically “state” and “statistics” are connected. When “political arithmetic” was introduced in England around 1676 and “Statistik” in Prussia around 1749, it was a science of the state. With demographics, public health and crime statistics questions were of regulation became deeply linked to technoscientific reasoning. Since then Data is the blood that circulates through the modern body politic: The development, implementation, and tinkering of methods and practices of data gathering, storing and processing turned governing into a computational endeavor.
The last decade however has seen the rise of data science that moved data driven applications to the core of a new scientific field. Today data comes from many sources, in messy forms, is widely distributed physically and legally and used for various purposes and by various agencies. But not only do new actors and new forms of expertise challenge well-adjusted procedures of regulation and control. What is at stake are the implicit, explicit and implemented categories and models that govern who and what is governed, how to know and access who and what is governed and who or what should and can be the agent and the object of governing. Given the interwoven history of the state and statistics, the turn to data science has the potential to reassemble the machinery of governing.
“Upgrade” is the tentative title of my recent book project focusing on the link between those new and transformed ways of doing data, the new role of computing (and those who do computing, namely: mathematicians, IT engineers and data scientists) and prototypical – or beta stage – developments for a new machinery of governing. The project investigates how the changes in lmethods and practices of data gathering, storage and processing that are linked to the rise of data science are rearranging the models and the daily work of governing; it will also look at if and how in reverse the methods and practices of data science are changed. If the infrastructure of governing counts on and with the practices of rationality of statistics, what happens when those change?
The meaning, chances and risks of social media
The explosive growth of social media in society in general and in science in particular raises assumptions about radical change of traditionally social structures and institutions. Optimistic assessments assume that society and science get more democratic, efficient and in touch with the people. Pessimistic evaluations point out quality loss and disinformation of the public. Sober analyses of the influence by social media in the field of science communication are rare. To allow a well balanced evaluation of the current situation the working group involves representatives from different key stakeholders in the field of scientific communication: researchers from different disciplines, (science) journalists, scientific PR staff, and social media users. The project is organized in collaboration with Acatech, Leopoldina and the Union of the German Academies of Sciences and Humanities.
A central question of the project is how specific characteristics of social media influence the field of science communication. Accordingly, the opportunities of new technologies (e.g. extension of range, expand participation, interactivity) as well as their risks (insufficient quality control, disinformation, mainstreaming, fragmentation) are identified and assessed.
For this purpose, the working group interviewed different stakeholders, established their own project blog and organized a public workshop. Furthermore, the working team will publish a position paper including the important research questions and perspectives as well as an overview of the current state of research and concede some recommendations how to use social media to enhance the opportunities and to reduce some risks of the usage of social media for science communication.
Case Studies of a Blog Portal
Blogs are representing a new environment for sociotechnical communication and creating new conditions for the communication of scientific topics. In contrast to the formal communication systems everybody and every content can - at first glance - take part in the selected cases. In other words: the interplay of technically produced openness of scientific blogs on the one hand and the social underdetermination of scientific blogging practices on the other hand makes it more difficult to recognize (proper) science. Therefore in this new media ecologies the difference between science and non-science becomes more virulent.The lack of useful demarcation criteria between science and non-science in science blogs constitutes the recurrent talking about “real science” and the discrimination of various forms of “bogus science”. Subsequently the scientific blogosphere is infiltrated with debates and controversies about pseudosciences, alternative medicine, esoteric, intelligent design, religion and so on.
What are typical patterns of this scientific boundary work in scienceblogs and what are the main functions of it? Which social groups get together to fight against “bad science” and what are the communicative strategies to marginalize or exclude persons, institutions or topics from the realm of “real science”?
To answer this questions I use reconstructive methods to explore this scientific boundary work in three case studies. Besides the many practical self-reportings or - in contrast - sceptical considerations about science communication reconstructive social research about new forms of science communication is rare. In connection with the case studies the project aims to start theoretical explanations of the boundary work in scienceblogs.
New forms of communicating science increased enourmous in the last ten years in the context of the web 2.0. The classical boundary work approach is one way to analyze new forms of science communication from the perspective of science and technology studies. Despite the fact that science is always accompanied by (historically changing) specific patterns of boundary work the project follows the idea that in some respect the boundary work in scienceblogs is a new phenomenon which grounds in the changing communication ecology based on the digitalization of society in general and of the scientific communication in special.