Alright, as promised, I will provide you with a little report on my experiences with and impressions of this year’s 4S conference in Sydney, Australia. The conference’s overall theme had been “TRANSnational” STS, something that was well reflected altogether by the conferences content and also by the ‘performance’ of its participants, with many of them traveling very far to make it to the conference venue. The 4S took place, once again, [Read more…]
Before I post a short report on the (spoiler so far: great) 4S conference 2018 in Sydney, Australia, I want to share a thought that had been inspired by a very compelling discussion I had at the conference with Nicholas Rowland and Barbara Bok but also is somehow a product of the whole conference itself and its vibrant discourses. [Read more…]
Form, Matter, and Pragmatics
The ascent of computer simulations in scientific modelling coincided not only with the rise of the digital computer as such but also with the advent of data visualisation technologies capable of representing the dynamics of a “target system”. In accordance with Mary Hesse’s distinction between formal and material analogies in scientific modelling, computer simulations appear to assume a peculiar dual nature: First, simulations typically are computational realisations of underlying formal models of their target systems, and as such help to determine their empirical correctness. Second, simulations typically comprise an aspect of material modelling, so as to make relevant properties of the target system perceivable. How do these two aspects of computer simulations interact, both on epistemological and pragmatic levels?
Being a relatively recent method of scientific inquiry, computer simulations are an even more recent topic of social and philosophical inquiries into science and technology. There are arguments for their genuinely “motley” epistemological character, for their characteristic “epistemic opacity”, and for their fundamentally dynamic nature, but also for treating them as experiments ‘in silico’. However, these debates only take limited account of simulational practice and its epistemic import. Conversely, from an STS perspective, computer simulations can be understood as complex networks of computer software, hardware, and infrastructures as well as of institutions, communities and practices. However, it remains to be demonstrated how these networks and practices affect the modes of knowledge thus produced. This project aims at an integration of STS and philosophical perspectives in order to address the epistemological and pragmatic aspects of computer simulations as an integrated whole.
The two-part working hypothesis to be tested in the present inquiry is this: First, one model might be realised in a variety of simulational fashions. The formal model bears the primary responsibility for representing the target system, whereas the computational core and the empirical rendering of the simulation are underdetermined by that formal model. Second, the criteria that join the formal and material aspects together are essentially pragmatic. Simulations are chosen in accordance with what shall be communicated about the target system within a community of researchers. In typical contemporary research settings, that community is heterogeneous in disciplinary, topical and technological terms. A computer simulation will be vindicated if and when it is successful on both levels. Underdetermination on the first is resolved on the second level, and problem-oriented, pragmatic empirical adequacy outdoes more foundational epistemological criteria of success.
How Process Management Systems reconfigure Sociotechnical Assemblages
When Frederick W. Taylor published “The Principles of Scientific Management” in 1911, his key goal was to optimize work processes so that they would be as little time-consuming as possible. His principle held that inefficiencies should be made visible through practices of measurement. In order to control the performance of labor, the standardization of efficient work processes accompanied the process of standardizing the collection of organizational information in order to capture in numbers and metrics the work processes that laborers performed. The key operation – measuring – was used, on the one hand, to construct ideal processes (goals to be reached) and, on the other hand, to control whether those goals were reached by the workers.
Today, process management systems are used in a great number of organizations to realize the standardization of workflows as well as to control their efficiency. Their key procedure is still numeric registering. While Taylor’s systematic approach involved a combination of distributed practices, Salesforce is a process management platform that connects these practices digitally. In order to render organizational structures and courses of action available for algorithmic governance, the standard software package has to be fitted to the specific organization. This fitting process stands at the center of the research project: Which organizational and algorithmic changes have to be made in order to “make Salesforce work”?
The process of implementation will be regarded as a mutual effort of adaptation. It is necessary to look at the modifications that must be made at the level of software as well as at the organizational level.
In this research project, a media-archaeological approach will be used in order to shed light on both the technical structure of Salesforce and the epistemological implications that lie in its code. This approach will be enhanced with ethnographic fieldwork following the process implementing Salesforce, interviews with different actors, as well as document analysis of best practice papers and software version histories.
Regimes of Data Processing and the Struggle for Privacy
Using digital information to organize (social) life seems to be the preferred modus operandi of contemporary societies. The operational process that leads to the digitization and digitalization materializes through forms of generating, accumulating and processing – everything and everyone is transformed into data. However, this process of datafication and the underlying sociotechnical arrangements – such as big data, algorithms, internet of things, wearables, cloud infrastructures, prediction logics etc. – clearly generate sociopolitical challenges directly bound to the collection and utilization of data. Frequently, datafication is implicitly and explicitly envisioned as a form of techno-social progress; its power to capture and connect, transform and visualize is perceived as an opportunity to enhance the status quo. Contrary to this, datafication is conceptualized as a phenomenon containing unpredictable risks for social life. In this regard it is often seen as a supporting mode to quantify, track, predict, monitor, analyze and eventually restrict the agency of subjects and populations alike. Obviously, the cultural-technological phenomenon in question is not a category that can be conceptualized as neutral but has to be seen inherently political: manifold interests and perspectives are inscribed into it.
The latter is a narrative that is claimed and addressed by a movement of data critical actors. Driven by visions of security, democracy, privacy, secrecy, transparency and empowerment, this avant-garde constantly works on possibilities of criticizing, negotiating and stabilizing current data practices.
With my dissertation project I will take these actors and their tinkering on answers to the current uncertainties into focus. I will develop a definition of modern data critics that aim at reconfiguring datafication’s underlying processes, infrastructures and technologies. I therefore examine questions data critical actors raise, their visions and social as well as technology based answers in form of expressed concerns and created sociotechnical artifacts. These practices and artifacts (e.g. applications) implicitly carry inscriptions of problematizations as well as specific ways of addressing a problem and producing one (seeming) solution for it. Some of the questions that guide my research are: How are data-related problematizations and perspecitves translated into socio-technical answers? How do these answers contribute to the (in)stability of our current information society?
On Creating and Transforming Problems for ‘Optimizing’ Solutions in Software Culture
While it seems quite easy to find it plausible that people come up with clever ideas and solutions for certain problems, given that those problems are actually there, in front of the person who is dealing with the problem, it is quite hard to imagine the process of coming up with, finding, creating a new idea seemingly out of nothing. The problem becomes a mystery, when we shift the scenario from the handyman who repairs plug sockets with rubber rings to the alleged ‘genius’ who startles from sitting in the green, having a game changing idea; or, in between those extremes,, engineers finding out what to do next in their offices, far away from their workshops.
So, here is my research question: How are problems created? Furthermore: How to create the problem? How to transform it to a problem with solution? How to process then further into the performative construction of the concrete solution?
To do so, I will encounter scientific literature from different disciplines, not limited to humanities either, and look for the pop-cultural and everyday discourse of creativity and technology development. Since I do not want to just contribute to the investigation of this discourse, but to understand how those mysterious acts of technological creativity are done, I will do ethnographic fieldworks in the fields of hackathons, which are strongly emphasizing the role of creativity. Furthermore, I am looking for contrast cases, like other projects in creativity research, and cases of technological inventiveness in techno-science scenarios.
After all, creativity is an important issue; not only regarding its discursive role, but its actual genesis and process. The digital culture, in which this investigation of technological creativity and developments takes place, is not only a specific framework for its technological processings, but a habitat which keeps automatizing more and more trivial tasks, and proving more and more tasks to be trivial. Insofar creativity is on the rise, we cannot possibly know enough about it. Whether it could help to automate or enhance even creativity or it could just enable us to do better the things that are left for us to do, in any case, we have the chance and the priority to get known to the capacities we can have through the world we live in. Eventually, this might correspond with an already ongoing transformation, or 'hackathonization' of post-heroic, intuitive process-management and the independent belief in the value of just creating new things/needs/questions/ideas/etc.