Networks have risen to prominence as intellectual technologies and graphical representations, not only in science, but also in journalism, activism, policy, and online visual cultures. Inspired by approaches taking trouble as occasion to (re)consider and reflect on otherwise implicit knowledge practices, in this article we explore how problems with network practices can be taken as invitations to attend to the diverse settings and situations in which network graphs and maps are created and used in society. In doing so, we draw on cases from our research, engagement and teaching activities involving making networks, making sense of networks, making networks public, and making network tools. As a contribution to “critical data practice,” we conclude with some approaches for slowing down and caring for network practices and their associated troubles to elicit a richer picture of what is involved in making networks work as well as reconsidering their role in collective forms of inquiry.
“For decades, social researchers have argued that there is much to be learned when things go wrong.¹ In this essay, we explore what can be learned about algorithms when things do not go as anticipated, and propose the concept of algorithm troubleto capture how everyday encounters with artificial intelligence might manifest, at interfaces with users, as unexpected, failing, or wrong events. The word trouble designates a problem, but also a state of confusion and distress. We see algorithm troubles as failures, computer errors, “bugs,” but also as unsettling events that may elicit, or even provoke, other perspectives on what it means to live with algorithms — including through different ways in which these troubles are experienced, as sources of suffering, injustice, humour, or aesthetic experimentation (Meunier et al., 2019). In mapping how problems are produced, the expression algorithm trouble calls attention to what is involved in algorithms beyond computational processes. It carries an affective charge that calls upon the necessity to care about relations with technology, and not only to fix them (Bellacasa, 2017).”