Abstract
Graphical abstract
1. Introduction
2. The sociology of Niklas Luhmann
3. Fractal geometry
4. Communication and fractals
5. Concluding remarks
Acknowledgements
References
Abstract
Social theory faces new challenges as society changes. The question is not only if social theory can keep up with –and account for– social transformations, but also if it can avail of social changes (in this case, the current dominance of digital media) in order to reinvent itself. The most attracting features of modern digital resources, such as Big Data, lies on their tools of analysis. But it just might be that the most promising contribution to social theory resides in the epistemological foundations backing these developments and the conceptual tools they can offer to rephrase epistemological issues. In this sense, the function debuggers play with regard to their target programs could shed new light not only on the process of knowledge formation, but also on the process of theory-improvement/ updating. The present contribution intends to show how theory-debugging might work, by taking the sociology of Niklas Luhmann as a target program to be debugged by fractal geometry with the goal of delivering an enhanced version of system theory. It concludes by arguing for the plausibility of describing communication as a natural fractal susceptible of being modelled by some kind of fractal set, and for how communication media are responsible for the fractal structure of communication along sociocultural evolution.
Introduction
Once algorithms were an abstract and weird idea developed by philosophers of mathematics and few people knew about them and what the word actually meant; nowadays it is hard to think of someone who have not heard the word at least once, not to mention the fact that, unlike in the past, algorithms are doing things for us all the time. A data revolution has shaken the world changing in unexpected ways how we interact with other human beings and with this new ecology of artificial forms of intelligence (an interface some call Global Brain (Heylighen and Lenartowicz, 2017)), how we do business, how we teach and how we learn, how we read the news, how we search for information, in short, everything we know, the things we do (Mayer-Schonberger and Cukier, 2013) and even the pace of life (Wajcman, 2008). And science, of course, is not the exception. Doing science in an information society brings many questions to the fore. To what extent are Big Data, Data Science, AI, Deep Learning, among others, changing the way science constructs knowledge, (i.e. to put forward and test hypothesis, construct theories and deal with the problem of the nature of knowledge and the foundation of knowledgeclaiming statements, create concepts and objects of knowledge, and so on)? (Berthon et al., 2000; Boyd and Crawford, 2012; Kitchin, 2014; Lash, 2002) Can algorithms make the scientific method obsolete? (Anderson, 2008) Can algorithms substitute humans in the process of abduction by suggesting the researcher an array of emergent patterns out of his data base? Are the new statistical tools of Data Science reinforcing and radicalizing empiricism?