Category

Filtering the Future

Harvest now, decrypt later

Written by Sjoerd Bakker
March 17, 2021

In the coming decade, quantum computers will likely break current modes of encryption. This is not necessarily a problem for future communication and data storage, as cryptography can be made (practically) quantum-proof, but it will retroactively expose data we store and send today. That is, intelligence agencies and hackers are harvesting encrypted data, in the hope that quantum computing will help them to uncover valuable information from it in the future.

Given the fact that quantum computers will be available to institutional users first, governmental agencies will be among the first to decrypt previously harvested data. They will be looking for sensitive or strategic data that could hurt or weaken adversaries. In the long run, as technology becomes available to a broader group of users, non-state actors may take an interest in decrypting data that they can use to blackmail their victims.

While this probably won’t affect the common man so much, as most of us are not of particular interest, high-value targets may have to start worrying about the consequences of having their data exposed.

Burning questions:

  • How much information is truly sensitive after being stored for many years?
  • To what extent does the responsibility to prevent such malign uses lie with developers of quantum computers (and algorithms) and future providers of quantum cloud computing?
  • What will society look like if all of our online data were decrypted and for everyone to see?

Today’s class of uncontrollable technology

How can we understand the rising complexity and uncontrollability of technologies? Here, we explore and compare the cases of synthetic biology and artificial intelligence, two disruptive technologies that produce outcomes that are not fully controllable or predictable and whose impact on society will only grow in the following decades. These disruptive technologies will furthermore challenge basic aspects of human self-understanding, including our notion of autonomy.

Our observations

  • “The world is getting more and more complex.” Although it is rather a non-starter, the expression is widely used in different contexts today. As we are trying to get a grip on everyday changes that we witness, the expression needs more specification. In our research, we explore uncertainties in the geopolitical, socio-cultural and technological realm and how they are influencing and reinforcing each other. For instance, how the internet is influencing global power dynamics.
  • When looking at the rising complexity in new technologies in particular, the challenges and fears concerning their complexity are often related to the feeling of losing control over our own technological inventions and the consequences this would have for our society. Science fiction often tells us stories of technological innovations getting out of hand (Frankenstein), computers that are controlling us (The Matrix), or human-made viruses that threaten the entire world population (The Walking Dead). The rising complexity of technology, and more specifically, the uncontrollability and unpredictability of today’s technology is explored here by introducing philosopher Jan Schmidt’s concept of “late-modern technology”. Instead of trying to explain the uncontrollability and unpredictability of individual technologies, the concept helps us to see them in a wider class of technologies showing the same characteristics, such as the seemingly different innovations in AI and in synthetic biology (the scientific domain that involves redesigning organisms for specific uses by engineering them to have new abilities, such as cell factories).
  • According to the classic-modern view of technology, uncontrollable and unpredictable outcomes of technology are undesirable. Man gains control over his environment by making use of technology. Constructability and controllability, including a clear input-output relation, are key in this regard and technology was traditionally equated with and defined by stability. Think of cars that are made in a production line.
  • By contrast, late-modern technologies are a class of technology in which this idea of stability is abandoned. Late-modern technologies confront us with our ideas about autonomy and control over our own inventions. Autonomy can be regarded as the most celebrated outcome of the Enlightenment and makes up the foundation of moral philosophy that is still dominant in today’s moral theory.
  • An entire class of “autonomous” technologies is in the making or has already been deployed, from autonomous vehicles to autonomous weapons. These increasingly guide our behavior at a time when human human autonomy is challenged by the distraction and information overload in our digital age. As we described before, technological decisionism confronts us with the fact that our decisions will increasingly be supported, if not steered, by artificial intelligence. As non-living or non-human things are increasingly actively participating in and shaping our environment, we cannot ascribe autonomy to humans only anymore, as is acknowledged in the theory of new materialism.

Connecting the dots

When thinking or talking about technology, we often use words that describe the mechanical characteristics of technology. Not seldom is technology in books or movies depicted as machines or robots. Indeed, in our language this machine image is also widely present. The machine metonym is closely connected to an ontological assumption: a machine is assembled by humans, built up from parts to a whole, it has controllable and predictable characteristics. This is a classic-modern view of technology.

However, when turning to present cases of technological advances such as synthetic biology, this becomes problematic. Even if the goal was to create synthetic organisms as controllable and predictable entities, a living organism, whether “natural” or a product of human intervention, by definition evolves and interacts with other organisms and the environment in multiple ways. These characteristics do not fit the part-whole view and make organisms less controllable and predictable than machines. This complex interaction of technology with other technological or living systems creates complexity. In addition, organisms reproduce and grow, something that the machine metonym does not imply either. As a result, using machine metonyms might blind us from the implications of creating new life forms, such as synthetic organisms, as happens in synthetic biology. In the case of Artificial Intelligence, similar problems arise when using the machine metonym. AI, and more specifically machine learning, is confronting us with a case of technology that shows more autonomy than the machine metonym suggests. So, what are these cases of technological innovation showing us? How are they different from technologies that better suit our more mechanistic and predictable view of technology?

Already in 1985, philosopher Hans Jonas envisioned a historically new technoscientific era when technologies would show different characteristics than the previous class of technologies, such as a certain degree of autonomy and limited predictability. In current philosophy of technology debates, scholars differentiate between modern technology, or classic-modern technology, and late-modern technology. We can understand synthetic biology and AI as cases of the latter. Late-modern technologies differ from classic-modern technologies in two fundamental ways.

First, they show self-organization, autonomous behavior or agency properties. In the case of AI, an autonomous system goes beyond the behavior programmed in the initial algorithm, as it can learn by itself from data and environment, its behavior transgresses the initial objectives and conditions set by its creators (i.e. human engineers, computer scientists) and therefore gain a lower degree of predictability. Similarly, an organism created by means of synthetic biology, starts to interact with and “learn” from its environment in a way that makes it hard to predict its behavior. In both cases, the technology autonomously interacts with an open-ended and uncertain context, the real-world environment, and is thus less predictable than technological systems that merely react to human input and are otherwise passive. In that sense, technologies are sometimes regarded as “black boxes”, as insight into their input and output processes is difficult to acquire.

Second, in the case of late-modern technology, the technology no longer appears in its modern way, rather, technological traces are disappearing. Culturally established borders and modern dichotomies such as “natural” vs “artificial” are becoming blurred. For instance, a synthetic cell has an artificial pathway, but shows no traces of technology: it cannot easily be distinguished from “natural” cells. Similarly, the thinking of AI can sometimes hardly be separated from human thinking or decision-making. In 2018, Google gave a demo of its voice assistant calling a hairdresser to make an appointment and shocked the audience when the hairdresser did not notice that she was not talking to a human. Indeed, this novel kind of technology appears human or natural to us. This is what is called the naturalization of technology. However, moral debates about these sorts of technology, such as the debate about acceptance of GMOs, are often still framed in modern terms, with a strict distinction between us humans, the technology we use, and the natural environment.

Late-modern technology is thus difficult to predict and control, difficult to separate from the context and environment of its application, it can be said to “have a life of its own”. The fact that human beings are surrounding themselves with more and more technologies that are less controllable and show autonomous features, inevitably gives us the sense that we are facing greater technological complexity, losing control over our technology and that our notion of autonomy, which we regard as a fundamental human trait, is being challenged. Late-modern technologies such as AI could even undermine our autonomy, as its ubiquitous deployment could steer us implicitly and explicitly in our behavior. As is often the case with new technological developments, late-modern technologies force us to define and reframe values and views that used to be implicit and unchallenged.

Implications

  • Seeing advances in AI and synthetic biology in a wider class of technologies is also helpful in discussing the challenges for both. For instance, in both areas, a centralization of knowledge can lead to negative consequences for society, e.g. that not everyone can benefit from or even be involved in their creation. In AI and synthetic biology, there are efforts to organize knowledge and IP in open-source governance structures, such as the OpenAI initiative and open source seed initiatives for (GMO) seeds.

  • The rise of artificial intelligence or technological decisionism might teach us something about our human thinking. Similarly, synthetically created organisms might tell us something about living organisms. In a sense, late-modern technology can give us insights into fundamental concepts.

Cell factories

Will our use of microbes enable a bio-based future? It is increasingly possible to use and tweak living organisms to produce food, fuel, drugs and materials. Here, we explore cell factories, or engineered microorganisms, to illustrate the ontological and ethical challenges that we will face in light of the rising numbers of hybrids created by advances in biotechnology.

Our observations

  • Cell factories are single-celled microorganisms, or microbes, whose metabolism is synthetically optimized to produce more energy or different substances. In other words, microbes are viewed as production facilities that are engineered with biotechnology to produce for human usage. Examples include chemicals, food ingredients, biofuels, drugs, detergents, paper and textiles. Whereas modern industries manufacture products on the basis of fossil fuels, these cell factories are the building blocks of a bio-based industry.
  • The advances in biotechnology to engineer microbes and create cell factories are in full speed. The question is whether and when these cell factories will be able to produce at industrial scale and economics, so as to accelerate a bio-based industry.
  • One of the major promises of cell factories is the production of food ingredients, such as lab-grown protein (meat, fish, milk, eggs), lauric acid (to replace palm oil), carbohydrates (to replace flour). In the report ‘Rethinking Food and Agriculture 2020-2030’, the authors argue that microorganisms programmed to produce food, or cellular agriculture, are about to disrupt agriculture as we know it for the next ten years. The reason they believe this is that they have calculated that proteins produced in cellular agriculture will be five times cheaper than existing animal proteins by 2030 and ten times cheaper by 2035. Furthermore, these proteins, they believe, will also be more nutritious and healthier.
  • The driver behind this is the rapid advance of precision fermentation. Fermentation farms, the vessels that facilitate the production of these programmed microorganisms, are production systems that are potentially more energy- and resource-efficient, more stable and sustainable than industrial animal agriculture. Industrial animal agriculture as a matter of fact has reached its limits in terms of scale and efficiency, while the worldwide demand for protein is only rising. This technological development will make the plant- versus meat-based diets distinction irrelevant, as food will neither come from animals nor plants, but from unicellular life.
  • Among the parties working in this field, Solar Foods, whose first commercial factory will be running this year, is an example. But Big Food and chemical giants are also heavily investing (e.g. Dupont) in this area.
  • In the past, advances in biotechnology have often raised fears over unforeseeable risks: are we creating little Frankenstein monsters when engineering cells, living organisms that we won’t be able to fully control? We cannot entirely oversee the consequences of industrial biotechnology using cells as factories.

Connecting the dots

Animals and plants play a major role in our society by providing us with food and materials. For a long time, we have held animals to produce meat, milk, eggs, leather and wool, have grown plants to produce grains, vegetables, fruits and fibers. We have become incredibly adept at optimizing these animals and plants, by breeding them in such a way that they comply with our wishes. Indeed, all animals and plants we see at farms today are the result of a long chain of human interventions. The beginning of domesticating these life forms is considered a revolution in the history of humankind. Thousands of years ago, when we started to keep and breed animals and plants to optimize them according to our demands, the way we co-existed with them also drastically influenced our own lives. It meant that humans were able to quit their nomadic, hunter-gatherer lifestyles and settle in places. The agricultural revolution allowed humans to collect more food per unit area and thus the overall population multiplied exponentially.
With the advances in synthetic biology, we might witness what we could call the second domestication of life forms in history. This might again radically alter how we interact with other life forms. This time, however, the focus will not be on visible life forms, such as cows, pigs, sheep, chickens or plants, but on invisible ones: microorganisms, or microbes. Through strides made in the field of synthetic biology and the insights gained in molecular biology, microbes can now be engineered and optimized to fulfill certain tasks, such as producing certain substances. By reading and writing the genome in microbes, or cells, it is now possible to create so-called cell factories. They are a promising way to replace conventional ways of production, as they can be tweaked to produce the specific type of chemicals, food ingredients, biofuels, drugs, detergents, paper, textiles and other materials we need, considering this can be done on a large scale and with a minimum amount of input. Because there are good reasons to believe this will be possible within the next ten years, the question is: will this domestication of microbes change our relation to other life forms?

First of all, it will raise the question how we should view and treat these new life forms. In industrial livestock farming, animals have not exactly been treated as life forms of intrinsic value, raising animal welfare problems. On huge farms, animals often live and die on a production line, in a sense bred to be production units. This industrial handling of living organisms has been questioned for long. It has alienated us from our living world. The current corona pandemic has been labeled a “One Health issue”, which means it is seen as an integral health problem for humans, animals and ecosystems. We are increasingly aware that fixed categories of “human” and “animal” do not always make sense and that we are not an individual species, but that our wellbeing is determined by our relationships with and dependencies on other species. We look more holistically at our living world rather than as existing of separate categories. But if we want to treat other life forms rightfully, where do we draw the line? The claim can be made that microbes have less intrinsic value than macrobes, but since all macrobes are built on microbes (or individual cells), there is no clear line to be drawn. Indeed, the fact that we are more focused on life forms that are visible to us has led us to the macrobist bias in the philosophy of biology. But if we take microbes to have the same value as macrobes, should we grant them microbial rights? Already in 1977, this scenario was explored in a sci-fi story by Joe Patrouch, showing the consequences of full microbial rights, such as a ban on household bleaches as they kill microbes. But today, legislation for microbial life is not sci-fi anymore. The Swiss Federal Ethics Committee on Non-Human Biotechnology has declared that all living beings, including microbes, have minimal value in themselves, implying that all life forms, however small, will have “rights” to some extent.
The fact that we are intentionally interfering in microbial life forms with synthetic biology more often leads us to the second challenge. How do we see these altered life forms or hybrids? These are times when one can find ever-increasing numbers of hybrids that blur the lines between natural and artificial. Cell factories show the characteristics of life forms, such as metabolism, but are artificially engineered. Indeed, cell factories can be seen within a broader category of late modern technology that is increasingly showing signs of autonomy and agency, like AI. These technologies seem to have a “life of their own”. Yet, there is no clear moral framework for these hybrids to come.
The rapid advances in cell factories lay bare the challenges that we’ll have to respond to in the coming years, in order to decide what a bio-based future will look like.

Implications

  • The rapid advance of the commercialization of cell factories will stir up debate on the moral status of smaller life forms and hybrids. This will again create fears about biotechnologies similar to those surrounding genetically modified crops.

  • Cell factories might have important second-order effects on society. First, cell factories would decentralize production facilities, as they can be produced in vessels anywhere. For instance, fermentation farms can be located in or close to towns and cities. And second, cell factories might help to reduce the focus on chemicals we have in our daily practices – fertilizers, synthetic textiles, carbon-intensive materials and substances – and incite the turn to more microbe-based products.

Our collective brain

What happened?

The corona crisis is showing us how alike we are in our thoughts. In early March, on the same day, virtually all of us came to the conclusion that there actually was a crisis and that we should hoard toilet paper and hand sanitizer. A few weeks ago, we also apparently all concluded that it was OK to go out in droves, without any announcement from authorities. Meanwhile, during the lockdown, we all spontaneously decided to work out, ride our bikes and go rollerblading outside. None of these phenomena, trends, hypes or crazes are unique or new, but this crisis does confront us with these new forms of collective thinking.

What does this mean?

There are new rules of play and we have to learn to deal with them. We’re exploring the boundaries between what’s allowed and what isn’t and are keeping a close eye on each other in the process. It’s no wonder then, that we’re learning from each other, mimicking each other or that we just happen to arrive at more or less the same idea. Furthermore, many of these ideas are of course put into our heads by companies and their marketing channels. At the same time, this is also a morally charged period, making us extra aware of our own behavior and others’; who is breaking the rules and who is slightly exaggerating? On the beach, therefore, it’s not just crowded, it’s dangerously crowded.

What’s next?

This heightened awareness of our collective behavior can evoke different responses. We can accept the situation and possibly even derive a feeling of solidarity from it. After all, we’re all in the same boat, going through the same struggle. On the other hand, it’s conceivable that we’ll look for activities and products that still do make us feel original and authentic. This could lead to increased demand for more personalized products and services and, as soon as it’s allowed again, even more exotic holiday destinations.