The real destiny of the machine [is] to merge itself with natural organisms.
– Jack Burnham, Beyond Modern Sculpture, 1968
In 1928, R. Buckminster Fuller presented the design for his Dymaxion House to the American Institute of Architects (AIA) in St. Louis. Fuller’s proposal revealed a fully industrialized, aluminum and polymer-based housing unit, highly efficient in its utilization of natural resources and easily deployable in almost any environment. In response to Fuller’s presentation, the president of the AIA published an article titled “Against all Standardization,” in which he passionately attacked the effects of standardization on architecture, and emphatically rejected the very kinds of systems that Fuller had proposed.
Of the various potential obstacles that confronted Fuller’s project (including the fact that it was developed on the dawn of the Great Depression), the most insurmountable was that it jeopardized the professional provenance of electricians, plumbers, construction workers and architects; as well as the numerous, powerful industries that supported each and all of them. In its methods of production and its efficiency of construction and function, it revealed the essential irrelevance and redundancy of the professions that it threatened with immanent obsolescence. As a result, these professions rallied against it as a means of self- and professional preservation.
Yet, this story does not begin with Fuller, nor does it end in 1928. In fact, it has been so frequently repeated throughout the history of architecture, that today the profession might just be beyond salvation. In many respects, architectural production is not much beyond the technological mean that the Dymaxion House endangered almost a century ago, even as technological development has been advancing at an exponential rate. As a result, most people live, work and temporarily inhabit incredibly inefficient edifices, far below a standard of living equivalent with the degree of technological progress that has occurred in the intervening four score plus years since Fuller’s fateful presentation in St. Louis.
The effect of this professional and intellectual intransigence is that the building industry currently accounts for almost half of global energy expenditure. Most of this energy goes to lifetime maintenance; and the majority of contemporary efforts to mitigate the extreme waste of building sustenance are as flaccid as they are antiquated. Low-water toilets, florescent bulbs, passive heating, cooling and lighting systems and solar panels, most much more costly than their less efficient kin, and some simply elements of good design, can only do so much. Furthermore, all have inherent limitations, while some might prove as toxic in the long term as they might be immediately effective.
Moreover, the architectural profession has shown extreme prejudice with respect to the application of emergent technological paradigms to issues of architectural production. Thus, parametric design and Building Information Modeling (BIM) take precedence over what Fuller called the technologies of “doing more with less.” Both of these technologies (however they might be framed by their proponents) sustain the architectural status quo, addressing neither issues of lifetime maintenance, nor those of material culture or construction in any substantive way. However, this does not diminish the prospective potential of much more powerful technological paradigms, or of their potential benefit for what the architect Rem Koolhaas once called, the various “voluntary prisoners of architecture.”
Nano-Robotic Architecture – A Paradigm Shift
In their article, “Can we grow buildings? Concepts and requirements for automated nano- to meter-scale buildings,” Rebolj, Fischer, Endy, et al., describe the potential application of nanotechnology to building construction. They envision a hybrid nanorobotic, nanomaterial process in which a light-filtering, BIM controlled armature directs nanorobots in the construction of carbon nano-tube (CNT) structures, much in the way that a contemporary 3-D printer functions, except at significantly larger scales, and with a significantly more intelligent material. Since the process relies on capturing carbon from CO2 in the immediate environment, and thus releasing Ozone back into the atmosphere, the authors claim that it will reduce carbon dioxide levels, while supplanting it with healthful Ozone, thus reversing the effects of global warming by replacing the protective Ozone layer that human chlorofluorocarbon emissions have depleted.
This article is the culmination of a science that finds its roots as far back as John von Neumann’s presentation of “The General and Logical Theory of Automata” at the Macy Conference of 1948. Given within the context of cybernetics, von Neumann’s presentation articulated the possibility of designing and developing a universal Turing machine capable of self-replication. Von Neumann hypothesized that, given a robot capable of producing any component, and of reading and implementing any blueprint, it was logically feasible to design self-reproducing machines. Von Neumann’s description of such robots (and of their material milieu) was firmly rooted in the highly circumscribed (pre-DNA) discourse of molecular biology of the time, because living organisms were the only “mechanisms” as yet capable of self-replication.
If nanotechnology derives its theory of self-replication and construction from von Neumann’s “Logical Theory of Automata,” then it derives its extreme “microminiaturization” from a short lecture given by the Nobel Prize winning physicist Richard Feynman, titled “There’s Plenty of Room at the Bottom” and delivered in 1960. In this lecture, Feynman describes the feasibility of etching the entire contents of the world’s libraries onto a 100 by 100 matrix of pins. Feynman’s fundamental unit of replication was 100 atoms, and, like von Neumann, he cites the relative efficiency of biological systems at replicating much more complex data structures. He also cites the fact that, at the quantum level, macroscopic laws pertaining to Newtonian (human scaled) physics no longer apply, and that, thus, the effects of gravity would be replaced by those of Van der Waals attraction, a force that the authors of “Can we grow buildings?” cite for its high structural performance as embodied in CNT products.
Feynman also acknowledges that, quantum conditions “represent completely new opportunities for design.” He continues:
Atoms on a small scale behave like nothing on a large scale, for they satisfy the laws of quantum mechanics. So, as we go down and fiddle around with the atoms down there, we are working with different laws, and we can expect to do different things. We can manufacture in different ways. We can use, not just circuits, but some systems involving quantized energy levels, or the interactions of quantized spins, etc.
This shift in “manufacture” is represented by a transition from metallurgical / chemical science, which defines much of contemporary material experimentation today, to nano-scale production based on an almost atom-by-atom construction of materials from elementals commonly found in the environment. Thus, nanoscience is capable of establishing the material vocabulary of buildings at the scale of atomic structure. In the case of CNT, this structure is interestingly based on the geometry of the element C60, or “Buckminster Fullerene,” so named because of its formal similarity to Fuller’s geodesic dome. Fullerenes exhibit the most stable atomic structure of all of the elements, resulting in incredibly strong materials with high bending moduli.
However, since nanotechnological production mimics natural processes, the kind of standardization exhibited in contemporary building material production no longer represents a limitation for architectural design. As Ted Sargent writes in “Nanotechnology: Design in the Quantum Vernacular,” “Nature builds all matter – including life – from the nanoscale; it is restricted to the same atomic building blocks as are we, yet creates infinitely variegated materials and motifs.” The equivalency drawn by all of the authors, but made directly and succinctly in Sargent’s writings, between biological and nantechnological systems opens the field of architectural design, production and performance far beyond that of simple construction, which accounts for only a small portion of the carbon emissions and energy expenditure of the built environment.
Why use such technologically advanced methods of production to re-instantiate an architectural prerogative that has already so abysmally managed the potential contributions of technology to architecture? Why “grow” buildings that embody high structural performance, but which retain contemporary electrical, water and HVAC infrastructures; now simply imbedded in the design through the BIM program that determines the overall aspect of the building? Furthermore, because of the synergistic possibilities between nanotechnological production and systems of biomimicry (possibilities inherent in the discourse of nanotechnology itself), how can architectural performance be based in a new organizational paradigm in ways that reduce lifetime energy expenditure?
Early twentieth century developmental, molecular and evolutionary biologists emphasized the synergistic whole of organismic organization, arguing that such organization resulted in the capacities of organic life not only for self-replication, but also for self-regulation, and the maintenance of extremely low entropy states over time. Cyberneticists, like Ross Ashby, Grey Walter, John von Neumann and Warren McCulloch all argued that the high computational performance of the brain was a direct result of its size and the complexity of its organization. With the ability to design and construct buildings from the same essential elements that comprise organic life, and to do so in ways similar to those exhibited in DNA encoding and protein development, isn’t it then possible to design truly “intelligent,” responsive and adaptive buildings that are highly sensitive to their ecological milieu?
Thus, instead of designing nanotechnological edifices into which traditional HVAC, plumbing and electrical systems are placed, it becomes possible to design buildings that breathe through their skins like frogs, drawing fresh air from the environment and filtering bad air out; the process of air filtration itself could be directed to complex metabolic processes, like those exhibited in plants, by means of which CO2 is transformed into sugars that can be catalyzed, not for growth, but for sustainable energy sourcing. Lighting can become a process of bioluminescence, not unlike that exhibited by lightening bugs or deep-sea aquatic life, and plumbing can be reinterpreted as a capillary system, in which water is drawn from the local environment, from rain and air humidity, and filtered through the organs of the building itself; while waste becomes an integral process of energy exchange, instead of something that is sent elsewhere for energy-intensive processing.
Temperature regulation can mimic the homeostatic processes exhibited by living organisms – a combination of metabolic, circulatory and skin responsiveness to temperature fluctuation. Furthermore, like living organisms, intelligent buildings could contract and expand, based on use requirements and energy demands; self-organizing for self-sustenance. This could all be regulated through an extensive “neural” network, distributed throughout the building itself in the form of nanoscaled conductive micro-tubules, and potentially interfaced through a central neural system by means of which inhabitants could control the development and performance of their buildings.
While this might seem more a vision culled from the annals of science fiction than a feasible manifestation of near-future technological capabilities, as the progenitor of Artificial Life, Christopher Langton has argued, “[Life] is a property of form, not matter, a result of the organization of matter rather than something that inheres in the matter itself.” Since nanotechnology is specifically focused on the organization of matter at the nanoscale, and since, as both Sargent and Langton argue, its fundamental elements are the same as those used in organic life, the future development of nano-based technologies for application in architecture is more a matter of orientation than anything else; it is determined by the models followed more than the technologies used. On the one hand, there is the heavily mechanistic model proposed by contemporary nanotechnology proponents, and derived almost directly from von Neumann’s self-replicating robots; on the other hand, there is the organically inflected model hinted at throughout much nanotechnology literature, but usually under-emphasized at larger scales.
By focusing more on the performance of biological organisms, instead of that of technological systems, it is possible to produce robotic life, android-buildings whose functions mimic those of biological life to the point at which any differentiation between living and non-living systems becomes more a matter of semantics than anything else. These bio-technological complexes would exhibit similar degrees of system integration and functional coupling, creating synergies that result in levels of efficiency far beyond that of contemporary environmental servicing systems, with their insistence on functional separation and system isolation. However, once instantiated, such structures would raise a series of serious questions with respect to how we engaged the nano-bio-technological milieu.
Beyond a certain level of complexity, would we consider such structures “living” perhaps even semi-autonomous organisms in their own right (thus rendering demolition a process equivalent to murder)? Would we re-conceptualize our relationship with the built environment in terms of symbiosis instead of those of exploitation and manipulation? Dependent, in their way, on their ecological milieu, would such intelligent buildings see humanity as a threat to self-sustenance, and thus turn on their inventors, as in so many science fiction tales of machine ascendency, or would such layering of ecological zones result in greater respect for each level on behalf of human inhabitants?
These are questions that have already been raised within the discourses of Artificial Intelligence and Artificial Life, with respect to the possibilities of future machine sentience, if not to architecture in particular; and they are at least as old as Philip K. Dick’s 1968 novel, Do Androids Dream of Electric Sheep, the speculative arc of which originally placed the possibilities of machine self-awareness in 1992, and later in 2021, a date quickly approaching. In many respects, they are legal questions, no more, nor less pertinent than the legal restrictions that define contemporary architectural production in terms of planning, fire and structural codes, as well as environmental servicing requirements. They represent anxious limits, relevant, but also reductive; based in contemporary models of known contingencies, and susceptible to unknown variables. Yet, if nanotechnology functions at what is essentially a genetic level, is it not possible, that instead of demolition, and thus death, one could to simply reprogram buildings to evolve to match emergent needs, and thus eliminate the paradigm of obsolescence propounded by the Futurists a century ago?
Conclusion: Utopia or Oblivion
When Fuller presented his Dymaxion House to the AIA in 1928, many of the industries involved in the mechanical servicing of buildings were still in their infancy; today, they are in their dotage. In the interceding four score four years, they have also become almost insurmountably entrenched, and like other industries (like that of automobile manufacture, and the petroleum-based economies of the twentieth century) overcoming the inertia of vested interests has become exponentially harder. Yet, newer technologies, with even more radical potentials for the processes of architectural production and for lifetime maintenance are emerging.
Since the paradigm shift implied by these technologies moves architectural discourse away from systems of representation and towards those of praxis, it threatens to effect every aspect of the building industry. As a result, the adoption of new architectural paradigms based on emergent technologies will inevitably meet the same kind of professional intransigence that Fuller encountered, and is threatened by the same evolutionary recalcitrance that has sustained status quo conditions in the building industry for almost a century.
However, technological paradigms, like nanotechnology and biomimesis do represent radical new possibilities for the future of architecture, and are consonant with an increasing ethical imperative to reduce the environmental impact of human habitation and the exploitation of the planet’s natural resources. It is unlikely that this change will occur in professions and disciplines that have a moneyed interest in maintaining current production and performance parameters for architectural design. This means that the impetus for change must be shifted either to paradigms emerging in the computer, material, biological and engineering sciences, or to a new generation of architects willing to challenge the foundation of their own profession.
The transformative prerogative of emergent (and convergent) technological and ethical imperatives implies the adoption of new conceptual, representation and physical tools for design, construction and maintenance. It is thus, as much of an architectural transformation, as it is an industrial one, and this might be its inherent strength. As we witness the wholesale collapse of the old industrial paradigm, and the faltering of traditional economies premised upon this model; as we seek to extract every last ounce of natural resources left in shale and sand to resuscitate the lumbering corpse of antiquated institutions and as a temporary salve to contemporary energy requirements, such essential restructuring represents not only a necessity, but also an opportunity to reinvent ourselves as a species, to reinvent our economies, our industries, our habitat and our relationship to the Earth upon which we all ultimately rely.
 Katz, Barry M., “1927, Bucky’s Annus Mirabilis;” from New Views on R. Buckminster Fuller; Hsiao-Yun Chu and Roberto G. Trujillo, eds, Stanford University Press, Stanford, Calif., 2009, p. 32.
 Danijel Rebolj, Martin Fischer, Drew Endy, Thomas Moore and Andrej Sorgo, “Can we grow buildings? Concepts and requirements for automated nano- to meter-scale building;” Advanced Engineering Informatics, issue number 25, 2011; pp. 390 – 398.
 von Neumann, John, “The General and Logical Theory of Automata,” from The Collected Works of John von Neumann, Volume V, Taub, A. H., ed., Pergamon Press, New York, 1961, pp. 288 – 326.
 Feynman, Richard, “There’s Plenty of Room at the Bottom,” Caltech Engineering and Science, volume 23:5, Feb. 1960, pp. 22 – 36.
 Sargent, Ted, “Nanotechnology: Design in the Quantum Vernacular,” Design and the Elastic Mind, Antonelli, Paola, ed., MoMA, New York, 2008, p. 81.
 Langton, Christopher, “Artificial Life,” from Artificial Life, Christopher Langton, ed., Addison-Wesley, Reading, Mass., 1989, p. 41.
Also relevant to this discussion is a quote from earlier in this volume, also by Langton:
The key concept in [Artificial Life] is emergent behavior. Natural life emerges out of the organized interactions of a great number of nonliving molecules, with no global controller responsible for the behavior of every part. Rather, every part is a behavior itself, and life is the behavior that emerges from out of all the local interactions among individual behaviors. It is this bottom-up, distributed, local determination of behavior that AL employs in its primary methodological approach to the generation of lifelike behavior.
Langton, 2-3. Nano- to meter-design is based on a similar bottom-up approach, and is thus amendable to the same kind of highly integrated behavioral coupling.