A Critical Reflection on Genetic Algorithms

As I became more familiar with the discourse of emergence, I developed a greater interest in the evolution of form in the natural realm and its application in architecture using computer algorithms. Many living forms have raised my curiosity towards understanding how they came to evolve, and incited me to learn about their history and the reasons behind their formation. This paper is a critical reflection on the techniques by which computational models and genetic algorithms reproduce organic forms and processes. In this paper, I will examine the concept of adaptation in natural systems and go over some of the scientific methods that were actually created to realize artificial adaptive systems, while pinpointing the advantages and limitations of each of these methods.

Genetic algorithms (GA) were initially developed in the computer science field of artificial intelligence. A genetic algorithm is a strategy by which optimized solutions for a problem are generated using techniques of natural evolution. These techniques include inheritance, mutation, selection and crossbreeding. Genetic algorithms have proven to be very useful in providing satisfying results for a wide variety of problems spanning across different fields of study. In architecture, they are playing an increasingly important role in the design of complex adaptive systems due to the use of digital instruments. The computer is the most widely used design tool in architecture, and algorithms have become a major medium for architects. Likewise, computational processes by which projects are realized involve a considerable amount of scripting, much of which is used in a way that mimics natural forms and the behavior of organisms.

Genetic algorithms are relatively recent and only date back to the early 1970s. The research on the topic was mainly initiated by John Henry Holland in 1975, an American scientist and professor at the University of Michigan, Ann Arbor. Since then, the topic rapidly expanded beyond the realm of computer science to encompass a number of other fields of research. John Henry Holland is known to be the father of genetic algorithms. His publications and articles regarding the subject have made him a pioneer in complex adaptive system. In his book entitled “Adaptation in Natural and Artificial Systems”, he demonstrates the model’s universality and applicability in different fields of study such as economics, psychology, computer science and game theory. Holland argues that the concept of adaptation involves a progressive alteration of structure that is controlled by a basic set of modifiers or operators. Different fields of study have different structures as well as different modifiers. For example, in the field of genetics, the structure is constituted by the chromosomes, and the set of operators involves mutation and recombination, whereas in the field of economic planning, the structure is made by the mixes of goods and the operator is the production of activity. Holland also describes how adaptation requires different performance measures depending on the field. In genetics, fitness is the main measuring criteria, whereas, in economic planning, utility is. This parallelism shows that adaptation is a property which can be applied to any kind of system, whether it’s organic (natural, biological) or inorganic (artificial, synthetic).

Artificial systems with an adaptive behavior towards data have the potential of recognizing patterns and reshuffling them in order to provide optimized (either maximized or minimized) results in a similar way to crossbreeding and the reassembling of the DNA in the event of reproduction. However, I believe that adaptation in artificial systems still does not match the spontaneity and flexibility of which exists in living organisms and biological entities. The capacity of animals to evolve in response to environmental conditions and changes is still, I think, much more developed and far more complex than man-made systems and computer algorithm. The biological process of adaptation is actually achieved by rearranging the genetic material in living organisms in order for them to confront and survive particular environments. Unlike computer algorithms, living organisms have a robust reproductive plan involving a multitude of structures. Computer algorithms are still being developed and their application in architectural design has yet to be fully explored.

There are many reasons why adaptive algorithms seem to be so limited in comparison to biological systems. The following are reasons which I consider primary and critical in that matter. Unlike the DNA which is structured by proteins that are common among all known living organisms (A, T, C and G); computer algorithms don’t share the same constituents. Coding shared information at the roots of different algorithms is inherent in an adaptive complex system. Additionally, in a common algorithm (script, computer program…), the information is always written a priori – before exposing it to a problem - and in most cases, the algorithm is designed to respond to specific types of data. In the presence of an unknown type, the algorithm fails to collect the information it needs and is consequently unable to generate the required answers. Such algorithm is good for tackling single problems; however it is likely to fail as problems get differentiated and may require constant revision to make it respond to different scenarios. Instead of repetitively tweaking the rules of an algorithm, I think it is much easier to make the information carried within it capable of creating intuitive responses in the first place. Most algorithms work by trial and error and select the one optimal answer after having looped through all the permutations, eliminating the ones which are the furthest from the desired performance measure. In contrast to such technique, a better and possibly faster way of finding the optimal solution is actually by interpolating it - that is by estimating its location without necessarily having to revisit all the permutations of the script (e.g. the Grasshopper plugin “Galapagos). The benefit of such technique is that it cuts a lot on loading time. On complex scripts featuring long and repetitive patterns (i.e. recursive functions) small improvements can yield considerable changes to loading period and overall efficiency due to the removal of small time increments that would normally accumulate. All in all, an efficient algorithm should find in itself the ability to estimate results, deduce answers and respond to data in a minimum of time.

Algorithms also need to record history. For that, they must acquire the skill to recognize patterns from previous uses and operations, and respond accordingly. In a similar way to classical and operand conditioning in psychology (as in the case of Pavlov’s experiment on dog salivation), a program can be “conditioned” to respond in certain ways to data. One way of achieving this is by deducing from the number of occurrences of events what to do, when to do it and how. Another way is to couple the operational script - the one in charge of a task - with a learning one. This is particularly beneficial in the case where the fitness criterion is constantly changing or the algorithm is unable to cope with the non-linearity of the experiment to provide reasonable answers. Recorded history contributes to the performance of an algorithm, raises its level of immunity and increases its degree of self-sufficiency.

The concept of recorded history links the notion of “learning” in artificial systems back with biological examples such as the developing of adaptive immunity with vaccination. The benefit of vaccination is that it stimulates an immune response that is more rapid than natural infection. By purposely infecting oneself with a disease, the body is capable of “memorizing” the way it has to confront it. Transposing this example into computation, if the recorded history were to list all the available “learned” methods, a script or program has more chances in making rapid and intuitive responses to a problem.

Furthermore, artificial systems vary in complexity and their recorded history depends on that. Chess for example is a simple artificial system where a few rules on how to move the pieces dictate the composition of a chessboard at any time during the game. Optimizing the next move in chess does not require recording much of the game’s history. Economics on the other hand is a far more complex artificial system, one which involves many structures and criteria. The history needed to evaluate a new move in economics is consequently much longer. In addition, the amount of input data required for each of these two examples is different: in chess, the seed for a new chessboard composition is the moving of one piece, whereas in economics, the seed is a set of moves (changes in supply, demand, etc…), and that makes a large difference in the history that is to be recorded.

In conclusion, computers have the ability to reproduce some instances found in nature only if they are adaptive to data. Artificial intelligence in programs emphasizes the capacity to generate automated responses based on specific conditions. This enables the creation of computational evolutionary models which are directly inspired from biology’s wonderful and countless examples. In architecture, the use of complex adaptive algorithms is on to new horizons, especially when it comes to systems that are adaptive to climatic, programmatic and user-related factors.




Architectural Association
Emergent Technologies & Design
Critical Reflection

2009
© 2023 Lemire Abdel Halim Chehab. All Rights Reserved.