Next Article in Journal
BBS Posts Time Series Analysis based on Sample Entropy and Deep Neural Networks
Next Article in Special Issue
Delimitating the Natural City with Points of Interests Based on Service Area and Maximum Entropy Method
Previous Article in Journal
Modelling the Hindered Settling Velocity of a Falling Particle in a Particle-Fluid Mixture by the Tsallis Entropy Theory
Previous Article in Special Issue
Scaling Effects of Elevation Data on Urban Nonpoint Source Pollution Simulations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy and its Application to Urban Systems

1
Laboratory for Urban Complexity and Sustainability, University of Nottingham, Nottingham NG7 2RD, UK
2
School of Physics and Astronomy, University of Nottingham, Nottingham NG7 2RD, UK
3
School of Architecture, University of Sheffield, Sheffield S10 2TN, UK
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(1), 56; https://doi.org/10.3390/e21010056
Submission received: 3 December 2018 / Revised: 8 January 2019 / Accepted: 9 January 2019 / Published: 12 January 2019
(This article belongs to the Special Issue Entropy and Scale-Dependence in Urban Modelling)

Abstract

:
Since its conception over 150 years ago, entropy has enlightened and confused scholars and students alike, from its origins in physics and beyond. More recently, it has been considered within the urban context in a rather eclectic range of applications. The entropy maximization approach, as applied by Alan Wilson and others from the 1960s, contrasts with considerations from the 1990s of the city as a thermodynamic dissipative system, in the tradition of Ilya Prigogine. By reviewing the relevant mathematical theory, we draw the distinction among three interrelated definitions of entropy, the thermodynamic, the figurative, and the information statistical. The applications of these definitions to urban systems within the literature are explored, and the conflation of the thermodynamic and figurative interpretations are disentangled. We close this paper with an outlook on future uses of entropy in urban systems analysis.

1. Introduction

Oxford Dictionaries defines entropy in three categories: (1) physical: “a thermodynamic quantity representing the unavailability of a system’s thermal energy for conversion into mechanical work, often interpreted as the degree of disorder or randomness in the system”; (2) figurative: “lack of order or predictability, gradual decline into disorder”; (3) information statistical: “a logarithmic measure of the rate of transfer of information in a particular message or language” [1]. The first definition comes from physics, and it may be argued to be equivalent to a special case of the third, information statistical, definition, applied at the microscopic level [2,3,4,5]. It is the second definition that proves to be a source of much confusion since this figurative sense is often conflated with the more strictly defined physical and statistical senses.
Entropy first entered the corpus of thermodynamics in the 1850s with Rudolf Clausius’s formulation of the Second Law of Thermodynamics, as a measure of the quality of heat energy in relation to temperature, and a characterization of irreversibility. Since then, the thermodynamic formulation has been shown to be equivalent to the molecular statistical formulations of entropy by Ludwig Boltzmann and Josiah Gibbs. In the 1940s, Claude Shannon introduced a statistical measure of “information content” which, due to its obvious similarities to the Boltzmann and Gibbs formulations, was also named entropy. The principle of maximum entropy, first formulated by E.T. Jaynes, states that, this statistical measure, when maximized subject to any known constraints, leads to the most likely distribution [2,6]. This statistical formulation, when applied at the microscopic level, is equivalent to the Boltzmann and Gibbs formulations. However, it can be applied more generally, especially in cases where details are either unknown or ill defined, such as in urban systems or cities.
Cities are complex and have often been viewed through the lens of systems in an attempt to better understand the numerous complex processes and structures that form a greater whole [7,8]. Significant attention is given to a plethora of interrelated urban issues such as segregation, pollution, and access to infrastructure and services, spanning all levels of complexity. The emergent non-equilibrium structure of the urban system depends on the interactions of its actors, whether the structure includes institutions, firms, households, or aggregations of these. The urban subsystems across which these actors interact may be classified in terms of the temporal scale over which they evolve, from slow, such as land use or transportation networks, to fast or immediate, such as employment or movement of people [9,10]. While there can exist local steady states, the system is dynamically evolving and we cannot predict an exact future state, but we may be able to produce models for likely configurations. With the prominence in recent years of ‘big data’ and increasing access to previously uncollected geographic data, new avenues exist for the calibration and testing of such urban models, as well as techniques to process and interpret this abundance of data [11]. It is in these areas that entropy has generated interest, in all its forms, be it as a statistical tool for analyzing these large urban datasets and performing inference where data is missing or incomplete, or purposed toward analyzing and understanding the irreversibility of complex urban processes. This latter focus is motivated by the recognition that cities dominate global resource use, with correspondingly adverse environmental impacts. The United Nations identifies “sustainable cities and communities” as one of its 17 Sustainable Development Goals [12].
The physical definition of entropy typically yields interpretation of the city as a thermodynamic system, identified within the literature particularly with regard to postulated links between entropy, the irreversibility of the second law of thermodynamics, and a notion of ‘sustainability’ [13,14,15,16,17,18]. This approach has its roots with both the work on non-equilibrium thermodynamics of Ilya Prigogine, and that of the economist Nicolas Georgescu-Roegen, who considered entropy in relation to the economic process [19,20]. It is often, however, presented qualitatively and can border on the second more figurative definition of entropy. The third, information statistical, definition has seen applied in a widespread manner to urban systems. For example, Alan Wilson’s employment of entropy maximization in the modeling of transport routing networks [21].
In this paper, we review how these interpretations of entropy have been applied to urban systems. In so doing we clarify and contextualize the differences among these three interpretations by outlining their scope of applicability. Section 2 presents the theoretical foundations of entropy in both its information statistical, and physical interpretations, highlighting the equivalence of the two. We present an overview of the theory in as concise a manner as possible, with an aim to cover the main bases for understanding the entropy concept. Section 3 reviews the applications of these statistical and thermodynamic interpretations respectively, in the context of urban systems. Lastly, in Section 4 and Section 5, we discuss the interrelations between the three entropy definitions, and offer an outlook on appropriate future uses of entropy in urban systems analysis.

2. Information and Entropy

Quantifying information content is relative, and context-specific, depending on the amount of information that is possessed by the observer. Shannon’s 1948 treatise proposes a measure of information content based upon the transmission of a message from one person to another [22]. In this scenario, the message, X , is conceived as a string of random variables, which can take on a number of states x i , with a probability of occurrence given by p i . Without observation, the receiver has no information, and perceives maximum uncertainty. The information provided by the observation of the state x i is a measure of how much uncertainty it resolves for the receiver. The lower the probability of x i , the more information its observation provides. For example, if X is the sum of two unbiased dice rolls, the low probability observation X = 2 implies 2 throws of 1, whereas X = 7 is more ambiguous, and has a higher corresponding probability. In this sense, X = 2 gives more information about the individual dice values than the high probability observation X = 7. We can call the most complete description of the system, which represents the results of each individual roll, a microstate. In contrast, the aggregate description of the sum of die values, X , represents a macrostate and may correspond to multiple microstates.
Axiomatically, Shannon showed that we can derive a measure for the information content of the form.
H = i p i log ( 1 p i )
where the sum is over all possible observations, and the base of the logarithm relays the units of information, e.g., base 2 gives rise to the unit of ‘bits’. This formulation follows intuitively from the discussion above, which is an observation that occurs with certainty, i.e., p i = 1 , contains no information, H ( p i ) =   0 . Additionally, the information obtained from two independent observations is additive while the joint probability is multiplicative: H ( p 1 p 2 ) = H ( p 1 ) + H ( p 2 ) . The logarithm, which is the only mathematical function that converts multiplication into addition, is the natural candidate for mapping probability p into information H . Shannon defines H as the entropy of the set of observation probabilities p i . This quantity is described by Shannon as “a measure of how much ‘choice’ is involved in the selection of the event or of how uncertain we are of the outcome” [22].
It can be interpreted as the unpredictability of the observed state, or the expected amount of information obtained from a single observation.

2.1. Entropy Maximization

As presented by Cesario (1975), a rational gambler will bet on an outcome that has, to their knowledge, the highest probability of occurring [23]. For our dice example, this most probable outcome is the macrostate X = 7 , as it has the highest number of microstates, i.e., six different ways of occurring, assuming unbiased dice where each microstate has equal probability. This is equivalent to choosing the macro state with the maximum entropy. The principle of maximum entropy states that the probability distribution with the largest entropy, subject to known constraints, gives the best representation of our current state of knowledge of the system. To the gambler, this represents the probability distribution that is most likely to be ‘true’. If the gambler were to gain additional information, e.g., that one of the dice is biased, the perceived probability distribution and the most probable macro state could change. This new distribution may be determined by maximizing Equation (1) subject to any constraints given by this new information. If perfect information, i.e., a complete probability distribution providing the probabilities of each microstate, is given, the new most likely macro state would be easily determined. If this information is imperfect though, how does the gambler determine the best way to place a bet?
If no information about the probability distribution is known, the entropy function is maximized when all outcomes have the same probability, i.e., all p i are equal to 1 / n . Otherwise, the gambler must use the probability distribution that maximizes the entropy subject to known constraints based upon all information that is available. This can be done formally by introducing Lagrangian multipliers, e.g., λ and μ for the two constraints of Σ p i = 1 and the expectation value of X , which leads to the general probability distribution p i = e λ μ X . This form of probability can be used to determine the normalization constant, which leads to the partition function. This method can be generalized for any number of constraints, and is outlined in detail by Jaynes [2].
The principle of maximum entropy is profound, in that it uniquely determines the probability distribution that is maximally noncommittal to the missing information. It assumes no more than what is given, and provides a method for making inferences based upon the possession of partial information. The application of entropy maximization is understandably broad, ranging from statistical mechanics to image processing, genomic analysis, and any domain where data and information are assessed [24,25,26]. Jaynes concludes that statistical mechanics need not be regarded as a physical theory (such as classical thermodynamics), dependent for its validity on additional assumptions of mechanics or the principle of equal a priori probability (PEAPP), which stipulates that all accessible microstates are equivalent after all known constraints are considered. In other words, the method of statistical inference and maximum entropy is generally valid, regardless of the details of the system to which it is applied.

2.2. The Second Law and Thermodynamic Applications

In classical thermodynamics, Clausius states the second law of thermodynamics simply as “heat cannot spontaneously flow from a colder body to a hotter body”, which gives rise to the notion that the heat at higher temperatures is more useful and possesses a higher quality. He noted that, while the heat exchanged reversibly, A B dQ rev , depends on the path taken from the initial state A to the final state B ; division by the absolute temperature T produces an integral A B dQ rev T , which is independent of the path, and, therefore, corresponds to a function of state [27]. Clausius named this state variable entropy, S , defined through its differential, d S = dQ rev T , and S B S A = A B dQ rev T .
The statistical mechanics formulation of entropy was hypothesized by Boltzmann through his famous equation S = k B log W , where k B is the Boltzmann constant and W is the number of microstates accessible to the system. The base of the logarithm is not critical and often taken to be e . The microstates are specific states of the system, which fixes all information (quantum or otherwise) for each individual atom or particle. For physical systems consisting typically of 10 23 particles, W is of the order 2 to the power of 10 23 , which is an unimaginably large number. Hence, it is generally not possible to analyse each of those microstates, without the assumption that all accessible states are equivalent, PEAPP.
Boltzmann’s formula was later generalized by Gibbs for systems where microstates are occupied with probability p i , with a sum over all available microstates.
S = k B i p i log ( 1 p i )
For isolated systems, i.e., those with no means to exchange matter or energy with its surroundings, the assumption of PEAPP leads to p i = 1 / W . Thus, the Boltzmann and Gibbs definitions of statistical entropy can be easily shown to be equivalent. For open systems (e.g., exchanging energy with the environment at temperature T ), PEAPP gives way to the Boltzmann distribution, where a microstate with energy E i is occupied with probability p i .
p i = 1 Z   e E i / k B T ,   Z = i e E i / k B T
Suppose we have a reversible process where no work is exchanged and all microstates E i are preserved, the first law of thermodynamics dictates that the heat exchanged is the change in internal energy: dQ rev = dE , where E = i p i E i is the total energy of the system. The Gibbs entropy definition can be differentiated to give (noting i p i = 1 and i d p i = 0 ):
dS = k B i d p i log ( 1 p i )
Combining with the Boltzmann distribution, we have (again noting i d p i = 0 ):
d S = k B i d p i [ E i k B T log Z ] = i d p i E i T = dE T
Thus, the statistical entropy is equivalent to the classical definition due to Clausius. This equivalence is general, despite the proof relying on a reversible process without work exchange. This is due to the fact that entropy is a function of state and, therefore, does not have dependence on the specific process involved.
The Gibbs entropy is equivalent to Shannon’s definition, save for the Boltzmann constant, and classical thermodynamics may be viewed as an early example of the principle of entropy maximization. Ben-Naim argues that the units of JK 1 are but a “historical accident,” and that, if a new absolute temperature was defined as T ¯ = k B T , the Gibbs and Shannon formulations would become equivalent [28] (pp. 204,205). Since the microstates are too numerous to assess individually, PEAPP represents the best interpretation of the physical system, and entropy is then maximized. Jaynes (1957) argues that we may view statistical mechanics as a special case of the more general procedures of inference derived from Shannon’s formulation of entropy [2]. In terms of information, we may, thus, think of Gibbs’ formulation as providing the amount of information needed to define the microstate of the system, given its macroscopic properties. Extensions of this equivalence in terms of quantum information and statistics may be found in References [4,5,29].
Given Boltzmann’s equation, S = k B log W , the maximization of entropy leads to the maximum number of microstates W , which correspond to the largest probability. Statistically, an isolated system will always evolve towards the most probable macro state, which corresponds to the largest number of microstates and maximum entropy. Thus, the second law can be interpreted statistically as, when heat flows from hot to cold, the number of accessible microstates increases such that the outcome “heat flows from hot to cold” is overwhelmingly more likely than “heat flows from cold to hot.” The second law is also equivalently stated as the “entropy of an isolated system always increases.” Jaynes formulates this in terms of information as “although our information as to the state of a system may be lost in a variety of ways, the only way in which it can be gained is by carrying out further measurements” [6].
It has been established from the 1920s by Eddington that the second law of thermodynamics holds a special position in science, in that it offers (at least for isolated systems) the arrow of time. No other law(s) in science explicitly distinguishes the past from the future. The second law is, therefore, often cited to imply a universal decline into disorder and chaos, so that entropy, itself an indicator of thermodynamic irreversibility, becomes naturally viewed as the indicator of this degradation. It should be noted that, in the thermodynamic context, the second law is firmly based on the statistics of large numbers of microstates (of the order of 2 to the power of 10 23 ). This level of statistical significance is often lost in more general entropy applications. For example, we do not expect the second law to apply to social order, where possible states are both limited in number and open to interpretation.

3. Applications of Entropy to Urban Systems

Entropy, in all three definitions, has been applied to urban systems for a range of processes and phenomena. Without being exhaustive, we review and highlight some representative applications in this scenario. We split this discussion into applications of the information-statistical and physical thermodynamic interpretations of entropy, respectively, focusing particularly on the method of entropy maximization, and the characterization of a dissipative urban system as key examples of these two interpretations. The figurative definition is discussed in relation to conflation with the thermodynamic since it receives little application explicitly.

3.1. Information Statistical Entropy

There exist numerous applications of the information statistical definition of entropy to urban systems. A prominent family of these exploit the property that entropy is maximal when probabilities are evenly distributed, and zero when concentrated in a single location, and can, thus, be used for representing a measure of spatial concentration or dispersion [30]. Such an interpretation can yield numerous indices measuring such phenomena in urban systems as ethnic diversity [31], urban sprawl [32,33], segregation [34], diversity of urban land use [35], and the geographic distribution of species to infer biodiversity [36]. These applications are often coupled with GIS and remote sensing techniques to analyse real geographic data [37,38,39,40], as well as instance matching of similar points of interest through geo-location data [41]. Medvedkov too suggests a comparable entropy method in an attempt to find order in the spatial distribution of settlements by comparing random and clustered distributions [42]. An early review of such applications is presented in Reference [35], along with the discussion on when Shannon’s information measure that may be appropriately replaced with the alternate information measures of Brillouin and Good. These applications are often based on quite strong assumptions due largely to the difficulty in gathering the data needed to capture the distributions in the question.
Batty develops an information statistic termed ‘spatial entropy’ as a discretized formulation of Shannon’s continuous representation of information entropy, explicitly including the coordinate system through the use of the class interval [43,44,45]. The inclusion of the spatial interval size in Batty’s formulation holds implications for geographical analysis, and allows for a comparison of the effects of partitioning the spatial system, e.g., the explicit inclusion of zone size in entropy maximization models, and analysis of trends in the spread of probability distribution across an urban system.
These methods can of course also be applied to spatial scales beyond the urban, to facilitate the analysis of regional or national scales in e.g., ecology, or in economics to deal with quantities such as income inequality in a given population [46,47].
Wilson, through a series of works from the 1960s onwards, popularizes the application of entropy maximization to the urban region [21,48,49,50]. He casts the system of urban transport flows as, what Weaver (1948) refers to as, one of “disorganized complexity”. A system described by a large number of variables, with a large number of elements that interact only weakly [50,51]. In the context of modeling the pattern of transport flows within an urban system, the problem of finding “the most probable state” is initially posed. Dividing the system into zones between which travel occurs, a matrix T i j can be constructed, detailing all individual transport flows from zones i to j , describing the ‘state’ of the system. Wilson (1967) argues that a good estimate of T i j may be made by applying three constraints: fixing the total number of workers living in a given origin zone i , fixing the total number of jobs in a given destination zone j , and fixing the total ‘generalised cost’, or impedance, associated with travel to work [48]. By assuming that each microstate of T i j is equally probable, we, therefore, want to find the T i j with the largest number of microstates W ( T i j ) giving rise to it. This may be achieved by maximising W ( T i j ) subject to the three imposed constraints, although Wilson chooses to equivalently maximize log W , to allow for Stirling’s approximation to simplify the maths.
This function is then maximized subject to the given constraints using a Lagrangian multiplier approach, which reveals an exponential expression for the most probable T i j matrix. The resultant expression reduces to the function that had been previously utilised in earlier ‘gravity’ models, providing an independent theoretical derivation, and superseding these expressions without the need for arbitrary tuning constants [52,53]. The approach builds on the entropy maximization method developed by Jaynes and detailed above, down to the analogous search for the most likely macrostate, as emphasized by Wilson in References [21,49] and Reference [54]. He interprets the entropy, log W , as a measure of the system uncertainty, which ought to be maximized to give the most likely scenario. All knowledge of the system is considered as constraints of the maximization and are incorporated via the Lagrangian multipliers.
This derivation led Wilson to propose a family of ‘spatial interaction’ models based upon this method, but applying different constraints. These include a retail model where the destination is fixed and the flow to this constrained location becomes the subject of analysis, and various considerations of disaggregation of the trips among various archetypal groups [21,55]. This entropy maximization approach to urban transport flows has been picked up, expanded, and adapted by numerous authors over the years. Such considerations include a continuous rather than discrete spatial representation [56], and simultaneous minimization of the generalized cost function in order to optimize network topology [57]. The incorporation of prior information from external data, such as traffic counts, has been demonstrated by References [58,59,60]. In this case, not all trips are assumed equally likely since this ‘known’ information about the T i j matrix is built in as a constraint. Such methods provide ways of improving model accuracy based upon real world data. Griffith & Jones (1980) investigate the relation of distance decay to the spatial structure associated with the origins and destinations [61], and Mattson describes an approach for maximizing ‘welfare’ in the allocation of housing [62]. More comprehensive reviews of various adaptions and applications of this model are presented in References [57,63] as well as in Wilson’s 2010 reflections on the technique, where it is claimed that such methods are routinely used by international companies wishing to optimize the location of new retail site locations [50].

3.2. Thermodynamic Entropy

Applications of thermodynamic principles to the urban system are diverse. A review of some of these is presented by Filchakova et al [14]. For example, in engineering thermodynamics, exergy is defined as the maximum amount of work that may be obtained from a system by bringing it into equilibrium with its environment [64]. Historically, exergy analysis has been used to improve the thermodynamic efficiency of various industrial processes, by identifying and minimizing exergy destruction. This has led to applications to larger systems assessing energy efficiency at national, regional, and urban scales [65,66,67,68,69]. For example, Nielsen and Jørgensen (2015) develop an exergy accounting framework, mapping the locations of large exergy consumption across six societal sectors for a small region, which allowed for the identification of key areas of attention for a proposed ‘sustainable energy’ transition plan [68]. It can be shown that the exergy destruction of a process, Ψ D , is related to its entropy production, S G , through the environmental reference temperature, T 0 , Ψ D = T 0 S G [70]. Thus, similar approaches considering entropy as a geographically applied indicator for targeting and improving upon energy inefficiencies are presented in References [71,72,73]. These methods remain superior to traditional energy accounting methods, as they capture the ‘quality’ of energy, that is its capacity to perform work [74,75].
One approach to modern thermodynamics, which departs somewhat from the familiar statistical mechanics interpretation, is an extension of classical thermodynamics outside of equilibrium, and has received attention in its purported applicability to urban systems. The vast majority of systems observed in nature are open, out of equilibrium, and undergoing irreversible processes. The urban system itself is one such example. Classical thermodynamics, concerned only with the initial and final states of systems in thermodynamic equilibrium, fails to include a theory of irreversible processes. A thermodynamic study of real systems requires a more general approach, something developed from the beginnings of the 20th century by theorists such as Onsager and Prigogine. However, this area of thermodynamics is still an active work in progress and lacks an established corpus [27]. Prigogine, in his study of the phenomena of self-organization, forwards the notion of ‘dissipative systems’ to refer to complex open structures that maintain their functioning through the constant dissipation of thermodynamic entropy [20]. This usually involves consideration of the entropy balance representing the total entropy change of the system as the respective sum of its internal entropy production and entropy exchange due to fluxes of matter and energy across the system’s boundary:
d S d t = d i S d t + d e S d t   .
The second law tells us that the internal entropy production d i S / d t >   0 , but the total entropy change of the system is permitted to be positive or negative, as d e S / d t =   d e S ( in ) / d t d e S ( out ) / d t may take on either sign.
Consideration of the city as a dissipative system in relation to a notion of sustainability is considered by Rees & Wackernagel (1997) [13]. They describe cities as “entropic black holes”, which draw in large amounts of entropy and matter, and “export the resultant entropy (waste and disorder)” to maintain their “highly-ordered dissipative structure.” It is argued that the ordered dissipative structure of the city is maintained at the expense of “increasing entropy or disorder in the environment”, citing trends of natural capital depletion, pollution emissions, and other adverse anthropogenic ecological consequences. Marchettini et al. (2006) extend this thread of work, but caution against an “entropic euthanasia” where the urban system has fully degraded all potentials through unprecedented growth in energy and matter inflows, leading to “maximum disorder and maximum entropy” analogous to an ‘urban heat death’ [76]. Similar arguments are also explored in Reference [77].
Filchakova et al. (2007) consider similar arguments in their wider review of thermodynamic concepts applied to the city, exploring its casting as an open system, and the analogy of the city as a living organism [14]. Following the arguments put forward by Marchettini et al., they appear to strive toward a more quantitative thermodynamic representation to “represent urban metabolism in an operational way”. A suggestion of coupling ecosystem theory literature with that on urban metabolism is suggested, but no concrete steps are taken.
Fistola (2011,2012) draws on authors such as Lovelock [78] and Rifkin [79], who both link entropy and the second law to notions of sustainability and humanity’s impact on global resources and ecosystems. A qualitative description of excessive levels of pollution is presented as a manifestation of “high entropy” and a loss of social capital due to increasing individualization, which is linked to an entropic trend of “structural decay” [80,81]. A strategy is articulated, which includes the assessment of “urban entropy”. This is developed in Reference [18] where a list of “entropy indicators” is produced including air quality, unemployment rate, waste production, and flooding risk, which are then mapped spatially. A similar approach using ‘entropy indicators’ is also taken by Pelorosso et al. in Reference [82]. These studies, while touching on the thermodynamic definition of entropy, employ the second, figurative, definition of entropy as a broader metaphorical gauge for a lack of ‘order’.
There are two broad arguments here, the first being that to maintain ‘order’, d S / d t 0 , the urban system must export entropy to its surroundings, which, in turn, acts to degrade this external environment. The second, not necessarily incompatible with the first, argues that, in some cases, the entropy of the urban system itself grows, d S / d t > 0 , which threatens its functional integrity. Potentially, both arguments are based upon mixed premises regarding conflation of the thermodynamic and figurative definitions through the association of irreversibility with ‘degradation’. Despite Filchakova et al.’s call for more quantitative conceptualizations of these representations, this remains largely absent. Where indicators have been presented, their link to ‘entropy’ remains qualitative, containing none of the thermodynamic theory either outlined above or in more classical considerations despite continued interest within the literature. Why is this still the case?
We argue that this derives primarily from conflation between the first two definitions of entropy detailed in our introduction, the thermodynamic sense, and the figurative sense. This can be seen in the language that tends to be used, centered on a notion of ‘disorder’ and ‘degradation’, beyond a simple thermodynamic context. As has been pointed out by authors in the past, anything described as a source of ‘disorder’ such as pollution cannot be simply coded as ‘entropic’ or acting to increase entropy in the thermodynamic sense [73,83,84]. Entropy is not a direct measure of ‘utility’ or the lack of. For example, the increase in entropy due to the release of an amount of hydrogen cyanide into the environment is roughly equivalent to the same amount of carbon dioxide released, despite the potentially catastrophic effects of the former. Thus entropy, at least in its thermodynamic sense, is not a sufficient measure of the negative externalities caused by human activity, and the association of the sign of d S / d t to these is problematic.
Furthermore, as has been pointed out by numerous authors, in conflating definitions 1 and 2, it is common to overlook the fact that the second law, cast in terms of inevitable decay to equilibrium or ‘heat death’, only applies to isolated systems that cannot exchange matter or energy with their environment [73,84,85]. Both the city, and Earth itself, are open systems, and thus a fear of ‘entropic accumulation,’ in the physical sense, is unfounded, since entropy produced is merely radiated out of the system as waste heat. As shown by e.g., Weiss (1994), the natural entropy production of the Earth system through the dissipation of solar energy greatly dwarfs any entropy produced by anthropic processes [86]. Of course, on an urban scale, this anthropic entropy production is arguably less negligible. The problems caused by urban heat islands are well documented, but these are much better understood in terms of other thermodynamic concepts such as heat, and we must understand the limits of entropy conceptualization where other such concepts take over.

4. Discussion

In reviewing various applications of ‘entropy’ to urban systems, we have shown that the distinction between the three definitions laid out in the Oxford Dictionaries is not so clear within the literature. This can be seen primarily in conflation between the physical definition and the figurative one. This is something that has arguably been persistent since these definitions were first popularised, and the association of thermodynamic entropy with disorder was first put forward by Boltzmann and Helmholtz in the 19th century [87]. The analogy is often put forth with examples of a deck of cards becoming less ordered when shuffled, or a child’s room becoming messier over time. This offers an intuitive understanding of the abstract quantity of entropy that is appealing, but also demonstrably misleading. A full review of the merits and problems of the disorder metaphor is presented in Reference [88] Overreliance on this metaphor leads to qualitative perceptions of ‘disorder’ being presented as evidence of entropy, as seen in the discussions on dissipative systems above. This figurative use is widespread, but becomes problematic when one attempts to ‘operationalize’ it. It is for this reason perhaps that actual implementations of measuring thermodynamic entropy in this way, as noted by Filchakova et al., appear to be almost absent from the literature.
It seems that many authors are not aware of the clear distinctions between the first and second definitions. This is particularly apparent in the writings, and subsequent criticisms, of the economist Georgescu-Roegen, whose initial work relating entropy and the economic process underwent significant revisions over the course of his career [89,90,91]. He later reflects on how, delving into the field as an interdisciplinary pioneer, he was inspired by the writings of Max Planck on entropy and “matter dissipation”, leading to his proposed ‘fourth law’, the postulate that complete recycling of matter is impossible [92]. This body of work was heavily criticized by authors from the natural and social sciences alike, and its status is now widely regarded as overreaching in its application of thermodynamics, in part due to the inapplicability of thermodynamic entropy to material flows [91,93]. Despite this, Georgescu-Roegen’s challenge to the economic orthodoxy from an environmental standpoint is seen as a defining moment in terms of the founding of the field of ecological economics [94,95,96]. This presents a good example of wider confusion that pervades applications of thermodynamic entropy, and how the distinction between its figurative use as an analogy, and proposed operationalization as a well-defined physical quantity, is not always clear.
Clear parallels can be seen here regarding the work conceptualizing urban dissipative systems, particularly with regard to seeking wider applications of concepts traditionally limited to physics. These conceptualizations tend to rely on the analogy of the urban system to an organism or complex ecosystem, which itself predates considerations of thermodynamics in this respect, e.g., discussions of urban metabolism in Reference [97]. In a similar vein, reflections of biological systems in relation to entropy and the second law have received notable attention including Schrödinger’s 1944 ‘What is Life?’ [98], as well as considerations of entropy throughput as an indicator of ecosystem integrity or adaptive capacity [99,100,101]. These links provide the motivation to explore the possibilities of extending similar analyses to urban systems, particularly in relation to issues of sustainability and resilience. It seems however that, in translation across disciplines and the ambiguity of application vs. analogy, it becomes common to misplace much of the thermodynamic theory in favor of more figurative understandings of entropy.
This holds some implications for any application of the first thermodynamic definition of entropy to urban systems. First, one should be careful that conflation is not being made with the second figurative definition. A general rule of thumb is that the first definition is measurable, at least in theory, whereas the figurative definition is not. To be of any use as an indicator one must be able to measure thermodynamic entropy in thermodynamic units of JK 1 . Without a mathematical formulation, it remains a metaphor, and one which we argue fails to be useful, sowing confusion rather than bringing clarity.
On the other hand, some authors present explicitly analogous mathematical constructs to thermodynamic and statistical mechanical principles to bring insights into other fields, including within ecology and urban analysis [101,102,103]. This has the potential to provide novel modeling techniques and insights, as the urban gravity-analogous models did in the mid-20th century. Caution should be taken however, regarding the limits to which such analogies can apply. In fundamental thermodynamics concepts like energy and the second law are well defined, where microscopic fluctuations are suppressed by the statistics of very large numbers. This is often not the case when seeking analogous thermodynamic ‘laws’ that govern social and ecological interactions, where fluctuations can determine overall system behaviour at this macroscopic scale.
What does this all mean for applications of the third information statistic definition? Generally this sense seems to remain distinct from the other two in the literature, although a small number of applications exist that attempt to combine entropy maximization with the language of dissipative systems [104,105]. As shown above, this third definition remains well-established in the literature considering urban system models, both as a measure for geographic dispersion, and for its use in entropy maximization techniques. We have argued that the first definition can be interpreted as a special case of the third definition, and is widely regarded in the literature. Despite this, these applications remain largely separate in consideration at the urban level. This can be largely seen as a matter of scale. The thermodynamic interpretation is concerned with specific microstates, the occupation of which are set by the Boltzmann distribution via energy levels of the microstates. PEAPP is often implicitly assumed. In this context, the statistics are drawn from the incredibly large number of microstates (of the order of 2 to the power of 10 23 ) so that energy and temperature play central roles. In contrast, information statistical applications generally deal with macroscopic configurations (such as transport routing), whose combinations are significantly fewer in numbers. With systems often out of equilibrium, the equivalents of energy and temperature are less clear.
Conflation of the third statistical definition with the second figurative definition are less prevalent and the analogy with the disorder is less relevant [106]. As well as being a misleading and subjective metaphor, the concept of disorder lacks the axiomatic properties such as additivity from which Shannon derived his expression for H . Nevertheless, Ayeni (1976) cautions an analogous confusion of the technical meaning of ‘information’ with its everyday understanding [7]. One should note that information entropy, in its standard form, lacks any notion of ‘information quality’, cf. the discussion above of thermodynamic entropy was blind to notions of ‘utility’. Although interpretation here is often influenced by the choice of variables, this has implications for when population similarity cannot be assumed, where methods incorporating population weighting must be employed [107]. As emphasized by e.g., Walsh and Webber (1977), one must remember that entropy as a measure of information is not an objective measure of the system, but is dependent on the observer’s ‘model’ of the system and the experiments they choose to perform on it. In this case, one must be clear in the definition of entropy in terms of the system variables chosen, and avoid conflation between prior information about the system before experimentation and information obtained from an experiment [35].
While, in some circles, the entropy maximization approach has been fully embraced, others remain critical, and it may still be argued to form a fringe within human geography [50]. Central to much of this scepticism is the underlying epistemological assumptions. Not only that human actors at some level of statistical aggregation may be considered to operate rationally and deterministically like molecules, but also that a complex social system such as a city may be adequately described by a finite series of quantitative parameters [23,108,109]. This philosophical debate is beyond our scope, and so long as all modeling assumptions are laid out bare, we feel, largely moot.

5. Conclusions

The three notions of entropy have given rise to much general confusion, which overshadow its applications in urban systems. Disentangling these separate notions enables the concept of entropy to be utilised in a wide range of worthwhile applications. The first thermodynamic definition of entropy applies to specific microstates, and the assumption of PEAPP enables it to be connected to physical parameters such as energy and exergy. In this sense, entropy could supplement and enhance traditional energy/exergy analysis such as in diagnosing energy efficiency within the urban system [73]. It could also offer analogous use of thermodynamic identities for modeling urban phenomena. The second definition of entropy, as figurative only, should not feature in quantitative analysis, lest the first definition slide into this more metaphorical interpretation of entropy. The third, information statistical definition of entropy is the most general of the three, it yields numerous applications to urban systems, ranging from spatial clustering analysis of various phenomena, to performing statistical inference for incomplete datasets. At the microscopic scale of atoms, the statistical entropy coincides with the thermodynamic entropy, and the statistics of large numbers leads to the classical laws of thermodynamics. As a universal measure, information statistical entropy is applicable at all scales, and therefore it should perhaps be featured more extensively in urban studies than it does currently. In part, this may be due to the confusion surrounding its varying usages. By disentangling these definitions, we believe entropy in this context offers a huge potential for urban systems analysis in the future.

Author Contributions

Conceptualization, B.P. and Y.M.; Writing-Original Draft Preparation, B.P.; Writing-Review & Editing, B.P., Y.M., and D.R.; Funding Acquisition, D.R. and Y.M.

Funding

This work was supported by the Engineering and Physical Sciences Research Council (grant number 1643433) and the Leverhulme Trust research program grant ‘Sustaining urban habitats: an interdisciplinary approach’ (RP2013-SL-015).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Oxford Dictionaries. Available online: https://en.oxforddictionaries.com/definition/entropy (accessed on 26 October 2018).
  2. Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  3. Curado, E.M.F.; Tsallis, C. Generalized statistical mechanics: Connection with thermodynamics. J. Phys. A. Math. Gen. 1992, 25, 1019. [Google Scholar] [CrossRef]
  4. Weilenmann, M.; Kraemer, L.; Faist, P.; Renner, R. Axiomatic Relation between Thermodynamic and Information-Theoretic Entropies. Phys. Rev. Lett. 2016, 117, 1–6. [Google Scholar] [CrossRef] [PubMed]
  5. Parrondo, J.M.R.; Horowitz, J.M.; Sagawa, T. Thermodynamics of information. Nat. Phys. 2015, 11, 131–139. [Google Scholar] [CrossRef]
  6. Jaynes, E.T. Information Theory and Statistical Mechanics. II. Phys. Rev. 1957, 108, 171–190. [Google Scholar] [CrossRef]
  7. Ayeni, M.A.O. The city system and the use of entropy in urban analysis. Urban Ecol. 1976, 2, 33–53. [Google Scholar] [CrossRef]
  8. Batty, M. Cities as Complex Systems: Scaling, Interactions, Networks, Dynamics and Urban Morphologies. In The Encyclopedia of Complexity & System Science; Springer: Berlin, Germany, 2008; ISBN 9780749215453. [Google Scholar]
  9. Wegener, M. Operational Urban Models State of the Art. J. Am. Plan. Assoc. 1994, 60, 17–29. [Google Scholar] [CrossRef]
  10. Robinson, D. Computer Modelling for Sustainable Urban Design; Routledge: London, UK, 2011. [Google Scholar]
  11. Kitchin, R. The real-time city? Big data and smart urbanism. GeoJournal 2014, 79, 1–14. [Google Scholar] [CrossRef]
  12. UN. Transforming Our World: The 2030 Agenda for Sustainable Development; Resolution Adopted by the General Assembly on 25 September 2015 (A/RES/70/1); United Nations: New York, NY, USA, 2015; ISBN 9780874216561. [Google Scholar]
  13. Rees, W.; Wackernagel, M. Urban Ecological Footprints: Why Cities Cannot be Sustainable—And Why They are a Key to Sustainability. In Urban Ecology; Springer: Boston, MA, USA, 1997; pp. 537–555. ISBN 9780387734118. [Google Scholar]
  14. Filchakova, N.; Robinson, D.; Scartezzini, J.-L. Quo vadis thermodynamics and the city: A critical review of applications of thermodynamic methods to urban systems. Int. J. Ecodynamics 2007, 2, 222–230. [Google Scholar] [CrossRef]
  15. Bristow, D.; Kennedy, C. Why Do Cities Grow? Insights from Nonequilibrium Thermodynamics at the Urban and Global Scales. J. Ind. Ecol. 2015, 19, 211–221. [Google Scholar] [CrossRef]
  16. Pulselli, R.M.; Ciampalini, F.; Galli, A.; Pulselli, F.M. Non Equilibrium Thermodynamics and the City: A New Approach to Urban Studies. Ann. Chim. 2006, 96, 543–552. [Google Scholar] [CrossRef] [PubMed]
  17. Balocco, C.; Grazzini, G. Sustainability and information in urban system analysis. Energy Policy 2006, 34, 2905–2914. [Google Scholar] [CrossRef]
  18. Fistola, R.; La Rocca, R.A. The Sustainable City and the Smart City: Measuring urban entropy first. Trans. Ecol. Environ. 2014, 191, 537–548. [Google Scholar]
  19. Georgescu-Roegen, N. The Entropy Law and the Economic Process; Harvard University Press: Cambridge, MA, USA, 1971. [Google Scholar]
  20. Nicolis, G.; Prigogine, I. Self-Organization in Nonequilibrium Systems: From Dissipative Structures to Order Through Fluctuations; A Wiley-Interscience Publication; Wiley: New York, NY, USA, 1977. [Google Scholar]
  21. Wilson, A.G. Entropy in Urban and Regional Modelling; Pion: London, UK, 1970. [Google Scholar]
  22. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  23. Cesario, F.J. A Primer on Entropy Modeling. J. Am. Plan. Assoc. 1975, 41, 40–48. [Google Scholar] [CrossRef]
  24. De Martino, A.; De Martino, D. An introduction to the maximum entropy approach and its application to inference problems in biology. Heliyon 2018, 4. [Google Scholar] [CrossRef] [PubMed]
  25. Lesne, A. Shannon entropy: A rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics. Math. Struct. Comput. Sci. 2014, 24. [Google Scholar] [CrossRef]
  26. Gull, S.F.; Skilling, J. Maximum entropy method in image processing. IEE Proc. F 1984, 131, 646–659. [Google Scholar] [CrossRef]
  27. Kondepudi, D.K. Introduction to Modern Thermodynamics; Wiley: Chichester, UK, 2008. [Google Scholar]
  28. Ben-Naim, A. Entropy Demystified: The Second Law Reduced to Plain Common Sense; World Scientific: Singapore, 2008. [Google Scholar]
  29. Goold, J.; Huber, M.; Riera, A.; Del Rio, L.; Skrzypczyk, P. The role of quantum information in thermodynamics - A topical review. J. Phys. A Math. Theor. 2016, 49. [Google Scholar] [CrossRef]
  30. Chapman, G.P. The Application of Information Theory to the Analysis of Population Distributions in Space. Econ. Geogr. 1970, 46, 317–331. [Google Scholar] [CrossRef]
  31. Allen, J.P.; Turner, E. The Most Ethnically Diverse Urban Places in the United States. Urban Geogr. 1989, 10, 523–539. [Google Scholar] [CrossRef]
  32. Yeh, A.G.-O.; Li, X. Measurement and monitoring of urban sprawl in a rapidly growing region using entropy. Photogramm. Eng. Remote Sensing 2001, 67, 83–90. [Google Scholar]
  33. Cabral, P.; Augusto, G.; Tewolde, M.; Araya, Y. Entropy in Urban Systems. Entropy 2013, 15, 5223–5236. [Google Scholar] [CrossRef] [Green Version]
  34. Mora, R.; Ruiz-Castillo, J. Entropy-based segregation indices. Sociol. Methodol. 2011, 41, 159–194. [Google Scholar] [CrossRef]
  35. Walsh, J.A.; Webber, M.J. Information theory: Some concepts and measures. Environ. Plan. A 1977, 9, 395–417. [Google Scholar] [CrossRef]
  36. Phillips, S.J.; Anderson, R.P.; Schapire, R.E. Maximum entropy modeling of species geographic distributions. Ecol. Model. 2006, 190, 231–259. [Google Scholar] [CrossRef]
  37. Rahman, A.; Aggarwal, S.P.; Netzband, M.; Fazal, S. Monitoring Urban Sprawl Using Remote Sensing and GIS Techniques of a Fast Growing Urban Centre, India. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2011, 4, 56–64. [Google Scholar] [CrossRef]
  38. Pourghasemi, H.R.; Pradhan, B.; Gokceoglu, C. Remote Sensing Data Derived Parameters and its Use in Landslide Susceptibility Assessment Using Shannon’s Entropy and GIS. Appl. Mech. Mater. 2012, 225, 486–491. [Google Scholar] [CrossRef]
  39. Jat, M.K.; Garg, P.K.; Khare, D. Monitoring and modelling of urban sprawl using remote sensing and GIS techniques. Int. J. Appl. Earth Obs. Geoinf. 2008, 10, 26–43. [Google Scholar] [CrossRef]
  40. Sudhira, H.S.; Ramachandra, T.V.; Jagadish, K.S. Urban sprawl: Metrics, dynamics and modelling using GIS. Int. J. Appl. Earth Obs. Geoinf. 2004, 5, 29–39. [Google Scholar] [CrossRef]
  41. Li, L.; Xing, X.; Xia, H.; Huang, X. Entropy-Weighted instance matching between different sourcing points of interest. Entropy 2016, 18, 45. [Google Scholar] [CrossRef]
  42. Medvedkov, Y.V. The Concept of Entropy in Settlement Pattern Analysis. Pap. Reg. Sci. 1967, 18, 165–168. [Google Scholar] [CrossRef]
  43. Batty, M. Entropy in Spatial Aggregation. Geogr. Anal. 1976, 8, 1–21. [Google Scholar] [CrossRef]
  44. Batty, M.; Morphet, R.; Masucci, P.; Stanilov, K. Entropy, complexity, and spatial information. J. Geogr. Syst. 2014, 16, 363–385. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Batty, M. Spatial Entropy. Geogr. Anal. 1974, 6, 1–31. [Google Scholar] [CrossRef]
  46. Theil, H. Economics and Information Theory; North-Holland Pub. Co.: Amsterdam, The Netherlands, 1967. [Google Scholar]
  47. Rao, R. Diversity and Dissimilarity. Theor. Popul. Biol. 1982, 21, 24–43. [Google Scholar] [CrossRef]
  48. Wilson, A.G. A statistical theory of spatial distribution models. Transp. Res. 1967, 1, 253–269. [Google Scholar] [CrossRef] [Green Version]
  49. Wilson, A.G. The Use of Entropy Maximising Models in the Theory of Trip Distribution, Mode Split and Route Split. J. Transp. Econ. Policy 1969, 3, 108–126. [Google Scholar]
  50. Wilson, A. Entropy in urban and regional modelling: Retrospect and prospect. Geogr. Anal. 2010, 42, 364–394. [Google Scholar] [CrossRef]
  51. Weaver, W. Science and Complexity. Am. Sci. 1948, 36, 536–544. [Google Scholar]
  52. Batty, M. Reilly’s challenge: New laws of retail gravitation which define systems of central places. Environ. Plan. A 1978, 10, 185–219. [Google Scholar] [CrossRef]
  53. Senior, M.L. From gravity modelling entropy maximizing: A pedagogic guide. Prog. Hum. Geogr. 1979, 3, 175–210. [Google Scholar] [CrossRef]
  54. Wilson, A.G. The Use of the Concept of Entropy in System Modelling. Oper. Res. Q. 1970, 21, 247–265. [Google Scholar] [CrossRef]
  55. Wilson, A.G. Further developments of entropy maximising transport models. Transp. Plan. Technol. 1973, 1, 183–193. [Google Scholar] [CrossRef]
  56. Angel, S.; Hyman, G.M. Urban Fields: A Geometry of Movement for Regional Science; Pion: London, UK, 1976. [Google Scholar]
  57. Kapur, J.N. Entropy maximization models in regional and urban planning. Int. J. Math. Educ. Sci. Technol. 1982, 13, 693–714. [Google Scholar] [CrossRef]
  58. Snickars, F.; Weibull, J.W. A Minimum Information Principle: Theory and Practice. Reg. Sci. Urban Econ. 1977, 7, 137–168. [Google Scholar] [CrossRef]
  59. Van Zuylen, H.J.; Willumsen, L.G. The Most Likely Trip Matrix Estimated from Traffic Counts. Transp. Res. Part B 1980, 14, 281–293. [Google Scholar] [CrossRef]
  60. Cascetta, E.; Nguyen, S. A unified framework for estimating or updating origin/destination matrices from traffic counts. Transp. Res. Part B 1988, 22, 437–455. [Google Scholar] [CrossRef]
  61. Griffith, D.A.; Jones, K.G. Explorations into the relationship between spatial structure and spatial interaction. Environ. Plan. A 1980, 12, 187–201. [Google Scholar] [CrossRef]
  62. Mattsson, L.-G. Equivalence Between Welfare and Entropy Approaches to Residential Location. Reg. Sci. Urban Econ. 1984, 14, 147–173. [Google Scholar] [CrossRef]
  63. Roy, J.R.; Lesse, P.F. On appropriate microstate descriptions in entropy modelling. Transp. Res. Part B 1981, 15, 85–96. [Google Scholar] [CrossRef]
  64. Dincer, I.; Cengel, Y.A. Energy, Entropy and Exergy Concepts and Their Roles in Thermal Engineering Ibrahim. Entropy 2001, 3, 116–149. [Google Scholar] [CrossRef]
  65. Hammond, G.P.; Stapleton, A.J. Exergy analysis of the United Kingdom energy system. Proc. Inst. Mech. Eng. Part A J. Power Energy 2001, 215, 141–162. [Google Scholar] [CrossRef] [Green Version]
  66. Balocco, C.; Papeschi, S.; Grazzini, G.; Basosi, R. Using exergy to analyze the sustainability of an urban area. Ecol. Econ. 2004, 48, 231–244. [Google Scholar] [CrossRef]
  67. Balocco, C.; Grazzini, G. Thermodynamic parameters for energy sustainability of urban areas. Sol. Energy 2000, 69, 351–356. [Google Scholar] [CrossRef]
  68. Nielsen, S.N.; Jørgensen, S.E. Sustainability analysis of a society based on exergy studies—A case study of the island of Samsø (Denmark). J. Clean. Prod. 2015, 96, 12–29. [Google Scholar] [CrossRef]
  69. Kalinci, Y.; Dincer, I.; Hepbasli, A. Energy and exergy analyses of a hybrid hydrogen energy system: A case study for Bozcaada. Int. J. Hydrogen Energy 2017, 42, 2492–2503. [Google Scholar] [CrossRef]
  70. Pal, R. Demystification of the Gouy-Stodola theorem of thermodynamics for closed systems. Int. J. Mech. Eng. Educ. 2017, 45, 142–153. [Google Scholar] [CrossRef]
  71. Bejan, A. Entropy generation minimization: The new thermodynamics of finite-size devices and finite-time processes. J. Appl. Phys. 1996, 79, 1191–1218. [Google Scholar] [CrossRef]
  72. Bejan, A. Entropy generation minimization, exergy analysis, and the constructal law. Arab. J. Sci. Eng. 2013, 38, 329–340. [Google Scholar] [CrossRef]
  73. Purvis, B.; Mao, Y.; Robinson, D. Thermodynamic Entropy as an Indicator for Urban Sustainability? Procedia Eng. 2017, 198, 802–812. [Google Scholar] [CrossRef]
  74. Sciubba, E.; Wall, G. A brief Commented History of Exergy from the Beginnings to 2004. Int. J. Thermodyn. 2007, 10, 1–26. [Google Scholar]
  75. Kotas, T.J. The Exergy Method of Thermal Plant Analysis; Butterworths: Tiptree, Essex, UK, 1985. [Google Scholar]
  76. Marchettini, N.; Pulselli, F.M.; Tiezzi, E. Entropy and the city. WIT Trans. Ecol. Environ. 2006, 93, 263–272. [Google Scholar]
  77. Rees, W.E. Cities as Dissipative Structures: Global Change and the Vulnerability of Urban Civilization. In Sustainability Science: The Emerging Paradigm and the Urban Environment; Springer: New York, NY, USA, 2012; pp. 247–291. [Google Scholar]
  78. Lovelock, J. Gaia, a New Look at Life on Earth; Oxford University Press: Oxford, UK, 1979. [Google Scholar]
  79. Rifkin, J.; Howard, T. Entropy: A New World View; Viking Press: New York, NY, USA, 1980. [Google Scholar]
  80. Fistola, R. The unsustainable city. Urban entropy and social capital: The needing of a new urban planning. Procedia Eng. 2011, 21, 976–984. [Google Scholar] [CrossRef]
  81. Fistola, R. Urban entropy vs sustainability: A new town planning perspective. Sustain. City 2012, 155, 195–204. [Google Scholar]
  82. Pelorosso, R.; Gobattoni, F.; Leone, A. The low-entropy city: A thermodynamic approach to reconnect urban systems with nature. Landsc. Urban Plan. 2017, 168, 22–30. [Google Scholar] [CrossRef]
  83. Kovalev, A.V. Misuse of thermodynamic entropy in economics. Energy 2016, 100, 129–136. [Google Scholar] [CrossRef]
  84. Gillett, S.L. Entropy and its misuse, I. Energy, free and otherwise. Ecol. Econ. 2006, 56, 58–70. [Google Scholar] [CrossRef]
  85. Schwartzman, D. The Limits to Entropy: The Continuing Misuse of Thermodynamics in Environmental and Marxist theory. Sci. Soc. 2008, 72, 43–62. [Google Scholar] [CrossRef]
  86. Weiss, W. The balance of entropy on earth. Contin. Mech. Thermodyn. Anal. Complex Mater. Judicious Eval. Environ. 1994, 8, 37–51. [Google Scholar] [CrossRef]
  87. Lambert, F.L. Disorder—A Cracked Crutch for Supporting Entropy Discussions. J. Chem. Educ. 2002, 79, 187–192. [Google Scholar] [CrossRef]
  88. Haglund, J. Good Use of a “Bad” Metaphor: Entropy as Disorder. Sci. Educ. 2017, 26, 205–214. [Google Scholar] [CrossRef]
  89. Georgescu-Roegen, N. The Entropy Law and The Economic Process in Retrospect. East. Econ. J. 1986, 12, 3–25. [Google Scholar]
  90. Kåberger, T.; Månsson, B. Entropy and economic processes—Physics perspectives. Ecol. Econ. 2001, 36, 165–179. [Google Scholar] [CrossRef]
  91. Glucina, M.D.; Mayumi, K. Connecting thermodynamics and economics: Well-lit roads and burned bridges. Ann. N. Y. Acad. Sci. 2010, 1185, 11–29. [Google Scholar] [CrossRef]
  92. Georgesçu-Roegen, N. Nicholas Georgescu-Roegen about Himself. In Eminent Economists: Their Life Philosophies; Szenberg, M., Ed.; Cambridge University Press: Cambridge, UK, 1992; pp. 128–159. [Google Scholar]
  93. Cojanu, V. Georgescu-Roegen’s entropic model: A methodological appraisal. Int. J. Soc. Econ. 2009, 36, 274–286. [Google Scholar] [CrossRef]
  94. Røpke, I. The early history of modern ecological economics. Ecol. Econ. 2004, 50, 293–314. [Google Scholar] [CrossRef]
  95. Martinez-Alier, J. Ecological economics. In International Encyclopedia of the Social & Behavioral Sciences, 2nd ed.; Wright, J.D., Ed.; Elsevier: Amsterdam, The Netherlands, 2015; pp. 851–864. [Google Scholar]
  96. Levallois, C. Can de-growth be considered a policy option? A historical note on Nicholas Georgescu-Roegen and the Club of Rome. Ecol. Econ. 2010, 69, 2271–2278. [Google Scholar] [CrossRef] [Green Version]
  97. Wolman, A. The Metabolism of Cities. Sci. Am. 1965, 213, 178–190. [Google Scholar] [CrossRef]
  98. Schrödinger, E. What Is Life? Cambridge University Press: London, UK, 1944. [Google Scholar]
  99. Schneider, E.D.; Kay, J.J. Complexity and thermodynamics. Towards a new ecology. Futures 1994, 26, 626–647. [Google Scholar] [CrossRef]
  100. Müller, F. State-of-the-art in ecosystem theory. Ecol. Model. 1997, 100, 135–161. [Google Scholar] [CrossRef]
  101. Jorgensen, S.E.; Svirezhev, Y.M. Towards a Thermodynamic Theory for Ecological Systems; Elsevier: Oxford, UK, 2004. [Google Scholar]
  102. Hernando, A.; Plastino, A. Thermodynamics of urban population flows. Phys. Rev. E Stat. Nonlinear Soft Matter Phys. 2012, 86. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  103. Wilson, A. The “Thermodynamics” of the City: Evolution and Complexity Science in Urban Modelling. In Complexity and Spatial Networks: In Search of Simplicity; Reggiani, A., Nijkamp, P., Eds.; Springer: Heidelberg, Germany, 2009; pp. 11–32. [Google Scholar]
  104. Feng, H.; Chen, X.; Heck, P.; Miao, H. An entropy-perspective study on the sustainable development potential of tourism destination ecosystem in Dunhuang, China. Sustainability 2014, 6, 8980–9006. [Google Scholar] [CrossRef]
  105. Zhang, Y.; Yang, Z.; Li, W. Analyses of urban ecosystem based on information entropy. Ecol. Model. 2006, 197, 1–12. [Google Scholar] [CrossRef]
  106. Ben-Naim, A. A Farewell to Entropy: Statistical Thermodynamics Based on Information; World Scientific: New Jersey, NJ, USA, 2008; ISBN 9789812707062. [Google Scholar]
  107. Guiaşu, R.C.; Guiaşu, S. Conditional and Weighted Measures of Ecological Diversity. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2003, 11, 283–300. [Google Scholar] [CrossRef]
  108. Wilson, A.G. Some new forms of spatial interaction model: A review. Transp. Res. 1975, 9, 167–179. [Google Scholar] [CrossRef]
  109. Clarke, G.P.; Wilson, A. International Encyclopedia of Human Geography; Elsevier: Amsterdam, The Netherlands, 2009; pp. 260–261. [Google Scholar]

Share and Cite

MDPI and ACS Style

Purvis, B.; Mao, Y.; Robinson, D. Entropy and its Application to Urban Systems. Entropy 2019, 21, 56. https://doi.org/10.3390/e21010056

AMA Style

Purvis B, Mao Y, Robinson D. Entropy and its Application to Urban Systems. Entropy. 2019; 21(1):56. https://doi.org/10.3390/e21010056

Chicago/Turabian Style

Purvis, Ben, Yong Mao, and Darren Robinson. 2019. "Entropy and its Application to Urban Systems" Entropy 21, no. 1: 56. https://doi.org/10.3390/e21010056

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop