Chapter 8 Spatial entropy, information and complexity
Restricted access

Entropy is one of the most elusive concepts in the physics of complex systems, largely because its meaning relates to the difference between a set of events and their occurrence. Only when events occur can it be formally interpreted and it is this difference that provides the observer with information about the state of the system. We begin by exploring this conundrum as it was first posed by Shannon (1948), arguing that there are two approaches in spatial systems to the definition of entropy: first through the measurement of entropy in any extensive spatial system which pertains to the number of events and the shape of the probability distribution describing it; and second the use of the entropy-maximizing method to derive consistent models which show how we can incorporate constraints on the form of the system into our most probable models. We first explore the measurement of entropy and then link this to entropy maximizing. We show how exponential and power laws are derived which relate to population densities in cities and city size distributions based on rank-size laws. We then extend our definition of entropy to information as defined by Kullback (1959) and this leads us to the definition of a spatial entropy which contains the basic elements for a measure of complexity.

You are not authenticated to view the full text of this chapter or article.

Access options

Get access to the full article by using one of the access options below.

Other access options

Redeem Token

Institutional Login

Log in with Open Athens, Shibboleth, or your institutional credentials

Login via Institutional Access

Personal login

Log in with your Elgar Online account

Login with you Elgar account
Handbook