Different Senses of Entropy – Implications for Education, Jesper Haglund, Frederik Jeppsson and Helge Strömdahl

Jesper Haglund, Fredrik Jeppsson and Helge Strömdahl

 

Swedish National Graduate School in Science and Technology Education,

Linköping University, S-601 74

Norrköping,

Sweden;

 

Abstract: A challenge in the teaching of entropy is that the word has several different senses, which may provide an obstacle for communication. This study identifies five distinct senses of the word ‘entropy’, using the Principled Polysemy approach from the field of linguistics. A semantic network is developed of how the senses are related, using text excerpts from dictionaries, text books and text corpora. Educational challenges such as the existence of several formal senses of entropy and the intermediary position of entropy as disorder along the formal/non-formal scale are presented using a two-Dimensional Semiotic/semantic Analysing Schema (2-D SAS).

 

 

  1. Introduction

Entropy and the second law of thermodynamics are linked to our understanding of the direction of spontaneous processes. This results in a sense of the direction of time, providing time’s arrow. It is often easy to recognise a film running backwards, since the process shows violations of the second law of thermodynamics. For instance, the process of a glass of milk being spilled out over the floor is highly unlikely to be reversed. Introducing the concept of entropy is a way to operationalise such physical phenomena. However, it is both an idealised and abstract concept, and it can be difficult to grasp its connection to our perceptions and experiences in the everyday world. This is illustrated by the fact that the concept and the word were invented as late as in the 19th century and are still rather unusual in most contexts in comparison with, for instance, the word ‘time’.

 

In a study on semantics in introductory physics teaching, Williams [1] notes that as opposed to mathematics, physics papers are “dominated by words that expand, explain, or qualify the information provided in the figures, graphs, and equations.” However, this dependence on language is not unproblematic:

 

Physics is often called an “exact science,” and for good reason. At our best, we are precise in our measurements, equations, and claims. We do not seem to be at our best, however, when we write and talk about physics to introductory students. Language usage which presents few problems when used among ourselves because of shared assumptions is potentially misleading or uninformative when used with the uninitiated.

 

Williams goes on to demonstrate that the same word can have different established meanings in different science communities. For example, a helium atom may or may not be regarded as a ‘molecule’ from either a chemist’s or a physicist’s point of view. The main challenge for mutual understanding is when words have different meanings in formal scientific language and non-formal everyday language. For example, by using dictionary entries of central physics words such as ‘energy’, ‘equilibrium’ and ‘particle’, Williams has shown that there is room for several different interpretations and, therefore, misunderstanding.

 

Baierlein [2] suggests that there are generally two approaches to teach introductory thermodynamics and therefore, the concept of entropy. On the one hand, a historical and macroscopic approach can be adopted that focuses on properties of cyclic processes and, in line with Clausius, introduces entropy as a state function in relation to these processes. This provides good opportunities for problem solving in engineering, but a limited understanding of the nature of entropy. On the other hand, a microscopic approach involves the introduction of a system-particle model and microstates that uses Boltzmann’s statistical approach. Here, the challenge is how to proceed from the abstract physical model to macroscopic applications. For instance, Reif [3] argues in favour of a microscopic approach and emphasises the need to understand the underlying mechanisms of physical phenomena. In addition, he points out the difficulty among students to build visualisable mental models with a macroscopic approach. He further shows how macroscopic properties can be derived from this atomic starting point in order to provide complementary perspectives. A series of studies conducted by the Physics Education Group at the University of Washington has focused on the conceptual understanding and teaching of thermodynamics. As part of the work, Cochran and Heron [4] have found that students have difficulties applying the second law of thermodynamics when assessing the feasibility of heat engine cycles. Overall, as opposed to Reif, the Physics Education Group questions the introduction of thermodynamic quantities through microscopic models, and claims that such concepts have to first be firmly understood in macroscopic contexts, such as the use of bicycle pumps [5].

 

During the teaching of thermodynamics, common disorder metaphors for entropy, such as explaining that a messy room has high entropy, are used extensively. However, there is a debate whether the use of such metaphors may actually do more harm than good. For instance, Lambert [6] argues that disorder is a ‘cracked crutch’ that cannot be relied upon for teaching entropy, and has argued for the removal of this metaphoric use in science textbooks by replacing it with the idea of entropy as dispersal of energy. In addition to thermodynamics, the word entropy has been introduced as a formal concept in information theory. Apart from natural science domains, entropy has also been adopted in the arts and in social sciences as a metaphorical description of the state of society.

 

One overall intention of our research initiative is to investigate and present how linguistic methods can be employed to analyse the meaning of words that are used in science, science education and non-scientific settings. Hence, the purpose of the present study is to explore the language that is used in connection with the word entropy in different domains and its educational implications. Specifically, the aim is to discern distinct senses of entropy in a systematic manner and explore their logical and historical relationships through the application of the Principled Polysemy approach [7, 8]. In the field of linguistics, ‘meaning’ denotes the subjective interpretation of a word by a person with regards to a specific situation or sentence. ‘Sense’ denotes a more stable, general interpretation of a word, which typically can be found in dictionaries as a separate entry. In semantics, polysemy indicates that one word has two or more interrelated senses. For example, the word ‘paper’ can refer to the material made out of wood pulp, an individual sheet of paper, blank or printed paper, a physical periodical, or its electronic counterpart. All of these senses are related logically and historically. Based on this analysis, challenges for teaching and learning scientific senses of entropy are discussed using a two-dimensional semiotic/semantic analysing schema [9, 10]. The research questions of the study are:

 

  • What are the distinct senses of the word entropy and how are these senses of entropy related logically and historically?
  • What are the educational implications of the answer to the question above regarding teaching and learning the scientific senses of entropy?

 

  1. Methods

2.1. Data Collection

The empirical data used in the present study were text excerpts relating to entropy from different sources. The intention was to identify a broad variety of language use rather than to perform an exhaustive investigation of one literature source, e.g., science text books. Dictionary information, generally from dictionary.reference.com, was used as a starting point to identify the different senses of entropy in science and non-science language. In the cases where new senses of entropy have been introduced in science and elsewhere, original sources have been consulted. Science text books and historical accounts of thermodynamics have been used to locate representative examples of how the subject is presented in educational settings.

 

Gries and Divjak [11] propose the use of text corpora when performing polysemy analyses, since they “provide data from natural settings rather than ‘armchair’ judgements or responses that potentially reflect experimentally-induced biases”. They claim that corpora would be particularly suited to the Principled Polysemy approach because the criteria used to discern different senses are related to predictions of language usage. In the present study, text corpora in Swedish collected by Gothenburg University since 1975 were used to analyse texts that were produced without an explicit educational intent [12]. The text corpora include news papers, fiction, popular science, parliamentary debate and legislation. Excerpts from this source have been translated into English by the authors.

 

Through this approach, all occurrences in defined settings were identified. This can be compared with Internet search engines, which provide a large amount of occurrences, but sorted in a way not controlled by the user. Twenty occurrences of the word ‘entropi’ (entropy) in the text corpora were analysed.

 

2.2. Data Analysis

The analysis was performed using two theoretical frameworks: Principled Polysemy for discerning different senses and the 2-D SAS for educational analysis and discussion.

 

2.2.1. Principled Polysemy

In the development of cognitive lexical semantics, Lakoff [13] argues that word meanings can be modelled in a way similar to that of our representation of concepts. Here, radial categories are formed where a central sense of a word radiates out to more peripheral senses. Such an extension of senses can be achieved through cognitive mechanisms, such as metaphoric transfer from a concrete, embodied perception to more abstract interpretations. In the analysis of the preposition ‘over’, Lakoff discerns a large number of senses of which the ‘above’ sense was identified as the most central. Lakoff argues that one way of formulating new senses of a word is through image schema transformation. For example, assume that a ‘path’ image schema is used in the following example that focuses on a process: “John walked over the hill”, related to the central ‘above’ sense [14]. By shifting to an endpoint focus a ‘goal’ image schema is adopted, which allows for the following to be formulated: “John lives over the hill” [14] that, in turn, renders a new sense to the word ‘over’ close to ‘beyond’. Another way for new senses to emerge is through metaphorical extension. For instance, in “[s]he has a strange power over me”, the use of the word ‘over’ has transferred from the original ‘above’ sense (related to vertical position) to an abstract ‘control’ sense. Although Lakoff’s approach has been very influential, it has been criticised for the fine granularity (a large amount of senses) that relies on subjective judgement on what should count as distinct senses, as opposed to context dependent nuances of meanings [14].

 

Evans and Tyler [7] present Principled Polysemy as a systematic approach to distinguish between different senses of polysemous words and establish the prototypical sense by continuing the work related to the ‘over’ example. This approach has also been adapted to other word classes, most significantly in the analysis of the noun ‘time’, where the following criteria are used to identify distinct senses [8]:

 

  • The meaning criterion: Featuring a different meaning, not apparent in any other senses.
  • The concept elaboration criterion: Featuring unique or highly distinct patterns of language use across contexts. Patterns can relate to modifications of words by an adjective, e.g., “a short time” or typical verb phrases, e.g., “The time sped by”.
  • The grammatical criterion: Featuring unique grammatical constructions, e.g., distinguishing between ‘time’ as a count noun, mass noun and proper noun. The meaning criterion is the most obvious for discerning different senses, but it gives room for subjective interpretations whether or not the meanings in two contexts should count as distinct, stable senses. Evans proposes that the meaning criterion and at least one of the two other criteria have to be satisfied to identify a distinct sense. Varieties that have different meanings in different contexts of language use, but do not satisfy the other criteria may be regarded as sub-senses within one sense. In the present study, we assume that one way of meeting the meaning criterion is to associate two different senses of entropy with different types of referents.

In the Principled Polysemy approach, a Sanctioning Sense is identified, from which the other existing senses can be derived. This Sanctioning Sense is the central node of a semantic network, representing the logical relation of the senses. In the analysis of ‘time’, the Sanctioning Sense is identified through the following criteria that were also used in the present study [8]:

  • Earliest attested meaning (originating sense).
  • Predominance in the semantic network, in the sense of type-frequency.
  • Predictability regarding other senses.
  • Lived human experience, i.e., experiences at the phenomenological level.

 

These criteria give guidance as to the identification of the Sanctioning Sense, but there is no general hierarchical relationship between them and all do not have to be met. In particular, the earliest attested meaning does not necessarily have to be the Sanctioning Sense. Furthermore, the relative importance of the criteria has to be argued for in the analysis of a particular word (‘entropy’ in this case).

 

2.2.2. Two-Dimensional Semiotic/Semantic Analysing Schema

In the tradition of conceptual change research of science education (c.f. [15]), the purpose of learning processes was originally for novices to replace existing conceptions of natural phenomena with the correct scientific understanding. As the research field has evolved throughout the last 30 years, this way of reasoning has been modified and refined. For instance, there is now an awareness that perceptually based non-formal interpretations are not replaced, but are kept and still used in appropriate everyday contexts, in parallel with the scientific understanding. In addition, the existence of several theories or models, all scientifically useful for the same phenomenon or scientific concept, has to be acknowledged.

 

Strömdahl [10] developed the two-Dimensional Semiotic/semantic Analysing Schema (2-D SAS) in order to analyse the meaning of words used in the natural sciences, which may be used as a tool within the conceptual change field (See Figure 1 below). A particular word (e.g., ‘entropy’) is analysed in two dimensions. The vertical axis depicts various meanings (polysemy) of the word and the horizontal axis depicts the semiotic elements, i.e. word, concept and referent. In the model, referents are the (typically non-linguistic) entities in the perceived world that language describes. Concepts are the mental or cognitive representations. Words are phonological and/or graphic symbols. Applying this schema, four types of meanings can be discerned: a set of non-formal senses, a scientific qualitative sense, a sense of physical quantity, and a sense of an operational empirical measurement or derivation (quantification of the physical quantity) that are all modelled using corresponding referents (Figure 1). The icon on the upper right hand side highlights the different senses and semiotic elements focused upon when presenting different educational implications in the following text.

 

In classical semantics, the referent is established by clear-cut sufficient and necessary conditions that define a sense of a word, similar to the idea of classification in science. Alternatively, according to Putnam [16] a word can be given an ostensive definition, i.e. the referent is pointed out as a standard or ‘dubbed’, as in “this is gold”, referring to a specific gold nugget. The issue of reference has often focused on natural kinds, e.g., ‘gold’ or ‘water’. However, there is another class of more abstract, theoretical terms, such as ‘force’, ‘mass’ and ‘acceleration’. Andersen and Nersessian [17] claim that such theoretical concepts cannot be pointed out in isolation, but rather as interacting participants in complex structures, understood in relation to each other by a theory or a law, in this case Newton’s second law. Hence, the discernment of these concepts and their referents is difficult, not only for novices being introduced to the field, but also in principle.

 

Following the reasoning of Andersen [18], the perspective of the present study is that referents are identified in the phenomenal world-as-perceived, not in the world-in-itself. Particularly, the referent of a formal, scientific concept is regarded as an aspect of an idealised and prototypical model of natural phenomena. As an example, the word ‘temperature’ has a formal definition in classical statistical mechanics, where the referent is an aspect of moving particles in a box. This qualitative modelling is complemented by the mathematical relation to other physical quantities. The empirical quantification of ‘temperature’ requires the definition of a unit (1 K, 1°C, etc.) and suitable measurement equipment such as a thermometer. On the other hand, the non-formal everyday life conception of ‘temperature’ refers to lived experiences, such as the physical sensation of a hot cup of coffee or a warm summer day [9].

 

  1. Principled Polysemy Analysis of Entropy

In the application of the Principled Polysemy approach to the data described above, five distinct senses of entropy and their interrelationships were identified:

 

  • Thermodynamic Sense
  • Statistical Sense
  • Disorder Sense
  • Information Sense
  • Homogeneity Sense

 

After the initial Thermodynamic Sense, the meaning, concept elaboration and grammatical criteria are used to argue for the introduction of additional senses.

 

3.1. Thermodynamic Sense

The term ‘entropy’ (originally ‘Entropie’ in German) was coined by Rudolf Clausius in 1865 to refer to a physical quantity S that was interpreted as ‘transformational content’ in the area of thermodynamics. His designation was inspired by the Greek word, ‘trope’, meaning transformation, and formed the new word ‘Entropie’ reminiscent of the closely related word ‘energy’. The concept was introduced by Clausius based on his investigation of heat engines following the results of Carnot, and generalised in a series of papers published in 1850–1865 [19]. He found that the process of ‘transmission transformation’ of heat from a body of high temperature to a body of lower temperature was linked to or driving the ‘conversion transformation’ of heat into work. He also realised that this process could be reversed for ideal heat engines. Based on the axiom “heat cannot of itself pass from a cooler to a warmer body”, in effect his formulation of the second law of thermodynamics, Clausius showed that there is a state function that relates the amount of heat involved in the two types of transformation of a cyclical and reversible process. It is this state function that he termed entropy. The finding was later generalised for irreversible processes into the result: dS ≥  δQ/T, i.e., the change of entropy is equal to (for reversible processes) or larger than (for irreversible processes) an infinitesimal amount of heat added divided by the absolute temperature. He ended his written production with the striking conclusion: “Die Energie der Welt bleibt konstant; die Entropie strebt einem Maximum zu” (the energy of the world is constant; the entropy tends to a maximum).

 

In 1909, Charathéodory presented an axiomatic approach to thermodynamics based on definitions of equilibrium and states and two axioms, without having to introduce imaginary heat engines or heat flows. The second law of thermodynamics was formulated as an axiom in an innovative fashion: “In the neighborhood of the initial state of a system there exist states not accessible from the original state along any adiabatic path.” In this axiomatic system, quantities, such as heat, temperature and entropy, were derived and analysed through the use of Pfaffian differential equations [20, 21].

 

In the field of engineering thermodynamics, the macroscopic interpretation of entropy has been extended to include time-dependent nonequilibrium systems, where matter, energy and, correspondingly, entropy flows through the system, as in the case of steady-state conditions. In this setting, Tolman and Fine [22] argue for the inclusion of terms representing the irreversible production of entropy. This enables the formulation of the second law of thermodynamics as an equality and the introduction of the efficiency equation, which relates the net work done by a system to factors, such as the energy and entropy transferred through the system, temperatures of the system and its surrounding and the introduced irreversible entropy production.

 

In the Thermodynamic Sense, entropy is regarded in a deterministic manner and the microscopic nature of matter or the inner structure of a system is not considered. Clausius recognised the importance of the arrangement of molecules and the tendency to increase the disaggregation of a body as an effect of added heat in the physical interpretation of his results, but he did not elaborate this interpretation in the formalism that he used. The tendency of entropy to increase is stated de facto, without further explanation of the mechanism. In addition, there are no tools for assigning an absolute value of the entropy of an isolated system, only the change during interaction with the environment. In the Thermodynamic Sense, the change of entropy typically describes the process of an ideal heat engine, similar to those introduced by Carnot from an engineering point of view. In this way, most textbooks have not adopted an axiomatic approach, such as Charathéodory’s. From this perspective, the idealised prototypical model is the heat engine and its interaction with the environment. The referent of entropy is abstract and difficult to pinpoint explicitly, but does relate to the tendency of energy to dissipate and the connection between conversion of heat to work and the transfer of heat from a body of high temperature to a body of lower temperature.

 

As an example of the Thermodynamic Sense, Davies [23] provides the following two descriptions in a popular account:

 

When a physical process occurs, such as a piston-and-cylinder cycle in a steam engine, it is possible to compute how much entropy is produced as a result.

In a closed system the total entropy cannot go down. Nor will it go on rising without limit.

There will be a state of maximum entropy or maximum disorder, which is referred to as thermodynamic equilibrium; once the system has reached that state it is stuck there.

 

In the previous Newtonian mechanistic understanding of the world as a clock, objects move and collide. This model could equally well be run backwards in time without breaking natural laws, such as energy and momentum being kept constant. However, the introduction of entropy provides an arrow of time that accompanies an explanation of irreversibility.

 

The on-line dictionary used in the study offers the following description of the Thermodynamic Sense of entropy [24]: a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not available for work during a thermodynamic process. A closed system evolves toward a state of maximum entropy.

 

This entry relates to the Thermodynamic Sense and conveys the premise that entropy is a state function that strives toward a maximum value, albeit expressed in non-formal language. The formulation, “a measure of the energy that is not available for work during a thermodynamic process” is not uncommon in dictionaries and is reminiscent of the quality of energy. Nevertheless, a problem is that it treats entropy as a kind of energy, ignoring the temperature, which may lead to unit errors. Besides, it assumes an engineering perspective that focuses on work rather than the more relevant concept of heat. The description used reminds us of exergy, which by contrast, is related to the amount of energy that is available for work. This example shows the difficulty making a non-formal, yet physically correct description of entropy, using a macroscopic perspective, in line with Reif [3]. (Note that in the interpretations by Davies and the dictionary below, a closed system cannot exchange energy with the surrounding, otherwise a characteristic of an open system.)

 

Apart from the Thermodynamic Sense, dictionary.reference.com [24] gives the following sense of entropy in the domain of cosmology: a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature (heat death).

 

This sense is basically the universal tendency of entropy to reach a maximum in a cosmological context. Since the uniformity of temperature is mentioned indirectly, it encompasses the distribution of energy. The ‘heat death’ vision (already invoked by Lord Kelvin) provides a connection to a negative and dystopic interpretation. It shows the ultimate consequence of the second law of thermodynamics, the final state of the universe devoid of all structure and drive for change. In spite of its fundamental importance in describing all spontaneous productive or destructive change, this powerful image associates entropy with destructive connotations. We argue that the ‘heat death’ interpretation qualifies as a sub-sense within the Thermodynamic Sense, since it includes the dystopic connection without a distinctly different type of referent.

 

3.2. Statistical Sense

While the macroscopic theory of thermodynamics succeeded in identifying that there is a quantity called entropy, which tends to increase in spontaneous processes, it failed to explain the underlying mechanism for why the entropy increases. For a student encountering the field of thermodynamics, it is difficult to get a grasp of what entropy really is by using a macroscopic approach alone. For a more fundamental understanding of the concept, statistical mechanics is required, a field that was developed in the decades after Clausius’s work, to which Maxwell, Boltzmann, and Gibbs are the most prominent contributors. They introduced statistical behaviour of individual particles in the analysis of systems, starting with the kinetic gas theory, a bold move in a time where the existence of atoms was far from generally agreed upon.

 

In particular, Boltzmann discovered that entropy is linked to the corresponding number of microstates, W, of the particles of an isolated system, through the following formula: S = kB lnW, where kB ˜ 1,38 ·10-23 J/K is Boltzmann’s constant. A microstate in classical statistical mechanics can be seen as a small volume element in a phase space of 6N dimensions that represent the locations and momenta of all N particles in a system. This is built on the postulate of equal a priori probability, i.e. that the system has the same probability of being in any of every possible microstate with the same energy, and the ergodic hypothesis, i.e. that time averages of variables in a system are equal to the average of a large number of identical systems (ensembles). As a consequence, the system tends to a macrostate with the highest number of possible microstates and hence, tends towards maximum entropy. It is a truism of fundamental significance as a strategy of nature: A system will most likely be in the state that is most probable; blind chance drives change. In this way, in comparison with the Thermodynamic Sense, a Statistical Sense enables a deeper understanding of the underlying mechanisms of thermodynamic processes.

 

Gibbs presented the following expression of entropy, S =-kB Σ pi ln pi, where pi is the probability i that a system will be in microstate i. It is a more general expression than the one presented by Boltzmann and it can be applied to a wider range of systems in equilibrium. For instance, it allows the possibility to exchange heat and particles with the surrounding environment, corresponding to Gibbs’s canonical and grand canonical ensembles. Gibbs considers a large number of identical systems, an ensemble. He also applies the ergodic hypothesis, that the long time average of a macroscopic variable of a system is equal to the ensemble average. Lebowitz [25] points out that a consequence of the use of statistical ensembles is that the Gibbs entropy is problematic for understanding time evolution of time-dependent nonequilibrium thermodynamic systems. (Cf. Sklar [21] for an in depth discussion of the differences between Boltzmann’s and Gibbs’s interpretation of entropy.)

 

The field of statistical mechanics was originally developed using classical physics, although presaging some qualities of quantum mechanics. For instance, the quantisation of the phase space was used as a calculational tool without physical rationale. Subsequently, the field has been given a thorough quantum mechanical interpretation, where the number of microstates at a certain total energy of a system is explained as the degeneracy of the eigenstates of the Schrödinger equation. In many applications, similar results can be derived using classical or quantum approaches where the classical statistical mechanics can be regarded as an approximation in the classical limit. In other cases that involve very low temperatures or consider the behaviour of photons, only the quantum approach yields results that are in line with empirical findings.

 

3.2.1. Statistical Sense—Meaning Criterion

The Statistical Sense of entropy assumes a probabilistic perspective that takes the likelihood of different microstates into account. For the Statistical Sense of entropy, the prototypical idealised model used is a system of particles in motion. For polyatomic molecules, the motion can be translational, rotational or vibrational. Here, the referent relates to the number of microstates that correspond to one given macrostate. This is expressed most clearly in the expression S =kB lnW for isolated systems, where entropy is related to the number of microstates that correspond to a macrostate, linked to the postulate of equal a priori probability. As a contrast to the Statistical Sense, the thermodynamic approach relates to a system exchanging heat and work with the surroundings, but cannot infer anything about the inner structure of the system.

 

Due to multiplication with kB (measured in J/K), the Statistical Sense of entropy is explicitly linked to the physical quantities of energy and temperature, and hence, refers to the same physical quantity as that communicated in the Thermodynamic Sense. In this way, it could be argued that the Thermodynamic Sense should not be separated from the Statistical Sense. However, we would claim that the use of different types of referents with the introduction of microstates makes it qualify as a stable different meaning.

 

Even though different ensembles in statistical mechanics are analysed with different mathematical techniques and yield seemingly different results, the expression S =-kB Σ pi ln pi holds true for all of them. In addition, the general approach of using particles in a system is shared, providing similar types of referents. Therefore, we argue that the meaning of entropy is not distinctly different between these ensembles and they relate to the same sense.

 

Quantum mechanics offered a refined understanding of microstates and made the explanation of a larger set of physical phenomena possible, e.g., behaviour of systems at low temperature. However, many of the tools developed in classical statistical mechanics for calculating macroscopic quantities can be utilised in the quantum mechanical approach as well. From our point of view, quantum mechanics offers a new referent for microstates, but not necessarily for macroscopic properties such as entropy. We suggest that entropy is still related to the number of microstates, as is the case in the Statistical Sense. In this way, the quantum meaning of entropy qualifies as a sub-sense within the Statistical Sense.

 

3.2.2. Statistical Sense—Concept Elaboration Criterion

The tendency of the entropy of an isolated system to increase is explained as tending towards a macrostate that corresponds to the maximum number of microstates. This is a neutral consequence of random motion and the exchange of energy between particles. The negative and resigned rhetoric of the ‘heat death’ vision is difficult to adopt in the Statistical Sense where much of the mystique has been removed. In addition, the final equilibrium state would correspond to a maximum number of microstates that the universe fluctuates between from the statistical perspective, a view far from stagnant death!

 

While the Thermodynamic Sense focuses on changes of entropy, the Statistical Sense provides an interpretation of absolute values of the entropy of a system. For instance, it is possible to calculate the entropy of a system that contains a noble gas through the Sackur-Tetrode formula. An explicit expression of the entropy of an isolated system is presented (provided certain conditions for molecular motion are satisfied), in terms of macroscopic quantities, e.g., the volume V and the number of particles N, and constants from classical or quantum statistical mechanics, Boltzmann’s constant, kB, and Planck’s constant, h.

 

3.2.3. Statistical Sense—Grammatical Criterion

No unique grammar patterns have been identified for the Statistical Sense.

 

3.3. Disorder Sense

In the Disorder Sense, entropy is divided into a non-science domain and a science domain, which can each be regarded as different sub-senses. The science domain is illustrated by the following examples:

 

Entropy provides a quantity measure of disorder [26].

 

Disorder is designated by a quantity called entropy, which is denoted S [27].

 

In this sense, entropy is seen as a sign of disorder and brings to mind a “messy nursery”, for example. The Disorder Sense is often used purposely in the teaching of introductory thermodynamics as an analogical tool for describing entropy. For instance, in a popular account of entropy, Atkins [28] compares the consequence of heating a system at different temperatures to sneezing in a church versus in a busy street.

 

Viard [29] has studied the understanding of entropy among fifth year physics university students. He asked ten students, “What is entropy for you?”, and “What do you imagine when you think of entropy?” in a pilot study [29]. Nine of the students referred to entropy as disorder or a measure of a system’s disorder. Although only a small sample, this gives an indication of how well-established the sense of the word is in students’ conceptual frameworks. He also showed that this association may often lead students to generating erroneous conclusions when solving thermodynamics tasks. For example, students suggested that the entropy would increase during the reversible and adiabatic expansion of a gas. Here, students probably only took the increased spatial disorder of the increased volume into account. Viard found similar reasoning patterns among other groups of third year university students.

 

In the non-science domain, the Disorder Sense does not refer to physics, but is rather a subjective interpretation of tendencies in society or culture. For example:

 

[The entropy struck after four windows]. In effect, it all starts with a vision to cheat the entropy of decomposition. Get the mouldering windows to be like new again. But the journey is full of surprising events [12]. In this manner, the term entropy has been introduced in other domains as a metaphor for general chaos or disorder. This general use of entropy as disorder has been elaborated in “semi-formal” ways in several academic fields such as economics, where analogies have been made to the formalism in thermodynamics (see for example Saslow, [30]) or more freely, in psychology or sociology. A parallel can be made to evolutionary theory which has been metaphorically transferred onto the domain of society in the form of social Darwinism.

 

3.3.1. Disorder Sense—Meaning Criterion

The Disorder Sense uses a system model consisting of at least two levels to establish the referent. The system is constituted by a set of parts that are each in a different degree of disorder relative to each other. This is similar to the Statistical Sense, but distinct from the Thermodynamic Sense, where a system is described by a set of state functions, e.g., volume, internal energy or pressure, and at the macroscopic level only.

 

Unlike the Statistical Sense, the Disorder Sense typically does not prompt for a probabilistic approach, but uses a snapshot of a situation, which analogically speaking, represents one single microstate. Disorder is related to visually salient spatial configurations and ‘messiness’ that does not take energy distribution into account. A typical text book example is given below:

 

The entropy is a measure of the disorder in a system. If I empty a box of Lego pieces on the floor, there is disorder among the Lego pieces. They are randomly scattered over the floor. The entropy of the Lego pieces is higher when they are scattered than when they are arranged in the box [31].

 

3.3.2. Disorder Sense—Concept Elaboration Criterion

Entropy in the Disorder Sense has the negative connotations in common with the ‘heat death’ sub-sense of the Thermodynamic Sense. The word ‘disorder’ prompts a subjective and emotionally charged interpretation that does not exist as part of the Statistical Sense.

 

The Disorder Sense often appears in everyday contexts and uses common objects that we are familiar with and which we can easily imagine in thought experiments. This is in stark contrast to the Statistical Sense that deals with phenomena in the microscopic world that are not directly accessible through our physical senses. In an educational setting, the disorder metaphor makes the manipulation of individual microscopic particles more concrete by comparison with everyday physical items.

 

We would claim that the “semi-formal” adoption of the word entropy in social sciences or economics or psychology has not reached a mature level with definitions agreed upon in the respective science communities. Instead, such a designation should be regarded as an elaboration of the Disorder Sense (or alternatively the Thermodynamic or Statistical Senses, when their formalisms are explicitly used) in new contexts.

 

3.3.3. Disorder Sense—Grammatical Criterion

No unique grammar patterns have been identified for the Disorder Sense.

 

3.4. Information Sense

The term entropy was introduced in information theory by Claude Shannon in 1948 for the quantity H =-K Σ pi ln pi, as a measure of the information produced in the stochastic process of i forming a message [32]. The adoption of the word entropy was directly inspired by statistical mechanics, as von Neumann advised Shannon:

 

Call it entropy. It is already in use under that name and besides, it will give you a great edge in debates because nobody knows what entropy is anyway [33].

 

The entropy concept in information theory has found a variety of applications, such as establishing limits for data compression, the analysis of natural language and distribution of wealth. Lambert [34], however, claims that it was unfortunate that the term entropy was introduced into information theory. He argues that it gives rise to many misunderstandings, particularly as entropy is not related to energy in this domain.

 

3.4.1. Information Sense—Meaning Criterion

In the realm of information theory, entropy can be interpreted as the average amount or rate of information produced when forming a message, element by element. The idealised prototypical model of the Information Sense shares the relationship between a macroscopic system (the entire message) and its elements (bits of information) as well as the mathematical formalism in the expression S =-kΣ pi ln pi with that of the Statistical Sense. However, in the Information Sense, the constant k is not related to energy and temperature. It therefore has a different unit and interpretation as a physical quantity.

 

In the Information Sense, the referent relates to the information needed to produce or interpret some type of message, using its known or unknown elements (digits, letters, words, etc.). This informational domain is distinct from the physics and chemistry domain shared in thermodynamics and in classical and quantum statistical mechanics, which renders a different meaning. An example of the Information Sense of entropy is:

 

If a source can produce only one particular message its entropy is zero, and no channel is required. For example, a computing machine set up to calculate the successive digits of p produces a definite sequence with no chance element [32].

 

3.4.2. Information Sense—Concept Elaboration Criterion

The Information Sense of entropy relates to the analysis, creation, coding and decoding of a message. Compared to the Statistical Sense, in the Information Sense there is typically a need for a contextual interpretation of the message, first by the ‘sender’ and then by the ‘receiver’. An agreement is needed on the granularity, the data format and the medium used. In addition, in the Information Sense, the focus is on the individual message or one particular configuration of items, as opposed to in the Statistical Sense, where all microstates sharing certain characteristics are considered. Entropy can be used as a tool to predict the next element in a stochastic process by changing the conditional probabilities as the message evolves:

 

[W]e describe how we have applied maximum entropy modeling to predict the French translation of an English word in context… A maximum entropy model that incorporates this constraint will predict the translations of in a manner consistent with whether or not the following word is several [35].

 

This treatment of entropy conjures up aspects of time series simulations in the Statistical Sense, but still requires an additional subjective interpretation of the message.

 

3.4.3. Information Sense—Grammatical Criterion

No unique grammar patterns have been identified for the Information Sense.

 

3.5. Homogeneity Sense

In the Homogeneity Sense the quality of entropy itself is connected to homogeneity rather than to referring to quantities such as ‘high’ or ‘increasing’ entropy. It is typical for the Homogeneity Sense to appear in non-science domains that include art and literature. Two examples from a Swedish context that clarify such usage are as follows: [A] three ton bitumen cube that is cubic at first, but after a while slowly settles into a flat lump of asphalt. It creates a powerful image of how the form of matter is smoothed out into entropy. Also my body is a kind of hydrocarbon lump without a return ticket to the origin [12]. He was interested in people, both children and adults, and received a warm response, an entropy, still remaining even though he himself has passed away. [NN] was engaged and well-informed in many areas [12]. In the first excerpt, entropy is regarded as the final, homogenous state of a dynamic process of artwork. In the second example, taken from a poetic obituary, entropy is seen as the remaining quality of a missed friend.

 

3.5.1. Homogeneity Sense—Meaning Criterion

The Homogeneity Sense refers to a system without an inner structure, whereas the Disorder Sense implies a system built up by a configuration of parts. The lack of any inner structure is shared with the Thermodynamic Sense, but the Homogeneity Sense is like the Disorder Sense in that it is distinctly subjective, immeasurable and spatially oriented.

 

3.5.2. Homogeneity Sense—Concept Elaboration Criterion

Entropy in the Homogeneity Sense has radically departed from the scientific field and is not used in science teaching or explanations.

 

3.5.3. Homogeneity Sense—Grammatical Criterion Rodewald [36] proposes the use of ‘increasing homogeneity’ for the qualitative introduction of entropy in thermodynamics teaching. He also introduces a homogeneity function, which reaches the maximum value of 1, and that corresponds to a totally homogenous system. However, our interpretation of the Homogeneity Sense of entropy is that it is a quality that an object can either ‘have’ or ‘not have’. It is not a continuous function that reaches the value of 1, but should be viewed as a discrete all-or-nothing variable. Here, it is possible to point out that there is entropy that represents a structureless state. This can be compared with to the other senses, including the subjective Disorder Sense, where a system or object has for example low or increasing entropy, a quantity measure.

 

3.6. Sanctioning Sense and the Semantic Network of Entropy

As described above, in Lakoff’s and Evans’s approaches to polysemy, a word is represented by a radial category of senses that is derived from a logically central Sanctioning Sense [14]. We now turn to proposing how the five described senses of entropy could be related.

 

3.6.1. Earliest Attested Meaning

In the case of entropy, the origin of the word is well documented, as it was coined and presented by Clausius in a scientific paper in 1865, related to the Thermodynamic Sense. This is an unusually clear example of how a word has been “dubbed” ostensively in line with Putnam [16].

 

The introductions in statistical mechanics and information theory are also historically attested in a straight-forward manner, relating to the work of Boltzmann in the 1870’s and Shannon’s article in 1948. In a letter to the editor of the American Journal of Physics, Baierlein and Gearhart [37] have tracked the disorder metaphor of entropy as far back as to Helmholtz, who characterized entropy as ‘Unordnung’ in 1882. However, the history of the Homogeneity Sense remains unclear. It also should be noted that the use of the Homogeneity Sense has been encountered only in a Swedish context, and may not be used in English.

 

3.6.2. Predominance in the Semantic Network

One characteristic of the Statistical Sense of entropy is that it uses a description of a system through the relationship between its elements. This characteristic is shared with the Information Sense and the Disorder Sense, but not with the Thermodynamic Sense and the Homogeneity Sense, which do not directly imply an internal structure. Unlike the four other Senses, the Homogeneity Sense is most frequently used in non-science domains. The Disorder Sense holds an intermediary position between science and non-science fields.

 

The productive character of the Statistical Sense of entropy is seen most clearly in the inspiration of the Information Sense. In addition, Saslow [30] shows that it could serve a similar purpose in the analogy between thermodynamics and economics. Furthermore, in the ground-breaking proof of the Poincaré conjecture, one of the seven Millennium Prize Problems in mathematics, Perelman [38] used an analogy to statistical mechanics and the entropy concept. Thereby, insight in physics inspired the advancement of pure mathematics in a highly innovative fashion.

 

3.6.3. Predictability Regarding Other Senses

The macroscopic thermodynamic properties of entropy can be derived mathematically from statistical mechanics, where also the underlying mechanisms and a broader range of phenomena can be explained. For instance, Baierlein [2] presents a method for deriving the formula ΔS = q /T from an example concerning isothermal expansion of a gas and the associated reasoning that the multiplicity (number of microstates) is proportional to VN (where V is the volume and N is the number of particles). However, it is not possible to derive statistical mechanics with a macroscopic approach. Therefore, the Thermodynamic Sense can be derived from the Statistical Sense logically.

 

The Gibbs formulation S =-kΣ pi ln pi is very general and can be applied to several types of i thermodynamic systems, regarded as different sub-senses within the Statistical Sense. Due to the similarity of the formulae used, the term ‘entropy’ was adopted in information theory by changing the value and unit of the constant k, which in turn, gave rise to the Information Sense. Joslyn [39] even claims that statistical entropy is a completely syntactic, content-free concept, thereby lacking an inherent semantic. The mathematic formalism can be applied to any context of interest, extending from the original field of thermodynamics. Apart from the mere visual similarity of the formulae, Jaynes [40] showed that information theory provides a valid alternative approach to derive the field of statistical mechanics in physics, which does not require the equal a priori postulate used by Boltzmann. With the maximum entropy approach, the quantity

H =-KΣ pi ln pi is maximised, i subject to the constraint of the knowledge of measured macroscopic quantities, which are handled by Lagrange relaxation. Here, “[e]ntropy as a concept may be regarded as a measure of our degree of ignorance as to the state of a system /…/ [W]henever the available information is sufficient to justify strong opinions, maximum-entropy inference gives sharp probability distributions indicating the favored alternative.” In this way, the arrow between the Statistical Sense and the Information Sense in Figure 2 below can go in both directions.

 

A metaphoric derivation of the Disorder Sense both in non- and scientific domains from the Statistical Sense is supported by the following arguments. As seen above, the two senses share the structure of a system with a set of constituent parts that are present in some level of disarray. In this way, they have similar or identical referents. In the science education field, an analogy is made between a thermodynamic system and, for example, a messy room, a mapping that focuses on the spatial configuration of the parts. In the analogy, microstates correspond to possible configurations of a set of toys and clothes. There are only few microstates characterised by order, which would correspond to a tidy room. This metaphorical transfer would not have been possible from the Thermodynamic Sense to the Disorder Sense, since the former does not use a system/part model. This is supported by the emergence historically of the disorder metaphor only after the introduction of the Statistical Sense. However, the negative connotation of disorder is closely linked to the ‘heat death’ sub-sense of the Thermodynamic Sense, as opposed to the more neutral Statistical Sense. Although not forming a radial category, the Disorder Sense has fused characteristics from these two different senses. Different characteristics have been inherited from more than one other sense when forming a new one.

 

The relationships between the different senses can be seen as an example of ‘family resemblance’, as introduced by Wittgenstein [41].

 

It can be argued that the Homogeneity Sense is an extension from the Disorder Sense through the process of image schema transformation. Here, a shift is made from a process focus to an endpoint focus, as in the case in the above-mentioned ‘over’ example. While the Disorder Sense focuses on the process of increasing disorder during spontaneous change, the Homogeneity Sense refers to the end state of equilibrium. As an example:

 

Grab the handle and pull the drawer back and forth. Eventually, everything will be mixed, like a smooth gruel, you see, no concentrations anywhere but everything is equally thick. Entropy she called it. The state of the desk drawer is, however, unlike the Earth’s reversible [12].

 

Here, although using the analogy of the level of order in a drawer, reminiscent of the typical context of the Disorder Sense, the interpretation of the particular word entropy aligns one’s mind with the Homogeneity Sense: the gruel has entropy and a lack of inner structure. In addition, the Homogeneity Sense has ‘borrowed’ characteristics from the ‘heat death’ interpretation of the Thermodynamic Sense such as the final, static state. The Homogeneity Sense has also been found in technological contexts:

 

Still, he claims that ‘growth is the natural state’! He seems to suggest that it is only a matter of technology to avoid entropy. Well, of course you could say that whether an industry is polluting or not can be decided by seeing if the smoke goes out of or into the chimney [12].

 

Human technology will certainly not be totally free from entropy, but the degree of entropy can vary significantly between different technologies [12].

 

In the two examples above, entropy is used in a technological description that is related to the transformation of energy, a use that is a typical context for the Thermodynamic Sense. However, the expressions “free from entropy” and “avoid entropy” shares clear connotations with the Homogeneity Sense. It is a matter of whether entropy exists or not. It is something undesired, that should be avoided or maybe caught by a filter. We regard this as an unfortunate use of the word entropy in this technological setting.

 

3.6.4. Lived Human Experience

The Thermodynamic Sense is closest to the macroscopic phenomena that are perceived (e.g., the functioning of heat engines and the direction of time). However, Carnot’s and Clausius’s analyses of ideal thermodynamic processes were scientifically powerful abstractions, but therefore at the same creating a gap to the perception of the world around us. One unorthodox way to retain this experiential connection is proposed by the Karlsruhe Physics group. Here, Falk [42] puts forward the idea that entropy should be seen as the everyday conception of heat, and treated as the central substance-like quantity of thermodynamics. Norwich [43] gives another account of how the physical quantity entropy is connected to human sensations. Above threshold values for human perception, he finds linear relations between the taste of saltiness and molar entropy of salt solutions and between the perceived loudness of a sound and the molar entropy of a gas.

 

The Statistical Sense of entropy is founded on modelling at the microscopic scale, which is not directly available to human perception. However, as Reif [3] argues, the microscopic model of a system and its parts can be understood through the use of analogy to more concrete domains, such as the purposeful use of the Disorder Sense in educational settings.

 

3.6.5. Conclusion on Sanctioning Sense

Based on the preceding arguments and in spite of the fact that the Thermodynamic Sense is the earliest historically attested sense, we propose that the Statistical Sense has the strongest claims for being the Sanctioning Sense. This contention is based mainly on its historical precedence in relation to the Information Sense and its predictive character in relation to the other existing senses. The resulting semantic network is shown in Figure 2.

 

  1. Classification of Senses in 2-D SAS

As mentioned above, the different senses of entropy can be classified as scientifically formal, non-formal or, in the case of the Disorder Sense, in across categories.

 

This ambiguity of the nature of a sense suggests a continuum along the vertical axis of different senses. Entropy cannot be measured directly, but can only be derived through calculation from other quantities. This is shown by the blank boxes in the icon on the right, where the empirical sense is not taken into account.

 

  1. Educational Implications

As mentioned above, Baierlein [2] has recognised that there is an educational challenge associated with the introduction of fundamental thermodynamics concepts such as entropy during teaching in terms of adopting a microscopic or a macroscopic approach. Eventually, an aim of teaching ought to be to merge these perspectives into an integrated view on the physical quantities involved. As clarified through use of the 2-D SAS approach, one of the challenges with regard to teaching entropy is that the two main approaches use different referents in the modelling of the same physical quantity (see Figure 4 below).

 

The use of a particular referent in teaching sets the agenda for how the concept is perceived among the students. For instance, if the Statistical Sense of entropy is intended to be learnt by the students, an appropriate referent should be used; in this case typically an aspect of a model of particles in motion, which depicts natural phenomena. This does not guarantee successful instruction, but it is a requirement. This becomes even more important for terms, which are used in non-formal settings prior to instruction or have a distinctly dominant Sanctioning Sense. For instance, if the teacher does not explicitly point out that the thermodynamic interpretation of heat refers to the exchange of energy between two systems, the students may stick to the powerful, subjective sensation of hotness.

 

As Reif [3] points out, it is difficult to visualise thermodynamic properties using a macroscopic approach, but easier in the microscopic approach. We suggest that the metaphoric transfer to the Disorder Sense is a good example of that type of visualisation. In this respect, the Disorder Sense that is often used in an educational setting assumes a position somewhere between a non-formal, everyday life concept and a sanctioned, formal construct. Analogies may provide a link to a more concrete domain, which can be made familiar through lived experience. There appears to be a paradox in that despite its practical engineering origin, it is difficult to visualise the Thermodynamic Sense of entropy! This observation could very well be a sign of the idealisations used in the modelling of heat engines, ideas that are far from real car engines.

 

One major drawback of the use of the disorder metaphor in the teaching of entropy is the tendency to give a ‘snap-shot’ view of only one microstate. The centrality of the dynamic fluctuation between all possible microstates is difficult to bring across with this approach. In addition, it tends to focus exclusively on spatial configuration. However, these are not insurmountable obstacles. Apart from the ‘messy room’, a more elaborated analogy is given by Atkins in a popular account [28]:

 

The analogy I like to use to show the connection [between the interpretations of entropy in macroscopic thermodynamics and statistical mechanics] is that of sneezing in a busy street or in a quiet library. A sneeze is like a disorderly input of energy, very much like energy transferred as heat. It should be easy to accept that the bigger the sneeze, the greater the disorder introduced in the street or in the library. That is the fundamental reason why the ‘energy supplied as heat’ appears in the numerator of Clausius’s expression, for the greater the energy supplied as heat, the greater the increase in disorder and therefore the greater the increase in entropy. The presence of the temperature in the denominator fits with this analogy too, with its implication that for a given supply of heat, the entropy increases more if the temperature is low than if it is high. A cool object, in which there is little thermal motion, corresponds to a quiet library. A sudden sneeze will introduce a lot of disturbance, corresponding to a big rise in entropy. A hot object, in which there is a lot of thermal motion already present, corresponds to a busy street. Now a sneeze of the same size as in the library has relatively little effect, and the increase in entropy is small.

 

In this way, Atkins provides the analogy ‘adding heat to a thermodynamic system is like sneezing’ as an educational application of the Disorder Sense of entropy. The sound volume or level of activity in the surrounding environment corresponds to the temperature of the system receiving heat and the “size” of the sneeze is mapped to the amount of heat transferred and at the core there is the. This particular analogy is preferable to the messy room, since it conveys less of a ‘snap-shot’ view and is not only associated to visual configuration.

 

Brissaud [44] proposes the use of ‘information’ and ‘freedom’ for describing the meaning of entropy. In line with Jaynes [40], he argues that ‘information’ focuses on the state of systems: outside of a system, entropy represents the ‘lack of information’ about the system, but inside the system, entropy represents ‘information’ itself. Entropy as a measure of ‘freedom’, the freedom of choice of the next microstate, gives a complementary, more dynamical aspect. In his view, ‘disorder’ is more suited as a metaphor for temperature than for entropy.

 

The Statistical Sense of entropy can be modelled in several different mathematical formalisms that correspond to different referents of a system containing particles in motion (Figure 5).

 

Apart from these different ensembles/systems, classical versus quantum treatments add to the complexity of available formalisms. In statistical mechanics, specific approaches for deriving macroscopic quantities, such as entropy, are very different depending on the different formalisms. For instance, in the microcanonical ensemble, entropy is directly linked to the number of microstates: S = kB lnW. In contrast, in the canonical ensemble, the link to macroscopic quantities is provided from the canonical partition function Q to the Helmholtz free energy:

A =-kBT ln Q. The fact that all these disparate ways of modelling can be reshaped into the expression S =-kΣ pi ln pi may come as a surprise when first encountered, but nevertheless provides an elegant way of demonstrating that all these approaches relate to the same qualitative Statistical Sense of entropy.

 

Another challenge in teaching and learning about entropy is that it is not possible to measure it in a direct way. It has to be derived by calculation, using measurement of other physical quantities in the macroscopic domain such as pressure, volume and temperature together with empirical constants (e.g., kB and h). This somewhat weak connection to a measurable and empirical domain is particularly relevant to the Statistical Sense (represented by the dotted line in Figure 6 below). It is simply not possible to size up a 6N phase space. From this perspective, in line with Kautz et al. [5], household equipment, such as bicycle pumps or car engines offer an opportunity to get a ‘feeling’ for what numerical values of the quantity may represent.

 

 

  1. Discussion

One reason for a linguistic approach to the field of science education is that it provides the possibility to identify and compare different scientific and non-scientific uses of a word. The relationship between the senses can be described without necessarily regarding the latter as misconceptions. The Principled Polysemy approach [8] was applied in this paper to a historically relatively new and unusual word, used predominantly in science settings, as compared with previous studies concerned with ‘over’ and ‘time’. As a consequence, the resulting semantic network has a relatively low degree of complexity.

 

The use of text corpora for polysemy analysis has provided authentic language samples, which are ambiguous and serve to highlight the dynamic relationships between different senses. In addition, the Homogeneity Sense of either having entropy or not, was first identified in the text corpora. Albeit so, in contrast with Gries’s and Divjak’s [11] statistical approach, the number of samples was small and the analysis was performed in a qualitative way. Furthermore, the use of occurrences of the word ‘entropi’ in Swedish text corpora may have given a different result compared to its use in English language, particularly in non-science domains.

 

The 2-D SAS approach [9, 10] was used to classify the identified senses of entropy and analyse their educational implications. As an extension of the original 2-D SAS approach, where only one formal referent was linked to a qualitative and a quantitative sense, this work has identified several formal referents through application of the Principled Polysemy approach. In addition, the Disorder Sense was used in both formal and non-formal settings, which suggested a continuous scale on the vertical axis previously not present in the schema.

 

It may be difficult to generalise the results from the analysis of entropy to other words used in science, but the approach can be used for other words, relevant within science. For instance, the word ‘time’, as studied by Evans [8] and accounted for above, could be further elaborated with regards to different scientifically formal senses. In addition, our findings regarding a classification of different senses of entropy could be used as instruments for empirical studies that could provide a platform for developing teaching sequences and analysing problem solving strategies among students.

 

As Williams [1] points out, physics is perceived as an “exact science” and assumes an unequivocal use of concept definitions. In contrast, in line with his results, this study has shown that central terms used in science may correspond to different yet related meanings in different domains, both within and outside the science community. In addition, entropy is an abstract, theoretical term. This makes analogical approaches in teaching a viable option. However, Lambert [45] claims:

 

Entropy is not disorder. Entropy is not a measure of disorder or chaos. Entropy is not a driving force. Energy’s diffusion, dissipation, or dispersion in a final state compared to an initial state is the driving force in chemistry. Entropy is the index of that dispersal within a system and between the system and its surroundings.

 

Here, Lambert argues that entropy has a single and unambiguous definition and interpretation. In this way, it would be possible to put forward necessary and sufficient conditions as a referent for the concept. In this point of view, the use of analogies, which by definition are not perfect maps of reality anyhow, may introduce errors and imperfections and should therefore, be avoided. We believe that this may well be the case within an established science community, but probably not so across different science domains, and particularly not for novice students. In the realm of thermal phenomena, following the reasoning by Andersen and Nersessian [17], there are several interrelated theoretical concepts, which only can be described by a theory, such as classical thermodynamics or statistical mechanics. For example, imagine heating a gas that leads to expansion. This may imply the combination of increased temperature, internal energy, pressure, entropy and average particle velocity. Here, the phenomenon is described as a variation in a set of abstract properties that are interlinked with each other, where the change in one is likely to have an impact on the others. When pointing at this complex and implying that “this is entropy”, it is difficult to discern this particular concept from other aspects of the same phenomenon. We claim that Lambert’s idea of simply presenting things ‘as they are’ is a far too restrictive perspective on how to introduce new concepts in educational settings. Any educational tool should be considered as long as its limitations are recognised and communicated to the students. For example, the ‘messy room’ analogy has the strong advantage of presenting a system and the relationship between its components. Drawbacks, such as the fact that only configurational aspects are accounted for, and that it provides a rather ‘snap-shot’ image, should be communicated explicitly to the students and contrasted with formal thermodynamic systems.

 

Acknowledgements

 

We thank our colleagues within the Swedish National Graduate School in Science and Technology Education and two anonymous reviewers for useful comments and feedback. In addition, we thank Richard Hirsch for support in the field of linguistics, Roland Kjellander for his reading from a thermodynamics point of view and Konrad Schönborn for his thorough comments on disposition and phrasing.

 

References

 

  1. Williams, H.T. Semantics in teaching introductory physics. Am. J. Phys. 1999, 67, 670–680.
  2. Baierlein, R. Entropy and the second law: A pedagogical alternative. Am. J. Phys. 1994, 62, 15–26.
  3. Reif, F. Thermal physics in the introductory physics course: Why and how to teach it from a unified atomic perspective. Am. J. Phys. 1999, 67, 1051–1062.
  4. Cochran, M.J.; Heron, P.R.L. Development and assessment of research-based tutorials on heat engines and the second law of thermodynamics. Am. J. Phys. 2006, 74, 734–741.
  5. Kautz, C.H.; Heron, P.R.L.; Shaffer, P.S.; McDermott, L.C. Student understanding of the ideal gas law, Part II: A microscopic perspective. Am. J. Phys. 2005, 73, 1064–1071.
  6. Lambert, F.L. Disorder—A cracked crutch for supporting entropy discussions. J. Chem. Educ. 2002, 79, 187–192.
  7. Tyler, A.; Evans, V. The Semantics of English Prepositions: Spatial Scenes, Embodied Meaning and Cognition; Cambridge University Press: Cambridge, UK, 2003.
  8. Evans, V. The meaning of time: polysemy, the lexicon and conceptual structure. J. Ling. 2005, 41, 33–75.
  9. Strömdahl, H. Discernment of referents—An essential aspect of conceptual change. In NARST Annual International Conference (In NARST 2009 CD Proceedings), Garden Grove, CA, USA, April 17–21, 2009.
  10. Strömdahl, H. The challenge of polysemy: On discerning critical elements, relationships and shifts in attaining scientific terms. submitted.
  11. Gries, S.T.; Divjak, D. Behavioral profiles: A corpus-based approach to cognitive semantic analysis. In New Directions in Cognitive Linguistics; Evans, V., Pourcel, S., Eds.; John Benjamins Pub Co: Amsterdam, the Netherlands, 2009; pp. 57–75.
  12. Språkbanken homepage. Available online: http://spraakbanken.gu.se/ (accessed on May 18, 2009).
  13. Lakoff, G. Women, Fire, and Dangerous Things: What Categories Reveal about the Mind; University of Chicago Press: Chicago, IL, USA, 1987.
  14. Evans, V.; Green, M. Cognitive linguistics. An introduction; Edinburgh University Press: Edinburgh, UK, 2006.
  15. Posner, G.J.; Strike, K.A.; Hewson, P.W.; Gertzog, W.A. Accommodation of a scientific conception: Toward a theory of conceptual change. Sci. Educ. 1982, 66, 211–227.
  16. Putnam, H. Meaning and reference. J. Phil. 1973, 70, 699–711.
  17. Andersen, H.; Nersessian, N. Nomic concepts, frames and conceptual change. Philos. Sci. 2000, 67, 224–241.
  18. Andersen, H. Reference and resemblance. Philos. Sci. 2001, 68, 50–61.
  19. Clausius, R. The Mechanical Theory of Heat, with its Applications to the Steam-Engine and to the Physical Properties of Bodies; John van Voorst: London, UK, 1867.
  20. Pogliani, L.; Berberan-Santos, M.N. Constantin Carathéodory and the axiomatic thermodynamics. J. Math. Chem. 2000, 28, 313–324.
  21. Sklar, L. Physics and Change: Philosophical Issues in the Foundations of Statistical Mechanics; Cambridge University Press: Cambridge, UK, 1993.
  22. Tolman, R.C.; Fine, P.C. On the irreversible production of entropy. Rev. Mod. Phys. 1948, 20, 51–77.
  23. Davies, P. The 5th Miracle. The Search for the Origin and Meaning of Life; Simon & Schuster Paperbacks: New York, NY, USA, 1999.
  24. Dictionary.com. http://dictionary.reference.com/ (accessed on May 18, 2009)
  25. Lebowitz, J.L. Statistical mechanics: A selective review of two central issues. Rev. Mod. Phys. 1999, 71, 346–357.
  26. Young, H.D.; Freedman, R.A.; Sears, F.W. Sears and Zemansky’s University Physics: With Modern Physics, 11th edition; Pearson Education: San Francisco, CA, USA, 2003.
  27. Henriksson, A. Kemi. Kurs A., 1st edition; Gleerup: Malmö, Sweden, 2001.
  28. Atkins, P. Galileo’s Finger. The Ten Great Ideas of Science; Oxford University Press: Oxford, UK, 2003.
  29. Viard, J. Using the history of science to teach thermodynamics at the university level: The case of the concept of entropy. In Eighth International History, Philosophy, Sociology & Science Teaching Conference, Leeds, UK, July 15–18, 2005.
  30. Saslow, W.M. An economic analogy to thermodynamics. Am. J. Phys. 1999, 67, 1239–1247.
  31. Ekstig, B. Naturen, Naturvetenskapen och Lärandet; Studentlitteratur: Lund, Sweden, 2002.
  32. Shannon, C.E. A mathematical theory of communication. Mob. Comput. Commun. Rev. 2001, 5, 3–55.
  33. Müller, I. A History of Thermodynamics. The Doctrine of Energy and Entropy; Springer: Berlin, Germany, 2007.
  34. Lambert, F.L. Entropy is simple—If we avoid the briar patches! Available online: http://www.entropysimple.com/content.htm (accessed on November 28, 2008).
  35. Berger, A.L.; Pietra, V.J.D.; Pietra, S.A.D. A maximum entropy approach to natural language processing. Comput. Linguist. 1996, 22, 39–71.
  36. Rodewald, B. Entropy and homogeneity. Am. J. Phys. 1990, 58, 164–168.
  37. Baierlein, R.; Gearhart, C.A. The disorder metaphor (Letters to the editor). Am. J. Phys. 2003, 71, 103.
  38. Perelman, G. The entropy formula for the Ricci flow and its geometric applications. Available online: http://arxiv.org/abs/math/0211159v1 (accessed on August 20, 2009)
  39. Joslyn, C. On the semantics of entropy measures of emergent phenomena. Cybernetics Syst. 1990, 22, 631–640.
  40. Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620–630.
  41. Wittgenstein, L. Philosophical Investigations, 3rd edition; Prentice Hall: London, UK, 1999.
  42. Falk, G. Entropy, a resurrection of the caloric—a look at the history of thermodynamics. Eur. J. Phys. 1985, 6, 108–115.
  43. Norwich, K.H. Physical entropy and the senses. Acta Biother. 2005, 53, 167–180.
  44. Brissaud, J.-B. The meanings of entropy. Entropy 2005, 7, 68–96.
  45. Lambert, F.L. Configurational entropy revisited. J. Chem. Educ. 2007, 84, 1548–1550.

 

© 2010 by the authors; licensee Molecular Diversity Preservation International, Basel, Switzerland.

 

This article is an open-access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).