Informatics Carnot Machine

Based on Planck's blackbody equation it is argued that a single mode light pulse, with a large number of photons, carries one entropy unit. Similarly, an empty radiation mode carries no entropy. In this case, the calculated entropy that a coded seque…

Authors: Oded Kafri

Informatics Carnot Machine Oded Kafri Varicom Communications, Tel Aviv 68165 Israel. Abstract Based on Planck's blackbody equation it is argued that a single mode light pulse, with a large number of photons, carries one entr opy unit. Similarly, an em pty radiation mode carries no entropy. In this case, the calculated entropy that a coded sequence of light pulses is carrying is simply the Gi bbs mixing entropy, which is identical to the logical Shannon information. This approach is supported by a demonstration that information transmission and am plification, by a sequence of light pu lses in an optical fiber, is a classic Carnot machine co mpri sing of two isothermals and two adiabatic. Therefore it is concluded that entropy under certain conditions is information. 1 A sequence of light pulses transmitted thr ough an optical fiber is widely used in communication [1]. A random sequence of identical light pulses, representing "1" and vacancies, representing "0", is the physic al entity of a transm itted binary file. The length L of the pulse's sequence (the number of the pulses and vacancies) and the randomness of the distribution of the pulses determine the amount of the Shannon information being transm itted [2]. In this communication a thermodynamic analysis of the transmission of a random sequence of a light pulses (a file) is considered. It is shown that the Shannon information is entropy, and the amplification process done in an optical fiber is a Carnot cy cle having the Carnot efficiency. To calculate the entropy of the sequence of pulses, it is first necessary to calculate the entropy of a singl e, single-mode coherent puls e. It is assumed that the i th pulse in a sequence has n i photons, of energy h ν , in a single mode . Since the photons are indistinguishable, the pulse is coh ere nt. The temperature of the pulse will be assumed to be equal to that of a blackbody that emits n i photons into a single-mode of frequency ν [3]. Since a blackbody is in equilibrium with its radiation, a temperature can be calculated. (Appropriate spa tial and spectral filters m ay filter the other radiation modes). In this case 1 1 / − = i B T k h i e n ν (1) The temperature of the coherent pulse obtained from eq. (1) to be ) 1 1 ln( i B i n k h T + = ν , (2) 2 The total energy of the p ulse q i is n i h ν . Therefore the entropy th at that a single pulse carries away from the blackbody is S i = q i / T i , or: ) 1 1 ln( i B i i n k n S + = (3) Since it means that the entropy of a c oherent pulse is identical to that of a classic harmonic oscillator (nam ely q i = k B T i ) and is not a function of its energy. Similarly, a mode w ithout energy carries no entropy as . B i n k S i = ∞ → lim 0 lim 0 = → i n S i The total entropy of a sequence of pulses, some with a large num ber of photons, (having entropy k B ), and some with no photons (emp ty pulses having no entropy), is the Gibbs mixing entropy of the sequence, namely, . Where Ω is the number of configurations of the pulses and p j is the probability of the j th configuration. It is seen that the Shan non information and entropy are equivalent in the classical limit (a large n i ) to the Gibbs entropy of mixing. j j j B p p k S ln 1 ∑ Ω = − = For example, to calculate the entropy of a random sequence of light pulses, of length L , it is necessary to consider the fact that each pulse has a probability of ½ to be "one" and probability of ½ to be "zero" . Therefore, the mixing entropy term is 2 ln ) 2 1 ln 2 1 2 1 ln 2 1 ( B B i k k S = + − = . To find the total entropy of the sequence we can sum all the entropies of the pulses (b ecause th e entropy is extensive), namely, . The Shannon information I is defined as [2]. If the probability of all the conf igurations is equal to 1/ Ω than I = log Ω . The num ber of the configurations of a binary file of a length L is 2 L , therefore the maximum amount 2 ln 1 L k S S B L i i = = ∑ = j j j p p I ln 1 ∑ Ω = − = 3 of the information in the bits of a file having L pulses is L ln2 nats (1bit = ln2 nat). It is seen that thermodynamics and informa tion theory yield the same result. When the sequence of the bits is not random, the am ount of information of the sequence is smaller. Therefore, in genera l, we obtain the Clau sius inequality, S ≥ k B I . ( 4 ) One can generalize this analysis and calcula te the energy and the tem perature of the whole sequence of pulses. This can be done easily for a random sequence. W hen n i is large, the temperature T i of a coherent pulse is n i h ν / k B =q i / k B where q i is the energy of the pulse. Namely, with contra distinction to the entropy, T i is a function of q i . . If we assume that all the energetic pulses h ave an equal energy q, the total energ y of the sequence is L q q Q L i i 2 1 = = ∑ = . The entropy of the sequence is S= k B L ln2 in nats or k B L in bits. The file temperature T is Q / S = q / 2k B . This means that the average b it energy q /2 is equal to k B T. This is the same relation as of a harmonic oscillator. It is worth noting that a random sequence of pulses is a non-coherent radiation, nevertheless it retains the thermodynamic propertie s of a harmonic oscillator. Now it is shown that this form alism com plies with the second law of thermodynamics [4]. Consider a long optic al fiber in which a file com prising a sequence of light pulses , having a temperature T H , with a pulse energy q H , travels along the fiber. The pulse energy is attenuated due to the loss in th e fiber. Therefore, the energy and the temperature of th e pulse are reduced to T C . Nevertheless, the amount of information (the entropy) rem ain s intact. This process of cooling at constant entropy, thermodynamically speaking, is an adiabatic expansion. W hen the pulse energy reduces, the file requ ires amplificatio n. To amplify the sequence of the 4 pulses, the amplifier has to read the f ile firs t. The reading p rocess is an energy transfer to the amplifier at cons tant bit energy. In this pro cess the amplifier increas es its energy at a constant temper ature. Thermodynamically speaking, this process is an isothermal com pression. In the next stage the file is am plified back to T H . This stage is an adiabatic compression in which we inve st work to in crease the energy of the pulses without increasing their information (entropy). Finally, at the last stage the amplifier writes (em its) the light pulses into the f iber. In this stage th e amplifier reduces its energy at a constant temperature T H and the cycle starts again. Fig. 1. A Carnot cycle for a file amplific ation. Amplifiers are necessa ry to overcome the energy loss along a fiber. Each cycle of amplification is a Carnot cycle of two isotherm als and two adiabatic. Hereafter it is shown that this cycl e has an efficiency of the Carnot machine. Before entering the am plifier, the sequence of the light pulses has a relatively low tem perature T C , and its entropy is Q/T C . After the amplification, it has a higher temperature T H . If Q is unchanged, the entropy is smaller. Therefore, the entropy balance Δ S = Q/T H - Q/T C < 0 is negative, which is a violatio n of the 2 nd law. The physical reason for the entropy reduction is that with a given amount 5 of energy Q , one can write more low-energy bi ts than one can write high-energy bits. To conserve the entropy (a reversible operation), we have to keep Δ S = 0. That means that we have to add more energy to Q . For a reversible operation Q H /T H = Q C /T C . Designating Q C = Q and Q H = Q+W, we obtain, W = Q (1- T C / T H ). In the irreversible case Q H /T H > Q C /T C , thus in general we obtain; ) 1 ( H C T T Q W − ≤ ≡ η (5) Eq. (5) is the Carnot efficiency. Summary and discussion: The amount of entropy rem oved from a blackbody by a single radiation mode in the classical lim it is k B . If the radiation mode is empty it does not remove entropy. The entropy rem oved from the blackbody is assum ed to be the entropy of the pulses. Therefore, it is argue d that in the classica l limit an energetic pulse carries k B entropy and a vacancy carries a zero entropy. W hen this assumption is used to calculate the entropy of a sequence of pulses, th e obtained entropy of the sequence is the Gibbs mixing entropy, whic h is identical to the logical Shannon information. The plausibility of this formalism is demonstrate d by presenting an informatics Carnot Cycle that yields the Car not efficiency for an ideal amplifier cycle in an optical fiber. Temperature and thermal equilibrium are concepts that are used to describ e random systems in equilibrium . In rando m systems energy is exchanged between particles by collisions. Ther e is no energy exchange be tween photons. Nevertheless, the quenched randomness of the energetic bits and the zero bits behaves according to the present form alism as in equilibrium , namely, a state where it is poss ible to calculate a unique tem perature. 6 It was shown previously that laser oper ation [5,6] and laser-cooling processes [7] which involve a production or a usage of a coherent light , yield the Carnot efficiency, and therefore comply with th e second law of thermodynamics. In these processes the light was considered as work , as light radiation was assum ed to be coherent. A coherent light beam has a si ngle radiation mode [8] and therefore it carries negligible amount of entropy. In the present study L /2 pluses, distributed randomly in L modes, carry entropy that is shown to be the Shannon inform ation. The pulse sequence is not coherent , as it is random. The lower the coherence, the higher is the amount of information that can be car ried by the sequence. This communication suggests that the Shannon information can affect the efficiency of a Carnot machine. Only when n i is large the entropy is not a function of the energy and the temperature. In this lim it, the entropy becomes a pure m easure of the quenched randomness, exactly as the logical Shannon in formation. This is a vital condition in informatics, as the entropy should remain intact with the energy attenuation. W hen n i is small, the entropy S = S ( Q ) is smaller than that of the logical inform ation. The entropy deficiency k B I - S ( Q ) is a loss of the logical inform ation. The Carnot efficiency of an amplif ier can be tested experim entally. Calorimetric experiments of this kind require careful photon counting; nevertheless they are possible in the c ontemporary technology. This st udy suggests that the second law is applicable in the cla ssical lim it to informatics. Acknowledgements : I thank Y.B. Band, R.D. Levine and Y. Kafri for many useful discussions. 7 References 1. J. C. Palais " Fiber Optics Communications " McGraw-Hill Professional (1998). 2. C. E. Shannon “ A Mathematical Theory of Communication ”, University of Illinois Press, Evan ston, Ill., (1949). 3. N.Gershenfeld " The Physics of Information Technology " Cambridge University Press pp 167 (2000). 4. Jaynes, E. T. " The Evolution of Carnot's Principle " in Maximum-Entropy and Bayesian Methods in Science and Engineering, 1, G. J. Erickson and C. R. Smith (eds.), Kluwer, Dordrecht, pp 267 (1988) 5. J.E. Geusic, E.O. Schultz du Bois & H.E.D. Scovil, Phys.Rev. 156, 343 (1967) 6. R. D. Levine & O. Kafri, Chemical Physics Letters , Volume 27, Issue 2, p.175-179 (1974). 7. O. Kafri & R. D. Levine, Optics Communications , Volume 12, Issue 2, pp. 118-122 (1974). 8. O. Kafri & I. Glatt " The Physics of Moiré Metrology " J. Wiley & Sons Inc. pp 43 (1990). 8

Original Paper

Loading high-quality paper...

Comments & Academic Discussion

Loading comments...

Leave a Comment