![]() ![]() In information theory, Shannon recognized the order-disorder quandary. To address the misuse and misunderstanding of the term, Professor Arieh Ben-Naim has suggested abandoning the word ‘entropy’ altogether and replacing it with ‘missing information.’ He also suggests entropy can be classified into two major categories: ‘one is based on the interpretation of entropy in terms of the extent of disorder in a system the second involves the interpretation of entropy in terms of the missing information on the system.’ ⁶ In short, it is lost to the entire system. The energy decreases while the entropy increases and entropy cannot be reused or converted. ![]() The products of fire are composed mostly of gases such as carbon dioxide and water vapor, so the entropy of the system increases during most combustion reactions. The image of the fire above is an example of entropy. Entropy is a measure of the energy that cannot be converted into additional work’.⁵ ‘This so-called heat loss is measured by a quantity called entropy. For example, many systems release heat as part of their working process. The entropy produced by all-natural and biological systems will continue to expand until some other outside source interferes with this trajectory. This law holds that there will always be an expenditure of energy in any system in the universe, and this expended energy cannot be reused. The total entropy of a system will always increase until it reaches its maximum possible value it will never decrease on its own unless an outside agent works to decrease it.’⁴ ‘Entropy always increases until it reaches a maximum value. It has even entered the realm of buzzwords, where one can hear the term ‘entropy’ being spoken with abandon and without any essential meaning. However, as Von Neumann supposedly stated, this has proven impossible, and this crucial law is defined differently depending on the system with which it is being used. Entropy should benefit from an uncontested, universally accepted definition by all logical formulations. Entropy² is the Second Law of Thermodynamics,³ and few technological advances would have taken place without understanding the fundamental consequences of entropy. And more importantly, no one knows what entropy really is, so in a debate, you will always have the advantage’.Īny research on entropy will leave the researcher in stupefaction over the possible definitions of a law that lies at the heart of humanity’s understanding of the universe. Von Neumann reputedly responded: ‘Say that information reduces “entropy.” It is a good, solid physics word. According to one (almost undoubtedly untrue) story, when grappling with the terminology to use in his paper, Shannon asked the legendary mathematician and physicist John von Neumann¹ ‘What should I call this thing?’. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |