Report On Shannon Entropy: [Essay Example], 1326 words GradesFixer
exit-popup-close

Haven't found the right essay?

Get an expert to write your essay!

exit-popup-print

Professional writers and researchers

exit-popup-quotes

Sources and citation are provided

exit-popup-clock

3 hour delivery

exit-popup-persone
close
This essay has been submitted by a student. This is not an example of the work written by professional essay writers.

Report On Shannon Entropy

Print Download now

Pssst… we can write an original essay just for you.

Any subject. Any type of essay.

We’ll even meet a 3-hour deadline.

Get your price

121 writers online

blank-ico
Download PDF

The idea of entropy was presented by Claude E. Shannon in his 1948 paper “A Mathematical Theory of Communication”. Wikipedia characterizes entropy as “a measure of the vulnerability related with a random variable. In this unique circumstance, the term usually alludes to the Shannon entropy, which evaluates the normal value of the data contained in a message, usually in units, for example, bits and a ‘message’ implies a particular realization of the random variable. Equivalently, the Shannon entropy is a measure of the normal Information content one is missing when one doesn’t know the value of the random variable.”

Historically, numerous musings of entropy have been proposed. The etymology of the word entropy goes back to Clausius, in 1865, who named this term from the Greek troops, which means change, and a prefix en-to recalls the indissociable (in his work) connection to the possibility of vitality by Jaynes[43]. A statistical idea of entropy was presented by Shannon in the hypothesis of correspondence and transmission of data [50]. It is formally like Boltzmann entropy related with the statistical portrayal of the infinitesimal setups of many-body frameworks and how it represents their plainly visible direct. Working up the connections between statistical entropy, statistical mechanics, and thermodynamic entropy was started by Jaynes [43]. In an initially totally alternate perspective, an idea of entropy rate was produced in dynamical frameworks hypothesis and representative gathering analysis. The issue of pressure is sometimes established in data hypothesis and Shannon entropy, while in different cases it is established in algorithmic Flexibility. As a result of this decent variety of employment and ideas, we may request whether the utilization from the term entropy has any importance. Is there really something connecting this decent variety, or is the utilization of a similar term in such huge numbers of implications basically deceptive? A short historical record of the diverse ideas of entropy was given by Jaynes thirty years earlier [43]. I here propose a more detailed review of the connections between the distinctive considerations of entropy, not in a historical perspective yet rather as they show up today, featuring spans between likelihood, data hypothesis, dynamical frameworks hypothesis and statistical material science. I will develop my argumentation in light of mathematical outcomes identified with Shannon entropy, operational framework, and demand change. They offer a strong both qualitative and quantitative guide for the correct utilize and interpretation of these ideas. Specifically, they give a rationale, and also several stipulations, to the most extreme entropy standard.

In this examination, a method to quantify the operational flexibility is produced. The scope of the volume to create for every item blend which is the idea of operational flexibility-can be calculated using information from the working expenses and all the technical information of the manufacturing framework and of the item to be created.

Rudolf Clausius, a German physicist, formulated the second law of thermodynamics in 1865 by stating that heat flowed spontaneously from hot bodies to cool ones, never the reverse. He conjectured that matter must have a previously unrecognized property which he called entropy. He further showed that total entropy always increased for all changes in any natural process. This observation led him to formulate the second law as the entropy of the universe tends to a maximum. Empirically, Clausius defined entropy, S, in a differential frame:

where dS is the change in entropy in a closed system due to a physical process in which a quantity of heat, dQ, streams from a higher to a lower temperature (dT ). This definition, however, fails to provide much understanding as to how the concept can be used concretely.

From the perspective of statistical mechanics, entropy is viewed as the probability that certain events may happen inside the framework of all possible events. By observing the behavior of large numbers of particles, statistical mechanics has succeeded in giving equations to the calculation of entropy as well as justification for equating entropy with a degree of disorder.

Shannon[50] looked at information as a component of a priori probability of a given state or outcome among the universe of physically possible states. He considered entropy as equivalent to uncertainty. Along these lines, information theory parallels the second law of thermodynamics as expressed by Clasius in claiming that the uncertainty on the planet always tends to increase. Indeed, as our perception of the world becomes increasingly complex, the number of phenomena about which we are uncertain increases and the uncertainty about each phenomenon also increases. To decrease this uncertainty, one collects an ever increasing amount of information describe by Kapur, [45]

A system facing uncertainty uses flexibility as an adaptive response to cope with change. The flexibility in the action of the system depends on the decision alternatives or the choices available and on the freedom with which various choices can be made by R. Caprihan [11].A greater number of choices leads to more uncertainty of outcomes, and hence, increased flexibility. This inference has been the main driver to apply entropy as a measure of flexibility by different researchers.

Jaynes [43] Demonstrated consistency between the definition of entropy in statistical mechanics and the definition in information theory. He showed that the measure of uncertainty defined by Shannon could be taken as a primitive one and be used to derive state probabilities. Jaynes also introduced a formal entropy-maximization principle, Tribus [44] which subsequently used to demonstrate that all the laws of classical thermodynamics could also be derived from the uncertainty measure. In summary, information is equivalent to the removal of uncertainty, and uncertainty and entropy are essentially identical, not mere analogs.

Originally, entropy was used to measure the amount of flexibility associated with probabilistic events. As events increase in number and get closer in probability of occurrence, the associated flexibility inside the situation increases, and consequently, entropy increases. This observation of the nature of entropy leads to the idea of applying entropy to the relative demand for items, which is the ratio of time spent in processing an item to the total processing time of all items. Along these lines, as more relative demand exists, entropy will be higher. This means that processes capable of handling more items are more flexible. Also, as relative demands get closer in value, i.e., more flexible, entropy is higher.

To recognize the proposed flexibility measure from previous works, the measure is designed with the accompanying properties. It is a dynamic measure that can be used along a time continuum to screen the performance of a system in terms of its flexibility. It is a general measure with the goal that it can be applied to any process.

To test entropy as a measure, it is applied to a process, where a process is defined as an activity or set of activities that takes a set of sources of info and transforms them into a set of yields. A single activity or machine, gathering of machines or a whole plant, then, can be defined as a process. The model of a vocation shop is used to test entropy as a measure since the whole shop or part of it acts as a process. Each activity inside a process is composed of set up and creation. A process is considered flexible when it can handle many occupations, and/or it produces equal amounts inside each employment, indicating an equal ability to handle different items. The more items that activity shops can produce, the more flexibility it appears for handling change.

In order to measure flexibility, entropy is used to measure the relative demand for yields or items produced by a process. The relative demand for an item is defined as the amount of time an entity dedicates to handling a task over the total handling time for all tasks assigned to that entity. As is demonstrated later in a mathematical model, this measures the amount of change (variation in demand) to which an entity is subjected. Entropy measures the ability to service the number of relative demands or tasks and the distribution of these relative demands.

Remember: This is just a sample from a fellow student.

Your time is important. Let us write you an essay from scratch

100% plagiarism free

Sources and citations are provided

Cite this Essay

To export a reference to this article please select a referencing style below:

GradesFixer. (2019, August, 27) Report On Shannon Entropy. Retrived February 23, 2020, from https://gradesfixer.com/free-essay-examples/report-on-shannon-entropy/
"Report On Shannon Entropy." GradesFixer, 27 Aug. 2019, https://gradesfixer.com/free-essay-examples/report-on-shannon-entropy/. Accessed 23 February 2020.
GradesFixer. 2019. Report On Shannon Entropy., viewed 23 February 2020, <https://gradesfixer.com/free-essay-examples/report-on-shannon-entropy/>
GradesFixer. Report On Shannon Entropy. [Internet]. August 2019. [Accessed February 23, 2020]. Available from: https://gradesfixer.com/free-essay-examples/report-on-shannon-entropy/
close

Sorry, copying is not allowed on our website. If you’d like this or any other sample, we’ll happily email it to you.

By clicking “Send”, you agree to our Terms of service and Privacy statement. We will occasionally send you account related emails.

close

Thanks!

Your essay sample has been sent.

Want us to write one just for you? We can custom edit this essay into an original, 100% plagiarism free essay.

thanks-icon Order now
boy

Hi there!

Are you interested in getting a customized paper?

Check it out!
Having trouble finding the perfect essay? We’ve got you covered. Hire a writer

GradesFixer.com uses cookies. By continuing we’ll assume you board with our cookie policy.