2016PUPC普林斯顿大学物理竞赛Online真题与答案免费下载-Entropy and Statistical Mechanics

历年PUPC普林斯顿大学物理竞赛真题与答案免费下载

Online线上竞赛部分

翰林学院全网首发

力争超快速发布最全资料

为你

千千万万遍

2016PUPC普林斯顿大学物理竞赛

Entropy and Statistical Mechanics Online Part 完整版真题免费下载

Online线上竞赛部分此卷共4大题

但每题下分不同数量的小题

完整版下载链接见文末

真题预览:

The subject matter of this document is statistical mechanics, or the study of how macroscopic results manifest from microscopic interactions in systems with many interacting parts.

Our goal is to provide a unified introduction to statistical mechanics using the concept of entropy. A precise definition of entropy pervades statistical mechanics and other scientific subjects and is useful in its own right. While many students may have heard the word entropy before, entropy is rarely explained in its full detail or with rigorous mathematics, leaving students confused about many of its implications. Moreover, when students learn about thermodynamic laws, laws that describe the macroscopic results of statistical mechanics, like “change in internal energy = heat flow in + work done on a system”, the concepts of internal energy, heat, and work are all left at the mercy of a student’s vague, intuitive understanding.

In this document, we will see that formulating statistical mechanics with a focus on entropy can provide a more unified and symmetric understanding of many of the laws of thermodynamics. Indeed some laws of thermodynamics which appear confusing and potentially unrelated at first glance can in fact all be seen to follow from the same treatment of entropy in statistical mechanics.

Entropy as information

At this point in your life, you may have heard the word entropy, but chances are, it was given a vague, non-committal definition. The goal of this section is to introduce a more explicit concept of entropy from an abstract standpoint before considering its experimental and observational signatures.

1.1 Quantifying the amount of information in the answer to a question

Entropy, while useful in physics, also has applications in computer science and information theory. This section will explore the concept of information entropy as an abstract object.

We will first consider entropy not as related to the concept of heat in objects, but as a purely axiomatic quantification of what we mean by the information we receive when we hear the answer to a question. For a concrete example, imagine that someone flips a coin and doesn’t reveal which side landed upright and we ask “what was the outcome of the coin flip? Heads or tails?” When they now tell us the answer, how much “information” do we gain by learning what the outcome was? In other words, we are faced with determining how much information is received when we hear the answer to a question that has a probability distribution of outcomes. This question was asked by Claude Shannon in 1948.

After pondering this question for a long time, you might come up with a few criteria that any reasonable measure must obey, such as:

  1. If the question has two answers that are not dependent on each other, then the measure of information contained in answering both questions should be the same as the sum of the information gained in learning the answer to each one individually.For example, if we have two independent coin flip experiments, the information gained in hearing the outcome of one coin flip should be the same as the information gained in hearing the outcome of the other, so that the total information gained is the sum of the individual amounts of information gained.How does this concept generalize to questions which have interdependent answers? Two such questions might be “am I wearing gloves?” and “am I wearing a sweater?”. The exact statement of this criteria is more complicated and we will not ask you to consider it here.

完整版真题下载链接注册登录后查看

文件为PDF格式

推荐使用电脑下载

 

2016PUPC普林斯顿大学物理竞赛完整版答案免费下载

请持续关注,稍后更新

 

如需打包下载,请联系小助手了解更多

翰林学员全站资料免费打包下载,专享高速下载通道。