Reading Quiz
Question 1:
What are some of the reasons that the multiplicity of an
ideal gas is more complicated than the multiplicity of an Einstein
solid?
Answer:
The multiplicity of an Einstein solid depends only on the
energy and the number of atoms (or oscillators, if you prefer). The
multiplicity of an ideal gas depends on energy and number of
particles, but also volume. It's also complicated by the fact that
interacting gases can often expand and contract and even exchange
particles along with exchanging energy.
- The Einstein solid's multiplicity only depends on number of oscillators and energy. The ideal gas's multiplicity depends on it's volume, total energy, number of particles, and when there more than one they can exchange molecules and energy. Alot more factors.
- The multiplicity of an ideal gas not only depends on the number of particles and total energy available, but also its volume.
- The volume of an ideal gas can change, which changes the multiplicity. Also, there are infinitely many momentum vectors that can correspond to a given energy and hence a seemingly infinite number of microstates with a given energy.
- In an Einstein solid we assume molecules are arranged in a lattice, which allows us to assume f=6, but an ideal gas molecule isn't in a lattice, and therefore has many more degrees of freedom. Also, Einstein solids are assumed to be incompressible, and ideal gasses are not. Compressing reduces the number of degrees of freedom.
- The multiplicity of an ideal gas depends on its volume, total energy, and number of particles, and gases can expand, contract, and exchange molecules.
- First of all, there are more particles interacting, and the multiplicity depends on the volume as well as energy and # of particles. Furthermore, exchanges can occur between two interacting gases of other forms than energy, such as molecules or volumes.
- ideal gasses are more complicated because their multiplicity depends on its volume as well as the amount of energy
- Multiplicity for an ideal gas is dependant on volume and molecule exchange.
-
- The ideal gas multiplicity depends on volume, energy, number of particles and interactions between gases that result in energy exchange. In contrast, the volume of an Einstein solid is very close to constant, and interacting solids are not likely to exchange molecules.
- multiplicity depends on volumeas well as total energy and number of particles
- First, it is more complicated because the multiplicity of an ideal gas depends on its volume, along with its total energy and number of particles. Also when two gasses interact they can expand or contract or even exchange molecules.
Question 2:
What is the surface area of a 4-dimensional hypersphere?
Answer:
Using eq. (2.39), I find that with d=4, the area is
2*π2r3.
- Using equation 2.39 with d=4. (2*Pi^2)*r^3
- The surface area of a 4-D hypersphere is ((2*pi^2)/(2))*r^3.
- A = 2*pi^2/1!*r^3 A = 2*pi^2*r^3 for a hypersphere of radius r.
- (2* Pi^2)* r^3
- 2*(pi^2)*(r^3)
- 2*pi^2*r^3 using eq. 2.39, surprisingly simple to me.
- 2*pi^2*r^3/(1!)
- d = 4 ==> surface area = 2 pi ^ 2 / (2 - 1)! * r ^ 3 = 2 pi^2 * r^3
-
- 2(pi)^2*r^3
- (pi^2)*r^3
- 2*Pi^2 * r^3
Question 3:
What is the definition of entropy? How do you
currently think of entropy, conceptually?
Answer:
The definition of entropy is given by eq. (2.45): S = k*lnΩ.
- "Entry is just the logarithm of the number of ways of arranging things in the system" I don't really like thinking of entropy as "order" especially after this section. It seems to be more reasonable to say that entropy measures how likely a certian state is. The higher the entropy the more likely a state is.
- The definition of entropy is the log of the number of arranging things in a system. I think of entropy as the amount of stuff in a system that is no longer able to have reactions with other stuff because it's in its most likely state.
- Entropy is defined as S = k * ln(Omega) and is a measure of how probable a state is.
- Entropy is the logarithm of the multiplicty of a system, but I think if it more as the amount of disorder in a system.
- Entropy is proportional to the logarithm of the multiplicity of a macrostate. The amount of entropy a system has is related to the probability of it being in the state it is in. I like to think of entropy as having something to do with the natural evolution of the universe. It's almost as if there is some kind of invisible 'force' pushing the universe toward a state of higher entropy, and that's what defines the way that things change. You can fight this trend, but it's only a matter of time before you must succumb to its persuasion and follow the inevitable path toward high probability, in which your ultimate fate is determined by the mindless laws of physics, and not whatever beautiful destiny you have so conveniently picked out for yourself. All of your free will and individuality will be wrenched violently from your grasp as you become a chaotic drone whose only purpose is to blend in with the other drones without causing a fuss. But then, applying a concept like entropy to the state of the human mind is a rather tricky thing to do.
- Entropy, S = k ln( Omega ) (where Omega= multiplicity). That is, it is a measure of the number of possible macrostates of an object. I think of entropy more as the idea of "disorder" in a system, although its fast changing to the more precise mathematical definition.
- According to the book, Entropy is the logarithm of the number of ways of arranging things in a system. I currently think of entropy as a measurement of how close a system is to thermal equilibrium.
- entropy - a mathematical construction created to easily quantify multiplicities. I think of entropy as a measure of order of a system. If entropy is large, the multiplicity is large, thus we are in a likely state. A likely state is one near equilibrium, which means randomness, and no order.
-
- S=k*ln*omega. I usually think of entropy in terms of motion or freedom of motion. For example, in a chemical synthesis in which A and B form C, the product, C has less entropy because there are half as many molecules moving around. The reverse reaction has more entropy because more movement is allowed.
- the log of the number of ways of arranging things int he system. I think entropy is just the concept of systems moving into more and more likely states, and that you can't go back.
- The definition ef entropy given in this reading is the logarithm of the number of ways of arranging things in the system. I still currently think of entropy as the level of organization of a substance, which tends to move towards disorder.
Question 4:
What material from the reading or previous classes would you
like me to go over in more detail?
Answer:
Your responses below.
-
- I have nothing at this time.
-
- I don't understand Schroeder's section "Multiplicity of a Monoatomic Ideal Gas".
- Just how important is the derivation for the high peak multiplicity result compared to the qualitative results?
- None
- I am having some trouble with the approximations we did in class on friday.
-
-
-
- The homework was a lot harder than any other homework.
- These concepts seem to be pretty straight-forward.