Flash and JavaScript are required for this feature.
Download the video from iTunes U or the Internet Archive.
Topics covered: Applications: chemical and phase equilibria
Instructor/speaker: Moungi Bawendi, Keith Nelson
Lecture 29: Applications: c...
The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high quality educational resources for free. To make a donation, or view additional materials from hundreds of MIT courses, visit MIT OpenCourseWare at ocw.mit.edu.
PROFESSOR: So, the first thing that we did is look at a simple polymer model not too different from polymer models that we've seen before. That is, models for molecular configurations. The difference in the case that we treated last time, though, was that instead of having just a handful of possible configurations with their just, designated energies, we had essentially an infinite sequence of levels. All of evenly spaced energy. And we went through that and saw what the thermodynamics worked out to be. And also with the high and low temperature limits of the thermodynamics turned out to be. And the interesting consequence of that is that that particular choice of configurations and energies associated with them maps exactly onto the vibrational levels of an ordinary molecule. So, of course, when we do the statistical mechanics in the end, everything depends on the partition function. Right? So we're always writing some molecular partition function. And we've got a sum over our energies for all the states. And so the only input information we need to know is, what are the levels and what are their energies.
So, although we started yesterday with a model of polymer confirmations, once we decided what the energies were, which was this sequence of energies starting at zero and going with E, and 2E, and 3E, I think we labeled them E0, and so on, that's all that mattered as far as the statistical mechanics was concerned. And so, since we started, We formulated this by imagining different confirmations for a polymer, but in fact these are the vibrational energies of any molecule too. Of any vibrational mode. So the results are very generally applicable. And so we saw what the thermodynamics worked out to be. And what the limiting cases were.
And a couple of them, that I'll just review, are, one that's very important in lots of settings, is that the low temperature limit when you're down it basically zero degrees Kelvin, that's where you've got kT way down here. So there's not nearly enough thermal energy for anything to get out of even the lowest level. And so in that case, everything is in the ground state. And then the heat capacity, which is this often a very simple thing to measure, you change the temperature. And you see that there's no change in the total vibrational energy. In other words, the limiting low temperature heat capacity is zero.
So we saw that for Cv for vibrations. And the limit that T goes to zero was zero. And that's because we were in this limit, where everything is in the ground state. Which is to say that u vibrational energy in that limit is zero. Unlike the case we treated earlier, where we had a limited number of levels. Because we treated a simple four-unit polymer with only four possible confirmations, in this case we have essentially an infinite number. So even in the high temperature limit, we don't have them, there's not a maximum total energy. If the temperature keeps going up, you'll get more and more energy. Because you can keep populating higher and higher levels. But the way that happens stops changing. Because the levels are all evenly spaced. So if the temperature, instead of being down here, is somewhere up here, now for sure if we change the temperature, the energy will continue to change. But what we've found is that in the high temperature limit, we had the equipartition of energy result. And that is, u vibrational in the high temperature limit, was just, for N molecules, it was NkT. 1/2 kT for each kinetic energy degree of freedom. 1/2 kT for each potential energy degree of freedom. And since vibrations have each of those, potential and kinetic energy, it's kT for each molecule, for each degree of freedom.
That's an incredibly simple, useful thing. That means, if I've got a molecule in the gas phase and it's in the high temperature limit for translation, I know each translational degree of freedom will contribute for each molecule 1/2 kT. Or 3/2 kT, in all three dimensions. So I know the translational energy of a mole of molecules in the gas phase at room temperature without doing anything all. It's 3/2 NkT. Rotations, if it's a diatomic molecule, will be two different degrees of freedom. For rotation. In two orthogonal planes. And I'll have 1/2 kT for each. 1/2 NkT for N molecules.
For vibrations, if I'm in a high temperature limit, then it'll be kT for each vibrational mode. For molecules, the vibrational frequencies are usually high, so that you're not in the high temperature limit. But for materials where you have vibrations, acoustic vibrations, those are low frequency. You can be in the high temperature limit. And easily are. So the result for energy for vibrational energy in the high temperature limit, was NkT. And so that means that the heat capacity in that same limit at high temperature was just Nk, derivative with respect to temperature. Very simple thing to measure, again. So that's what we saw last time in the case of both the conformational model that we treated and the vibrational energies of molecules onto which that same model maps.
Now what I want to do, is look a little further. And let me actually write the partition functions too. Also an important result was that the partition function itself, q vibrational, in the high temperature limit, was just kT over E0. Also very simple result. E0 is h nu 0 for vibrational mode with frequency nu 0. And of course, given the partition function you can calculate everything. So in the high temperature limit we can easily calculate things. In the low temperature limit, it's just one, right? There's only one allowed possible state in the low temperature limit. Everything is in the ground state when it's cold enough. So the sum of our states just gives us exactly one term that's of any reasonable value. That is, the term with this being the lowest energy. Everything else, the energy is much bigger than kT. This is a vanishingly small number then.
So just write q, vibrational, in the low temperature limit, is just one. OK, that was the case for vibration. And remember, these low temperature limiting cases, that's a common case for nearly any degree of freedom. As long as it's quantized. At some point, you'll get this situation. Where the temperature is low compared to even the lowest excited level. Of course, what'll happen in the high temperature limit might vary, depending on the structure of the energy levels. In the case of vibrations, it's like this, it'll turn out rotations do the same thing. But it's not always as simple as this.
OK, now what I want to do is just go through the next step of what we can treat, given the statistical mechanics that we know. Which is chemical equilibrium. Why can't we just calculate equilibrium constants based on what we've seen so far. If we can calculate the partition functions for molecules, and they undergo reactions and they're in equilibrium, we should be able to calculate the equilibrium constants. From first principles, just based on the statistical math we've seen so far. So let's try to do that.
So it just means that, again, just as always, if we know all the energy levels and what all the possible states are, there's no reason we shouldn't be able to set our sights on a calculation of that sort.
So let's just sketch out what the levels are going to look like for simple chemical events. So I just want to draw an energy diagram, here's energy. And, like we often do with chemical equilibria, I'm going to set the zero of energy at the separated atoms, and I'm imagining I've got reactants and products. So let's make this our products. Then over here we'll have our reactants and make the energies different. Alright. So there's some amount of binding energy, right? There's a bond dissociation energy going from the lowest available level in each molecule to the dissociation limit where we pulled the atoms apart. So now I'm going to draw vibrational energy levels inside the molecule. Let's imagine, it wouldn't need to be this, but let's imagine it's just diatomic molecules. So there's one vibrational mode in each. Just the stretching mode. And we've already seen the levels are evenly spaced. So there's going to be a bunch of evenly spaced levels. Actually, it stopped being quite evenly spaced once this stops being a simple harmonic oscillator, but for all the low lying levels it's pretty close to that.
So those are the available levels. And there's a dissociation energy. So minus E, and I'll call this for the products, Ep. And it's just minus D0 for the products. Right, in language, terminology that we've seen before. And it's the same thing for the reactants. And really, if this were more than a diatomic molecule, maybe there would be a bunch of vibrational modes. But it wouldn't matter. This would just represent all the vibrational energies. There's some lowest state available.
So now we've got the dissociation energy for the reactants. So those are the energies it takes to separate the molecule. But those aren't the only states available to molecules. Of course, they could have extra vibrational energy. They're not always in their ground states. And we're going to calculate partition functions. In general we would sum over all the available states. And then just calculate the probability of being in the state.
One way to look at this, since if there's chemical equilibrium between the species, it means the molecules can interconvert. What that really means is that any of the molecule has access to any of these states. Of course, we like to group all these states over here. Because they correspond to a particular chemical structure. And we like to group these states over here, because they correspond to this different chemical structure. From a statistical mechanics point of view, it's just states and levels. And we could just calculate the probability of any molecule being in any one of the states. But it's useful to keep them separated like this. So let's do that. And let's look at the difference here, the difference in dissociation energy. Delta D0. Let's not put a double arrow. Let's define the sign of it. This way, going from products to reactants.
OK, now let's just look at how equilibrium should work. So now, let's just take a generic reaction. Little a, our number, our stoichiometric number, A plus b, and B goes to c C, d D. And we know that delta G0 is minus RT log of Kp. And that delta G0 is just the free energy of the products minus the free energy of the reactants. So it's c times G C0 plus d G D0 minus a G A0 minus b G equals B0.
So we need to know the free energy of each one of the species. And now we know how to calculate that from first principles, through statistical mechanics. So we know that G is A plus pV. A is minus kT log of capital Q. And I'm going to assume that we're in the gas phase. And it's an ideal gas, so I'm going to replace pV by NkT. And what's Q? Well, we know what Q is. It's q, little q, to the N power translational over N factorial. For atoms, this is all it would be. But we have molecules. So we have individual other degrees of freedom besides translation. So let me just label those internal, q internal. And that's also to the N power. And q internal, if you say, what are those degrees of freedom, well it's the electronic energy. it's the vibrational energy. It's the rotational energy. And we'll deal with those shortly, but let's just separate it this way for now. Just we can keep track of where the N factorial belongs.
And now we're going to use Stirling's approximation for the N factorial. So our log of Q, which we need up here, is just N log of q trans q internal minus log of N factorial. And that's just equal to N log q trans q internal, minus N log N. Plus N. And of course, this I'm going to put down here momentarily. And now our expression for G is up there. So it's minus kT log of Q plus NkT. So these factors of N are going to cancel. So when I take minus kT log of Q, that's going to be minus NkT over here. That's going to cancel the plus NkT here. So all I'm going to have left is this term and this term. Which I can combine. So it's just minus NkT log of q translational q internal over N. Now these, really, are just the same as q, it's just minus NkT log of little q over N. I didn't need to separate that into this product. I only wanted to do it to be clear where we were getting this factor of N factorial from.
So now we have an expression for G. And if we know G for all the species involved in a chemical reaction, we should be able to calculate the chemical equilibrium. So let's do it. So now to do it, let's look a little more closely at what these internal degrees of freedom are. Because that's where these important details are going to come in. Obviously the equilibrium is going to depend on the energetics. How much different are the bonding energies or the dissociation energies and the molecules involved. And also, how much different or the other molecular energy levels. The vibrations, rotations, and so forth.
So, q internal is the product of rotational, vibrational, and electronic partition functions. Remember how we saw that if you can write the energy as a sum of energies, then the partition functions are multiplied. Because, of course, if this is the sum of a rotational plus vibrational plus electronic energy, then of course I can just separate out these things. These are in the exponent. I can write it as a product. Same with translations. I already have separated that.
So now let's look at how these things behave. Well, for the electronic case, there's really only one electronic state of interest in general. And that's the lowest state. In cases you're extremely familiar with, if it were the hydrogen atom, it would be down in the 1s orbital. And it would take a huge amount of energy to get up into the 2s or 2p orbital. At ordinary temperatures you never have that. All the atoms would be in the ground state. Now, for most molecules, it don't take as much energy as for the hydrogen atom. If you have benzene, for example, the ground electronic state, the lowest electronic state, is quite far below the first excited state. Not as much as the hydrogen atom going from 1s to 2s to 2p, but still by much more than ordinary thermal energies at room temperature.
And what that means is, again, only the ground state term is going to matter. Now, sometimes we would define that is this as the zero of energy. But since we're doing chemical equilibria, the zero of energy is up here. So that epsilon, that energy term, is this amount. The dissociation energy. This is minus dissociation energy. So we're going to have a positive number there. So q electronic is just e to the D0 over kT. So that's easy enough.
We also know what the vibrational partition function is. We saw it last time. It's this thing we could write as a simple form. One over one minus e to the minus E, I'll call it E vibe, over kT. It's h nu 0, where nu is the vibrational frequency. Again, in the case of many molecules, the vibrational energy is pretty high compared to kT. So even this often simplifies to be just one. In other words, all the molecules, in many cases, are basically all of them are in the ground vibrational state at room temperature. Certainly if you look at molecules of high frequencies, if you look at the hydrogen molecule, H2, the vibrational frequency is about 4,000 wave numbers. Remember I mentioned last time, kT at room temperature, T is 300 Kelvin. Then kT corresponds to 200 wave numbers. It's a factor of 2/3. Much less than 4,000 wave numbers. In other words, the vibrational energy is much lower than kT. So, again, everything would be in the lowest level. And this just simplifies to one. In many cases, it's reasonably close to one. For high vibrational frequencies, nu 0.
Now, we haven't talked about rotation. But you probably have just an intuitive feeling that at ordinary temperatures, if I do this, if I wave my hand in the air, molecules that I happen to intersect are going to start spinning faster. In other words, ordinary thermal energies do populate some number of rotational levels of molecules. Molecules probably aren't going to start getting squished together and vibrate harder when I do something like this. So molecules might generally still be in the ground vibrational levels, thermal energy isn't enough to raise them in many cases. But rotation, for sure. They're not all in the lowest level.
Turns out that, in fact, the energy separation between rotational levels is very small compared to kT at room temperature. So just like we saw for the high temperature limit for vibrations, it turns out that for q rotation in the high temperature limit, we have the same situation where we have kT over, now it's a rotational energy. And it's typically on the order of about one to ten wave numbers for small to medium sized molecules.
Remember, too, last time when we talked about in the vibrational levels and said, well, if you're at kT and it's this big, it's basically telling you roughly how many levels do you have thermal access to. Because we saw the same results for vibrations. It's not so different for rotation. And it turns out that at ordinary temperatures, you might have access to a few tens to a few hundreds of rotational levels, depending on whether they're closer to one or ten wave numbers or a little higher. Again, room temperature is 200 wave numbers. So, that means that, remember q is a unitless number. You're counting the states, weighted by the energy. Weighted by the Boltzmann factor.
And for rotations, it's a number that might be on the order of 100 or so. Order of magnitude estimate, that's all. And of course we've seen the translational partition function is on the order of 10 to the 30th, right? Enormous number. In other words, in the simple lattice model that we've use to describe translation, just breaking up the available volume into little pieces. Counting. Well, OK, you have something on the order of 10 to the 30th possible locations. And if you treat this properly, quantum mechanically, for the translations, there's actually a magnitude of the number is similar.
Now I just want to go through a very simple example. Just treating a particular generic reaction. And look at what the equilibrium constant is. Working it through, given what we've seen so far. So I just want to simplify it by having all the stoichiometric coefficients equal to one. It's not a great complication, if we don't do that. But it's a little bit simpler. So let's take a molecule that's A-B plus C-D. And now let's break bonds and reform them. So we get A-C plus B-D, right? So we're going to do something simple. And since I've got all the stoichiometric coefficients equal to one, we've got the same number of molecules as reactants and products, that'll make things a little bit easier. Just in the sense that if we look at the contribution of the translational energies at room temperature, they're going to be the same. For the reactants and the products. Any of the molecules is going to have a translational partition function on the order of 10 to the 30th, at room temperature and an ordinary volume. And nothing's going to change from reactants to products.
And for vibrations, let's assume, as is often the case, that the vibrational frequencies are fairly high. So all the partition functions for the vibrations are equal to one. Like we've got written up there. For all four of the molecules. So then nothing is going to change in the vibrations. Going from reactants to products.
Finally, the rotations. So, for the rotations our partition function is something on the order of, I think I meant to write 100, not ten here. As our order of magnitude. That's what it's going to be. It's a number on that order. And if we assume that the masses of the atoms involved are comparable, then we can cheat a little bit and say that's also, those numbers are also, going to be comparable for the reactants and the products. We don't have to do that. We could put it in, and rather easily calculate it. But I'm going to make life simple in this way and just work through how to carry through the calculation. And I think it'll be clear how to put in different partition functions for those quantities if they are different from each other.
OK, so let's try it. Then, our delta G0, let's go back over here. Here's delta G, let me just put the relationship up. So we've got an expression for G. Which is, let's start from back there. It's minus. And now for each one of the substances, it's Ni kT log qi over Ni. I'm just going to put this in molar terms. So it's minus little ni RT. Log of qi over capital Ni. And I only want to do that because I want to express this in free energy. This is for substance i. Free energy per mole. So it's just minus RT log of qi over Ni.
That's G per mole. That's Gi naught per mole. Plus RT log pi over p0. Where p0 is an atmosphere, basically. One bar. So that means Gi0 bar is minus RT. And I'm going to take this minus this and combine them. So it's log of qi over Ni, that's this part. And now I've got log of pi over p, and I'm just going to use the ideal gas law. So I'm going to use pV is nRT. So then I've got Ni kT over my factor of p0, times the volume.
So that's Gi0 for each one of the substances. The Ni's cancel. And I've got minus RT log of qi kT over p0 times V. Now, qi is just the total molecular partition function. It's this product of q trans times q rotational times q vibrational times q electronic. And now I need delta G. So delta G0, is just minus RT. And now it's just, I'm going to take this for each one of the substances, for the products, minus the reactants. And I'm going to combine the log terms. This is the same, and I'm going to have this in every term.
So what's going to happen? Well, all this stuff is going to cancel. Now, that doesn't necessarily happen. That happens because of the stoichiometric numbers being the same. Otherwise I'd have to take them to that the power of the stoichiometric number. But in this case they're just going to simply cancel. So I'm going to have left is the ratio of my partition function for molecule AC. Partition function for molecule BD. Partition function for molecule AB, and for molecule CD. Products, reactants. Pretty simple, because I know how to calculate all this stuff. And again I'd suggested some simplifying assumptions for what those partition functions are. But it wouldn't be hard to calculate each one of them if I just had all the relevant energy levels.
To make this simple, we're going to assume that the rotational energies are the same for all the molecules. It wouldn't need to be, it wouldn't be hard, to plug in different values. We're going to assume we're in the low temperature limit for vibrations. So q vibrational for each one of these things is equal to one. The translational contributions, are equal for all of them. So the only thing that's different is the electronic contribution. And that's different. Because, of course, you have different electronic energies. The binding energies for the products and the reactants aren't in general going to be equal. And in most cases, it is the energetics that dominate the equilibrium constants. Of course, that is, it's the electronic energies that usually dominate. Of course, the other things do matter. But it's not unusual for the electronic energies to dominate.
So here's delta G0. Of course, this is minus RT log of Kp. So there's our expression for our equilibrium constants. So it's just this product. So let's just put in the partition functions for the electronic part. It's e to the D0 over kT for molecule AC. Times e to the D0 for molecule BD over kT. Divided by e to the D0 for AB over kT, times e to the D0 for molecule CD over kT. So the whole thing is just e to the D0 AC, plus D0 BD minus D0 AB minus D0 CD over kT. Or in other words, it's e to the delta D0 over kT.
That's the whole story. So it's a super-simple calculation to execute. And for lots of molecules, we know the dissociation energies. These are measured, determined, either thermochemically or spectroscopically for a great many molecules. So in fact, we have the information we need to do that calculation. And if we want to worry about the vibrational and rotational levels, typically we have that information, too, spectroscopically. So we know what the relevant energy levels are to do the whole calculation.
So what it means is, we can calculate equilibrium constants for ordinary chemical reactions just from first principles. Knowing the energy levels of the available states of the molecules. Any questions? In your notes I've included another example. It's a little bit of an extra example. You can go through it if you'd like. It turns out to have a kind of attractive closed form solution. Which is the case of a chemical reaction like an isomerization inside a crystalline solid. So you can imagine you've got a crystal of some molecule, many crystals, thermally over time, they might decompose. They might have reaction processes that can take place. So I've imagined you've got uni-molecular reactions that can occur. And in that case you start with a pure crystal of one species. And you end up with a mixture of some number of the original species, and some number of new species. And there's an equilibrium constant that'll say where that should be.
So again, it'll depend on the energies. On the dissociation energies of the different species. And there's also a kind of mixing term, because effectively now you've got the different species located at different places in the crystal. Those are distinguishable states, in the solid. And you need to count those. But the one other example I want to work through is a little bit different from chemical equilibria. It's phase equilibria. Why shouldn't we be able to calculate phase diagrams? So in the earlier part of the course, you went through and saw the macroscopic thermodynamic treatment. The equilibrium constants and chemical equilibria. So of course you saw the whole delta G0 as minus RT log Kp and so forth. And you've seen now we can actually calculate all that from first principles. What about phase equilibria? What you saw before, we just drew phase diagrams. And they had boiling points or lines of boiling. Or in other words, liquid-solid and liquid-gas and solid-gas equilibria. And what we did is, given where those lines fell, you can calculate things. But where do the lines fall? What's the boiling point or the sublimation temperature of some material at a particular pressure? Well, we didn't offer any prescription for calculating that. You had to take that from measurement and then given that, you could use the Clausius-Clapeyron equation and so forth and look at the way things behaved if you moved along a line in a phase diagram. But there was no prescription offered to calculate where exactly would that line be on the phase diagram.
But again, if you know all the energies of the possible states, in the solid, in the liquid and the gas, statistical mechanics shows us that we can calculate the equilibrium between those. Which is to say we know we can calculate where those lines belong. So I want to just go through maybe one of a couple of relatively simple cases. I want to go through the case of a solid-solid equilibrium. Let's imagine we have two solid phases and an equilibrium between them. And this problem was on the problem set. How many of you did it? Everybody did it. Did you all get to the end of it? Who got to the end of it? Some of you got to the end of it. If you did, congratulations, it's not so trivial to work on for the first time. So I'll just briefly go through that problem. And show how it works.
So, it's similar, of course, to the case that we're doing now. To the case we just did, of chemical equilibria. The difference, though, is that it's cooperative. The solid, at least in the way I formulated that problem, you either have the crystal in phase one or phase alpha, or in phase beta. And there's nothing in between. There are lots of systems like that. Not just crystals, you could go from different forms of DNA. Act like that. Where you have super coiled DNA and regular DNA. And you actually can have some of the DNA in each. But it's actually very cooperative. So there's a huge tendency to either be all in one, or be all in the other. Or at least very very, large pieces of it like that. Lots of other systems act like that. In other words, unlike the case where we're thinking about chemical equilibrium among molecules in the gas phase, these two molecules over here crash into each other and they react. Nothing else cares. The other whole mole of molecules does whatever they were doing. And a little later some other pair of molecules happen to crash into each other and maybe react or maybe don't.
In other words, there's no cooperativity at all. What happens to some pair of reactants and products, everything else is completely independent of it. That's not the case when you have, in many cases, a phase transition. You can see it by eye, often. If you do something like super cool water. This is a kind of fun experiment. If you take dry ice, and you've got a little bit of water. Right or a little pot of water. And you put some dry ice into it. So you can actually get the water to be below the freezing temperature and it won't freeze yet. Then you drop a crystal in there and boom, it all freezes. Very cooperatively. Lots of phase transitions behave like that.
So let's just see how it works. In this case, it's really similar to what we just did. We can treat it, here's phase alpha. Here's phase beta. And the crystal, the interactions between molecules or the atoms in the crystal are different in the two phases. So effectively, the binding energy is different. In other words, if you say, now let's think of the energy it would take to evaporate all the atoms or molecules and let them loose in the gas phase. That's the analog of dissociation for the molecule. You're pulling everything apart from the crystal and separating them all. So that's what we'll call these energies. This'll be minus E alpha. And this will be minus E beta. And then there are vibrational energies. So the lattice has a bunch of vibrational energies. We can assume they're evenly spaced. And they're going to be different. They're not the same in the two different crystalline forms. That's typically the case. You have a phase transition from one crystal to another. The lattice vibrational frequencies aren't the same any more. Speed of sound is different. Things are different. And that's all. That's the only thing that's different. So these are binding energies per atom.
So now, let's just try to calculate Q. And what we are supposed to get out of this is, what's the phase transition temperature? That's the thing that we didn't have any prescription for calculating before. We just said, here it is. Here's the boiling point. Here's the melting point, or whatever the phase transition was. Or if we drew a phase diagram, here's the line. But now we're going to try to calculate where that is. Or what those temperatures are. So, Q for either phase it's just e to the E over kT. Just like we saw before for the dissociation energies of molecules. To the Nth power. No N factorial any more. There's no translation. Times the vibrational part. This is the electronic partition function. And then there's the vibrational part. And that's one over one minus e to the minus h nu, what's called E over kT, for the lattice I'm assuming it's just one frequency. So this energy is h nu E. And of course, it's different for the alpha and beta phases. And I'm going to assume we're in the high temperature limit. Which is often the case for lattice vibrations. So it's e to the, little E over kT. For our electronic part, to the Nth power. And we've seen what the high temperature limit is. It's kT over h nu E. And that's also, that to the 3N power. I should have written this here.
In other words if you say, how many vibrations are there in the lattice, well if there are N atoms, each atom in the gas phase would have three degrees of freedom. Translational degrees of freedom. In the crystal, it can't freely translate. But those degrees of freedom are still there. The atoms can all move. So now, those are the lattice vibrations. When the atoms try to move, they vibrate against each other. So how many different modes are there? They may be degenerate. They may all have the same energy. But those modes are all still there. Actually, they're the acoustic modes of the crystal. And there are 3N of them. One for each translational degree or freedom that each atom had.
OK, so that's it. A, minus kT log of Q, so it's just minus NE, and the kT cancels. Minus 3NkT log of kT over h nu E. Second part. And then we can calculate the chemical potential. It's just the dA/dN, d constant T and V. So this goes away. It's just minus little E. This goes away, it's minus 3kT log of kT over h nu E. Well, basically we just finished. At the phase transition temperature for the two phases, those things have to be equal. And that's all there is to it. So, at, we'll call it Tc, the phase transition temperature, I guess I called it T1 here, mu alpha is equal to mu beta. So this stuff for alpha is equal to this for beta. That's it. So what does it say? It says E beta minus E alpha, these terms, equals 3kT1 log of nu E beta over nu E alpha. Solve for T1. That's it. It's E beta minus E alpha over 3k log nu E beta over nu E alpha. There is our phase transition temperature. That's the whole story. So if we know the electronic energy that binds the crystal together, usually something that can be measured easily, so it's known for many materials. And if we know the lattice vibrational frequency. Also something that's pretty routinely measured, we're done. So again, from a very simple first principles approach, we can calculate that phase transition temperature. We could do it for a solid-gas equilibria too, if we said OK, let's think of sublimation of the solid. And now we know how to calculate the chemical potential in the gas phase. We can have those be equal. That's actually worked through in your notes. Again, it's an extra problem. If you would like to take a look at it, it's a straightforward calculation. The point is, we can calculate those phase equilibria. Liquids, I will say, are harder. Just because it's harder to define and know all the energies available. Doable, approximately. But the point is, we can actually now locate all those lines that we sort of arbitrarily drew on the phase diagrams. And make sense of what the temperatures and what the pressures are, for that matter, where they occur. Any questions?
Free Downloads
Video
- iTunes U (MP4 - 114MB)
- Internet Archive (MP4 - 114MB)
Free Streaming
Subtitle
- English - US (SRT)