Flash and JavaScript are required for this feature.
Download the video from iTunes U or the Internet Archive.
It is good to know how fast different functions grow. Professor Strang puts them in order from slow to fast:
logarithm of x powers of x exponential of x x factorial x to the x power What is even faster??
And it is good to know how graphs can show the key numbers in the growth rate of a function
A LOG-LOG graph plots log y against log x If y = A x^n then log y = log A + n log x == LINE WITH SLOPE n
A SEMILOG graph plots log y against x If y = A 10^cx then log y = log A + cx == LINE WITH SLOPE c
You will never see y = 0 on these graphs because log 0 is minus infinity. But n and c jump out clearly.
Professor Strang's Calculus textbook (1st edition, 1991) is freely available here.
Subtitles are provided through the generous assistance of Jimmy Ren.
Growth Rate and Log Graphs
Related Resources
Lecture summary and Practice problems (PDF)
PROFESSOR: OK. Hi. I thought I'd give a short lecture about how logarithms are actually used. So a little bit practical. And also, it naturally comes in, how quickly do functions grow? Which functions grow faster than others?
And I made a list of a bunch of functions that we see all the time. Linear growth. Just, the function goes up along the straight line. Proportional to x, linear could have been a c times x, still linear. Here that's called polynomial growth, like some power of x.
Here is faster growth. We introduced e to the x, and I'll take this chance to bring in 2 to the x and 10 to the x. Especially 10 to the x, because that'll lead us to logarithms to base 10, and those are handy in practice. So that's exponential growth.
And here are some that grow faster still. x factorial, n factorial grows really fast. And n to the nth or x to the xth is a function that grows still faster. And of course, we could cook up a function the grew faster than that. X to the x to the x power would really just take off. And we could find functions that grow more slowly.
But let's just take these and let x be 1000. Just to have a kind of realistic idea of how these compare when x is 1000. OK. So I'm skipping to c. So x will be 1000. 10 cubed. Let me just write it as 10 cubed. So x is going to be 1000. And because these are big numbers, I'm going to write them as powers of 10.
OK. so how about 1000 squared? 10 cubed squared will be 10 to the sixth. 1000 cubed, we're up to 10 to the ninth. And onwards. Like, this is where the economists are working. The national debt is in this range.
OK. Now fortunately, it's not in this range. 2 to the thousandth power. And if I want to be able to compare it, I'll write that approximately as 10 to-- well, if it's 2 to the thousandth power, it'll be 10 to a smaller power. And 300 is pretty close for 2 to the thousandth. Then e to the thousandth, that's going to be bigger than 2. e is 2.7 et cetera. This is more like 10 to the-- I think this is right-- about 434, maybe. And 10 to the thousandth-- well, I can write that right in. 10 to the thousandth when x is 1000. OK. So that's the one that is exactly right.
And also, I could write in 1000 to the thousandth power. What power of 10 will this be? 10 to the what? 1000 to the thousandth power, I think, is 10 to the three thousandth. Why do I think that? Because 1000 itself is 10 times 10 times 10. Three of them, right? And then we do that 1000 times, so we have a string of 3000 10s multiplying each other. And that's what 10 to the three thousandth is.
And you might wonder about a thousand factorial. Let me make the rough estimate. A big number in factorial, order of magnitude, is something like, it doesn't grow as fast as this, because this is x times x minus 1 times x minus 2. 1000 times 999 times 998. So we're not repeating 1000 every time.
And the difference-- it turns out that this number divided by this number, x to the x over e to the x, is the right general picture for factorial. So that would be, if I divide 10 to the 3000 by 10 and this power, what do I do? In a division, I do a subtraction of exponents, because I have that many fewer 10s multiplying each other. So I think it would be 3000, but I don't want the full 3000, because I take away e to the thousandth, 434 of them. So that's about-- 2566 is close enough, anyway.
OK. Giant numbers. Giant numbers. And of course you saw that I didn't write it out with 1 and 3000, or whatever, zeros. Hopeless. OK.
In other words, it's the exponent that gives me something I can really work with. And the exponent is the logarithm. That's what logarithms are. They are the exponents. And when they're the exponent with a 10, I call 10 the base. And I'm speaking about logarithms to the base 10. Can I just copy those numbers again? And then I want to write their logarithms. Because it's the logarithms that kind of remain reasonable-looking numbers but tell you very nicely what's growing fast.
So let me write out again. 10 cubed, 10 sixth, 10 to the ninth is polynomial growth starting with the first power. Then I'll write down 10 to the three hundredth, approximately. 10 to the 434, I think, is about right. And then 10 to the 1000. And then I had 10 to the 2566 as something, roughly 1000 factorial, and then 10 to the 3000.
OK. I just copied those numbers again. And now I plan to take their logarithms. I can see what's happening with logarithms. The logarithm of 10 to the ninth is-- if the base is 10-- the logarithm of 10 to the ninth is the nine. This has logarithm 6. This has logarithm 3.
So you see-- well. If we took the logarithm of the national debt, it wouldn't look too serious. It would just be up around 9 moving toward 10. But what I'm using it for here is to get some reasonable way to see-- 300. Of course, that's big. For a logarithm, that's a very big number. 434, 1000. These are climbing up. 2566 and 3000. OK.
So these are the logs. Just to repeat. If I wanted this growth, this list of functions by how fast they grow, where would log x appear in my list of functions? It would be way at the left end. Slower than x. Much slower than x. Log x grows very slowly, as we see here.
And then if you wanted one that really grew slowly, it would be log of log x. That creeps along. Eventually gets to-- passes any number. But x has to be enormous.
And one more little comment before I begin to use some things graphically. Because that's the other part of this talk, is log-- the graphs. Using logarithms in graphs. A little point. You might ask, what about functions that decay? What would be the corresponding functions here that decay? Let me write them here.
Decay. By that I mean, headed for 0 instead of headed for infinity. Well, 1 over x, 1 over x squared, 1 over x cubed. Those functions go to 0 faster and faster.
Now, what about these? The next list would be 1 over-- I'm dividing, but 1 over 2 to the x. 1 over e to the x. Can I write that in a better way? e to the minus x. 1 over 10 to the x. Those are going to 0 like crazy. And of course, if I keep going, even worse. So like, x to the minus x power would be really small.
So my point is just that we have a scale here that not only gives us a handle of how to deal with things that are growing very fast, but also things that are going to 0 very fast. The other, the negative logarithms. The logarithms of these things would be minus 3, minus 6, minus 9 and so on, if I divide by one. Good.
All right. So that suggests the idea. Now I want to introduce the idea of a log scale. So I'm just going to think of a usual straight line, on which we usually mark out 0, 1, 2, 3, minus 1, minus 2. But on this log scale, the center point, the 0, I'm really graphing the logarithm of x instead of x. That's the point. That in this log scale, what I'm picturing along here will be-- this number will be 10 to the 0 power, which is 1. The next one will be 10. The next one will be 100. The next one will be 1000.
So you see, within this picture-- on a graph that we could draw and look at on a printed page-- we can get big numbers by going from the ordinary 1, 2, 3 scale to the log scale, which puts these points in this order.
And let me put some of the other ones. Now, what one point goes there? 1/10. Every time I go that far, I'm multiplying by 10. When I go this way, I'm dividing by 10. Up there, this is the number 1/10, which is the same as 10 to the minus 1 power, right? Here is one hundredth. Here is one thousandth. And so on. So this log scale is able to deal with very small numbers and very large numbers in a reasonable way.
And everybody sees the point here that really, what it is is the logarithms. So this is 0. This is 1, 2, 3, and so on. Minus 1, minus 2, minus 3. If I'm graphing, really, these are the logarithms of x. And I'm doing logs to base 10 again, because that gives us nice numbers.
OK. By the way, what's that number? What's that number, halfway between there and there? It's not halfway between 1 and 10 in the ordinary sense, which is whatever, 5 and a half. No way. Halfway between here is-- you know what it will be? It'll be square root of 10. 10 to the 1/2 power. The half is here. The log is a half, so the number is the square root of 10. That's about 3, a little more than 3. And what would be here, would be 10 to the minus 1/2. 1 over square root of 10. So you see that picture.
Oh, I have another question, before I use the scales. What if I like the powers of 2 better? In many cases, we might prefer powers of 2. Well, if I plotted the numbers-- I'm looking at this log scale. And suppose I plot the numbers 1, 2, 4, 8, whatever. 16. What could you tell me about those? Well, I know where 1 is. It's right there. That's a 1. Well, two would be a little further over. Then 4, then 8 would come before 10, and 16 would come after 10.
I pointed there, but 16 would not come there. 16 would be a lot closer, I think, in here. What's the deal with 1, 2, 4, 8, 16 on this log scale? They would be equally spaced. Of course, the spacing would be smaller than the 10 spacing. If every time I multiplied by 2, I go the same distance. After I'd done it about 10 times-- multiplied by 2 10 times-- so that's 2 to the tenth power is close to 1000. So 10 powers of 2 would bring me pretty near there. Anyway.
And here's one more question. Where is 0? If my value that I wanted to plot happened to be 0, where is it on this graph? It's not there. You can't plot 0 on a log scale. It's way down at the-- you know, it's at the minus infinity end of the graph. Infinity is up there at that end, and 0 is down here. OK. Good.
So can we use that log scale? How do we use that log scale? Let me give you an idea for what use that log scale might be. Practical use.
Suppose I know, or have reason to believe, that my function might be of the form y is something times x to the nth. I have some quantity y. The output when the input is x. But I don't know these, that number a. So I've done an experiment. And I would like to know what is a, and especially, what is n? I would like to know how the growth is progressing. And I'm just taking simple growth law here.
OK. I would graph it. I'd get a bunch of points, I put them on a graph, and I look at the graph. Now if I just graph these things, if I just graph that y, here is x and here's y, suppose n is 1.5. Suppose my growth rate, and this is very possible, is x to the 1.5. And a is some number-- who knows. Could even be 1. Suppose a was 1. So then I'm graphing y as x to the 1.5. What does that look like? Well, it looks like that.
The problem is that if the real growth-- the real good relation-- see, I would have a few points that might be close to that curve. But if I'm looking that curve, I frankly could not tell 1.5 from 1.6 growth rate. The truth is, I couldn't tell it from 2. I couldn't tell what the actual growth rate is from my graph, which has a little error, so I'm not too sure. And the point is x to the 1.5 and x to the 2 would be all-- If I sketch the graph, it would look like that.
But go to the log scale. Go to a log log graph. So I'm going to take logs of both sides, and look and plot that. So I take the logs of both sides, so I take the log of my outputs y, and now this is a product of that times that. What's the rule for logarithms? Add logarithms. So this would be log a plus log of x to the nth. But now what's the log of x to the nth? Beautiful again. This is x times x times x n times. At least of n is an integer. Think of it as x multiplied by itself n times. When I take the logarithm, I add n times. Log of x to the nth is n log x.
Now that, let me graph that now. This is now a log picture. So I'm graphing log y against log x, which was the whole point of my log scale, to think of doing this.
And what kind of a curve will I see from this equation on this graph paper? A straight line. That is some constants plus some slope. n will be the slope times the x. It's like capital Y is capital A plus n times capital X or something. But better for me to write log, so we remember what it is.
So on this paper, suppose-- I did the example x to the 1.5. OK. So in this example, a is 1 and n is 1.5. So what would my points look like here?
Now remember, I should really allow negative logarithms. Because this is the point, right? This is x equals 1 here. The log is 0, but the number is 1. Ha, OK. So when the log is 0, you see, it's going to be a straight line.
And actually, when I took a to be 1, its logarithm will be 0. The line would go right through there. It would have a slope of 1 and 1/2. My points will be really close to line. I measure out, if I go out a distance 1, then I go up a distance 1.5. Right? Up 1.5. When I go across by 1 on the log picture, it could be down here. My numbers could be smaller or larger. A straight line. I can get out a ruler and estimate the slope far more accurately than I could hear with a lot more software.
OK. So that's an important, very important instance in which we wonder what the rate of growth is, and the graph shows it to us. But just make a little point that I've put some points here, like near a line, and that raises another graph question of very great importance. Suppose you have some experiments that put points close to a line, but not right on a line. You want to fit a line close to them. You want to fit the best line to the experimental points.
How do you fit a straight line? That's an important thing. And let me save that for a future chance, because I want to tell you about it. The best, the standard way is what's called the least squares. So least squares is a very important application. And the best line, it turns out, is a calculus problem.
So for the moment let's pretend they're right on the line. Its slope, which we easily find, tells us this number.
May I mention one other behavior? So another possibility. If y is not growing polynomially, but suppose y is growing exponentially-- I'll just put it here, because it's not going to be a big deal. y is some-- call it b, e to the c x. So that's a different type of growth. That's the big part of the today's lecture, is to say, this is a quite different growth.
But it would be equally hard-- or even harder-- to find this growth rate c from an ordinary graph. The graph would take off even faster than this one. You couldn't see what's happening. The good idea is, take logarithms. But what do we want to do? We'll take the logarithm of y-- log y, as before-- will be the log of B plus the log of e to the cx. Oh, maybe I should have made this 10 to the cx, just to make it all-- instead of the e, I could use the 10. Whatever. Because I've been talking about logarithms to the base 10, so let me use the powers of 10 here.
What's the logarithm of 10 to the cx? When the base is 10, the logarithm is the exponent. c times x. So what am I seeing in this equation? That's an equation when I've taken logarithms, my big numbers become reasonable. And also, very small numbers become reasonable. And I get a straight line again. I get a straight line. But it's not in this log paper. The logarithm of y, the y-axis, the vertical axis, is still log scale. But you see it's ordinary x there now. So I don't use log x for this one. Just ordinary x. It's semi log paper. Logarithm in the in the vertical direction, ordinary in the x direction. OK. Good.
Now I just want to add one sort of example. Because it's quite important and also quite practical. May I tell you about-- Let me ask you the question, and see if you get an idea. Because this is like basic to calculus. Let me talk about-- this e will stand for error. Error e. And what error am I talking about? I'm talking about the error as the difference between the derivative-- I have some function f of x. And there's its derivative. And I compare that with delta f over delta x.
So what do I know? I know that as this is a function of delta x, I'm comparing the instant slope versus the average slope over a distance delta x. So it's not 0, right? This one is a finite movement. Delta x produces a finite moment delta f. As delta x goes to 0, that does approach this.
So here's my question. My question is, this is approximately some constant times delta x to some power. And my question is, what is n? How close? What's a rough estimate of how near the delta f over delta x is to the actual derivative? OK.
So I have to tell you what I meant by delta f over delta x. I meant what you also meant, f at x plus delta x minus f at x divided by delta x. In other words, that's the familiar delta f. Moving forward from x, I would call that a forward difference, a forward delta f. Because I'm starting at x, and I think of delta x as moving me a little bit forward. So I get the delta f, I divide by the delta x, and that's what this thing means.
And do you know what n is? Let me connect it to my pictures. If I tried to graph this, I'd have a graph. You know. Here's my delta x and here's my e. This difference says delta x goes to 0, it goes to 0. You know, if delta x is small, e is small. If I divide delta x by 10, e divides by something.
I don't even know if you see it on the camera. The graph has gone into a-- well, a black hole, or a chalk hole, or a white hole, or something. It's just completely invisible. I can't see the slope of this thing.
But if I did it on log log paper, I'd see it clearly. And the answer would be 1. The error, the difference between derivative and average slope, goes like delta x to the first power. And then we can see later where that 1 comes from, and we can see where that a is. It's all in Taylor series.
But here's my practical point. There is a much better delta f than this one. A much better delta f over delta x. An average slope that's much more accurate, and that in calculation I would always use. And the trouble with this one is, it's lopsided. It's one-sided. I only went forward. Or if delta x is negative, I'm only going backwards. And it turns out that the average of forward and backward is like centered at difference.
So let me tell you a center difference. f at x plus delta x. So look a little forward, but take the difference from looking a little backward. That would be my change in f. But now what do I divide by to get a reasonable slope? Well, this is the change in f going from minus delta x-- delta x to the left of the point to delta x to the right of the point. The real movement there in the x-axis was a movement of two delta xs.
So I would call this a center difference. Can I write that word "centered" down?
And if I use that, which is a lot smarter if I'm practically wanting to get pictures, then what happens? So if this is now instead of this, instead of choosing this lopsided, simple, familiar but not that great difference, if I go for this one, the answer is, n changes to 2. n is 2 for this one. The accuracy is way, way better for center differences.
And the point about the log graphs is, if I plot those points on the graph I would see that slope of 2 in the log log graph, it would be again-- in ordinary graph, it would become invisible as delta x got small. But on a log scale, I'd see it perfectly.
OK. Some practical uses of logarithms. Now that we no longer use slide rules, this is what we do. Thanks.
NARRATOR: This has been a production of MIT OpenCourseWare and Gilbert Strang. Funding for this video was provided by the Lord Foundation. To help OCW continue to provide free and open access to MIT courses, please make a donation at ocw.mit.edu/donate.
Free Downloads
Video
- iTunes U (MP4 - 73MB)
- Internet Archive (MP4 - 73MB)
Subtitle
- English - US (SRT)