What is the Meaning of "Average"?Date: 03/28/2007 at 16:24:13 From: Danny Subject: What is the meaning of average Can you please give a detailed description of average and its meaning? I'm not looking for a definition like "average is a certain # divided by a certain total #." I don't quite understand the real meaning of average. Date: 03/28/2007 at 23:07:43 From: Doctor Peterson Subject: Re: What is the meaning of average Hi, Danny. There are several different meanings of "average". The most general is a "measure of central tendency", meaning any statistic that in some sense represents a typical value from a data set. The mean, median, and mode are often identified as "averages" in this sense. The word "average" is also used (especially at elementary levels) to refer specifically to the mean, which is the kind of average you mentioned: add the numbers and divide by how many there are. This kind of average has a specific meaning: it is the number you could use in place of each of the values, and still have the same sum. For example, I'll illustrate the idea by making several piles of, say, beans. Suppose I make 5 piles, containing 4, 10, 9, 6, and 11 respectively. If I wanted to redistribute them into five piles each of which had the same number, I would gather them all together, count them (4+10+9+6+11 = 40), and then divide them evenly into 5 piles of 8 (40 / 5 = 8). Thus, the average is the number I get when I distribute a sum evenly; it smooths out the variations in the numbers. Here is an explanation of this kind of average, using a different example: What Does Average Mean? http://mathforum.org/library/drmath/view/52809.html The idea of a mean can be applied to other situations where addition is not the relevant operation. The mean we just talked about is the "arithmetic mean", meaning that it is based on addition. There is also a "geometric mean", based on multiplication, which is the number you could replace everything with and keep the same product, rather than the same sum. For example, the geometric mean of 12 and 27 is the square root of 324, which is 18. This is because the product of 12 and 27 is 324, and the product of 18 and 18 is also 324; we could replace both numbers with 18 and the product would be the same. There are other kinds of mean based on other ways of combining them, which are used on the basis of how the numbers involved "want" to be combined. See these pages: Arithmetic vs. Geometric Mean http://mathforum.org/library/drmath/view/52804.html Applications of Arithmetic, Geometric, Harmonic, and Quadratic Means http://mathforum.org/library/drmath/view/69480.html See also: Average http://mathforum.org/library/drmath/view/57613.html If you have any further questions, feel free to write back. - Doctor Peterson, The Math Forum http://mathforum.org/dr.math/ Date: 04/11/2007 at 12:00:23 From: Danny Subject: What is the meaning of average Hello Doctor, Thanks for the helpful response. In your letter, you mentioned that average refers to central tendency. Let me give my interpretation of what central tendency means. Please correct me if I am wrong. For example, if our data shows that it rains 10 times over 100 days, then it means that the sky "tends" to rain 10 times per 100 days. 10 divided 100 gives a frequency value of 0.1, which means that it rains 0.1 time per day on average. This average refers to how frequently it rains. For example, if it rained 11 times (more frequent than 10), then you would get 11/100, which is a bigger value than 10/100. Thus, 11/100 is more frequent than 10/100. Is this interpretation of central tendency correct? I also think central tendency is the average value that tends to be close to MOST of the various values in the data. For instance, if my data set is (4,6,1,3,0,5,3,4) the central tendency is 3.25, which is a value that tends towards 3 and 4. There are two 3 values and two 4 values in the data, which make up most of the data set. So are both of those ways of thinking about central tendency? Date: 04/11/2007 at 22:43:28 From: Doctor Peterson Subject: Re: What is the meaning of average Hi, Danny. What you are saying in both cases is a reasonable example of the mean, and fits with my description of average rainfall, though I used the inches of rain per day rather than the number of rainfalls. But central tendency is intended to be a much broader term. It's meant to be vague, because it covers not only means but also the median, the midrange, and even the mode. Its meaning is "any statistic that tends to fall in the middle of a set of numbers"; anything that gives a sense of what the "usual" or "typical" value is, in some sense, can be called a measure of central tendency. The *median* is, literally, the number in the middle--put the numbers in order, and take the middle number in the list, or the average of the two middle numbers if necessary. So that's clearly a "central tendency". The *midrange* is the exact middle of the range--the average, in fact, of the highest and lowest numbers. So that, too, has to lie in the middle, though it doesn't take into account how the rest of the numbers are distributed. The *mode* is the most common value, if there is one; it really doesn't have to be "in the middle", or even to exist, but it certainly fits the idea of "typical". The (arithmetic) *mean*, like all the others, has to lie within the range of the numbers, and it represents the "center of gravity" of all the numbers. So each of these fits the meaning of "measure of central tendency", each in a different way. Taking your set of numbers as an example, here are the values of the various measures of central tendency. Your numbers are 4,6,1,3,0,5,3,4 which when sorted in increasing order are 0,1,3,3,4,4,5,6 midrange: (0+6)/2 = 3 median: (3+4)/2 = 3.5 mode: both 3 and 4 mean: (0+1+3+3+4+4+5+6)/8 = 3.25 All of these are "middle" numbers, and for many real data sets they will be close together. The geometric mean in this case is 0; it doesn't work well when zero is allowed! - Doctor Peterson, The Math Forum http://mathforum.org/dr.math/ Date: 04/12/2007 at 17:35:33 From: Danny Subject: What is the meaning of average Hello doctor, thanks for your insights, I now have a better idea of average. Here is one more question about probability. Let's say that I was sick 40 times out of 1000 days. So based on this information, the probability of me getting sick on a random day is 40/1000. 40/1000 says that, in 1000 days I was sick 40 times, and that is how frequently I was sick. Simplifying this 40 to 1000 ratio, we get 1 to 25 ratio. That is, on average, I was sick 1 time per 25 days (or 0.04 times per day). So the probability of me getting sick on a random day is 0.04 based on how frequently I was sick. This leads me to conclude that the probability of anything is based on the past data, and we can make good predictions of future events because of the law of continuity, meaning that things in the universe always follow a pattern. If we lived in a universe without continuity, then the knowledge of probability is useless. So if I was sick 40 times out of 1000 days in the past, then the probability of me getting sick on a random day is the average value of 40/1000 = 0.04. I like to point out that the average 0.04 doesn't have a real physical meaning, because it says that I was sick on average 0.04 times per day (0.04 times? that makes no sense). I think 0.04 is just a number that corresponds to or represents 40/1000 (40 times per 1000 days is meaningful). Please verify what I have written and correct my errors if there are any. Thank you so much. Date: 04/12/2007 at 23:12:39 From: Doctor Peterson Subject: Re: What is the meaning of average Hi, Danny. >the probability of me getting sick on a random day is 40/1000. This is really a whole different question, at least on the surface; but I can see the connection between averages and probability, and perhaps you really had probability in mind from the start. What you're talking about here is called empirical probability: just a description of what actually happened, which can't say anything about why, or what could happen another time. It's simply a ratio: how does the number of occurrences of sickness compare to the number of days under consideration? Out of those 1000 days, 40 of them were sick days; so "on the average" 40 out of 1000, or 4 out of 100, or 1 out of 25 were sick days. If they were evenly distributed--the same idea as a mean--then every 25th day would have been a sick day. >This leads me to conclude that the probability of anything is based >on the past data, and we can make good predictions of future events >because of the law of continuity, meaning that things in the universe >always follow a pattern...probability is established on the basis of >continuity in the universe. Now you've made some big jumps! Not ALL of probability is just about past data; that's just empirical probability. And we can't always extrapolate from past events to the future. Sometimes that works, sometimes it doesn't. In part, it's the job of statistics to look at the data you've got and determine how valid it is to expect the same probabilities to continue--how good a sample you have. But even beyond that, whether we can assume that patterns will continue depends on other knowledge entirely, such as science. If we find a mechanism that explains a pattern, we have much better grounds for expecting it to continue than if we don't. To make a broad statement that "things in the universe ALWAYS follow a pattern" is to indulge in philosophy, not math. In probability, we go the other way: we make an ASSUMPTION that things will continue as they are, in order to be able to apply probability to predicting anything; we leave it up to scientists (or sometimes philosophers) to decide whether that is a valid assumption. The scientist will most likely do some experiments to see if the predictions based on his theory work out, and if so he has some evidence that it is valid, and he can continue to make predictions. If not, then he tries another theory! He certainly would not say that probability forces him to believe that things work a certain way. And perhaps that's what you mean to say: probability applies to a situation beyond the data we have only if there is consistency in the causes underlying the phenomena. >So if I was sick 40 times out of 1000 days in the past, then the >probability of me getting sick on a random day is the average >value: 40/1000 = 0.04. Again, the empirical probability in itself says nothing about whether you will continue being sick at the same rate. It only says that IF you continue at the same rate, then you can expect to be sick 1 day out of every 25, on the average over a long period of time. This, in part, is an expression of what is called the Law of Large Numbers: that IF there is an underlying pattern such that on each day (in your example) there is a 4% chance of being sick, then OVER A SUFFICIENTLY LONG period of time, you can EXPECT to be sick on 4% of all days. So you're right that the probability says nothing about any particular day, and to express it as if it meant you would get sick 1/25 of a day each day is silly. You should say that you get sick 1/25 of ALL days, IF in fact you do! The difference between this and the general idea of averages is that an average can apply to any collection of numbers, not just to the frequency of an occurrence. We can talk about the average speed of a car; regardless of how its speed has varied along a route, we can use the total distance traveled and the total time it took to determine the average speed, which is the speed it might have been going throughout the entire trip, in order to get the same total distance in the same total time. There is nothing probabilistic about this; but like probability, we are taking something that may vary "randomly" and condensing all its variations into a single number. The average speed does not mean that at every moment the car was going that fast, and the probability does not mean that out of every 25 days you are sick on one of them, or, worse, that on every day you are sick for 1/25 of the time. Averages and probability both ignore unevenness and look only at the big picture. And that makes your question a very good one. I've been noticing the connections between probability and averages in several areas lately, and it's good to have a chance to think more about it. - Doctor Peterson, The Math Forum http://mathforum.org/dr.math/ Date: 04/13/2007 at 14:37:16 From: Danny Subject: What is the meaning of average Hello doctor, Thanks for you help and patience. I have one last question. It seems like sometimes averages have no meaning. For instance, in a class of 10 students, 2 got 100 on a test, 8 got 0. The test average is 200/10 = 20. So on average every person got a 20 on the test. If I am correct in thinking that an average value is an estimate of the various values in the same data set (like you said, an average is like a center of gravity in the data set, so all the numbers in the data set should lean towards the average), then the average 20 is closer to the REAL scores of the 8 students who got 0 than to the REAL score of the 2 people who got 100. This average gives a vague idea of how badly most people did, but it has "hidden" the two perfect scores. The average may tell us that most of the people must have done badly so that the average comes out to be so low. However, we can't know that some people did perfectly just by looking at the average. This leads me to believe that the average taken in this case shows that MOST people did badly. On the contrary, it does not give an overall picture of how EVERYBODY did. Let me give another example. In a class of 10 students, 3 students got 70 on a test, 5 students got 80, and 2 got 60. If we take the average here, it comes out to be 73 points per student. Now this average is a good estimate of how EVERYBODY did. Because it is close to the scores of 70, 80, and 60. In the previous example, the average 20 is just too far away from 100 to tell us anything about the students who got 100. With this case, the average 73 gives a better idea of how EVERYBODY did. So this shows that average values sometimes do give a overall or general picture of how EVERYBODY did. But some other times, it only shows how MOST people did. Without looking at the actual data, you can't be sure what means what. So averages are vague in meaning...I think. Is what I said correct? Date: 04/13/2007 at 20:51:49 From: Doctor Peterson Subject: Re: What is the meaning of average Hi, Danny. Several of the pages on our site that discuss mean, median, and mode talk about why you would choose one rather than another. Each has its uses, and what you're saying is that for some purposes the mean is not the appropriate "measure of central tendency". That doesn't mean that it is meaningless, or that it is never a valid concept; only that it doesn't tell you what you'd like to know in this situation. The mean is the "center of gravity"; and there are many objects (speaking physically, now) whose center of gravity is not within the object. The center of gravity does NOT mean "where most of the atoms in the object are". That doesn't mean the center of gravity is meaningless; it's what determines how the object will balance. But sometimes balance isn't what you're interested in! In the case of scores on a test, the median is usually considered the most reasonable measure; in your example, the median would be zero, showing that over half (in fact, more than that) scored zero. So if you choose your statistic carefully, it will tell you what you want to know. Another classic example of this is median income. If in your town 999 people earned $1000 a year, and one man earned $9,000,000 a year, the average (mean) income would be 10,000 a year, even though NOBODY made that amount. The median income gives a much better picture, if you want to know how the "average" person is doing; but that entirely misses the fact that there is one person who is rich. No matter what "average" you use, you'll be leaving someone out. Another example is the rainfall I like to use to illustrate the idea of the mean. If the average rainfall is 1 inch a day, say, it might actually have been dry as a bone for 99 days, and then there was a 100 inch flood on the last day. The average accurately reflects the TOTAL amount of rain over the 100 days, but that isn't all it takes to decide what plants can survive there. Again, the whole idea of an average is to try to boil down a lot of information into one number. That necessarily means that you have to lose some information. (That's why people don't want to be treated as mere numbers; they are more complex than that. Even a set of numbers doesn't like to be replaced by a single number!) I think I've said all along that averages are meant to be "vague" in the sense that they deliberately ignore all the details. As I showed above, you're actually being too generous in saying that the average shows how "most" people did; it may show how NONE of them did. Your last few statements are exactly right: the average is not enough to tell what's really happening. In trying to use any kind of average to say how EVERYBODY did, you are misusing the whole concept. Unless the numbers are all close together, there's no way for any number to tell you how they all did. It's ridiculous to expect that! But there are other statistics that can come to your aid--not averages, but "measures of dispersion", that tell you how FAR APART the numbers are. The most famous of these is the standard deviation, which is actually the square root of the mean squared deviation from the mean (not that I expect you to make any sense of that). This number tells you how accurately you can expect the mean to tell you anything about most of the population. So if you had not only the mean but also the standard deviation, you would have enough information to decide not only what the "middle" is, but also how far "most" of them are from that middle. Even then, however, you would be missing a lot of details, because you've boiled a whole class down to two numbers, which isn't a lot better than one. If you really want to get a sense of how the class (as a group of individuals) is doing, you'll make a graph of some kind. When I give a test, I do exactly that: my spreadsheet calculates the mean, the median, and the standard deviation, and I also make a graph of the distribution of grades so I can see where the "outliers" are--the individuals who are entirely missed by the simple statistics, and whose existence has to be recognized. Incidentally, I've sometimes noticed in teaching, as a result of these statistics, that I can't "teach to the middle" of the class, because there is no middle. Sometimes I find a bimodal distribution, which means that I have a lot of F's and a lot of B's, and no one in between where the median and the mean both lie. (The last word there is an interesting, and very appropriate, pun!) So I have to ignore the statistics and teach the students. - Doctor Peterson, The Math Forum http://mathforum.org/dr.math/ Date: 04/16/2007 at 20:24:40 From: Danny Subject: What is the meaning of average Hi doctor, I have a better idea now, and have learned a lot. Thanks! |
Search the Dr. Math Library: |
[Privacy Policy] [Terms of Use]
Ask Dr. MathTM
© 1994- The Math Forum at NCTM. All rights reserved.
http://mathforum.org/dr.math/