As Dilip stated, the return (CAGR) will not be the same as the average. For the simple mathematical explanation - (R+x)(R-x) will return R^2-x^2 over the two periods, not the average R^2 result. The higher the variance, the worse the return over time. The issue may be less about this in the accumulation phase then when a retiree is withdrawing funds. I'd prefer the first 10 years of my retirement produce extraordinary returns followed by a below average decade than the reverse.

You do not *have* to worry about the variance of the return if you do not want to, or if you choose not to. But average long-term growth rate is known only after the fact, not before. Right now, all you can say is that you *expect* a portfolio to grow at, say, an *average* rate of 8% per annum for the next 10 years. Whether your expectations are achieved will not be known till 2022. The word *average* is important implying that the growth rate during each of the ten years might besmaller or larger or the same as 8%, and some idea of how much the annual growth rate might vary from the nominal 8% during the ten years is of some interest to some people. One measure used to describe this variation is the variance (or standard deviation which is the square root of the variance).

Here are some questions for you to ponder.

If you have to choose between two investments, both projecting anaverage growth rate of 8% per annum over 10 years, but one is more volatile so that the actual growth rate in any particular year might be anywhere from 2% to 14%, while the other is more stable with actual growth rate in a year rangingfrom 7.5% to 8.5%, which would you prefer?

You invest in something extremely risky for a two-year period(no backsies) and the investment has a 50% loss the first year anda 50% gain next year. Is the average gain (-50+50)/2% = 0%? Will you get all your money at the end of the two-year period?

I was gonna post this as a comment to Dilip's answer but decided it was worth expanding on, even though some of this has already been mentioned.

If an investment goes down 50% one year, and up 50% the next, even though on average your rate of return is 0%, you're *down 25%*.

The math:

`$100,000 - 50% = $50,000$50,000 + 50% = $75,000 <-- ouch!`

Some more math:

`Both Portfolios start at $100,000. Both have an average annual rate of return of 8%. Portfolio A - High Variance: Year 1: $100,000 down 50% = $50,000 Year 2: $50,000 up 58% = $79,000 <-- Yay! Average annual gain of 8%! Ca-ching! Er... except you're down 21% on your original investment. Portfolio B - Low Variance Year 1: $100,000 + 6% = $106,000 Year 2: $106,000 + 10% = $116,600 <-- This also has an average annual gain of 8%, but instead of being down 21%, you're up 16.6% on your investment!Which portfolio would you prefer to invest in?`

Now, it's true, that when the investment is done, you've sold it, and you have the cash in hand, it doesn't matter anymore what the variance was over the time that you held it. It's all hindsight then, and nothing more to do except brag at neighborhood parties.

**Expected variance is useful to figure out if you should invest to begin with.** For any given level of expected return, you want the lowest variance. The lower the variance, the more reliable that expected rate of return is.

So yes, you should worry about variance of return.