clock menu more-arrow no yes mobile

Filed under:

A Test of Football Outsiders' Projections Over the Years

Here's a brief look at the accuracy of Football Outsiders' forecast for the season.

Scott Cunningham

Last week, Dave posted a great conversation with Football Outsiders, getting some interesting insight on the Falcons. FO has some great analytic tools and intriguing information by which we can evaluate performance. They use this information each year to forecast team records where they have Atlanta pegged 7.3 wins. Given the on-paper performance in 2013, I think this is fair. However, my non-analytic evaluation wants to objectively believe the realistic forecast would come close to 9.

Regardless of how I feel, I thought of putting their projections to the test. In reading through the comments on Dave's post, I noticed a divide between those who credit FO's analytics and those that question them. After all, they did project the Falcons' downfall last year - not to the same extent, but projecting injuries is useless. So, I now get to ask the all-encompassing drive behind the analysis: should FO's projections get some credit?

Track Record

After compiling the projections for 2008 through 2013, I compared FO's projected mean wins against the eventually regular season performance. Here's the chart:


As you can see, the general trend of the projections is slightly positive. The distribution of the results is heavily scattered, though, and implies a weak predictor. The r-squared value (.177) describes the extent of the weakness - essentially, this means that FO projections only explain about 17.7% of the eventual result. Given all of the variables that can affect performance, that value is not horrible. But I have to evaluate the projection for the end result, and in that regard, the forecast does not resonate with me.

I'll try not to bore you with more detail, but here are some statistics. The average variance is 2.4 games. The standard deviation of this sample is 2.9. That translates to a pretty large margin for error (as seen in the distribution). One way to interpret this is it relates to the Falcons: If I want to be 95% confident that I can get describe an accurate range for the 2014 performance, I would only be able to say that Atlanta will win between 1.5 and 13.1 games. Sorry to those of you who think we will go undefeated. In all seriousness, that's a huge gap and essentially leads me to believe that the projections are statistically irrelevant.

Taking It One Step Further

As a simple exercise, I got curious about how much the previous season record suggests. See the table below:


This really caught my attention. Only using previous records, I get an r-squared value of .093. In other words, I can describe 9.3% of the performance variance based solely on last season's performance. Let's apply that in conjunction with FO's forecasts. If I can describe 9.3% of the variance using just one variable, and FO can only describe 17.7% of that variance, it means that their analytics for ranking performance only contributes an additional 8.4%.

I have nothing against what they do because it's a complicated task. The accuracy just does not exist for me to place any weight on what they suggest will be the fate of our football team.