An Inadvertent Test of Home-Field Advantage in the NFL

Back when I was a graduate student, we did our econometrics assignments on punch cards and submitted the “jobs” to a computer desk. We were allowed three attempts for each assignment.

I worked very diligently to make sure I did each assignment correctly on the first attempt. That left me with two more computer runs for each assignment that I could use for my own purposes.

I used those extra runs to try to predict the outcomes of NFL games based only on points scored by against each of the opposing teams in previous games that season. I carefully coded points for and points against each team and ran simple linear regressions. Nothing worked.

But I did get a statistically significant value of 7.0 for the intercept term. That result made no sense to me at all; I had expected the intercept to be roughly zero. It took me several hours (when I probably should have been studying econometrics) to figure out that the way I had coded the data meant that the intercept, or constant, term was a proxy for the home team.

The seven points I was measuring were home field advantage, and that result was pretty consistent with what others were finding or guessing back then.

I was reminded of this incident by Brian Goff’s piece last week at The Sports Economist, where a commenter mentions home-field advantage. But what happened during the first week of the 2006 season when the visitors won so many games? Did the home-field advantage hold but the visitors were that much better? Or has the size of the home field advantage changed over these past xxx years?

Photo of author

Author: John Palmer

Published on:

Published in:

General