FanPost

Yards per Play is only half the story

We talk a lot about yards per play here at TN, but I wanted to chime in to explain why that measure of an offense is only half complete. Just as importantly as expected/average yards per play is the standard deviation of the expected/average yards per play. None of these ideas are original; the guys at Smart Football, Sabremetric Research, and Advanced NFL Stats have covered this quite a bit. To start, let’s conduct a simple thought experiment:

Imagine a team that only had one play- a 50 yard bomb. This play was successful about 1/5 of the time. That means this offense averages 10 yards per play, which is a huge amount but the standard deviation is a whopping 22.3 yards per play. Because of the huge standard deviation, there is very little chance this team ever reaches the end zone unless give a short field position (the likelihood of stringing two plays together to get you in is 4%). Imagine another team that only has 1 play that averages only 3 yards per play but has a 1 yard standard deviation. This offense would almost certainly never punt. As long as your defense gave you just 1 stop per game, you would go undefeated. Looking at these two offenses through the limited lens of only yards per play, it looks as though one is far superior to the other 10 ypp vs 3 ypp. But missing is the standard deviation comparisons of ~22 to 1.Now imagine this scenario:

Offensive Plays:

Series - 1

Play 1: 0 yards

Play 2: 0 yards

Play 3: 0 yards

Punt

Series – 2

Play 1: 99 yard bomb for a TD

Series – 3

Play 1: 0 yards

Play 2: 0 yards

Play 3: 0 yards

Punt

Series – 4

Play 1: 0 yards

Play 2: 0 yards

Play 3: 0 yards

Punt

This is not a very good offense, yet it is averaging 9.9 yards per play. What that average hides is the standard deviation of 31.3 yards per play.

Standard deviation is the risk that the play will not achieve its expected/average yards. So what is the best way to tell if your offense is better at getting the most yards for the risk it is taking? Here we turn to a Finance concept: the Sharpe ratio. Sharpe ratio tells you how much reward you are getting for each unit of risk you are taking. Let’s consider another (more realistic) sample set for these two teams plays:

Team A play results in yards: 20, -1, 0, 10, 15, -10, 4, 2, 18, 12.

Team B play results in yards: 7, 3, 9, 2, 2, 5, 6, 6, 4, 3.

Team A has an average of 6.3 yards per play and a standard deviation of 10.5. That makes their Sharpe Ratio 0.6. Team B has an average of 4.2 yards per play and a standard deviation of 3.6. This makes team B’s Sharpe ratio 1.2. Team B’s offense is achieving much more for the level of risk that it is taking than Team A is despite the higher yards per play average of Team A.

I just point this out as I've never seen anyone mention the standard deviation when talking about yards per play of either FSU or its opponents, yet it should, as it is an important measure in telling whether an offense is underachieving or not.

Fanposts are a section for the fans and do NOT reflect the views of Tomahawk Nation.