P
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
P1
P2
P3
P4

QB Interception Rates

  Posted 6/7 by Chase Stuart, Exclusive for Footballguys.com

If you're new to fantasy football I doubt you would believe it, but people have been writing fantasy football advice columns for a long time. Way back in 2000, Doug Drinen wrote this article about what statistics are likely to be repeated each year, and which ones are not.

"The correlation [coefficient] is fairly weak for interceptions. If your league counts them, don't let a high INT total from the previous year scare you off too much." What Doug was saying was that after a bunch of research, he found that the number of interceptions a QB throws every year is pretty random; what a guy did last year does not often reflect what he'll do this year.

It's true that most people don't think too much about INT-prone QBs when they're drafting their QB, but if you do projections, you need to project INTs. This article was designed to help those who create projections fine tune their projections in one area - interceptions.

I looked at all QBs since 1990 who met the following three criteria: 1) in season N, they had at least 250 pass attempts; 2) in season N+1, they had at least 100 pass attempts; 3) they played for the same team in seasons N and N+1. There were 365 QBs who fit the bill.

Believe it or not, there was zero relationship between interception rate in one year and interception rate the following year. I ran two different multiple regression analyses; the first one used INT rate and league rate in Year N to predict INT rate in Year N+1. The R^2 of the best fit equation was 0.01, which very low. The R^2 metric is the square of the correlation coefficient, which was just a hair under 0.10; that indicates that the two variables are not correlated very closely. In English, what did I just say? Knowing the INT rate and the league INT rate for a QB will get you about 10% of the way towards predicting that QB's INT rate the following year.

I also performed a regression where the league INT rate and each quarterback's difference between his rate and the league average were the inputs and the difference between his rate and the league average rate the following year was the output. Same result -- an R^2 of 0.01 and practically no correlation between the variables.

Extreme examples make this apparent. I checked the worst 20 INT rates of this period, measured by the difference between the quarterback's INT Rate (defined as 100*INT/attempts) and league average. This would include seasons like the 1996 version of Dave Brown, 1992 John Elway and 2006 Ben Roethlisberger. On average, these 20 QBs had a 4.89 INT rate while the league average that year was 3.25, giving them an average difference of 1.65 INTs per 100 attempts below average. The following year, these QBs were at 3.33 INT/100 attempts when the league average was 3.22. In other words, after being the worst possible QBs at avoiding interceptions, they were barely below average the next year.

On the flip side, the 20 best QBs at avoiding interceptions averaged 1.45 INT/100 ATT when their league average was 3.31, for a difference of 1.87 fewer interceptions per 100 passes than average. The next year? These same QBs, on the same teams, averaged 3.10 INT/100 ATT when the league average was 3.28; this means that for every 552 passes they threw, they had one fewer interception than average. For all intents and purposes, it looks clear to me that there is no correlation between INT rates from season to season.

Brett Favre has often been a punching bag for INT jokes, and in his 16 seasons in Green Bay he threw 286 interceptions. However, if the league average QB threw exactly as many attempts as Favre threw in those seasons, he would have thrown 283 interceptions. Favre may have thrown a bunch of INTs for an elite QB, but he certainly didn't throw a lot of interceptions relative to his number of attempts for an average QB.

QBs who throw a lot of passes will have a lot of interceptions. There's no denying that. But based on the numbers I've just examined, I see no reason to project certain QBs (except say, rookie QBs) to have significantly more INTs than average or certain QBs to project significantly below average.

But wait, you say. Peyton Manning posted a better than average in INT rates for six straight years. Tom Brady has never had a worse than average INT rate. Those are good points, so let's investigate further.

I checked all QBs with over 250 passes in one year, over 200 passes the next season, and over 100 in the third season, all while never changing teams. 93 QBs met those criteria and had above average INT rates. On average, the group threw 0.81 fewer INT than average in Year N and 0.83 fewer INT than average in Year N+1; in the third year (Year N+2), the group threw 0.40 fewer INT than average. In other words, they retained about half of their non-turnover-prone ability.

I also checked QBs meeting the same criteria but who were below average for two straight years. Obviously our sample size was smaller -- only 35 QBs met those cutoffs. They threw 0.72 more INTs than average in Year N, 0.68 more INTs than average in Year N+1, and then 0.26 more INT than average in Year N+2. So they retained some of their recklessness, if you will, but not all that much of it.

Last check -- how about three straight years? 50 QBs were at 250+, 200+ and 200+ attempts and below average in INT rates for the same team in three straight years, and threw at least 99 passes in the fourth season. How did they do? In Year N, they were at 0.84 INT below average; in N+1, 0.83 INT below average; in N+2, 0.82 INT below average. In the fourth season? Just 0.47 INT below average.

Only 11 QBs were below average in three straight years and met the above criteria; they threw 0.57, 0.67 and then 0.94 more INT than average in Years N, N+1 and N+2 respectively. In Year N+3, after three years of reckless throwing, these eleven QBs on average threw just 0.11 more INT than average.

Outside of that last look, the sample sizes in these groups are pretty good. What general conclusions would I draw?

Outside of perhaps very raw or very old QBs (no data provided, juts a gut feel), I would project the following:

  1. For QBs who have consistently thrown fewer INTs per attempt than league average, I'd project about halfway between league average and the QB's recent average. For example, the past three years, Manning has thrown about 0.87 fewer INT per 100 passes than average. Brady, in 2005-2007, was about 1.01 INT below average. I would project Manning to throw about 0.5 INT fewer than average and Brady 0.6 INT below average. Projecting any fewer interceptions than that would be inappropriate.
  2. For QBs who had a great INT rate last year (say, -1.00 fewer INT than average) and no history of being able to consistently do that, I'd project the league average INT rate.
  3. For QBs who had a terrible INT rate last year (say, +1.00 more than average) and no history of being consistently bad, I'd project the league average.
  4. For QBs who had a better than (or worse than) INT rate last season and some evidence to suggest it was not a fluke, I would project them to be just slightly better than (or worse than) league average this year. Anything more than 0.25 INT fewer than (or greater than) average seems wrong to me.

Really short conclusion: Just about every QB should be projected to have an average INT rate in 2008, unless they've been really good or really bad for multiple seasons, or are really young or really old.

Finally, there's another way of looking at interception rates. Any interception is a random, unusual and rare event. Most QBs only get intercepted two to four times a season. Therefore, we would expect some long streaks with no interceptions. We would also expect some QBs to unfortunately have several interceptions randomly bunched together that would ruin their interception rate. I used a random number generator to create 25 quarterbacks, each of whom will throw 500 passes. Each QB has a 3 percent chance of an INT on any one pass. That is to say, if each QB threw 500 million passes, they would all throw right around 3 million interceptions. But on only 500 attempts, how many INTs did those 25 QBs throw?

  • 9
  • 11
  • 12
  • 12
  • 13
  • 13
  • 14
  • 14
  • 14
  • 14
  • 14
  • 15
  • 15
  • 15
  • 15
  • 16
  • 16
  • 16
  • 16
  • 17
  • 17
  • 19
  • 20
  • 21
  • 22

One QB, by luck of course, threw just 9 INTs out of 500 passes (1.8 INT/100 attempts) even though morally he should have thrown 15 interceptions. Three QBs threw 20 or more INTs, even though they too were morally just 15 INT guys. In fact, the distribution of our 25 QBs looks pretty similar to the distribution in any given NFL season. That is to say, given 25 QBs who throw 500 passes, even if all QBs had the exact same INT rate, we would expect a couple of QBs to have really low INT rates on just 500 attempts and some QBs to have really high ones. With so few attempts and each event being rare, these are the results we should expect. In my opinion, that's pretty much what happens. The vast majority of QBs - close to three-fourths of the league, perhaps - have roughly the same INT rate. In any given year, some will be lucky and won't throw many interceptions; conversely, some will be unlucky and will throw a bunch. But the next season, there's no reason to project those guys to be anything other than average when it comes to throwing picks.