The Continuing Rise in Strikeout Rate

In yesterday’s post on the early home run trend of 2013, I noted that strikeout rate was up again, as it has been for a while. At this point, the continuing rise of strikeout rate isn’t a new story, and I think most of you are probably aware of the fact that Major League Baseball is essentially setting a new record high for league average K% each season.

One of the main theories that is espoused for why strikeout rate is ever increasing is the simultaneous increase in pitcher velocity. It used to be that Randy Johnson was a freak because he could touch 100 mph with his fastball, but now it seems like every team in baseball has a guy who can hit that mark. While we don’t have historical velocity data, we do have PITCHf/x velocity data since 2007, and BIS velocity data going back to 2002. While there are some differences due to the classifications of pitch types, both support the idea of rising velocity.

The BIS data has the average fastball going from 89.9 in 2002 to a peak of 91.6 last year, a nearly 2 mph rise in the average fastball speed over the last 11 years. We know that velocity and strikeout rate are highly correlated, so a league wide rise in the speed of pitches would explain why strikeout rate keeps going up and up.

However, we also know that average velocity generally starts lower in the coldest month of April, and tends to rise as the season goes on and the summer gets warmer. So, since we were already digging into the trends of monthly data versus full season, I thought it might be worth looking at whether strikeout rate follows a similar path. If velocity is driving the rise in strikeout rate, and velocity is at its lowest point in April, then it should follow that strikeout rate should also be at its lowest point in April.

To test out whether that’s true or not, I got our pal Jeff “Not The Pitcher, Player Linker” Zimmerman to send me monthly strikeout data back back to the start of the 2002 season. Here are the average K% totals for each month of the season from 2002 to 2012:

Month K%
April 17.5%
May 17.2%
June 17.3%
July 17.5%
August 17.6%
September 18.3%
Overall 17.6%

During the first five months of the season, there’s no real K% trend. K% in April has actually been slightly higher than it has been in May or June, and then it has gotten back to April rates by the middle of the summer. There is, however, a huge spike in September. The most obvious explanation for the September strikeout spike is expansion of the rosters, as the addition of more players allows managers to play more match-ups in selecting relievers, and the addition of additional minor league bats might be driving up the strikeout rate for hitters as well. But, putting that structural change aside, it doesn’t appear that there’s a significant difference in strikeout rate between the beginning of the season — when average velocity is at its lowest — and later in the summer, when pitchers are throwing harder.

Perhaps a chart will be beneficial here. Here is the overall rise in strikeout rate from 2002 to 2013, by month:

Krate

The September spikes stand out, but there’s another thing that chart shows that I hope you’ve noticed — the drastic increase in K% since 2008. From 2002 to 2007, K% hovered right below 17% and was trending up at a slow pace, but then it took off in 2008, jumping to 17.5%, then 18.0%, then 18.5%, and so on and so forth.

Now, here we stand in April of 2013, and despite an average fastball velocity that is equal to what the league posted overall in 2010, we’re on track to have the second highest K% month in the sport’s history, second only to last September. So, what happened in 2008 that continues on to this day, and doesn’t seem to be driven primarily by the rise of velocity throughout the summer?

Well, 2008 was the year after MLB Advanced Media and Sportvision installed PITCHF/x cameras in Major League Stadiums, and it was the first year that umpires could potentially have been acting upon a directive from the league based on that data. MLB had previously been using the Questec system to evaluate umpires, but the system was only installed in about a third of MLB ballparks, according to this 2009 article from the New York Times. In that article, former head of umpires Mike Port described how the “Zone Evaluation” system would be an improvement:

“It’s an upgrade from where we were,” Port said in a telephone interview. “The umpires, they don’t want to miss a pitch any more than a batter wants to strike out. Where the Z.E. system will give us a lot of help is more data to help identify any trends: ‘The last three plate jobs, you missed seven pitches that were down and in. Here’s how one of the supervisors can help you adjust your head angle or your stance to have a better chance of getting those pitches.’”

Over the last 30 years, the strikeout rate in MLB has gone from 14.0% to the 20.0% it stands at today. It took 24 years to move from 14% to 17%, but it’s only taken six years to move from 17% to 20%. Those six years correspond perfectly to the PITCHF/x era.

And, while correlation is clearly not the same as causation, there is other data that suggests that the rising strikeout rate could very well be attributable to changes in umpiring. James Gentile authored a piece on Beyond the Box Score a few weeks back that showed that called strikes are increasing at a much higher rate than swinging strikes. Borrowing an important image from that piece:

Strikes:PA

Swinging strikes are up too, but called strikes have risen at a higher rate, and of course the two things would be somewhat linked; if there are more called strikes, there are more pitcher’s counts, and batters are more likely to chase pitches out of the zone than they would have had the called pitches gone the other way.

As Gentile notes, the fact that called strikes are up does not necessarily mean that that umpires are simply calling a larger strike zone, as pitchers could be trusting their new faster velocities more and nibbling less, or perhaps they’re just getting better at hitting their spots. This data doesn’t prove that the strike zone is expanding.

But, again, we have PITCHF/x data since 2007, and if pitchers were pounding the zone more often in order to cause the rate of increase in called strikes, we should be able to find it in the data. Since 2007, here’s the rate of pitches that have been labeled as in the strike zone by PITCHF/x:

Season PA Zone%
2007 188623 49.3%
2008 187631 49.9%
2009 187079 50.1%
2010 185553 50.1%
2011 185245 49.8%
2012 184179 49.2%
2013 9828 48.4%

PITCHF/x isn’t perfect, and there is a margin of error on these numbers, but I don’t see a significant rise in pitches in the strike zone there. Again, this isn’t conclusive, as the called strikes could be happening earlier in counts, leading pitchers to bury more pitches out of the strike zone later in counts, with those effects offsetting to hide themselves within the aggregate data. This just isn’t a granular enough study to conclude that the strike zone has gotten bigger.

But, I think the data is at least pointing in that direction. The drastic uptick in strikeouts since PITCHF/x cameras were installed seems like a pretty big coincidence, especially since the called strike is on the rise more than the swinging strike. If the main driver of the increase in strikeout rate was velocity or simply more hard-to-hit pitches, it seems like we’d see a trend in K% as velocity climbed throughout the season. If hitters were more willing to trade strikeouts for a better chance at a home run, it seems like that trade-off would be more pronounced in the summer, when the ball actually travels better and rewards that philosophy more often.

If the main driver of strikeout rate is not the players, though, then we wouldn’t see the environmental effects on temperature and velocity in strikeout rate throughout the season. And, looking at the monthly data, we don’t really see those effects. That in itself isn’t enough to prove that the guys driving strikeout rate up are the umpires and not the players, but the timing of the strikeout leap seems to suggest to me that PITCHF/x data is having a larger impact on the game than perhaps anyone anticipated.





Dave is the Managing Editor of FanGraphs.

39 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
LionoftheSenate
11 years ago

Nice work.

Remember when tons of people were foolishly crying about Questec being in ball parks? Typical, lots of people rip baseball for every move it makes. The people that ripped Questec look as silly as ever.

eye-roll
11 years ago

One of those who protested most spectacularly was Curt Schilling; yet this suggests he should have been more helped than hurt by it overall.

Bill
11 years ago

Schilling’s complaints were that umpires called games differently when Questec’ed than they did without. He was upset at the lack of consistency. With well calibrated machines in every park, this is no longer an issue. If you read the SI article linked in eye-roll’s post, you see that the Questec system was flawed. It sounds like it relied on a human operator. The people that ripped Questec weren’t largely ripping the idea, they were ripping the implementation. They helped to bring about the change to the far superior Pitch-fx systems now installed. Pitch-fx doesn’t make the Questec rippers look silly, it makes them look right.