Yesterday, Brian Kenny and I spent a few minutes talking about relief pitchers on Clubhouse Confidential, and specifically, about the differences in the role of a middle reliever versus a closer. Both Kenny and I believe that the idea of a “closer mentality” is mostly a myth, but we do spend some time talking about why some guys aren’t cut out for the traditional closer role. If you want to watch the segment, I’ve embedded it after the jump, and will expand on one of the things I said on the show below that.
When Kenny asked me what the best way to gain value from a relief ace was, I pointed out that I preferred the method of bullpen usage that was in place before the rise of the save as a statistic of importance. Obviously, the structure of the bullpen has changed a lot over the last 30 years, and the adoption of specialist relievers and one-inning stints have led to larger pitching staffs and far more frequent pitching changes. Managers are being more aggressive than ever in exploiting platoon advantages and trying to limit the amount of innings their relievers work in order to increase effectiveness when they do pitch.
Back before the creation of the modern bullpen, it wasn’t at all unusual to see a reliever throw 100+ innings in a season. In fact, the 1982 season from Bob Stanley is perhaps one of the most interesting years a pitcher has had in quite a while – he appeared in 48 games, went 12-7, racked up 14 saves, and threw 168 1/3 innings in the process. He didn’t start a single game the whole year, but he finished fourth on the team in totals innings pitched and threw just seven fewer innings than Mike Torrez, who started 31 games for the Red Sox that season. For comparison, Stanley faced 694 batters in ’82, while Jeff Samardzija led all Major League relievers in 2011 with 380 batters faced.
The change in bullpen usage is the biggest difference in the sport now compared to 30 years ago. For reference, here’s the average number of batters faced per relief appearance for each year since 1982:
The downward trend is so strong that the totals are lower than the previous year for almost every single pair of seasons during the timeframe. And, despite the fact that modern bullpen roles have been well established for quite a while, the dwindling rate of batters faced per appearance shows no signs of slowing down. While the drop from 1982-1991 was the most extreme, the last two decades have each seen the league shed an additional half a batter per reliever appearance, and given that we’ve seen teams now expand to carrying 13 pitchers at times, there seems to be no end in sight to this trend.
Teams have transitioned away from a few pitchers carrying large loads (the 1982 Red Sox got 1,453 innings by using just 14 different pitchers, while the 2011 Red Sox got 1,457 innings from 27 pitchers) into a model where a lot of pitchers carry significantly smaller loads. In 1982, there 64 relievers who faced 300 batters or more – in 2011, there were 25. And yet, the relative amount of innings pitched by relievers as a whole hasn’t changed all that much, increasing from 30.6% in 1982 to 32.6% last year. The change in bullpen management has been much more about redistributing relief innings pitched from few to many, rather than asking bullpens to carry a larger share of the load, even while they take up a larger portion of the roster.
Since teams have dedicated more rosters spots to relief pitchers in order to help facilitate more situational match-ups and minimize the wear and tear on their best relief pitchers, they’d clearly be expecting some kind of return on that investment. In order to justify the extra roster spots and the redistribution of innings, they’d need to see some kind of performance improvement in order to make the change pay off. And, really, with pitchers facing fewer batters, you’d expect them to be able to throw harder and exploit platoon advantages for better results overall. The trade-off should be more quality for less quantity.
But, looking at the numbers, we don’t really see much evidence that the modern bullpen has helped relievers perform better at all.
Over the last thirty years, walk rates by relievers are essentially unchanged. They went up a bit when the home run barrage took over the late-1990s, but have gone back down as home runs have become less common. The ratio of walks to home runs is pretty steady and consistent over the last thirty years, and there’s certainly no evidence that the modern day bullpen has helped pitchers avoid the base on balls.
On the other hand, strikeout rate has skyrocketed, increasing by 40% since 1982. This would seem to support the idea that relievers can be more effective in shorter stints, and that playing the match-ups can help prevent run scoring. However, there’s a problem with that theory – the strikeout rate of starting pitchers has gone up 41% during the same time frame. While strikeout rate has been raising at the same time that the modern bullpen has been evolving, this seems to be a case where correlation is not causation. If starters are seeing the same rise in strikeout rate, that points to a more fundamental shift among hitters – more sluggers swinging for the fences, the rise in acceptance of the strikeout as just another out among organizations – rather than a specific benefit being given to relievers from their new roles.
Likewise, it doesn’t appear that relievers are really generating much of a benefit when hitters do put the bat on the ball. Home run rates have risen at a similar rate as what starting pitchers have experienced, and likewise, batting average on balls in play has increased significantly over the years. While relievers do post slightly lower BABIPs than starters (theorized to be the result of being able to throw harder in shorter stints), this was true even when relief pitchers were throwing multiple innings and carrying much heavier workloads.
In fact, if you look at the sum of the components (in the table above, that’s ERA- and FIP-), there’s just no evidence that bullpens are preventing runs at a better rate now than they were before the current roster construction norms came along. Any improvements in quality of performance by the elite relievers have been offset by the fact that more innings are now being given to inferior arms, so the trade-off has essentially resulted in a change of no real benefit.
We’re told that defined roles are supposed to make a reliever’s job easier by giving him a usage pattern he can adapt to. This makes sense from an intuitive standpoint, but the results don’t really show much of an effect. Teams have essentially taken two roster spots away from position players and handed them to the bullpen without seeing a tangible improvement in performance from their relievers overall.
The ROI on the modern bullpen just isn’t there. At some point, someone is going to get back to using relievers the way they were used 30 years. The current paradigm takes up too many roster spots and simply shifts innings from your best arms to your worst ones. There’s a time and place for playing the match-ups, but if you have a guy who can get batters out from both sides of the plate, there’s no reason he should only be used for an inning at a time. Instead of ridiculing the Braves for how they used Jonny Venters and Craig Kimbrel, perhaps we should be applauding them for refusing to give into a trend of roster usage that just hasn’t provided any real sustained benefit.
Dave is the Managing Editor of FanGraphs.