It’s time for interleague play, again. Even moreso than the interminable disputes about which “style of play” is aesthetically superior, complaining about fairness of the presence/lack of the DH in away games, perhaps the most contentious debate among many fans (contentious despite the overwhelming evidence on one side) is that interleague play proves that the American League has been significantly stronger than the National League for at least a decade, no matter what this fine representative of the Best Fans in Baseball believes:
The American League’s domination of interleague for an extended period of time is good evidence for its superiority, whatever the causes of that superiority might be. However, some will point to individual players as being independent demonstrations. For example, Matt Holliday was a great hitter with the Rockies through 2008. He started the 2009 season in Oakland and “struggled” relative to what he’d done before. Some people attributed that simply to him being a product of Coors Field (sigh), but when he was traded to St. Louis, he started raking at almost the same level. It must be the league, right?
Or how about Pat Burrell, who came off a number of successful seasons in Philadelphia, signed with Tampa Bay, then bombed so badly for a season-and-a-half the Rays let him go for nothing in 2010. He then signed with San Francisco and tore the cover off the ball to help the Giants on their way to a World Series Championship.
Naturally, it is silly to argue from individual cases to a league-wide issue. However, I wondered if taking all the cases like Holliday’s and Burrell’s and putting them together might show us something about the relative strength of leagues, both now and in the past.
I simply took all the players who, like Burrell in 2010 and Holliday in 2009, had hit in both leagues during a given season (due to a trade, being released then picked up, etc.) and compared their wRC+ (helpfully park-adjusted and set so that the league average is always 100) in each league during that season. We want to get a “group picture,” so I took all the players who did so (excluding pitchers) and got the wRC+ for each group in each league (each player is weighted by his plate appearances). If one group hit better in one league, that might indicate that league is weaker. There are limits to this approach, (more on the limits below), but at least by looking at same year performance we mostly don’t have to worry about taking player aging into account. I did this for every season from 1960 through 2010. Here is a graph of the results (click to enlarge):
One thing to remember when reading this graph: if the reading is “higher” for one league, that (might) be taken to mean that the league is weaker. In recent years, hitters who hit in both leagues during a single season have hit better in the National League rather than the American League, although in 2010 they hit almost exactly the same (87 wRC+ in the AL, 89 wRC+ in the NL). But the previous history is also interesting because there was no interleague play in that period. I originally had a number of charts with ratios, moving averages, and the like, but found it too cluttered.
A more simplistic, but hopefully still illustrative approach for looking at this history is to take bigger chunks of time. I took the same pool of players, but grouped them into pseudo-decades by 60s, 70s, 80s, 90s, and 2000s. (I say “pseudo-decades” because I counted, e.g., 1960-1969 as a “decade,” and decades really start in year “1” not 0, and that 2010 gets grouped in with the 2000s, but given that this is an admittedly crude approach to get an overview. I don’t remember why I did it this way, but I was too lazy to go back and change it. I don’t think it hurts the overall perspective we’re trying to get.) Once again, click to enlarge:
When looking at both graphs, this corresponds pretty well with other accounts of relative league strength I’ve read. For the past couple of decades (at least since expansion), hitters who played in both leagues during a season generally have had an easier time of it in the National League. They did in the 1980s overall, too, although not as much, and not at the beginning of that decade. The National League was clearly better in the 1970s. I was a bit surprised by the 1960s, since I’ve read that the National League was stronger (in part due to the linger effects of the more rapid racial integration of the Senior Circuit). I don’t know what to say about that, but wanted to point it out.
There are limits as to what this actually tells us. I didn’t do the same thing for pitchers because that is much more complicated (two examples: a) separating performances as starters and relievers for players who did one with one team and then switched when traded; b) redoing all the stats to exclude pitching against pitchers in the National League). What I’ve done here doesn’t tell us if the difference was because the hitters in the stronger league were better or the pitchers in the weaker league were worse. Moreover, by itself it doesn’t prove one league was better over one season or multiple seasons. Looking at the data, you can see the relativelly small sample of players who hit on both leagues, particularly prior to the 1990s. For use in e.g., a projection system, one would also need to include regression, enlarge the sample by using multiple years (which means taking aging into account), and brings up some of the issues brought up by MGL.
Having said all of that, I do think this at least demonstrates that there is evidence other than interleague play for the American League’s current (and perhaps fading?) superiority… and that it hasn’t always been the case.
[One last note: I started in 1960 because some of the wRC+ [wOBA] components such as caught stealings and intentional balls (which are excluded from wOBA calculations) aren’t tracked until the mid-1950s, and I wanted to do the “whole decades” approach.]
Matt Klaassen reads and writes obituaries in the Greater Toronto Area. If you can't get enough of him, follow him on Twitter.