Baseball has seen many changes in the past 100 years. Some changes are significant enough, retrospectively, to define an era. There was the Deadball Era from roughly 1901 to 1919, characterized by an emphasis on pitching, defense, and a low run-scoring environment. The Liveball Era began in 1920, ushered in by Babe Ruth, cleaner baseballs that were easier for batters to see, and rule changes like banning the spitball. When the offense started to overpower the game, more changes were made to temper that environment, like the introduction of the ground-rule double in 1931. Before that, a ball that bounced on the field and over the fence was considered a home run.
There’s Jackie Robinson’s debut in 1946, and the following years when African-Americans finally were permitted to play in the majors. There’s expansion, the lowering of the pitcher’s mound, the introduction of the designated hitter in the American League, free agency, more expansion, newer ballparks, PEDs, testing for PEDs, and an ever-expanding strike zone — all marking the beginning of other, overlapping eras. And then there’s the sabermetrics revolution — using advanced statistical modeling and analysis to construct rosters, manage bullpens, and deploy extreme defensive shifts.
All of these changes in baseball and yet, for the last 100 years, the offensive hierarchy among defensive positions has remained pretty much the same. First basemen, right fielders, and left fielders produce more offense than the average player; catchers, second basemen, and shortstops produce less. It was that way in 1914, in 2014, and in nearly every season in between. Clean ball, dirty ball, higher mound, lower mound, PEDs, no PEDs — whatever the conditions in the game and on the field, first basemen, left fielders, and right fielders have dominated on offense.
Let’s examine more closely this relationship between offensive skill and defensive position — both the historical averages and outlier seasons.