Exactly two months ago, I posted my first in-season BIP-based park factor update. BIP-based, you say? Basically, I’ve taken every batted ball hit in every park, applied major league average production for its exit speed/launch angle bucket, incorporated run values, and scaled the resulting projected production to an average of 100. It’s now time for midseason update #2, as of the All Star break.
Today, we’ll take a macro, big picture look at the midseason data, comparing the results to our first 2017 update and to 2016 full-season data. Later this week, we’ll get into the specifics of individual parks, drilling down to the specific pieces of data that show whether a park is hitter or pitcher-friendly.
First, let’s go macro-macro, and see how hitters have performed on the major BIP types through the break, through our first 2017 update (May 20), and for the entire 2016 season:
From 2016 to our first 2017 measuring point, there wasn’t much difference in production on all balls in play; .328 AVG-.536 SLG in 2016, .324 AVG-.535 SLG through May 20. You will notice the almost total disappearance of the “null” group, which constituted 14% of all batted balls last season. As I stated in my June 1 piece, the exit speed/launch angle for balls that did not generate a Statcast reading is now being estimated. For details on the process, check out Daren Willman’s Baseball Savant website.
There has been a fairly significant change in production on batted balls between May 20 and July 9. This is largely due to warming temperatures throughout that time period. Through May 20, hitters were batting .313 AVG-.863 SLG on fly balls. By the break, that had risen sharply to .327 AVG-.907 SLG. In that interim period between the two dates, hitters flexed their muscles to the tune of .340 AVG-.949 SLG on fly balls.
Interestingly enough, hitters actually hit the ball with a bit less authority in that interim 5/20-7/9 period than they did earlier in the season. As the weather warms, it takes a little less juice to get the ball over the wall.
Production on liners also ramped up from .650 AVG-.869 SLG through 5/20 to .664 AVG-.890 SLG between 5/20-7/9, for a cumulative .657 AVG-.880 SLG through the All Star break. It wasn’t an increase in homers driving this upward spike; it was a rise in doubles. A combination of faster infields and outfields and perhaps gradually fatigued fielders would seem to be driving this trend.
The faster infield/fatigued fielder concepts would also be supported by the increase in grounder production from .222 AVG-.240 SLG through 5/20 to .225 AVG-.244 SLG during the interim period leading up to the break. Overall, hitters produced at a .224 AVG-.242 SLG clip on the ground in the first half.
On all BIP types, hitters batted .324 AVG-.535 SLG through 5/20, .336 AVG-.567 SLG between 5/20-7/9, and .330 AVG-.552 SLG for the entire first half.
This sets the stage for the first half park factors:
Color-coding is used above to note significant divergence from league average. Red cells indicate values that are over two full standard deviations above league average. Orange cells are over one STD above, yellow cells over one-half-STD above, blue cells over one-half STD below, and black cells over one STD below league average. Ran out of colors at that point. Variation of over two full STD below league average will be addressed as necessary in the text below.
This table lists the overall single-season park factors for each club going back to 2013, with partial-year data through July 9 and May 20 included for 2017. One might opine that single-season factors don’t tell you much and might be too volatile. Well, the average year-to-year correlation coefficient for the 2013-16 annual sets of date above is 0.59, indicating a strong correlation. That occurred despite significant modifications of multiple venues over that time frame.
The correlation between the 7/9/2017 and 2016 park factors is a strong 0.62, up from 0.50 as of 5/20. Take Atlanta and their stadium change out of the mix, and the correlation jumps to 0.65.
Let’s look at some long and short-term trends uncovered in the above table. The Coors Field effect is alive, well, and perennial. There is only one red cell outside of the Rockies’ row of data. All of their cells are red. Their full-season park factors sit in a narrow band between 124.5 and 127.8; thus far in 2017, they are breaking out even further to the upside.
The other teams with typically hitter-friendly parks over the last few years are Boston, Cincinnati, Milwaukee and the New York Yankees. Of that group, the Reds and Brewers are playing in even more hitter-friendly home conditions than ever this season, while the Red Sox and Yankees have seen fairly significant slips in their home park factors in 2017. We’ll talk quite a bit about Fenway later this week.
You might say, Aaron Judge! Homers! Well, if park factors are being calculated correctly, Aaron Judge shouldn’t influence them at all. When he gets all of one, it’s a homer in Yellowstone. This method treats 105+ mph flies and 110+ mph liners differently than their more weakly hit counterparts. Traditional calculation methods do not.
Then there’s the perennially pitcher-friendly ball parks. The Angels, Marlins, Athletics, Mariners, Giants and Cardinals have consistently played their home games in pitcher-friendly conditions over recent seasons. It must be said that after some reconfiguration, Marlins Park and Safeco Field aren’t nearly the fly ball graveyards they once were; they’ve been creeping toward the run-neutral group of late.
San Francisco is now the home of the single most pitcher-friendly park in the game. 2017 will mark the third time in the last four years that AT&T Park will have a factor over a full standard deviation lower than MLB average. Oakland is going for three years in a row with that distinction.
Most unpredictable/misunderstood park award goes to Petco Park. Everything assumes it’s a pitchers’ park…..but, noooooo. It was extremely hitter-friendly in both 2013 and 2015 and is on track to be so again this season. In 2014, it was extremely pitcher-friendly, and in 2016 it was neutral. It’s all about the marine layer in San Diego; that and relatively hard field conditions, which consistently allow higher than projected production on both liners and grounders.
Take a moment to look at the difference between the 7/9 and 5/20 columns in the table above. You’ll note that the teams at both extremes regressed a bit toward the middle as the season has progressed, as you might expect.
In addition, you’ll see that the park factors for a few clubs in geographic regions with more variable temperatures shot upward. Baltimore, Kansas City and Texas would be three prime examples.
Lastly, as a supplement, here are the single-season fly-ball park factors for each club from 2013 through 2016 and as of May 20th and July 9th of this year. Again, there’s a strong correlation: an average of 0.64 from 2013 to -16 (including Atlanta, 0.66 excluding Atlanta) and 0.77 from 2016 to the 2017 data as of the All Star break: