Last night’s broadcast of the Cardinals and Nationals game debuted live, in-game Statcast enhanced graphics and replays. Statcast is the next-generation player tracking technology that combines both optical and radar measurements promising to create new ways to quantify previously unmeasurable aspects of baseball. The hype leading up to this game was billed as historic, and here at FanGraphs, we even had a special edition of the After Dark Live chat to cover this momentous occasion.
If you were expecting something earth-shattering from Statcast, once you began to watch the game you were probably disappointed at the slow start. If you were unable to watch the broadcast, no need to worry, because all the important replays from the broadcast were posted on Major League Baseball’s site, and I’m about to review and critique the different elements of the Statcast presentation.
First, before analyzing specific images and gifs from the game, MLB Network appeared to treat this as a normal broadcast using Statcast to augment their broadcast, not define it. 90% of the broadcast contained traditional camera angles, graphics, replays, and other broadcast elements. When Statcast was used, it was to produce enhanced replays and player positioning. There weren’t graphical overlays over live-game action aside from a few pre-pitch positioning graphics. ESPN currently has more detailed graphics for live-action pitch tracking with their K-Zone graphical overlay.
The Statcast enhancements can be broken down to two general categories: live player tracking and replay play tracking. The live player tracking can highlight where a player is, who he is and how far away he is from a certain point, like a base. Replay play tracking entails more detailed metrics and included ball tracking. All of these additional graphics came from a limited number of high-angle cameras, and not the full arsenal of broadcast cameras. The high cameras were most likely the only cameras calibrated to work with the Statcast data to display the graphics correctly.
The most constant feature of the Statcast enhancements to the broadcast was Shift Trax, which was a graphic that showed the positioning of each defensive player. It’s similar to the feature that baseball game videogames have had for at least a decade. From what I could tell, it didn’t update during game action to show the movement of the players.
From the same high-angle camera, MLBN was able to highlight the players on the field with graphics. In this case, it was just the identification of the batter and the pitcher.
Throughout the game the broadcast would cut to a shot of a base runner with a lead and denote how large the lead was.
The most interesting aspect of the lead display is that the distance is measured along the line between second and third base and not directly between the runner and second base. This was most likely done to show the lead the runner has taken toward third rather than away from second. However, this would not be completely accurate since the runner now has a longer distance than 76 feet (90-14) to get to third. It might be ideal to show both the distance from the bag and along the baseline, but that might be data overload, where we are given too much data to derive any meaning from the presentation in any appreciable amount of time. This is pre-pitch coverage, not a math problem.
Speaking of data overload, we had a good discussion about this Danny Espinosa bunt play in our Statcast chat.
As with all Statcast replays that clocked a runner going to first, everyone said they wanted to see the batter’s time from contact to first and, ideally, a time for defensive play as well. The player’s ‘instantaneous’ and max speed are a nice engineering feat, but those metrics don’t tell the story of the play as well as time from contact to reaching first base.
While having different measurements would be ideal, the player trail graphics are a great visualization of the movement on the play. The faster the player runs the redder the trail becomes. You can see how Espinosa accelerated to first, and when Kolten Wong hit his maximum speed before recording the out.
The next two gifs show a fantastic diving catch by Jon Jay and Yunel Escobar’s walk-off home run. Once again, we have a wide, high-angle camera shot with detailed information about Jay’s route distance and running speed and Escobar’s batted-ball trajectory; this is really cool information. The only criticism I have here is that the green trajectory doesn’t match ball on camera, and it appears to be displaying longer than what actually happened. This is most likely a calibration issue with the cameras, or the trajectory is showing a calculated, projected path rather than a tracked path.
Throughout the broadcast there were a handful of Statcast-enhanced replays that playing tracking with a selection of summary stats describing the play. For the most part these stats appeared to be a preselected set of data for different types of plays. I would conjecture that as this technology matures metrics more poignant to each particular play’s story will be displayed. For example, Jon Jay’s diving catch in the bottom of the ninth had a first step time of 0.3 seconds. That’s an important piece of information to understand how Jay was able to get to the ball and prevent runs from being scored.
Finally, I think the enhanced pitching metrics will be the first new stats coming out of the Statcast data to be widely adopted. With Trackman radar readings instead of PITCHF/x’s high-speed cameras, you can calculate the spin on a particular pitch, measured in RPMs, which can give us more information about the effectiveness of different types of pitches in a way that isn’t obvious from watching it on TV. Likewise, the radar-based system can measure perceived velocity, calculated using the pitcher’s individual extension to ‘normalize’ the pitch relative to an unknown league average. Pitchers with shorter extensions will have a lower perceived velocity than pitchers with longer extensions each throwing with the same actual velocity. These might be more upgrades over PITCHF/x than a revolution like the fielding data could be, but these are more actionable pieces of information out of the gates.
While I spent a lot of time in this post discussing ways to improve Statcast, I really enjoyed the broadcast last night. I think this technology will greatly enhance MLB broadcasts, and the data it produces will provide teams and (hopefully) public analysts a treasure trove of metrics and new insights. During the chat many people were asking about whether or not the data would be made public. Each game’s raw data requires a lot of storage space, so distribution logistics alone will probably prohibit MLB from releasing that data to the public. Add that to the fact that teams don’t want to lose any competitive advantages that data might provide. However, I think MLB will distribute aggregated data on their website or attach some data to their current Gameday application like what they are currently doing with batted ball data.
I code a bunch of things here. I really need to update my blog about statistics at stats.seandolinar.com.