Is Big Blue Calling a Better Strike Zone?
Everyone knows by now that run-scoring has taken a big hit in recent years. Clubs crossed home plate an average of 4.65 times per game in 2008, but that declined to 4.61 in 2009, 4.38 in 2010 and just 4.28 this past year, marking the lowest level of offense seen since 1992. Many theories have been put out there as to why teams are scoring less -- a wave of top-flight young pitching and the post-steroid era among them -- but a big reason might be that Big Blue is being less generous to batters on pitches taken in the strike zone.
In 2008, umpires called a strike 74.5% of the time that a hitter took a pitch that was thrown within the strike zone. By 2011, that called strike rate on pitches taken in the zone was over six percent higher as compared to '08:
|Year||Called Strike Rate on Pitches Taken in Zone|
Those extra strikes have massive effect over the course of a full season. The difference between 2008's called strike rate on in-zone pitches taken and 2011's rate amounted to an extra 5,732 called strikes on hitters. The difference between a called ball and a called strike is about 0.15 runs. Multiply that by 5,732 and you have a nearly 860 run decrease compared to 2008's rate. That's 0.35 runs per game, or nearly the entire difference between 2008 and 2011 run scoring.
So, where are those extra called strikes on pitches taken in the zone? Most of them are at the knees. Check out the called strike rate on in-zone pitches taken in 2008, and then 2011:
The called strike rate on in-zone pitches taken that are thrown down has risen from 53.2% in 2008 to 62.9% in 2011, an 18% increase.
Any number of things could be responsible for run-scoring being as low as it was when flannel and torn jeans were all the rage and brick cell phones were Chic, but a more accurate Big Blue has also played an important role. Hitters beware: pitchers are getting their due more often on should-be strikes.