It’s early, but this year has already taken twists we’ve never quite seen before. Here are three trends to watch. AP Photo/Steve Nesius
It is far too early to say the Red Sox are really in trouble, or Dan Vogelbach is really a star. It is not, on the other hand, too early to say we’re really going to see yet another league record for strikeouts, and home runs are probably really returning to the juiced-ball levels of 2017.
In between, though, are a number of ways baseball in 2019 has been broadly different this year than it ever has been before. Sometimes these early-season shifts turn out to be flukes, or seasonal; sometimes, they bear out as real, substantive changes in how the game is played. We’re roughly 10 percent of the way into the 2019 season, and we’ll look at three of those early leaguewide anomalies to see how they fit into the historical trendlines; how April specifically affects them; and whether the best explanation is permanent change or something flukier.
1. Triples have disappeared
There have been 0.14 triples hit per team per game this season, which would be the lowest in history.
Where it fits in historical trends: The long-term trend — since the 1920s — for triples has been down, but the dips have been slow and irregular. In the 1920s, the average team tripled about once every two games; in the 1930s and 1940s, it dropped to once every three; in the 1950s through the early 1980s, it was about once every four; and then over the next three decades, it was once every five. Over the past two decades, triples have occasionally dropped further still, and the three lowest triple rates in history came in 2013 (0.16 per game), 2017 (0.16 again) and 2018 (0.17). If this season holds at 0.14 triples per game, it would be the lowest rate in history, about 150 triples fewer leaguewide than were hit last year.
But is it just April? It doesn’t appear to be. April has actually been the most typical month of the season for triples. And early-April triple rates have historically matched late-April rates. So while one might speculate that runners would be more cautious on cold/wet days, that doesn’t seem to affect anything broadly. Early April is a good indication for the rest of the season.
But is this really just about home runs or strikeouts? This is important, because almost every story these days is really just about home runs or strikeouts. Triples might be down because players are hitting fewer triples. But they might be down because players are hitting fewer balls, generally, meaning this could be just one of the many outcomes lost to extra third strikes.
So instead of looking at triples per game, we can look at triples per ball in play, and we can look at triples per double. As noted, triples have fluctuated a bit — and declined a bit — since the turn of the century, but that has been almost entirely caused by strikeouts. The rate of triples per ball in play has been incredibly steady, at 0.7 percent of balls in play. This could be the first year ever to drop below 0.6 percent:
Another way to look at it is examining the ratio of triples to doubles, which are the closest cousin to triples. There have been eight triples for every 100 doubles hit this year, which would also be an all-time low.
So triples are convincingly down this year, and it can be explained only partly by the two big changes in home run and strikeout rates. And, indeed, this goes counter to another recent trend, of more playing time going to young position players. (Triples peak in a player’s early 20s.)
So it’s real? If you can think of a way players get to third base and stop there — stealing third, or sacrificing a teammate from second to third, or going from first to third on a single, or hitting a danged triple — then it has probably become less frequent in the past few years. Rob Mains has recently written about this at Baseball Prospectus, with two hypotheses: Sabermetrics has shown that the risk of advancing from second to third on a contested play (or in exchange for an out) isn’t worth it, by run-expectancy models; and fear of injuries has made runners more cautious, more station-to-station.
There’s another good reason, too: Strikeouts and home runs! In a high-strikeout environment, teams are less likely to get the runner home from third base on a sacrifice fly or groundout. And in a high-homer era, there’s less disadvantage to stopping at second, since a home run scores all baserunners equally. So, while the drop in triples isn’t explicitly about strikeouts and home runs, it sort of is.
With all that said — we’re talking about maybe only 15 missing triples, compared to last year’s rates. Triples are going to continue to be down relative to when you were in elementary school, but it’s too early to say they’ll continue to be down relative to last year.
2. Pitchers are wilder than ever
There have been 3.48 walks per game, 0.43 batters hit by pitches per game, and 0.39 wild pitches per game. The latter two would be the highest rates since 1900; the walks would be the highest since 2000.
Perhaps most significantly, all this wildness has led to (and/or resulted from) longer plate appearances, which have led to longer games: Despite pace-of-game reforms, the average nine-inning game in 2019 is on track to match the slowest in history, at 3 hours, 5 minutes — five minutes longer than last year.
Where it fits in historical trends:
- Hit batsmen had been fairly steady for about a century before ticking up in the mid-1990s and spiking in the mid-2000s. But that spike retreated until last year, when the league set a modern high at 0.4 HBP per game.
- Wild pitches were fairly steady until a jump in the 1960s, and were then fairly steady for decades after that, but have inched continuously up this decade.
- Walks peaked at the height of the steroid era but dropped a lot this decade (about 3 per game from 2011 to 2016), until the juiced ball came along. The 2017 and 2018 rates were high (3.26 and 3.23, respectively), but this year is much higher still.
But April? Yes, April is the busiest month for wildness: HBPs are generally 3 percent higher than the rest of the season, wild pitches 4 percent higher, and walks 6 percent higher. The first two weeks of the season are wilder still, and these per-game rates have already quieted down some since last week.
If we were to adjust each of these wildness measures down by 6 percent, then we’d still be on track for the new highs in HBPs (since 1900) and walks (since 2000), but the jumps would be less extreme and the chances of regression to normal levels would be more compelling.
But is it really just about home runs or strikeouts? Yeah. Even stipulating that there’s a good chance all these wildness stats will regress to something in line with 2018 and 2017, there are clear ways pitchers have changed their approaches even more this year, to add strikeouts and to avoid home runs:
2010: 63.5 percent of all pitches
2018: 60.8 percent
2019: 58.7 percent
2010: 27.6 percent of all pitches
2018: 29.1 percent
2019: 30.1 percent
As Baseball Prospectus’ Matt Trueblood has been documenting, full counts are way up this year, with 15.4 percent of plate appearances now reaching such a point. Pitches per plate appearance, at 3.95, are way up from last year’s 3.90, which was itself a record. More pitches mean more opportunities for hit batsmen and wild pitches, of course. Beyond that, though, pitchers are showing clear intention here: Avoiding contact and working cautiously to get to the deep counts where a strikeout pitcher’s advantage grows. This isn’t really that pitchers are wild, but that they’re choosing a wilder way of pitching.
So it’s real? Basically. Although, as with triples, it’s easier to say that baseball has definitely changed in the past few years than it has definitely changed this month. I’m extremely confident we’ll see a new record for pitches per plate appearance this year. The rest will likely follow.
3. Relievers are bad
Relievers’ ERA (4.44) is, collectively, higher than starters’ ERA (4.33). Relievers have also allowed exactly the same OPS as starters.
Where it fits in historical trends: Through the 1950s, relievers were pretty much always worse than starters, and even into the mid-1970s it wasn’t uncommon for relievers to collectively allow more offense than starters did. But that was a different era, when relievers were really more like backups — the second string — than they were a part of everyday baseball strategy. Since the relief era really took off in the late 1980s, relievers have always pitched considerably better than starters, not because they’ve been better pitchers but because they’ve been quality pitchers used in shorter stints and, usually, with the platoon advantage generally in their favor. (In other words: Starters are better, but relievers have the easier job.)
If this holds, it would be the first time since 1988 that relievers have allowed a higher ERA than starters did, or as high of an OPS. Indeed, from 1988 through 2017, there was only one season when relievers’ ERA wasn’t at least 5 percent lower than starters’.
But last year, relievers’ ERA was just 3 percent lower than starters, and their OPS allowed was just 1 percent lower. This year, relievers’ ERA is 3 percent higher than starters.
You’d have to go back to 1954 to find a year when starters were that much better than relievers.
But April? The gap between starters and relievers is smallest in April (and September), but smallest doesn’t mean small. Since 1988, relievers’ ERA has been about 6 percent better than starters’ in March and April. The gap grows — to about 8 percent — in May through August, but even early in the season it is a real aberration to have starters outpitching relievers.
That said, weird things do happen in short stretches, and a reversal like this for a month isn’t unprecedented. Starters outpitched relievers in April 2009, before things went back to normal the rest of the season. Also in 1997 and in 1994.
Home runs and strikeouts? Not exactly, but sort of. The crucial detail here, if this reversal turns out to be real and persistent, is relievers are throwing more innings than they ever have before, which dilutes the collective reliever pool. There have been 4.43 pitchers per game this year, which would be a record; relievers have thrown 41 percent of all innings, which would also be a record. There were 285 pitchers who appeared in relief during teams’ first 12 games this season. There were 261 in the first dozen games of 2017 — and it’s safe to say those extra 24 are dragging down, not lifting up, the collective reliever stats. There were 232 relievers in the first 12 days of 2010. Dozens of pitchers have thrown relief innings this year who would probably have been in the minors most any other year.
Furthermore, we can’t really say relievers are worse except in relation to starters, and starters have been the statistical beneficiaries of modern pitcher usage. Fewer and fewer starters are asked to face batters a third time in the game, to throw a pitch while exhausted, or to pace themselves to get through eight or nine innings. Each season, starters are able to pitch more like relievers — at full tilt, basically. A big reason teams prefer to use their starters this way — five innings and then out — is they like their starters to pitch for strikeouts, and they don’t like having tired starters in the game giving up home runs.
So it’s real? It’s probably not all the way real. There hasn’t been a substantial increase in relief innings since last season, or in relievers used since last season, so it’s hard to explain why there’d be such a substantial change in results since last season. The best bet is relievers will settle in to a level much like last year, when they allowed 3 percent fewer earned runs than starters. But that was a big change, an anomalous season matched only once in the previous 40 years. We wondered, at the end of the 2017 season, why five-inning starters weren’t closing the gap on relievers, statistically. Then they did, a lot. This April is providing strong evidence that last year, at least, was real.
Author: Sam Miller