October 14, 2014: Geographical Depression

I'm fighting a bit of depression right now, almost certainly due to the fact that I still live here and I still hate it. Much of my day revolves around fantasizing about getting out.

49 thoughts on “October 14, 2014: Geographical Depression”

    1. I didn't watch Twin Peaks when it was broadcast; it was probably on after my bedtime.
      In college, I rented the pilot (I think, it definitely wasn't Fire Walk With Me) and didn't see the big deal. Was the Laura Palmer mystery solved in the first two hours or so? Did things develop from there?

      1. I haven't made it past the first episode either, but I think the entire series was supposed to be about her? Then when it was cancelled in season 2 they wrapped it up quickly.

        1. The only show I can recall that went nowhere faster than Twin Peaks was probably John from Cincinnati. I liked them both.

      2. LOST has gotten the most comparison to Twin Peaks -- the pilot set up the initial plot line and mystery(s), and then ran with it. And like LOST, it had the zen-like quality of being more about the ride than the destination. The show was very quirky (Lynch, duh) but suffered when he wasn't directing and when the network tried to fiddle with things (and like LOST, fans were clamoring for answers). I thought it was a bit soap-opera-y at times, and not all the characters grew on me, but it was fun while it lasted. If you do watch the series, Miguel Ferrer is delicious, and a young David Duchovny in drag. Even Lynch as an "actor" was a gas.

    1. Glad to see this analysis, since it was something I picked up on this year, and it really bothered me.

      1. Interesting piece, but I'd much rather see heat maps and changes in the odds of a pitch in a grid being called a strike, rather than the maps just showing the grids where 50%+1 of the calls were strikes.

        The Jeff Sullivan piece at fangraphs (linked in the HBT piece) is particularly compelling. See the Zone of Interest graph, which shows a pretty linear increase in the percent of pitches in his low zone being called strikes over the 2009-2014 period. The graph also shows the trends for the "five best catchers" and "five worst catchers" to try to capture framing effects. It shows that the five worst catchers still have lower called-strike rates in that zone than at any time in the period for the five best catchers, but that the bottom five are now getting strikes called in the low zone at rates equal to or better than the average back in 2008-09.

        1. also, the academic piece cited looks to be really interesting.

          This paper examines the role of changes in monitoring, technological innovation, performance standards, and collective bargaining as they relate to performance improvements among Major League Baseball umpires from 1988 through 2013. I find structural changes in performance concurrent with known bargaining struggles, and substantial improvements in performance after implementation of incentive pay and new technological monitoring and training. Not only do umpires improve performance in expected ways, but the variability in umpire performance has also decreased substantially. These changes have reduced offensive output often attributed to a crackdown on performance enhancing drug use in MLB.

          Emphasis added.

    2. Does the game really need more offense?

      I'm happy the 1994-2007 run-scoring environment of (a ridiculous average of 4.84 runs/game) has been supplanted by something more resembling the run-scoring environment that existed from 1969-1992. The author's invocation of "the Pitch f/x era" is telling - he might as well just say "the era in which technology has finally enabled MLB to evaluate umpires' conformity with the rule book strike zone." (And yes, MLB likely played a not-insignificant role in creating the conditions of the 1994-2007 run-scoring environment. I'm glad those conditions have been corrected.)

      1. Yes, but because although the runs/game are from an earlier era, the means to that environment are different.

        1. Building off that. The average runs per game in 1969-1992 was 4.21 with a median of 4.24. 2013 averaged 4.17, so pretty close; 2014 averaged 4.07. That would tie 1969 and be above only four seasons of your 24-year span. My issue is I don't think it's going to end here.

            1. Hm, never used the variance functions in Excel. Assuming VAR.P (or VAR.S, very similar result) are what I want, 1969-1992 had a variance of .043 and 1994-2007 had a variance of .024.

              1. so you have a spreadsheet where the records are individual games? kewl.

                Also, interesting.

                So, assuming that the distribution were normal (which it's not quite), the 90 [err, 95, duh] pct confidence interval for the mean for 1969-92 would be about 3.80-4.62, and the c.i. for 1994-2007 is about 4.52-5.15, I think (of course, you have population data, so in reality, the means are exact....). And interesting that the lower-scoring era is the noisier period.

          1. I don't think we have enough evidence to know if it will end here or not. We can operate on the assumption that run-scoring will continue trending downward, pushed in part by the reestablishment of the rule book strike zone. However, since umpires are most certainly receiving feedback from MLB on the zone, we shouldn't foreclose the possibility that adjustments on guidance will be made year-to-year.

      2. For me it has significantly less to do with the amount of offense and much much more to do with my enjoyment of watching the game. It isn't fun to see batters getting called out on balls that are far too low, far too often. It isn't fun to watch batters take pitches up in what-used-to-be-the-zone when they're being selective. It isn't fun to feel like the ump's zone is outcome determinative, or much closer to that than it ever was before.

        I also wonder a lot about the way the low strike zone affects defensive shifts. It seems a low zone increases ground balls, and limits what a batter can do with a ball to the outfield. Joe Mauer is a prime example here, I think. I know that at least in the first half of the season I saw him get a low pitch called for strike 3 a bunch of times, and saw that low, outside corner expanded to the point where it made him a less effective hitter (indeed, I'm kind of shocked that there weren't more low outside strikes for lefties... I'd be surprised if that isn't where it grows next). I suppose, in that type of a situation, yes, I prefer more offense. When it's guys who know the zone and can handle a bat well who are being primarily affected, I'm bothered. The expanded strike zone affects the skill we're witnessing. It doesn't mean the pitchers are more skilled, it requires less of them. And removes the skill advantage of good hitters.

        Mashing is down too, likely as a result of fewer steroids, and I'm good with that. But batting averages have been dropping too, and that bothers me. A league with a .260 average feels a lot more balanced than one with a .250 average.

        1. Oh, those were from the ump's perspective. See, the low and away is called too much on Mauer!

          1. What I want is for the umpires to call balls and strikes the way the rule book defines them. I can't tell, from that article, whether umpires are calling pitches below the rule-book strike zone strikes, or if they're simply calling strikes on pitches that the rule book defines as strikes, but that they weren't calling strikes before.

            If they're not calling balls and strikes according to the rule book, that's a problem. If they are, then it's only a problem if you think pitches in the bottom of the strike zone should not be strikes, in which case the solution is to change the rule.

            1. I want umpires to be internally consistent and cross-sectionally consistent, i.e., for umpires to be interchangeable and to call balls and strikes the same from inning to inning and game to game. If all the umps agree on the same definition of balls and strikes, the rulebook doesn't matter very much. Yes, it would be best to have a "fair" definition (not too tight, not too loose). But in the absence of extremes in interpretation, consistency is the more important issue. Players will adapt.

              1. The players will adapt, but the rule book matters. Would we be happy if they called balls hit a foot outside the foul line fair, as long as they were consistent about it? Would we be happy if they called runners who beat the throw to first by a step out, as long as they were consistent about it? To me, those things make as much sense as saying the umpires can call balls and strikes however they want, as long as they're consistent about it. Consistently wrong is still wrong. There are reasons we have rules. The umpires aren't supposed to make up their own.

            2. in which case the solution is to change the rule.

              Might be, if those are strikes by the rule. And, of course, they've changed the rule several times. The last time, they even lowered it, from the top of the knee to the bottom. But based on my observation, I'd say things below that level are being called strikes.

      3. Yeah, I'm not sure about the box in the graphs (is it an estimated average strike zone by the rules?), but the new zone seems to conform better to it.

        1. The box in the images is shown purely for a frame of reference.

          The frame is 24 inches across, so it's wider than the strike zone. However, I suppose you could argue the strike zone is another three inches beyond the width of home plate itself, so then I think the box might be the rule book width and the top and bottom averaged across all plate appearances.

    3. I wouldn't mind a lower run-scoring environment if it actually meant starting pitchers were going deeper into games and thus less pitching changes. But it's not. Over the last 11 years, the average innings pitched per start in the AL has remained almost exactly a constant 5.9. It was 5.9 in 2014 and it was 5.9 in 2004 when the average runs allowed per game was 4.99. In 2004-14, the best IP/GS was 6.1. Managers just don't trust their starters to get out of jams past the sixth inning no matter how well they've pitched. I think the biggest difference over the last decade is so many teams going to power arms in the bullpens and relying on them in tie or close games in the last three innings. If you're a reliever and you don't strike out at least 1 batter per inning, you're below average.

      1. There were 65 "qualified" relievers in the Majors this year (of 142) who averaged 9.0 K/9 or better, 15 who average 8.5-9 and another 10 who averaged 8-8.5 K/9. The MLB average was 8.46 K/9 for relievers.

        So socal is exaggerating, but not by a whole lot.

        1. I knew it was close. Basically, when I see a reliever's numbers and I see he doesn't have as many Ks as IPs, I figure he's got below-average stuff. If it's close, maybe average. It's just an quick and easy way to figure that stuff.

  1. Out of nowhere, the Milkmaid was promoted today. A manager put in his two weeks just a few days ago and she has already been offered the job. His departure has caused what fifteen hours of interviews couldn't. I know this is the job they wanted for her, but we never knew it would be available this early.

      1. Right.
        Details, man. You start us out with this...
        "I'm fighting a bit of depression right now, almost certainly due to the fact that I still live here and I still hate it. Much of my day revolves around fantasizing about getting out."

Comments are closed.