Minor league pitchers, average on balls in play
When I ran the numbers of a few thousand minor leaguers to calculate MLE factors a few years ago, I found that pitchers do indeed allow a higher batting average on balls in play.
The effect is small for AAA, just .003, partly because half the AAA pitchers come from the Pacific Coast League, and thats a great hitter's league. I looked at moving from A to A+, A+ to AA, etc., and chained the results. I got:
In other words, a pitcher in low A who allows a .300 BABIP would give up a .349 in the big leagues.
This is all 100%, completely WRONG.
My mistake was twofold:
1) I did not regress enough to the mean. I was regressing 50% at 500 balls in play, it should be 1700-1800
2) I did not look both ways for selective sampling. My theory was that if a pitcher was pitching well in AAA, he was probably lucky in his BABIP, so you regress to the mean, but his performance at the next level is an unbiased sample. The problem is that players get sent both directions, a guy who is hit-unlucky will get demoted, so you need to correct both the upper and lower level by regressing to the mean.
3) My guess is that once I do these steps, there will be little difference in expected BABIP from a minor or major league pitcher.
What got me looking at this was this article by David Gassko from last year. If backup catchers and utility infielders can get hitters to hit at about a league average BABIP rate, then why should I expect any professional pitcher to do less?