sigh. I really wish I still had access to the various engineering departments at OSU, but it's winter break and I'm certain that I can't get in there to use the computers with appropriate software to model what I would like to model for definitive results on what I was talking about.
I will be the first to admit that I don't even know that I'm correct, but here again is the reason I think that in the scenario I mentioned (where the speed is correct on a breaking putt on an intended line that is known to work), the margin for directional error is greater than those of a flat putt.
Firstly, if you miss left of the intended line, I think intuitively it's obvious that the range for margin for error
(1) is smaller than that of a flat putt because the putt will move away from the center of the hole...for an arbitrary example, if you miss it low by 1% of face angle on a flat putt, it will catch the left side of the hole and go in, but on the breaking putt, you can only miss by 0.5%. I will grant that. If you miss higher on the line, this continuum
(2) of misses catch the high side of the hole. The next section
(3) of the continuum starts as holing out on the high side and moves through results back to the center of the hole (picture a parabola going higher but ending shorter). Now if we hit the putt even higher offline, there is a range of misses
(4) that catch the lower side of the hole until we start to see misses.
I really think in comparison to a flat putt that the total sum of angles of "high misses" 2, 3, and 4 is an amount greater than what "low misses" loses, resulting in an overall greater margin of error for direction.
This is my idea, and I don't know if it's right or wrong, but please read on as I'm starting to think it might not even matter if the range of successful right/left misses is the same as a flat putt. This idea in itself takes zero consideration into account for the fact that the golfer now knows he can lose it a little to the high side and still be okay (thus guarding against a low miss...as opposed to a straight putt that has a smaller absolute margin for error in either direction).
I know twizen said "you gain margin of error high side but lose it all on the low side". I wish I had a way to prove this one way or the other.
Perhaps it really is the same as the tee shot fading/drawing example earlier about hitting fairways. Maybe it even IS the same total margin for error, but in practice, consciously knowing to avoid one side allows for a greater number of successes. I understand that in a vacuum, the left/right error should be an even distribution, but on breaking putts where you know that you can miss it high side (whether you consciously know this or not, or if it's just unconscious intuition from years of practical experience), I would think that the distribution of misses would be skewed to the high side, thus resulting in more made putts.
NXT, what do you think? Obviously there is a point where the golfer will see diminished returns.
p.s. Ship, don't leave
Last edited by ReidLockhart; 12-22-2013 at 04:53 AM.