Or: Why I don't trust or care about that last digit.
Spoiler alter: About 3/4 of an inch per second.
So, as a science and statistics nerd, I'm always obsessed with the amount of uncertainty and value involved with sports measurements, and how that affects that actual game.
For example, it has been shown that while batting average, ERA and home runs are highly variable statistics, that are poor predictors of future success, there are other statistics that are significantly better at predicting success. And it's not a surprise, given the factors that control whether a ball travels slightly further than the reach of 9 men in stirrups.
So, anyways, I wanted to compare the values of the 40 times from one to another, and consider at which point it could actually matter that one player is actually faster than another.
So, (using some questionable assumptions) I calculated that speed of a player running at approximately a 4.5 to 4.2 second 40 yard dash, which would be from 8.89 yards per second (or 26.7 feet per second)*.
*This is where I bitch about America using the Imperial System. Seriously, it's not the 19th century.
Now, assuming that speed, for each 0.01 seconds that a player goes faster he gains approximately 3/4 of an inch per second he runs.
Ignoring the fact that the response time of a 40 doesn't represent how you would normally respond in a play, and the fact that the variation in a normal human's response time can change up to 0.2 seconds per attempt, let's imagine that 0.01 could actually be a reliable measurement.
The median play in the NFL lasts just a few seconds. Let's assume a play is active for 5 seconds. In those 5 seconds, a player running at full speed would be less than 4 inches further than than a player that is 0.01 seconds slower.
But that is total BS, as well. No player is going to run for 5 seconds in a straight line from one spot to another. Maybe it would be more fair to look at a CB running before a ball is thrown. The median release time of an NFL QB is Phillip Rivers' 2.72 seconds. In that amount of a time a CB that is 0.01 seconds faster would be about 2 inches further. Just to give you an idea, the football is about 6.6 inches across. So, even at 0.03 seconds difference, you're talking about less than the width of the ball separation.
For example, Darrelle Revis (5'11") was a 40 time of 4.38 seconds, and drafted in the first round. Richard Sherman (6'3") was a "slow" 4.54, which was one of the reasons he was drafted in the 5th round. The speed difference would represent 12 inches distance between Revis and Sherman in a 2.72 second sprint.
Now, we have to factor in that a ball does not come in at an angle where the horizontal separation is equally important to the vertical difference. At an approach angle of 30 degrees the value of horizontal separation is worth about 58% of the vertical height. So, at 58%, the 12" of vertical separation becomes equivalent to 7 inches of height. Now that 12 inches gained is only 7, minus the 4 that Sherman gains, and suddenly with the players would only be separated by 3 inches. Less than half of the width of football.
So, does that 0.01 (or 0.16) seconds matter?
That's a stupid question, actually, because the question should always be asked as "So, how could that 0.01 seconds compare to the other statistics compared to height, arm length, shuttle time, vertical jump, etc."
It would probably be valuable to consider 0.01 seconds worth somewhere around 2-6 inches for a player in the open field on a normal play.
So, you can kind of start to see where large defensive players make more sense than fast defensive players, and where a small differentiation in plays could make a difference, and where it wouldn't matter in the slightest.
Just my initial thoughts. I'd love you hear your comments, insults, and questions.
Hope you enjoyed it!