The history of drinking during endurance exercise is an interesting one. It serves as a wonderful lesson for two reasons. First, it demonstrates a concept I’ve discussed at length before called the Hype cycle where a particular concept or method goes through a cycle of first overemphasis, then under emphasis, before eventually settling into its rightful place. This cycle can be seen almost anywhere, but in terms of training you’ve seen it with such things as “core” training, mileage, and interval training.
Secondly, the history of hydration demonstrates that we tend to overemphasize what we can measure and to ascribe more meaning initially than we deserve. As you’ll soon see, there was nothing wrong with the scientific measurements taken throughout the study of hydration, the problem has been the interpretation of those measurements.
Going back to the early days of marathon running, it was thought that the consumption of most fluids during long races like a marathon was not needed and even detrimental. Why? Because runners were studied and it was found that at the end of the race, the winners or top finishers lost the most body weight. The logic was that the best runners lost the most water weight, therefore losing fluids was necessary to maximize performance and hydration should not occur. The top runners were the most dehydrated, so dehydration is good! This line of thinking is used often, even to this day (i.e. The Kenyans do X, so X should be done…). This should be a cautionary tale to doing something just because the fastest guys do it.
So early in the history of hydration we have a policy of no drinking. What happens next?
With the rise of mass participation running, an increased awareness of illnesses associated with dehydration and the ability to measure hydration status very easily and quickly, we overeacted. The norm went from drinking nothing during exercise to trying to replace all of your fluid loss during exercise by drinking water or sports drinks. The common advice of measuring yourself before and after exercise to calculate hydration needs reached mantra status with coaches, nutritionist, trainers, and the common exerciser.
According to a nice summary by Mundel (BJSM-2011), one reason for this overreaction was the design of studies which measured the effect of drinking on tests at fixed intensities which essentially found how long you could go, and not how fast you can go over a fixed distance, which is what we do in the real world.
As mentioned above, the other reason is that heat exhaustion and similar illness became more prevalent with the rise of mass participation. The thinking was simple, extreme dehydration caused some problems and helped contribute to heat exhaustions, therefore if we eliminate dehydration heat exhaustion and similar illnesses would be eliminated. The problem with this thinking is similar to the “no drinking” logic. Just because a lot of dehydration is bad, doesn’t mean we need to eliminate all of it. It’s only bad if it gets to a dangerous point outside of the norms. Until it gets to that point, which is hard to do unless you force yourself not to consume any fluids (which is what was occurring in the previous period), you are fine.
You see this “all or none” thinking in a myriad of different places. Some obvious examples through history are: free radicals, carbohydrates, fat, lactate, etc. Just because a lot is bad, doesn’t mean a little is.
Finding the Happy Medium
With this overreaction came a new problem called hyponutremia, which was essentially over hydration. Thankfully, we’ve seemed to correct our earlier mistakes of way too much or way too little and found a nice balance.
Currently, we’ve reached a sort of happy medium. Research consistently demonstrates that losing water and dehydration by a couple percent is fine when running. In fact losing 1-2% or so body weight during a long performance may be the sweet spot in terms of maximizing performance. . Not surprisingly research by Marino et al. 2011 (British journal of sports medicine) shows that the body goes through several neuromuscular adjustments to maintain core body temperatures, despite fluid losses occurring. Noakes and others have consistently demonstrated that drinking by thirst does the job. You won’t replace all your fluid losses like the previous recommendations had you, but instead you’ll drink just enough during exercise to keep you from reaching the critical level where dehydration effects performance. Of course the problem is people have been inundated with recommendations on drinking water during exercise (i.e. those who carry a water fuel belt on a 30 minute run…) that many have forgotten how to drink by thirst and need to reawaken that ability.
Somewhat ironicly, recent research by Tim Noakes (http://running.competitor.com/2010/12/news/new-study-finds-drinking-less-running-faster_19567) showed that once again, just like the early studies, that top runners seemed to drink less and lose more body weight than slower runners. It’s almost like we’ve come full circle. The difference is that this time the human interpretation was different. Noakes didn’t say that because the top runners lost the most weight that dehydration should be desirable. Instead he concluded that drinking by thirst, or just enough, is what is needed. We’ve seemed to reach that happy medium indeed. It just took us about a century.
Of course, the word still needs to be spread that a little dehydration is fine. Even in the scientific community there are still those who hold on to the idea that we should replace ALL fluid during exercise. Unfortunately, it will be years before the knowledge that we went too far in our recomendations is accepted and spreads to everyone.
What’s the takeaway lesson to be learned from the history of hydration?
-Don’t overreact. Recognize the emphasis/hype cycle.
-Watch out for human error in interpretation.
-Be careful with "All or none" thinking