ARTHRITIS — Searching for THE TRUTH — Searching for THE CURE
Maladaptation was a concept who’s time had arrived but not without a great deal of complexity and confusion. Dr. Philpott observed these differences while another researcher (Dr. Roger Williams) was independently studying the biochemical uniqueness possessed by each of us. The findings were explained in his book BIOCHEMICAL INDIVIDUALITY.
Both Philpott and Williams saw that heredity played some part in deciding who would develop multiple food and chemical sensitivities, but there were no definite connections. Ultimately they concluded that these patients most likely had inherited a genetic ‘defect’ which led to their uncommon biochemistry. They hoped that ‘orthomolecular chemistry’ might hold the solution to this defective biochemistry.
Orthomolecular literally means finding the ‘proper molecule’. Philpott, Williams and their forward thinking peers sought to find the right vitamin, amino acid, mineral, enzyme or other biochemical supplement. This would hopefully bridge the relative deficiencies created by the inborn defective biochemistry found in their patients.
Large dosages of a broad spectrum of these supplements were often prescribed in hopes of covering all the bases. This strategy seemed prudent since many patients would make significant improvements on these regimens. In fact some diseases, such as pellagra, were actually cured with megadoses of the ‘proper’ molecule. In this case vitamin B-3 (niacin) taken at a dose of sixty times what was considered nutritionally adequate for normal individuals was required to get results.
At the same time a number of patients found little or no advantage when supplementing their diets with large doses of multi-vitamins and other nutrients. In fact many did worse. Their doctors were perplexed since they assumed any excesses of these nutritional supplements would simply be excreted from the body. It was decided that the probable culprit was a sensitivity to the binders holding the tablets together or the food source used to manufacture the supplements rather than the nutrients themselves.
Allopathic medicine has long misrepresented and misused the idea of biochemical individuality much like a politician flip-flopping on an important issue. When physicians like Dr. Philpott made a momentous discovery these medical nay-sayers would claim there was no scientific proof. When proof beyond anecdotal evidence was provided they would retreat from their initial statements by suggesting “it might work for some people”. They would continue by saying “but everyone is different” to diminish it’s importance.
They inferred that the patients that could be helped by these new therapies were medical oddities, representing a minuscule fringe population of patients. Therefore the new discovery would have little, if any, widespread use. If it was effective for the majority of patients they and their peers would, of course, already know about it. This subtle implication is used to this day. These doctors would prefer to let their patients suffer rather than simply admit ignorance.
We’ve already discussed how allopathic medicine uses the double-blind placebo controlled crossover study as their gold standard for research. The same medical experts that use the concept of biochemical individuality to nullify the importance of medical discoveries, quickly hop to the other side of the fence when they want to prove their point. All the talk about biochemical individuality disappears. Suddenly humans are assumed to be monolithic creatures (like so many laboratory mice), all for the sake of proving the efficacy of a drug. They contend that if there are differences, they are minor and can be adjusted for by ‘matching’ the test subjects. Here they try to approximate things like age, race, etc.
It’s absurd to call any kind of research that is currently performed on human beings, in the face of what we know about biochemical individuality, science. What makes it even more preposterous is the way this so called research is conducted at ‘arms length’ giving absolutely no weight to the environmental situation of the individual. If researchers really want to find answers they are going to have to dig a bit deeper. In the process they are going to have to get their hands dirty by understanding the true identities and innate complexities of their test subjects.
George W. Watson, M.D. did just that. Dr. Watson was also a psychiatrist and was curious to see whether biochemical metabolic differences could be used to predict mental symptoms. There was a great deal of interest in human metabolic differences after doctors reviewed and tried to make sense of the results of what is now called the Minnesota Semi-Starvation Studies.
During World War II Quaker pacifist volunteers working with the U.S. Army Air Corps and the University of Minnesota tried to determine the effect of semi-starvation on human behavior. Specifically they were trying to recreate a dietary scenario similar to what allied pilots would face if they were downed behind enemy lines. These pilots would be forced to forage for food and would rapidly become calorically deprived.
Prior to the study there was the assumption that their cognitive abilities as well as their overall physical demeanor would rapidly deteriorate. Much to the surprise of the researchers the behavior split into two very different groups. The majority of the men fared unusually well. In fact they reported experiencing feelings of calmness, serenity, peacefulness, sometimes euphoria as well as heightened mental clarity.
By contrast the remainder of the subjects rapidly became mentally disoriented and showed symptoms that perfectly mimicked what psychiatrists would diagnose as neurosis or psychosis. Other symptoms included body aches and pains, gastric upset, chronic fatigue, insomnia, depression, panic attacks and paranoia.
After the study, substantial follow-up work was done by Dr. Seymour Kety. He found that mentally ill patients were receiving the same number of calories when compared to their healthy counterparts. He tried to find out if any metabolic characteristic, specifically how fast blood sugar was burned, could predict a behavior such as schizophrenia. When tested he observed some of the schizophrenics would burn blood sugar very fast, while the others much, much slower. He made a surprising and rather obvious error when analyzing the results. Kety took the average burn rates of all the schizophrenics and lumped them together. The lumped average of all the schizophrenics was very close to that of the healthy control individuals. Since the averages were the same he surmised that, statistically speaking, there were no metabolic differences between the two groups.
Kety did note that schizophrenic behavior was not singularly related to blood sugar metabolism being too fast. It was also not singularly related to blood sugar metabolism being too slow. He found it could be either. Kety hypothesized that the differences he observed were due to some random genetic defect that was tied to schizophrenia. What he failed to note was that schizophrenic behavior was singularly tied to an EXTREME metabolism (one that burned glucose too fast or too slow). A bell curve would show that the scatter of healthy individuals were all distributed close to the center. The same curve would have his schizophrenic patients occupying the areas to the extreme right and left of the curve.
Watson examined Kety’s work and ultimately would correct what he had overlooked. The bi-modality seen in the Minnesota Semi-Starvation Studies was statistically significant. In fact Dr. Watson found that subjects fell into three different groups in terms of the rate at which they burned (or metabolized) blood sugar. Like Kety, Dr. Watson found that his patients with mental symptoms were EXTREME in that they burned blood sugar very much faster or very much slower than the healthy subjects.
The real discovery came when Dr. Watson considered that the diet might be the problem rather than the patient. The problem could stem from the fact that only one diet was being used. If this diet was somehow inappropriate, or mismatched for the disparate and special metabolic needs of his ‘extreme’ patients, they would be placed under a tremendous physiological stress.
Through exhaustive and painstaking work he found if the diets were altered to match the metabolic needs of his extreme patients that their symptoms would resolve. He reiterated Kety’s findings that individuals could have identical symptoms but have radically different metabolisms.
As you recall, this is exactly what Dr. Hans Selye found in his research on animal physiology. Selye saw that symptoms were totally independent of the type of physiological stress placed on the system. Dr. Watson had just discovered the most prominent physiological stressor for humans — eating a diet that was mis-matched to their inherited metabolic needs. In so doing he proved that there was no one diet that could possibly be appropriate for everyone.
Researchers always seek to minimize the number of variables in their studies. Test mice were fed from the same bag of cereal so researchers incorrectly assumed that the nature of the food supplied for humans was unimportant.
Maybe Dr. Kety did understand the statistical findings of his metabolic research. Perhaps he didn’t want to be the first to open this veritable Pandora’s Box for researchers. He must have realized the scorn that would have no doubt be forthcoming from his peers. The implications were enormous. All the previous research on human biochemistry would have to be re-evaluated, redone or discarded. Future research would have to take metabolic differences into account. A major, key variable had been ignored.
Every so often there is a bit of research that can never be re-enacted. This was the case with the work done by Dr. Weston A. Price. Like a comet that passes only once through our solar system Price had a singular opportunity to observe what really transpired as our recent ancestors became ‘civilized’. Dr. Price would forgo the convenience of working in a clinical laboratory. Instead he spent 20 years traveling and living under the most primitive of conditions to make simple, firsthand observations.
Dr. Price was a practicing dentist during the early 1900’s. He was perplexed by an obvious question that so many other of his associates took for granted. Why did people living under primitive conditions have excellent teeth?
At the time missionaries and traders were pushing into the last few isolated parts of the world. These places had no dentists and Price would find that there was little need. Some of the groups that he studied had an incidence of dental caries as low as 0.16% or one-sixth of one percent. In one area 2,464 teeth were examined. The people were of all ages. Only 4 cavities were found. This sounds impossible considering the fact that there was no fluorine in the water, no tartar control toothpaste’s. What gave these people immunity from tooth decay?
Even more compelling was the fact that time and time again Price found natives from the same racial stock only miles away that had rampant dental disease. He looked at different people living in a wide array of regions and climates. He analyzed Swiss isolated by mountainous alpine terrain as well Gaelics living near the almost unnavigable seas of the Outer Hebrides.
He observed the Indians of the Americas including those in the far north and great northern plains. Further south he lived with Florida Seminoles as well as the Columbian and Peruvian Indians in the jungles and mountains of South America. Inhabitants of Fiji, Hawaii and the Maori’s of New Zealand in addition to the Aborigines of Australia were included in his work.
On average the rate of cavities jumped from nil to over 25% very soon after the natives began changing what they ate. Price documented a specific case where natives labored on an English plantation. All of the workers were essentially free of cavities except for one, the cook. He routinely sampled the foods he prepared while the other workers continued with the same unchanged diet they had eaten for hundreds of years.
Tuberculosis, which at the time was uncontrolled among civilized populations, began manifesting itself in the same native groups shortly after dietary changes were made. Tuberculosis is infectious in nature and can be transmitted by human contact. However there must also be a susceptible host. Isolated groups had never seen this malady until civilization arrived, so much so it was referred to as the ‘white plague’ by natives. Once again some natives would develop the disease while others seemed immune.
Physicians thought that primary cause for these outbreaks were due to the living conditions of the natives. They thought that the smut from peat fire smoke that hung over the thatched topped homes is what contributed to disease in these Gaelic towns. The remedy of building new homes with more sanitary conditions had no impact. Natives that had changed their diets to new imported foods were not only developing cavities, they were now succumbing to tuberculosis.
Dr. Price noticed arthritis and other degenerative rheumatoid conditions developing in newly ‘civilized’ groups. In one instance he observed 10 native Indians in a reservation home that were laying side by side, completely bedridden with crippling arthritis. Only miles away there were Indians that chose not to alter their lifestyle or diet. They considered the white man as dirty and so was anything that he came in contact with, even food. They did have regular contact with whites, but it was limited to trading; usually furs in exchange for hunting ammunition. Arthritis of any kind was unheard of in these groups.
Dr. Price supplemented his research on living subjects by examining the teeth and bones of deceased natives. Twenty thousand skeletons were examined.
It was a relatively simple task to discriminate the remains. Christian missionaries that converted natives demanded that the corpse be buried in a prone or flat position. Prior to that natives would bury the bodies in a sitting or flexed position. Some would use mounds above the ground while others would bury the bodies on top of each other only to exhume them after a couple of years. The bones were then stored in a sacred building so the burial ground could be reused. None of the bones that Dr. Price examined had visible signs of degenerative arthritis. He observed this phenomenon among white European as well as Indian groups.
Even more troubling was what Price observed next. Up until then he only noticed the rather immediate effect of cavity formation and onset of new chronic and infectious diseases from these dietary changes. Prior to this time he was struck with the perfection of the structure of their oral cavities. The teeth were not only free of caries but the arch formed by them were perfectly spaced an aligned. Price called them “two rows of pearls”. The next generation would not be so fortunate.
The offspring of parents who had changed their diets were plagued by an onset of structural problems. The first and most obvious were that the rows of teeth were now crooked and crowded. Up until this time Price never observed an impacted molar, now they were commonplace. Those crooked teeth now held more cavities. The rate went from nil to 25% in the first generation. The second generation was now seeing decay in an average of 45% of their teeth. Other groups were more severely effected. In some tribes the rate would jump as high as 78%.
There were structural changes. The new offspring showed more deformities, cleft palates, club feet, etc. A subtle structural change was the narrowing and deviation of nasal cavities. These children could no longer get adequate air through their nose. They became perpetual mouth breathers. Meanwhile the incidence of infectious disease and chronic rheumatoid conditions amongst the young increased. Children were now routinely afflicted with dental disease, tuberculosis and arthritis.
Even more subtle was the reproductive efficiency of these children as they reached puberty. Speaking simply, it became increasingly difficult for them to conceive and successfully carry the pregnancy to term.
While Dr. Price was making anecdotal observations of dietary changes on humans another researcher was conducting controlled experiments on animals. Between the years of 1932 and 1942, Dr. Francis Marion Pottenger, Jr., conducted a feeding experiment to determine the effects of subtle dietary changes on cats. His cat study was completely independent of Dr. Price’s work.
The research was prompted when Pottenger noticed that his laboratory cats seemed divided in their ability to survive adrenalectomies (surgery to remove the adrenal glands). At the time there were no chemical procedures available for standardizing biological extracts (in this case, adrenal hormones). Manufacturers used animals to determine their potency. Cats cannot survive without their adrenal glands so a properly calibrated dose of extract was needed daily to support their lives. This in turn would indicate the potency of that batch of hormonal extract.
If a cat didn’t survive the initial surgery it couldn’t be used later in standardizing the potency of hormone extracts. Dr. Pottenger sought to maximize the pre-operative health of his laboratory animals by feeding them a diet of market grade raw milk, cod liver oil and cooked meat scraps (gathered from a local sanitarium).
These scraps included the liver, tripe, sweetbreads, brains, heart and muscle. This diet was considered to be rich in all the important nutritive substances by the experts of the day, and the surgical technique used for the adrenalectomies was the most exacting known. Even with all these precautions many of the cats still did not survive surgery. He then began noticing that these cats showed a decrease in their reproductive capacity. Many of the kittens born in the laboratory exhibited skeletal deformities and organ malfunctions.
As his neighbors in Monrovia donated an increasing number of cats to his laboratory, the demand for cooked meat scraps exceeded supply from the sanitarium. Pottenger placed an order at the local meat packing plant for raw meat scraps, again including the viscera, muscle and bone. These raw meat scraps were fed to a segregated group of cats each day and within a few months they appeared in better health than the animals being fed cooked meat scraps. Their kittens were more vigorous, and most interestingly, their operative mortality decreased markedly.
The contrast in the apparent health of the cats fed raw meat and those fed cooked meat was so startling, it compelled Dr. Pottenger to undertake a controlled experiment. What he had observed by chance, he wanted to repeat by design. He wanted to find answers to such questions as: Why did the cats eating raw meat survive their operations more readily than those eating cooked meat? Why did the kittens of the raw meat fed cats appear more vigorous? Why did a diet based on cooked meat scraps apparently fail to provide the necessary nutritional elements for good health? He felt the findings of a controlled feeding experiment might yield new insights into optimal human nutrition.
Dr. Pottenger devised a series of experiments where one group of cats were fed a diet of two-thirds raw meat, one-third milk and cod-liver oil. The second group was fed a diet of two-thirds cooked meat, one-third milk and cod-liver oil. Nine hundred cats were studied over a ten year period.
The cats receiving raw meat reproduced normally, had few abortions, nursed their young well, had very good behavior patterns and exhibited a high resistance to infections and parasites. The cats receiving the cooked meat reproduced poorly. There was an abortion rate of 25 per cent in the first generation, increasing to 70 per cent in the second generation. Many of the pregnant cats died in labor and the mortality rate of the kittens was high. The cats that survived long enough to mature were irritable and difficult to handle.
Skin lesions and allergies were frequent and became progressively worse from generation to generation in the cats eating cooked meat. Structural changes became evident. The oral arches narrowed, and bone deformities (osteomyelitis), cardiac lesions, thyroid disease, nephritis, hepatitis, arthritis, and many other conditions familiar to human beings all became common. The kittens of the third generation were so degenerated that none of them survived the first six months of life, thereby terminating the strain.
Pottenger wanted to see if there was a way to reverse this trend. He returned the cats in the first and second generation cooked meat group to a raw meat diet. There was immediate improvement but it still took three to four generations before the offspring regained the good health of their counterparts nourished with raw meat throughout.
The implications from the work done by these early researchers is daunting. The indication is that simple dietary changes would result in an inevitable increase in all types of chronic conditions. In fact these ‘maladaptive’ problems would tend to increase with every generation.
More and more couples would experience problems simply conceiving. Adequate sperm and eggs might be available but normal fertilization still would not take place. Males might show inexplicable drops in sperm count. Weight gain, sleeping disorders, chronic fatigue, skin problems, and allergies of all types would be on the increase. There would also be an increase in osteo-arthritis and rheumatoid conditions as well as other so called auto-immune diseases. Unfortunately these are the same trends that we have witnessed in increasing number in humanity’s recent generations. The incidence and prevalence of these trends are accelerating as we approach the next millennium.
It’s obvious that cats are not the same as you and I. However, pharmaceutical companies know that they are similar in so many respects that they are the preferred animal of choice for testing new drugs.
Chemists know that heating meat causes the protein component to change into a slightly different chemical structure. This chemical change is termed ‘de-naturing’. A common example of protein de-naturing occurs every time we scramble or fry an egg. Apparently this subtle change has a huge impact on the foods ability to remain a perfect food for the maintenance of health in cats.
The revelation from the previous chapter is not that we should start eating raw meat to cure arthritis. The work of Price and Pottenger simply and eloquently ‘separated the forest from the trees’ — in this case by isolating downstream symptoms from the initiating cause.
For the first time we have a lucid view of the origin of all arthritis — maladaptation to our environment. Maladaptation still might seem like a fuzzy, overly general concept but I cannot think of one more accurate in describing what is really going on. When our surroundings change faster than we can, maladaptation results. A process is set in motion leading to arthritis of all varieties as a consequence of this seminal event.
Nature has designed our bodies to have the capacity to accommodate as well as adapt. Accommodations occur over the near term while adapting is a long term process. Accommodating implies that the body has made a modification but is carrying a definite burden and paying a physiological price for it the entire time it is subjected to the new change.
Dr. Selye found that the body could in effect ‘pump itself up’ to remain healthy during periods when forced to make these accommodations. However our physiology cannot maintain this over the long term. Like being forced to carry around a heavy weight, sooner or later it must exact a degenerative toll. Minor changes might be borne for months, even years before the body’s ability to accommodate would peter out. Major changes would hasten a physiological collapse. Both ultimately lead to chronic degenerative disease states and in many cases, arthritis.
Most misconceptions about the adaptive process stems from considering a time frame inappropriately short for true adaptation to take place. Seeming increased tolerance to things such as tobacco, caffeine, alcohol, fast food, pollution, etc., should not be considered as tapping into our long term adaptive capabilities. Nothing can ‘evolve’ during one lifetime!
In this case all that is being experienced is a short term physiological accommodation. It’s also wishful thinking to believe that humans can successfully adapt to new substances after only a few generations. If this were the case Dr. Price would have observed a reversal in the incidence of cavities with successive generations. He noticed the opposite phenomenon.
One way to determine whether you have successfully adapted rather than accommodated to a change is by withdrawing the change for at least 5 days. You can do this by avoiding or insulating yourself from the substance in question. After this period reintroduce it. If you are ‘maladapted’ to the substance you will experience an ‘alarm’ reaction signaled by a return or exacerbation of symptoms. This is exactly how we test for food allergies. It can however be done to test for any substance or change in conditions.
Now we are armed with a very good test for maladaptation — at least for those that are already displaying symptoms. If you already have arthritis you are displaying symptoms! The problem that most arthritics face is that they might be taking so many symptom suppressing drugs that it dampens the alarm reaction to the point where it is difficult to identify.
Most would have to look back several years to their childhood to recall a totally symptom free state. Even then you might recall things like headaches. frequent infections or allergies to common airborne allergens. All of these were really signs of maladaptation that managed to ‘peek through’ when your physiology was carrying an excessive burden of onerous things. So many people experience symptoms like these in their daily lives that that they aren’t considered out of the ordinary, even by our doctors.
Dr. Price witnessed the opposite firsthand. There was little need for dentists, allergists or doctors of any kind because the isolated people he observed didn’t have any prominent symptoms. Like us they were occasionally exposed to the insult of infectious agents or periods of extreme heat or cold; but their bodies were able to maintain the ability to compensate. By being naturally well adapted to their foods and their external environment they had access to something that we don’t. A greater capacity to accommodate, they possessed a huge reserve of it, and in turn maintained good health.
Confusion also stems from looking too far back in time. Human adaptation to mild to moderate changes (like diet) probably required a minimum of a few thousand years. 40,000 to 50,000 years ago mankind was in it’s infancy and much more homogenous when compared to their physiologically diverse counterparts that prevailed (and were exceptionally well adapted to every part of their environment) some 2,000 years ago. If we try to evaluate the circumstances under which mankind was best suited 45,000 years ago we will get a false read. The important adaptive changes achieved by our more recent ancestors will be short changed.
We shouldn’t put too much weight on how most ‘civilized’ cultures lived and ate during the past 2,000 years either. The rapid nature of change during this period did not leave adequate time for proper adaptation. Our focus should lie just before this time. Here resides the clues that are most precious in determining our physiological identity today. If we find them we will possess the ability to make some changes that will move us from a maladapted state back a healthy one.
When Dr. Price’s made his observations humans had already spread to almost every inhabitable area of the world. They had become accustomed to vastly different climates, terrain’s, and of course, food sources. In his book NUTRITION AND PHYSICAL DEGENERATION Price delivers an extraordinary look into the enormous diversity in these diets. What is striking is how several groups relied almost exclusively on animal sources for their nutrition while others existed primarily on vegetation. These dietary scenarios fly in the face of what is generally accepted as good nutritional practice today.
The large populations of people living in colder climates had short growing seasons. Many of these areas were almost totally devoid of vegetation. An excellent example are the some 20,000 Gaelics that lived in the Outer Hebrides (what is today Scotland). Violent seas that surrounded them almost completely isolated these people until the early 1900’s.
The soil in this region was almost devoid of lime, to the point where little would grow. The landscape was bare and rugged. There were no trees. The soil simply wouldn’t support their growth. The inland area was primarily composed of peat bogs which were used to thatch roofs and fuel fires. There were no dairy products of any kind as the land was inadequate for grazing cattle. Herds of sheep were able to survive and became a primary staple. There was also an abundance of crabs, oysters, clams, lobster and cod harvested from the sea. Vegetables were very limited even during the summer months. Very small amounts of oats and barley were able to be cultivated. There were no fruits of any kind.
This dietary predicament might raise some immediate questions for nutritionists. First of all how did they get enough calcium? Wouldn’t their bones turn to mush without milk products? More importantly, where did they obtain vitamin C?
The average person isn’t aware that humans are one of the very few animals that cannot manufacture their own vitamin C. This is why it is so crucial that we obtain this nutrient from what we eat. You might of heard the term ‘limeys’ as a reference to sailors. They were given this name since they would carry a supply of limes on board during long voyages to avoid diseases like scurvy — caused by terminal vitamin C deficiency. There was little vegetation and of course citrus fruits were unavailable to the isolated Hebrides Gaelics. How did they obtain enough vitamin C to avoid diseases like scurvy?
Happily the animals they consumed were capable of producing large amounts of vitamin C. This vitamin was primarily stored in the organ tissues of the liver and adrenals. Steaks and other animal muscle meats are most valued in today’s culture. The opposite was true for the primitive groups that depended on animals as their major source of nutrition.
After being killed the organs (kidneys, liver, brain, adrenals) were most accessible and consumed first. In fact the muscle meat that was difficult to butcher would often be left to the dogs. Fish eggs and their organ meats were also prized. A favorite delicacy of the Hebrides Gaelics was baked cod head stuffed with liver and a small amount of oats.
One commonality of all the isolated groups Price observed was that they didn’t waste food. In this case all edible parts of the animal were consumed. Sheep bone marrow (dense in calcium and other minerals) was routinely used as a thickener in their rich stews.
Meanwhile Price observed isolated cultures living primarily off fruits and other vegetation. Not surprisingly these people were located in more temperate climates with extended growing seasons. Corn, beans, squash, sweet potatoes, yucca, poi (taro root), coconut, paw paws, pumpkin, bananas as well as other tropical fruits were staples. Fish and animals completed their diets but were consumed much less frequently and in smaller amounts.
As you might imagine, cultures utilizing animals as their main source of sustenance also consumed huge amounts of fat. Medical research today is obsessed with fat. They should be. A high fat diet means obesity, high blood pressure and cardiovascular disease for most. The paradox is why others can consume even more fat and have none of these maladies.
Huge research dollars have already been spent unsuccessfully trying to understand why Eskimos living under arctic conditions are capable of consuming large amounts of fat laden whale blubber without succumbing to vascular disease and obesity. We have a pretty good idea of the health disaster that might await if the average person switched over to an Eskimo diet. What would happen if an Eskimo was suddenly switched over to a low fat, low protein, high complex carbohydrate diet that is currently being touted as the health ‘standard’ for all of us? Even worse, what would happen if that same Eskimo began eating in a vegetarian manner? Would the consequence of this dietary change be equally deleterious? Would it present itself differently in the way it effected their overall health?