Warung Bebas

Sabtu, 16 Mei 2009

The Coronary Heart Disease Epidemic: Possible Culprits Part I

In the last post, I reviewed two studies that suggested heart attacks were rare in the U.K. until the 1920s -1930s. In this post, I'll be discussing some of the diet and lifestyle factors that preceded and associated with the coronary heart disease epidemic in the U.K and U.S. I've cherry picked factors that I believe could have played a causal role. Many things changed during that time period, and I don't want to give the impression that I have "the answer". I'm simply presenting ideas for thought and discussion.

First on the list: sugar. Here's a graph of refined sugar consumption in the U.K. from 1815 to 1955, from the book The Saccharine Disease, by Dr. T. L. Cleave. Sugar consumption increased dramatically in the U.K. over this time period, reaching near-modern levels by the turn of the century, and continuing to increase after that except during the wars: Here's a graph of total sweetener consumption in the U.S. from 1909 to 2005 (source: USDA food supply database). Between 1909 and 1922, sweetener consumption increased by 40%:

If we assume a 10 to 20 year lag period, sugar is well placed to play a role in the CHD epidemic. Sugar is easy to pick on. Diets high in refined sugar tend to promote obesity due to overeating.  An excess causes a number of detrimental changes in animal models and human subjects that are partially dependent on the development of obesity, including fatty liver, the metabolic syndrome, and small, oxidized low-density lipoprotein particles (LDL). Small and oxidized LDL associate strongly with cardiovascular disease risk and may be involved in causing it. These effects seem to be partly attributable to the fructose portion of sugar, which is 50% of table sugar (sucrose), about 50% of most naturally sweet foods, and 55% of the most common form of high-fructose corn syrup. That explains why starches, which break down into glucose (another type of sugar), don't have the same negative effects as table sugar and HFCS.

Hydrogenated fat is the next suspect. I don't have any graphs to present, because no one has systematically tracked hydrogenated fat consumption in the U.S. or U.K. to my knowledge. However, it was first marketed in the U.S. by Procter & Gamble under the brand name Crisco in 1911. Crisco stands for "crystallized cottonseed oil", and involves taking an industrial waste oil (from cotton seeds) and chemically treating it using high temperature, a nickel catalyst and hydrogen gas (see this post for more information). Hydrogenated fats for human consumption hit markets in the U.K. around 1920. Here's what Dr. Robert Finlayson had to say about margarine in his paper "Ischaemic Heart Disease, Aortic Aneurysms, and Atherosclerosis in the City of London, 1868-1982":
...between 1909-13 and 1924-28, margarine consumption showed the highest percentage increase, whilst that of eggs only increased slightly and that of butter remained unchanged. Between 1928 and 1934, margarine consumption fell by one-third, while butter consumption increased by 57 percent: and increase that coincided with a fall of 48 percent in its price. Subsequently, margarine sales have burgeoned, and if one is correct in stating that the coronary heart disease epidemic started in the second decade of this century, then the concept of hydrogenated margarines as an important aetiological factor, so strongly advocated by Martin, may merit more consideration than hitherto.
Partially hydrogenated oils contain trans fat, which is truly new to the human diet, with the exception of small amounts found in ruminant fats including butter. But for the most part, natural trans fats are not the same as industrial trans fats, and in fact some of them, such as conjugated linoleic acid (CLA), may be beneficial. To my knowledge, no one has discovered health benefits of industrial trans fats. To the contrary, compared to butter, they shrink LDL size. They also inhibit enzymes that the body uses to make a diverse class of signaling compounds known as eicosanoids. Trans fat consumption associates very strongly with the risk of heart attack in observational studies. Which is ironic, because hydrogenated fats were originally marketed as a healthier alternative to animal fats. The Center for Science in the Public Interest shamed McDonald's into switching the beef tallow in their deep friers for hydrogenated vegetable fats in the 1990s. In 2009, even the staunchest opponents of animal fats have to admit that they're healthier than hydrogenated fat.
The rise of cigarettes was a major change that probably contributed massively to the CHD epidemic. They were introduced just after the turn of the century in the U.S. and U.K., and rapidly became fashionable (source):
If you look at the second to last graph from the previous post, you can see that there's a striking correspondence between cigarette consumption and CHD deaths in the U.K. In fact, if you moved the line representing cigarette consumption to the right by about 20 years, it would overlap almost perfectly with CHD deaths. The risk of heart attack is so strongly associated with smoking in observational studies that even I believe it probably represents a causal relationship. There's no doubt in my mind that smoking cigarettes contributes to the risk of heart attack and various other health problems.

Smoking is a powerful factor, but it doesn't explain everything. How is it that the Kitavans of Papua New Guinea, more than 3/4 of whom smoke cigarettes, have an undetectable incidence of heart attack and stroke? Why do the French and the Japanese, who smoke like chimneys (at least until recently), have the two lowest heart attack death rates of all the affluent nations? There's clearly another factor involved that trumps cigarette smoke. 

Selasa, 12 Mei 2009

The Coronary Heart Disease Epidemic

Few people alive today are old enough to remember the beginning of the coronary heart disease (CHD) epidemic in the 1920s and 1930s, when physicians in the U.S. and U.K. began sounding alarm bells that an uncommon disease was rapidly becoming the leading cause of death. By the 1950s, their predictions had come true. A decade later, a new generation of physicians replaced their predecessors and began to doubt that heart attacks had ever been uncommon. Gradually, the idea that the disease was once uncommon faded from the public consciousness, and heart attacks were seen as an eternal plague of humankind, avoided only by dying of something else first.

According to U.S. National Vital Statistics records beginning in 1900, CHD was rarely given as the cause of death by physicians until after 1930. The following graph is from The Great Cholesterol Con, by Anthony Colpo.


The relevant line for CHD deaths begins in the lower left-hand part of the graph. Other types of heart disease, such as heart failure due to cardiomyopathy, were fairly common and well recognized at the time. These data are highly susceptible to bias because they depend on the physician's perception of the cause of death, and are not adjusted for the mean age of the population. In other words, if a diagnosis of CHD wasn't "popular" in 1920, its prevalence could have been underestimated. The invention of new technologies such as the electrocardiogram facilitated diagnosis. Changes in diagnostic criteria also affected the data; you can see them as discontinuities in 1948, 1968 and 1979. For these reasons, the trend above isn't a serious challenge to the idea that CHD has always been a common cause of death in humans who reach a certain age.

This idea was weakened in 1951 with the publication of a paper in the Lancet medical journal titled "Recent History of Coronary Disease", by Dr. Jerry N. Morris. Dr. Morris sifted through the autopsy records of London Hospital and recorded the frequency of coronary thrombosis (artery blockage in the heart) and myocardial infarction (MI; loss of oxygen to the heart muscle) over the period 1907-1949. MI is the technical term for a heart attack, and it can be caused by coronary thrombosis. Europe has a long history of autopsy study, and London Hospital had a long-standing policy of routine autopsies during which they kept detailed records of the state of the heart and coronary arteries. Here's what he found:

The dashed line is the relevant one. This is a massive increase in the prevalence of CHD death that cannot be explained by changes in average lifespan. Although the average lifespan increased considerably over that time period, most of the increase was due to reduced infant mortality. The graph only includes autopsies performed on people 35-70 years old. Life expectancy at age 35 changed by less than 10 years over the same time period. The other possible source of bias is in the diagnosis. Physicians may have been less likely to search for signs of MI when the diagnosis was not "popular". Morris addresses this in the paper:
The first possibility, of course, is that the increase is not real but merely reflects better post-mortem diagnosis. This is an unlikely explanation. There is abundant evidence throughout the forty years that the department was fully aware of the relation of infarction to thrombosis, of myocardial fibrosis to gradual occlusion, and of the topical pathology of ostial stenosis and infarction from embolism, as indeed were many pathologists last century... But what makes figures like these important is that, unlike other series of this kind, they are based on the routine examination at necropsy of the myocardium and of the coronary arteries over the whole period. Moreover Prof. H. M. Turnbull, director of the department, was making a special case of atheroma and arterial disease in general during 1907-1914 (Turnbull 1915). The possibility that cases were overlooked is therefore small, and the earlier material is as likely to be reliable as the later.
Dr. Morris's study was followed by another similar one published in 1985 in the journal Medical History, titled "Ischaemic Heart Disease, Aortic Aneurysms, and Atherosclerosis in the City of London, 1868-1982", conducted by Dr. Robert Finlayson. This study, in my opinion, is the coup de grace. Finlayson systematically scrutinized autopsy reports from St. Bartholemew's hospital, which had conducted routine and detailed cardiac autopsies since 1868, and applied modern diagnostic criteria to the records. He also compared the records from St. Bartholemew's to those from the city mortuary. Here's what he found:

The solid line is MI mortality. Striking, isn't it? The other lines are tobacco and cigarette consumption. These data are not age-adjusted, but if you look at the raw data tables provided in the paper, some of which are grouped by age, it's clear that average lifespan doesn't explain much of the change. Heart attacks are largely an occurrence of the last 80 years.

What caused the epidemic? Both Drs. Morris and Finlayson also collected data on the prevalence of atherosclerosis (plaques in the arteries) over the same time period. Dr. Morris concluded that the prevalence of severe atherosclerosis had decreased by about 50% (although mild atherosclerosis such as fatty streaks had increased), while Dr. Finlayson found that it had remained approximately the same:


He found the same trend in females. This casts doubt on the idea that coronary atherosclerosis is sufficient in and of itself to cause heart attacks, although modern studies have found a strong association between advanced atherosclerosis and the risk of heart attack on an individual level. Heart attacks are caused by several factors, one of which is atherosclerosis.  

What changes in diet and lifestyle associated with the explosion of MI in the U.K. and U.S. after 1920? Dr. Finlayson has given us a hint in the graph above: cigarette consumption increased dramatically over the same time period, and closely paralleled MI mortality. Smoking cigarettes is very strongly associated with heart attacks in observational studies. Animal studies also support the theory. While I believe cigarettes are an important factor, I do not believe they are the only cause of the MI epidemic. Dr. Finlayson touched on a few other factors in the text of the paper, and of course I have my own two cents to add. I'll discuss that next time.

Kamis, 07 Mei 2009

Dihydro-Vitamin K1

Step right up ladies and gents; I have a new miracle vitamin for you. Totally unknown to our ignorant pre-industrial ancestors, it's called dihydro-vitamin K1. It's formed during the oil hydrogenation process, so the richest sources are hydrogenated fats like margarine, shortening and commercial deep fry oil. Some of its benefits may include:
Dihydro-vitamin K1 accounts for roughly 30% of the vitamin K intake of American children, and a substantial portion of adult intake as well. Over 99 percent of Americans have it in their diet. Research on dihydro-vitamin K1 is in its infancy at this point, so no one has a very solid idea of its effects on the body beyond some preliminary and disturbing suggestions from animal experiments and brief human trials.

This could be another mechanism by which industrially processed vegetable oils degrade health. It's also another example of why it's not a good idea to chemically alter food. We don't understand food, or our bodies, well enough to know the long-term consequences of foods that have been recently introduced to the human diet. I believe these foods should be avoided on principle.

Senin, 04 Mei 2009

Pastured Eggs

Eggs are an exceptionally nutritious food. It's not surprising, considering they contain everything necessary to build a chick! But all eggs are not created equal. Anyone who has seen the tall, orange yolk, viscous white, and tough shell of a true pastured egg knows they're profoundly different. So has anyone who's tasted one. This has been vigorously denied by the American Egg Board and the Egg Nutrition Council, primarily representing conventional egg farmers, which assert that eggs from giant smelly barns are nutritionally equal to their pastured counterparts.

In 2007, the magazine Mother Earth News decided to test that claim. They sent for pastured eggs from 14 farms around the U.S., tested them for a number of nutrients, and compared them to the figures listed in the USDA Nutrient Database for conventional eggs. Here are the results per 100 grams for conventional eggs, the average of all the pastured eggs, and eggs from Skagit River Ranch, which sells at my farmer's market:

Vitamin A:
  • Conventional: 487 IU
  • Pastured avg: 792 IU
  • Skagit Ranch: 1013 IU
Vitamin D:
  • Conventional: 34 IU
  • Pastured avg: 136 - 204 IU
  • Skagit Ranch: not determined
Vitamin E:
  • Conventional: 0.97 mg
  • Pastured avg: 3.73 mg
  • Skagit Ranch: 4.02 mg
Beta-carotene:
  • Conventional: 10 mcg
  • Pastured avg: 79 mcg
  • Skagit Ranch: 100 mcg
Omega-3 fatty acids:
  • Conventional: 0.22 g
  • Pastured avg: 0.66 g
  • Skagit Ranch: 0.74 g

Looks like the American Egg Board and the Egg Nutrition Council have some egg on their faces...

Eggs also contain vitamin K2, with the amount varying substantially according to the hen's diet. Guess where the A, D, K2, beta-carotene and omega-3 fatty acids are? In the yolk of course. Throwing the yolk away turns this powerhouse into a bland, nutritionally unimpressive food.

It's important to note that "free range" supermarket eggs are nutritionally similar to conventional eggs. The reason pastured eggs are so nutritious is that the chickens get to supplement their diets with abundant fresh plants and insects. Having little doors on the side of a giant smelly barn just doesn't replicate that.

Sabtu, 02 Mei 2009

Iodine

Iodine is an essential trace mineral. It's required for the formation of activated thyroid hormones T3 and T4. The amount of thyroid hormones in circulation, and the body's sensitivity to them, strongly influences metabolic rate. Iodine deficiency can lead to weight gain and low energy. In more severe cases, it can produce goiter, an enlargement of the thyroid gland.

Iodine deficiency is also the most common cause of preventable mental retardation worldwide. Iodine is required for the development of the nervous system, and also concentrates in a number of other tissues including the eyes, the salivary glands and the mammary glands.

There's a trend in the alternative health community to use unrefined sea salt rather than refined iodized salt. Personally, I use unrefined sea salt on principle, although I'm not convinced refined iodized salt is a problem. But the switch removes the main source of iodine in most peoples' diets, creating the potential for deficiency in some areas. Most notably, the soil in the midwestern United States is poor in iodine and deficiency was common before the introduction of iodized salt.

The natural solution? Sea vegetables. They're rich in iodine, other trace minerals, and flavor. I like to add a 2-inch strip of kombu to my beans. Kombu is a type of kelp. It adds minerals, and is commonly thought to speed the cooking and improve the digestion of beans and grains.

Dulse is a type of sea vegetable that's traditionally North American. It has a salty, savory flavor and a delicate texture. It's great in soups or by itself as a snack.

And then there's wakame, which is delicious in miso soup. Iodine is volatile so freshness matters. Store sea vegetables in a sealed container. It may be possible to overdo iodine, so it's best to eat sea vegetables regularly but in moderation like the Japanese.

Seafood such as fish and shellfish are rich in iodine, especially if fish heads are used to make soup stock. Dairy is a decent source in areas that have sufficient iodine in the soil.

Cod liver oil is another good source of iodine, or at least it was before the advent of modern refining techniques. I don't know if refined cod liver oil contains iodine. I suspect that fermented cod liver oil is still a good source of iodine because it isn't refined.


 

ZOOM UNIK::UNIK DAN UNIK Copyright © 2012 Fast Loading -- Powered by Blogger