Obesity-related cancers rising, threatening gains in U.S. cancer rates

The rates of 12 obesity-related cancers rose by 7 percent from 2005 to 2014, an increase that is threatening to reverse progress in reducing the rate of cancer in the United States, U.S. health officials said on Tuesday.

According to the U.S. Centers for Disease Control and Prevention, more than 630,000 people in the United States were diagnosed with a cancer linked with being overweight or obese in 2014.

Obesity-related cancers accounted for about 40 percent of all cancers diagnosed in the United States in 2014. Although the overall rate of new cancer diagnoses has fallen since the 1990s, rates of obesity-related cancers have been rising.

“Today’s report shows in some cancers we’re going in the wrong direction,” Dr. Anne Schuchat of the CDC said on a conference call with reporters.

According to the International Agency for Research on Cancer, 13 cancers are associated with overweight and obesity. They include meningioma, multiple myeloma, adenocarcinoma of the esophagus, and cancers of the thyroid, postmenopausal breast, gallbladder, stomach, liver, pancreas, kidney, ovaries, uterus and colon and rectum (colorectal).

In 2013-2014, about two out of three U.S. adults were considered overweight or obese. CDC researchers used the U.S. cancer statistics database to see how obesity was affecting cancer rates. Although cancer rates rose in 12 of these cancers from 2005 to 2012, colorectal cancer rates fell by 23 percent, helped by increases in screening, which prevents new cases by finding growths before they turn into cancer.

Cancers not associated with overweight and obesity fell by 13 percent.

Not surprisingly, about half of Americans are not aware of this link. The findings suggest that U.S. healthcare providers need to make clear to patients the link between obesity and cancer, and encourage patients to achieve a healthy weight.

These trends are concerning, There are many good reasons to strive for a healthy weight, and now you can add cancer to the list. However, the science linking cancer to obesity is still evolving, and it is not yet clear whether losing weight will help individuals either prevent cancer or ameliorate it once a cancer has taken root.

What is clear is that obesity can raise an individual’s risk of cancer, and that risk is likely to be reduced by maintaining a healthy weight. But in any event, why take that risk






A new report shows that what you eat really can be preventive

More than 130,000 men and women are told they have colon or rectal cancer every year, making it the third most commonly diagnosed cancer, according to the American Cancer Society.

But a new report from the American Institute for Cancer Research and the World Cancer Research Fund provides new evidence that the right eating and exercise plan can really help lower your risk of developing the disease.

In the report, researchers analyzed 99 studies with data on 29 million people.

“The findings are clear that diet and lifestyle play a major role,” says lead author Edward L. Giovannucci, M.D., Sc.D., professor of nutrition and epidemiology at the Harvard TH Chan School of Public Health. “Despite its prevalence, colorectal cancer is a highly preventable disease.”

Foods To Eat More Of

It’s long been suspected that eating more whole grains will reduce your risk of colon cancer, but this is the first time that it has been confirmed.

“Until recently, there had not been many studies that directly examined whole grain intake and subsequent colorectal cancer risk in large populations,” says Giovannucci. “But now we have enough research to say the link has strong evidence.”

In fact, eating about three servings of whole grains a day can lower colorectal cancer risk by 17 percent. (One serving is equal to 1 cup of ready-to-eat cereal, a slice of bread, or ½ cup of cooked rice or pasta.)

Why do whole grains help?

“Fiber is one of the keys to prevention of colon cancer,” says Michael A. Valente, M.D., a colorectal surgeon at the Cleveland Clinic, who was not involved in the AICR/WCRF report. “But we suspect that it’s really the thousands of nutrients, minerals, and other natural chemical compounds present in foods that are high in fiber—such as whole grains and fruits and vegetables—that are helping to prevent cancer, not just the fiber itself.”

Many of these compounds have what the report called “plausible anti-carcinogenic properties.” Which is why, in addition to eating more whole grains, it’s smart to increase consumption of fiber-rich fruits and vegetables as well.

Foods To Cut Back On

The researchers found that eating a lot of red meat (such as beef and pork) and processed meat (such as bacon, cold cuts, and sausage) were potentially harmful.

Every 1.8 ounces a day of processed meat increased risk by as much as 16 percent, while eating more than about 17½ ounces of red meat a week was labeled a “probable cause” of colorectal cancer.

One theory as to why these meats increase colon cancer risk is that they have high levels of iron derived from blood, which has been shown to promote the growth of colorectal tumors.

The connection between alcohol and colorectal cancer was also “convincing,” according to the report, and was especially strong for those who drink more than 30 grams per day (the equivalent of about two glasses of wine, or two cocktails or two beers).

“If you do consume alcohol, keep your intake moderate,” recommends Giovannucci.

Other Steps You Can Take

Getting more whole grains and veggies, and less meat may have another risk-reducing benefit: helping you to maintain a healthy weight. According to the report, there is strong, convincing evidence that people who are overweight are more likely to develop colon cancer.

All types of physical activity—not just formal exercise—was protective, too, with the most active people having about 20 percent lower risk of colon (but not rectal) cancer than the least active.

The report did not cover screening for colon cancer, but it’s a preventive move that deserves mention, and detailed below. Colorectal cancer usually develops over 10 to 15 years without causing symptoms. Most cases start as noncancerous polyps in the lining of the large intestine or the rectum. Detecting and removing polyps prevents them from developing into cancer.

You should have a colonoscopy every five to 10 years staring at age 50.

And if you have a close relative who had colorectal cancer, you should be even more vigilant about changing your lifestyle and getting regular screenings.

“Having a first-degree relative (mother, father, sibling) with the disease increases your risk by nearly 100 percent compared to the average person,” says N. Jewel Samadder, M.D., a gastroenterologist at the Mayo Clinic and expert with the American Gastroenterological Association.

If that’s you, experts recommend that, in addition to improving your diet, weight, and activity level, you start getting colonoscopies at age 40.

Can a Daily Aspirin add to Prevention of Colon Cancer?

New guidelines suggest aspirin can prevent some forms of cancer, but taking one isn’t a good idea for everyone

A recent analysis by the U.S. Preventive Services Task Force suggests that aspirin might lower your risk of certain cancers, especially colon cancer if it’s taken long term. But you shouldn’t take aspirin for cancer prevention alone. That’s because the drug also poses risks—in particular the risk of dangerous bleeding in the stomach and brain—that may outweigh its possible protective effect against cancer.

But if you and your doctor decide that taking a daily, low-dose aspirin (81 mg, or a “baby aspirin”) is a good way to reduce your risk of heart disease, then think of a reduced risk of colon cancer as a bonus.

Considerable research going back decades shows that taking low-dose aspirin can help prevent heart attacks and ischemic strokes (the kind caused by blood clots) in people at high risk for cardiovascular disease. Now, researchers at the Preventive Services Task Force, an independent, volunteer panel of experts in prevention and evidenced-based medicine, have looked back at those studies to see what effect aspirin might have had on the risk of cancer. Combined data from three large studies involving 47,464 people suggested that, compared to people who didn’t take aspirin, those who did reduced their risk of colon cancer by about 40 percent, but only 10 to 19 years after they started taking the drug.

Uncertain Benefits vs. Known Harms

Encouraging results make a compelling case for ongoing, high-quality research looking at various cancers, but current evidence doesn’t support taking aspirin solely to prevent colon cancer. The evidence to date has to be interpreted cautiously, because it comes largely from a small set of older trials on cardiovascular disease prevention that were not set up to study the effect of aspirin on cancer. As a result, one cannot issue a blanket recommendation for the use of aspirin specifically for prevention of any cancers.


Been putting off that colonoscopy? A new review evaluates the other screening options.

Colonoscopy has long been touted as the gold standard for colon cancer screening, recommended for all adults starting at age 50. With colon cancer expected to kill more than 49,000 Americans this year, getting a colonoscopy is currently the best way to reduce your risk. But, many people avoid colonoscopy because it includes an unpleasant 12-hour prep that includes drinking copious amounts of laxative and many trips to the bathroom, followed by the procedure itself, which costly and typically requires anesthesia.

Instead, consumers may want to opt for one of the two at-home colon-cancer screening tests available by prescription.

A new review published in JAMA by David Lieberman, M.D., professor of medicine and chief of gastroenterology at Oregon Health and Science University in Portland, concludes that home tests may be a decent first-step screener—although patients still need a colonoscopy if the kits find a worrisome result. What’s more, the kits can miss polyps, including precancerous growths that a doctor can spot and remove at the time of the colonoscopy.

                AT-HOME KITS

     FIT Test (Fecal Immunochemical Test)

The second most commonly prescribed colon-cancer screening test in the U.S. after colonoscopies, FIT tests have been in use for about 10 years.

The FIT test requires sending a single small fecal sample to a lab, which is then tested for blood. It’s a test that should be repeated annually, unlike colonoscopy, which is typically required just once every 10 years.

A person may have a cancer that isn’t bleeding at the time of the test, but that same tumor may bleed and be detected when the person is retested the following year. Research shows that this type of test detects cancer with 79 percent accuracy. But about five percent of tests deliver “false positive” results—which result in patients having to go for follow-up colonoscopies.

       Multitarget Stool DNA Test

This test goes by the brand name Cologuard. It requires shipping an entire bowel movement to the lab. In addition to testing for blood, Cologuard looks for DNA from cancer cells scraped from the intestinal wall by feces as it passes through.

Studies have shown that this type of test detects cancer with 92 percent accuracy. However, 14 percent of tests deliver false positive results—far higher than the FIT test.

Another concern with this test is its sheer newness. Because Cologuard only received Food and Drug Administration approval in 2014, there are no studies showing that people who choose this screening method avoid dying of colon cancer in the long-term.

Studies also have not yet established the appropriate interval between testing, though the U.S. Preventive Services Task Force (USPSTF), an independent panel of health experts that advises the government, recommends repeating the test every one or three years.


If you have a personal or family history of colon cancer, then regular colonoscopies clearly are the best choice. For the rest of us, the best approach is less clear, but until more research data are available, colonoscopy (sadly) is still the safest route.

Sorry to disappoint!



A recent large study assessed the correlation between food intake and cardiovascular disease and deaths in the Middle East, South America, Africa, or south Asia. A link was brought to light between increased fruit, vegetable, and legume consumption with a lower risk of cardiovascular and total mortality. Maximum benefits could be derived for total mortality at three to four servings of any of these components per day (equivalent to 12 to 17 ounces per day) .

This study evaluated 135,335 individuals aged 35 to 70 years without cardiovascular disease. Enrollees were selected from 613 communities in 18 low-income, middle-income, and high-income countries in seven geographical regions: North America and Europe, South America, the Middle East, south Asia, China, southeast Asia, and Africa. Their diet involved country-specific food frequency questionnaires at baseline. The data contained demographic factors, socioeconomic status (education, income, and employment), lifestyle (smoking, physical activity, and alcohol intake), health history and medication use, and family history of cardiovascular disease.

The main outcomes were cardiovascular diseases of all types, cardiovascular mortality, non-cardiovascular mortality, and total mortality. They assessed the correlations between fruit, vegetable, and legume consumption with risk of cardiovascular disease events and mortality.

The study covered 10 years beginning in 2003, and was concluded at the end of March, 2017. Combined mean fruit, vegetable and legume intake averaged 3.9 servings per day. During a median 7.4 years of follow-up, the following events were reported: 4784 major cardiovascular disease events, 1649 cardiovascular deaths, and 5796 total deaths.

Higher total fruit, vegetable, and legume intake displayed an inverse correlation with major cardiovascular diseases and total mortality in the models adjusted for age, sex, and random effects. The overall hazard ratio for total mortality was lowest for three to four servings per day compared with the groups taking substantially less.  Interestingly, there was no additional decrease in hazards with higher consumption.

Fruit intake was related to lower risk of cardiovascular, non-cardiovascular, and total mortality.

Legume intake was also inversely linked with non-cardiovascular death and total mortality.

For vegetables, raw vegetable intake strongly correlated with a lower risk of total mortality.  In contrast, cooked vegetable intake exhibited a modest benefit against mortality. 


This study adds further support for what we have recommending for many years. In general, dietary patterns in the U.S. have been trending in this direction, but for many, obesity seems to be nullifying much potential benefit. I was a bit surprised by the suggestion that raw vegetables might be better than their raw counterparts, but any firm conclusions will await further study.

Although the study did not detail the causes for reductions of non-cardiovascular mortality, from what we already know, it is likely that various cancers were likely prevented as well.





Unnecessary medical care is common in the United States, and a fear of malpractice seems to be an important driver for ordering unneeded tests and treatments, a new survey found.  Other factors include patient demand and doctors’ desire to boost profits, the researchers said.

“Unnecessary medical care is a leading driver of the higher health insurance premiums affecting every American,” said study senior author Dr. Martin Makary, professor of surgery and health policy at Johns Hopkins University School of Medicine in Baltimore.

Unneeded medical care accounts for a large chunk of wasted health care resources and costs in the United States and leads to about $210 billion in extra spending each year, according to the National Academy of Medicine.

The researchers surveyed more than 2,000 U.S. doctors in a wide variety of specialties and found that most believed 15 to 30 percent of medical care is not needed, including 22 percent of prescription medications, 25 percent of medical tests, 11 percent of procedures and 21 percent of overall medical care.

Leading reasons cited by the doctors for overuse of medical resources were fear of malpractice (85 percent), patient pressure/request (59 percent), difficulty accessing prior medical records (38 percent), and profit (17 percent).

Specialists and doctors with at least 10 years of experience after residency were more likely to believe that doctors perform unnecessary procedures when they stand to profit, according to the study.

“Interestingly, but not surprisingly, physicians implicated their colleagues [more so than themselves] in providing wasteful care. This highlights the need to objectively measure and report wasteful practices on a provider or practice level so that individual providers can see where they might improve,” said study co-author Dr. Daniel Brotman, a professor of medicine at Hopkins.

The respondents said the best ways to reduce unneeded care include training medical residents on appropriateness criteria for care (55 percent), easy access to outside health records (52 percent), and more evidence-based practice guidelines (51.5 percent).

“Most doctors do the right thing and always try to, however, today ‘too much medical care’ has become an endemic problem in some areas of medicine. A new physician-led focus on appropriateness is a promising homegrown strategy to address the problem,” Makary said in a university news release.


These data suggest some appropriate responses for individual patients: First, when any test such as an X-ray of the back, neck, or elsewhere, is recommended, one should question its need. Furthermore, can the test be done at a later date and how much will any given result affect the resulting treatment? Many suspected ailments will subside with time and render any testing unnecessary. Will the management be altered regardless of the test’s outcome? If not, maybe it can be avoided altogether. Moreover, don’t insist on a given test or procedure if the physician believes it is unnecessary or can wait until later—even if you have insurance that with cover much of the cost. Before any prescription is given, ask whether cheaper alternatives—especially generics—are just as effective. Even more important, perhaps a given prescription can be avoided altogether without any consequences to health.

These are some of the tips that can empower patients to participate in reduction of overall costs of healthcare in this entire nation.



Eating almonds on a regular basis may help boost levels of the good (HDL) cholesterol while simultaneously improving the way it removes cholesterol from the body. According to researchers, who, in a recent study, compared the levels and function of high–density lipoprotein (HDL cholesterol) in people who ate almonds every day, to comparable levels of the same group of people when they ate a muffin instead. The researchers found that while participants were on the almond diet, their HDL levels and functionality improved. The study, published in the Journal of Nutrition, builds on previous research on the effects of almonds on cholesterol–lowering diets. The researchers wanted to see if almonds could not just increase the HDL levels but also improve the function of this component, which works by gathering cholesterol from tissues, like the arteries, and helping to transport it out of the body.
HDL is very small when it gets released into circulation, and acts like a garbage bag that slowly gets bigger and more spherical as it gathers cholesterol from cells and tissues before depositing them in the liver to be broken down.
Depending on how much cholesterol it has collected, HDL cholesterol is categorized into various subpopulations, which range from the very small to the larger, more mature forms. The researchers hoped that eating almonds would result in more larger particles, which would signal improved HDL function.
In a controlled–feeding study, 48 men and women with elevated LDL cholesterol participated in two six–week diet periods. In both, their diets were identical except for the daily snack. On the almond diet, participants received 43 grams — about a handful — of almonds a day. During the control period, they received a banana muffin instead. The researchers found that, compared to the control diet, the almond diet increased HDL particles to their largest size and most mature stage — by 19 percent. They were able to show that there were more larger particles in response to consuming the almonds compared to not consuming almonds, which would translate to the smaller particles doing what they’re supposed to be doing. They’re going to tissues and pulling out cholesterol, getting bigger, and taking that cholesterol to the liver for removal from the body. An increase in this particular HDL subpopulation is meaningful, because the particles have been shown to decrease overall risk of cardiovascular disease.
If people incorporate almonds into their diet, they should expect multiple benefits, including ones that can improve heart health. Obviously, they’re not a cure–all, but when eaten in moderation – and especially when eaten instead of a food of lower nutritional value – they’re a great addition to an already healthy diet. Other nuts may provide similar benefits, but they have not been studied in this fashion. Nevertheless, they may provide other benefits as well, such as in cancer prevention, as we present below.

A recent study showed that nut and peanut butter consumption can reduce the risk of esophageal and gastric cancer. Previous studies had suggested that nut consumption has been associated with decreased risk of colorectal, endometrial, lung, and pancreatic cancers. Polyphenols, fiber, vitamins, and minerals in nuts may confer this observed protective effect. Up to now, no prospective study has evaluated the effect of nut consumption on esophageal and gastric cancers. The objective was to evaluate the associations between nut and peanut butter consumption and the risk of esophageal and gastric cancers and their different subtypes. The most recent study used data from the NIH-AARP Diet and Health Study, which enrolled 566,407 persons who were 50–71 years old at baseline (1995–1996). The median follow-up time was 15.5 years. Intakes of nuts and peanut butter were assessed through the use of a validated food-frequency questionnaire. Statistical models estimated risks for esophageal and gastric cancers. Compared with those who did not consume nuts or peanut butter [lowest category of consumption], participants in the highest category of nut consumption had a lower risk of developing the most common type of stomach cancer  The same association was also seen for peanut butter consumption.

This information is added to what we already know about nuts in general. Almost all nuts provide good sources of caloric energy, primarily from unsaturated fats (oils), they are useful also for lowering cholesterol. Moreover, the essential amino acids contained in nuts are vital for constructing protein, i.e., the building blocks for our muscles and other tissues. Although each type of nut does not supply, in itself, a complete source of these amino acids, consuming a variety of nuts will provide a complete complement of the various necessary (essential) components. Other nutritional elements provided by nuts include folic acid, vitamin E, potassium, magnesium, and calcium. Especially noteworthy is their uniformly low sodium content, a highly desirable feature (provided that no salt is added). They also contain polyphenols, bioactive constituents that seem to be beneficial to heart health that extends beyond other dietary constituents.

During the past 20 years, mounting evidence indicates that consuming all nuts (including peanuts and peanut butter) at least twice weekly provides substantial protection from cardiovascular disease and overall death rates as compared to those consuming them only rarely or not at all. These desirable results seem to share the stage with almonds, as noted above, primarily through the rearranging of cholesterol components, and despite a substantial caloric content, nuts have less tendency to promote obesity, probably because of their prominent satiating effect. For unknown reasons, nuts also appear to prevent diabetes, another contributor to cardiovascular disease. Research studies have also indicated that, if the “Mediterranean” diet, which, in itself is healthy, is supplemented by extra mixed nuts (one ounce daily) and extra virgin olive oil (one quart total per week), substantial additional reductions of cardiovascular disease and stroke can be accomplished.

The bottom line? Forget the junk food and opt for any kind of nuts, whether with meals or as free-standing snacks!





How can you keep snacking from derailing your healthy eating program, not to mention weight control? Try these tips.

Don’t skip meals. Skipping meals may seem like a good way to cut calories, but in fact this just makes you so hungry later in the day that you’re vulnerable to devouring mega-portions of snack food in order to supply your body with easily digested sugars.

Keep junk food out of the house. There’s a lot of truth to the old joke about the “see-food diet” — you see food and you eat it. The opposite is also true. If you don’t have junk food lying around, the sight of it won’t tempt you, so don’t even bring it home. After all, you can’t eat what isn’t there. Or, if someone in your household tends to have chips or other unhealthful snacks, put them out of sight.

Snack mindfully. Have you ever watched a show on television with a bag of chips or pint of ice cream in hand, only to find that it was all gone before you knew it? This type of mindless eating can pack on a lot of unwanted calories. The solution is simple. Try not to snack while doing something else like surfing the Web, watching TV, or working at your desk. Instead, stop what you’re doing for a few minutes and pay attention to your snack. Savoring a piece of fine chocolate can be more satisfying than gobbling down a whole chocolate bar.

Remember, you can take it with you. Think ahead and carry a small bag of healthful snacks in your purse or the glove compartment of your car. If you have a healthy snack handy — preferably, one you really like — you won’t turn in desperation to the calorie-laden cookies at the coffee counter or the candy. My preference is a wide variety of nuts—peanuts, pistachios, etc. It’s best to consume them in their salt-free form. Popcorn is also a viable option, but again without salt or other high caloric additives. Additional ideas are provided in a previous post: http://www.mortontavel.com/2015/04/05/

Zero in on hunger. Before you snack, ask yourself, “Am I truly hungry?” Many of us mistake emotions, such as stress and fatigue, for hunger. If the answer is yes (your stomach feels hollow, your head is achy), make sure you’re not confusing hunger with thirst. Drink an 8-ounce glass of water, then wait 10 to 15 minutes. If you’re still hungry, have a healthful snack.

Know your cravings. If you want a snack, but you’re not hungry, attack cravings from a psychological level. Ask yourself how you’re feeling. Lonely? Bored? Stressed? Then, ask yourself the bigger question: Will food fix this problem? The answer is always no. Eating a cookie, for example, won’t address a problem at work that you’re worried about. Go for a walk around the block, do a few stretches, put on some music, or choose another simple activity that might distract you or boost your mood. Then if you still want the food, fine. Ask yourself what food you really want. Then eat only a small amount, and make it good. If you’re craving chocolate, for example, eat one small square and savor it. It’s important that you snack on what you’re craving rather than deny the craving. Eating around a craving may only cause you to eat more because the craving isn’t satisfied.

Hopefully, these tips might make life a bit more pleasant and free of that undesirable excess weight!


Health benefits of physical activity: A recent update explaining how much and what kind is most effective.

   A recent review clearly explains the health benefits of physical activity and exercise; virtually everyone can benefit from becoming more physically active. Most international guidelines recommend a goal of 150 min/week of moderate-to-vigorous intensity physical activity. Many agencies have translated these recommendations to indicate that this volume of activity is the minimum required for health benefits. However, recent evidence has challenged this threshold-centered messaging as it may not be evidence-based and may create an unnecessary barrier to those who might benefit greatly from simply becoming more active. This review summarizes recent information that has examined the relationship between physical activity and health status.

The Findings

Assessment of a large body of data (based largely on epidemiological studies consisting of large cohorts) have demonstrated a dose–response relationship between physical activity and premature mortality and the primary and secondary prevention of several chronic medical conditions. The relationships between physical activity and health outcomes are generally curvilinear such that marked health benefits are observed with relatively minor volumes of physical activity. These findings challenge current threshold-based messaging related to physical activity and health. They emphasize that clinically relevant health benefits can be accrued by simply becoming more physically active. This information is summarized nicely on the following Video:    http://links.lww.com/HCO/A42.

Conclusion: There’s hope for us all!



The American Heart Association (AHA) has issued a new “Presidential Advisory” on dietary fats and cardiovascular disease to “set the record straight” on the dangers of saturated fats.

The statement by the American Heart Association, June 15, continues to strongly recommend replacing saturated fats with poly- and monounsaturated vegetable oil to help prevent heart disease.

The statement also recommends that the shift from saturated to unsaturated fats should occur simultaneously in an overall healthful dietary pattern, such as DASH (Dietary Approaches to Stop Hypertension) (http://www.mortontavel.com/2013/05/02/) or the Mediterranean diets, and that “good carbohydrates,” such as whole grains and whole fruits, are other appropriate foods to substitute for saturated fats.

“We want to set the record straight on why well-conducted scientific research overwhelmingly supports limiting saturated fat in the diet,” stated lead author, Frank Sacks, MD, professor of cardiovascular disease prevention at the Harvard T.H. Chan School of Public Health, Boston, Massachusetts.

Outside nutritional experts were fully supportive of the advisory, describing it as “an outstanding paper,” “exacting in its review of the evidence,” and “full of common sense.”

The AHA leadership decided that they needed to put out a new advisory on diet — particularly fats — because of various commentators on nutrition suggesting that saturated fat was innocuous, which has been widely covered in the media, but these comments were not scientifically based.

Unfortunately, a great deal of attention has been paid to controversial new studies that are not scientifically rigorous. A growing trend of media articles focusing on small studies suggests that some saturated fats are “good for you.”  Some people suggest that eating butter and full-fat milk is beneficial. And coconut oil is a fad right now — but it is actually a saturated fat, which raises your LDL [low-density lipoprotein], so the AHA wanted to look at the issue again.

The AHA president issued an advisory identifying this as a key issue that needed attention. Conclusions were based on careful scientific review, organized in a very systematic way, involving experts from a wide range of fields who have looked very carefully at the literature. Then the recommendations were thoroughly vetted and passed through multiple levels of peer review and scientific advisory committees across the entire AHA. This statement focused on fats — what fats should we be eating — and they concluded very strongly that we all should eat less saturated fats, and these can be replaced by polyunsaturated and monounsaturated fats. This recommendation was supported by multiple scientific studies that demonstrated that by lowering intake of saturated fat and replaced it with polyunsaturated vegetable oil, cardiovascular disease is reduced by approximately 30%. In addition, several studies found that coconut oil — which is predominantly saturated fat but widely touted as healthy — raised LDL cholesterol the same way as did other saturated fats found in butter, meat, and palm oil.

Their message is that “polyunsaturated fats are the best fats to eat. They are found mainly in vegetable oils such as soy bean oil, peanut oil, corn oil. Monounsaturated fats, found in sunflower oil, olive oil, nuts, and avocado, are also okay — much better than saturated fats, and may be as healthy as polyunsaturated fats.”

The last few years has seen an increase in knowledge on benefits of polyunsaturated fats. They are associated with a reduction in total mortality and no compensatory increase in death from other causes; they are also associated with a reduction in insulin resistance, helpful in combating diabetes.

   The AHA document is full of common sense. It is not controversial, but saturated fat is scientific code for animal fat. This statement is telling us to eat less meat, but they are not actually using those words, possibly to avoid alienating livestock farmers. There is now a weight of evidence that plant foods — which are very low in saturated fat — are beneficial. It has been shown time and time again that these foods can reduce heart disease.

    Thus here is a clear statement that should allow us all to react sensibly!



       From a recent analysis of a large amount of data, researchers discovered an increased risk of death among PPI type antacid users. That includes the popular drugs–Nexium, Protonix, Prevacid, Prilosec, Kapidex, and others. When researchers compared those agents with the other type of antacids, the H2 blockers (Zantac, Pepsid, Tagamet, Axid and others),  those taking PPIs for one to two years were found to have a 50 percent increased risk of dying over a five year period. People may have the idea that PPIs are very safe because they are readily available, but there are real risks to taking these drugs, particularly for long periods of time.

Previous published studies had linked PPIs to kidney disease, and other researchers have shown an association with other lesser health problems.  Although prior research had disclosed that each of these side effects carried a small risk of death, this study suggested that together they may affect the mortality rate of PPI users.

To find out, the researchers sifted through millions of de–identified veterans’ medical records in a database maintained by the U.S. Department of Veterans Affairs. They identified 275,933 people who had been prescribed a PPI and 73,355 people prescribed an H2 blocker between October 2006 and September 2008, and noted how many died and when over the following five years. The database did not include information on cause of death.

They found a 25 percent increased risk of death in the PPI group compared with the H2 blocker group. The researchers calculate that, for every 500 people taking PPIs for a year, there is one extra death that would not have otherwise occurred. Given the millions of people who take PPIs regularly, this could translate into thousands of excess deaths every year.

The researchers also calculated the risk of death in people who were prescribed PPIs or H2 blockers despite not having the gastrointestinal conditions for which the drugs are recommended. Here, the researchers found that people who took PPIs had a 24 percent increased risk of death compared with people taking H2 blockers.

Further, the risk rose steadily the longer people used the drugs. Within 30 days, the risk of death in the PPI and H2 blocker groups was not significantly different, but among people taking the drugs for one to two years, the risk to PPI users was nearly 50 percent higher than that of H2 blocker users.

Researchers concluded that a lot of times people get prescribed PPIs for a good medical reason, but then doctors don’t stop it and patients just keep getting refill after refill after refill, there needs to be periodic re–assessments as to whether people need to be on these.

As compared with the H2 blocker group, people in the PPI group were older and also somewhat sicker, with higher rates of diabetes, hypertension and cardiovascular disease. But these differences could not fully account for the increased risk of death.

From this information, I would conclude that for most conditions helped by the relief of excessive gastric acidity, stick with H2 blockers whenever possible, but if one must take any of the PPI group, try to limit their use to less than one month each time.



At its core, all chiropractic is based on an unscientific theory of human disease— that all or most disease results from faulty alignment of vertebrae. If chiropractic manipulation appears to solve one’s back pain, it probably wasn’t medically significant to begin with. One of the problems with chiropractic treatment is that evidence for its effectiveness is entirely anecdotal. This is because it is nearly impossible to analyze chiropractic with double blind, placebo controlled studies. If such studies could be done, they would likely prove either that chiropractic was no better than a placebo or that it offered no measurable advantage over a massage. The practice’s founder, D.D. Palmer, created the practice based on the flawed notion that the root of all human illness lies in so-called “misalignments” of the spine (as opposed to things like germs and viruses and genetic anomalies.) Palmer sold his method of “adjustments” to correct these misalignments as a way to “naturally” cure people of problems — He even went so far as to make the dubious claim of curing deafness the first time he ever laid hands on a patient

But at present, chiropractors are hot: According to the Bureau of Labor Statistics, they rake in about $81,210 per year, and their ranks are expected to grow expected to grow 17 percent in the next few years.

And it’s primarily because humans have terrible backs. We just haven’t evolved to keep up with the physical stress of gravity on a straight back, combined with desk jobs that have us crunched over computers for hours on end. A full 80 percent of Americans will deal with back pain at some point in their lives; one in five people reported back pain in the last year alone. About one-third of those folks saw a chiropractor or other alternative practitioner to deal with their back pain in 2016.

But does chiropractic work? The industry is, by definition, an alternative to evidence-based medicine, and aspects of it can be pretty worrying. At the same time, some experts say this treatment has a place.

Even in 2017, chiropractors are full of odd ideas, with many patients reporting being routinely hurt and misled. The Mayo Clinic warns that chiropractic adjustments can cause herniated disks or make already herniated discs worse. Chiropractic patients also often suffer compressed nerves, even strokes. A colleague of mine, a neurosurgeon, informed me that he has personally observed patients who have been rendered quadriplegic (paralyzed from the neck down) following chiropractic manipulation of the neck. The former chiropractor and skeptic Sam Hamola writes that many chiropractors engage in aggressive and scammy behavior to separate patients from their cash.

One of the most disturbing complaints I hear comes from chiropractic patients who have paid thousands of dollars in advance for a course of treatment lasting several months — after succumbing to a high-pressure sales pitch involving scare tactics. These patients have usually opted to discontinue treatment because symptoms have either worsened or disappeared. Most have signed a contract, however, that does not allow a refund, even if the treatment regimen was not completed. Some have used a chiropractic “health care credit card” to borrow the advance payment from a loan company, leaving the patient legally bound to repay the loan.

Edzard Ernst, an expert in pain and its treatments, who has studied the effectiveness of chiropractic medicine, has written columns suggesting that chiropracters often do more harm than good. “You will lose some cash,” he tells a questioner via email of what a typical patient might expect if they see a chiropractor. “You might get some relief in the case of back pain, but not for other conditions …In the worst case, you might be in a wheelchair for the rest of your life.”

But as those numbers above show, chiropractors are doing great financially and patients are flocking to them. There’s about one of them for every two dentists in this country. Wander around any American city or suburb and you’ll likely spot their offices sandwiched between the local FedEx and Panda Express. They’ve even found their way into hospitals, where they work alongside regular doctors and nurses. And here’s where the big BUT comes into play: Some of those doctors actually like it.

Stuart Kahn is a doctor and professor of rehabilitation and physical medicine at Mount Sinai Hospital in New York City. He treats patients with debilitating lower back pain all day. If your back goes out, he’s the guy you want to see. And, every once in a while, he sends patients to chiropractors, stating “The best thing you can do is diagnose what the cause of the back pain is, and then you can try to treat it.” And that’s something only a doctor is qualified to do. When a patient walks into his office with back pain, Kahn’s first task is to rule out cancers, infections, fractures, and other disorders that require specialized treatment. He can also prescribe anti-inflammatory medicines that can reduce swelling and pressure on the spinal cord, saving patients from further pain and damage. But most of Kahn’s patients fall into two baskets: Either they will live with chronic back pain for the rest of their lives, or they have some acute problem, like a slipped disc, that needs to be dealt with. For the latter group, he works on building a treatment regime that can lessen their daily pain and improve their range of motion and quality of life. The most important part of that regime is usually physical therapy. After a period of work, he said, “they’re more flexible, their core is stronger, they have better posture at work, they try to cut out the exercises that trigger the episodes.”

But the former group consists of a narrow subset of those patients who require management not only for pain, but for accompanying stress and emotion, and this is the group that he thinks might be helped by chiropractic management. In these cases, improvement is likely attributed largely to the so-called “placebo effect,” i.e. the emotional lift that can make a useless “treatment” actually suppress physical pain. As I have described in detail¥, the placebo effect can be quite powerful, especially when accompanied by an attentive and sympathetic therapist combined with physical contact (“laying on of hands”). There is none better positioned to fulfill these criteria than a chiropractor. Thus there does seem to be some people who a few chiropractors can help, but truly physical benefit is questionable. In any case, however, those adjustments should be part of a course of treatment recommended by a medical expert, not the bloke hawking $10,000 neck twists next to Denny’s.

CONCLUSION: If your back hurts, see a doctor (MD type), and then let him/her decide whether you should consider a chiropractor. My preference, however, is to stick with a licensed physical therapist and forgo chiropractic entirely.



¥ Tavel, ME, “Snake Oil is Alive and Well: The Clash between Myths and Reality. “Reflections of a Physician”. Brighton Press, Inc. Chandler, Arizona, 2012

Tavel ME. The Placebo Effect: The Good, The Bad, and The Ugly. The American Journal of Medicine. 2014; 127(6):484–488




      There are those who believe that health care lends itself to the usual market forces, meaning that competition will bring about the best products at the lowest prices. For instance, comparative quality ratings and pricing for items such as autos and vacuum cleaners allow us to obtain the best products at the lowest possible costs.

But do these principles apply to health care? Clearly not, for several reasons. For example, if we suddenly become ill and need an ambulance, we summon the nearest local provider with its prevailing charges (which can be substantial), then taken to a nearby medical facility and usually charged exorbitant rates for emergency and/or hospital services.  If you are lucky enough to possess decent insurance, you will be billed according to whatever has—or has not—been agreed to with the ambulance service and hospital as reasonable compensation, and you will usually be required to ante up for any co pays or deductible amounts in your contract. All prices are entirely out of your control and obviously not subject to free market forces. And so it goes through the entire spectrum of medical care that includes drugs and devices, doctors’ fees, and numerous tests and additional services. At the end of this process, you are apt to receive an incomprehensibly large bill that is not coupled with reality or market forces, and even if you aren’t responsible for most of the payment, the money must come from somewhere, for your insurance carrier is not a philanthropical organization.

So, how much are we contributing as a nation to these healthcare expenditures? The bills total approximately $3 trillion annually, or about 17% of our overall economy. Of that total, hospital bills account for 40-50%; tests and ancillary services, 20-30%; doctors, 20%; drugs and devices, 15%: and nursing homes 5%. These amounts generally are twice the total expenditures of other western countries, which generally range from 6-11% of their respective economies. Since our outcomes seem not superior to those of the aforementioned nations, some experts even argue that our illness and mortality rates are even worse than those of other countries. But complicating factors such as obesity, poor lifestyle choices and others, may account for poor outcomes in this nation. Nevertheless, at best, our healthcare system is providing no clear advantage over those of other nations.

So how do we explain all this? We are overpaying for virtually all components of our health care system. We are doing so because there are few if any restraints on the charges. For instance, Obamacare put no controls on the pricing of drugs or clinical care. Pharmaceutical companies’ charges are not only unrestrained, but they can often “game” the system to overcharge for older, generic drugs. The insurance carriers were granted unrestricted leeway in setting premiums and deductibles in exchange for allowing policies that provide maternity and preventive care and that mandate coverage for patients with preexisting conditions.  Hospitals can pad their bills through the use of opaque charges that include all sorts of add-ons and “facility” fees, making them all but impossible to decipher. Fortunately, Medicare serves as a partial restraint on many of these excesses: For instance it applies a so-called Disease Related Grouping (DRG) system to bundle and restrain allowable hospital charges for given diseases/and or treatments. Although this system does restrain charges somewhat, medical purveyors often use other means’ to circumvent these amounts, and hospitals can still bill private insurers at higher rates, depending on prior agreements. Also, coding of procedures and even physician’s services has become a science of gaming to extract the highest possible tariffs. In all cases, those who are uninsured—and are least likely to afford them—receive the highest bills.

In order to understand these large expenditures, we can learn from other countries’ experiences. Although there are several contrasting systems, they all employ governmental price controls coupled with universal participation. In Germany and France, for instance, all individuals must be insured. Most people purchase state sponsored insurance, with premiums based upon one’s income. Private insurance is allowed and may supplant the base insurance for the few who can afford deluxe services. In Canada and Australia, a single payer system is used, analogous to Medicare for all, making private insurance unnecessary. In the United Kingdom and Denmark, an extensive health care structure includes a single payer system with state ownership of hospitals and medical infrastructure. Notably, all these systems couple price controls for services, together with the requirement for participation by the entire population, a factor that spreads the costs widely and is sufficient to cover all those with “pre-existing” conditions.

The U.S. could adopt any of these methods, but a single payer (“public option”, or Medicare for all) would seem to be the most direct and cost-effective. Administrative costs for Medicare average about 2-3%, in comparison to about 20-30% of most private insurers. Even under the mandate by the ACA (“Obamacare”) to limit these costs provided by private insurers to 15% of total outlays (the other 85% devoted the health care), this is still a significant amount. Moreover, private companies can encourage larger medical bills, thus increasing the overall size of their pool but passing on the costs to those who are insured. This means that the 15% could be substantially greater as a portion of the larger pie, allowing CEOs and other directors to receive millions in compensation.

Expanded Medicare would not preclude the addition of supplemental private insurance, as we now have in combination with its basic coverage. An overall plan must be empowered to limit prices for all drugs, procedures, and hospital bills, which would control the entire cost structure of the medical system, allowing us to emulate costs of other western countries. Nationwide pharmaceutical prices must be subjected to negotiated limits as placed by Medicare or related agencies.  A single payer system would also simplify record keeping and unify documents, reducing time required by physicians and office personnel.

Obviously this is but a start, and other issues must be addressed that are too numerous to enumerate here.

In all cases, rational solutions must contain two vital components: 1) Mandatory participation by the entire population. 2) Careful and rational control of all expenditures

Any program lacking these two vital components will be, at best, too costly, or, at worst, socially unacceptable or disastrous.




The Mounting Evidence Against Diet Sodas

Studies suggest possible links between low-calorie beverages and health risks, though more research is needed

Many people think of diet sodas as healthy, low-calorie alternatives to sugary drinks. Yet a small but growing body of evidence suggests that diet sodas may have health downsides and may not even provide the benefits some people turn to them for, such as weight loss. Excess sugar intake is a problem in Western society because it contributes to obesity, diabetes, and other conditions. We know that diet beverages are becoming more popular, but we don’t have a lot of research into the effects of diet beverages on different aspects of health.

According to a 2016 study published in the Journal of the Academy of Nutrition and Dietetics, nearly half of adults and a quarter of children in the U.S. consume artificial sweeteners—and the majority do so on a daily basis. Diet drinks make up the bulk of the intake. So here is what we know so far about diet sodas and their role in health, and what you can you do to make smart beverage choices in the meantime.

               Not So Heart Smart?

The strongest evidence so far links regular diet soda intake with cardiovascular conditions, such as stroke and heart attack, as well as type 2 diabetes and obesity (which are also risk factors for cardiovascular disease). For example, a recent study of about 4,400 people age 45 and older found that those who drank one or more diet sodas every day were three times more likely to have a stroke than those who didn’t. This study, however, had several limitations and didn’t prove that diet sodas themselves caused people to have strokes. Although it could be that people who drink diet sodas are in poorer health than people who don’t, these findings do jibe with previous research, and thus strike a note of caution. For example, three large studies published between 2007 and 2009 found that people who drank diet sodas regularly were more likely to develop type 2 diabetes and had between 30 and 55 percent higher risk of metabolic syndrome (a constellation of health problems that could increase the risk of type 2 diabetes, heart disease, and stroke) than those who didn’t. Two other studies from 2012 further bolstered these results: Researchers linked daily diet soda consumption to about a 45 percent higher risk of heart attack, stroke, and early death in one study of about 2,600 people.

A Cautious Interpretation

The studies linking diet sodas and cardiovascular risk are intriguing, but they still need to be repeated in more rigorous settings. For example, all of these studies relied on participants self-reporting their dietary habits, which can introduce error because people don’t always remember what they ate. Additionally, those who drink diet sodas may already be at increased risk of conditions such as diabetes or obesity because they are unhealthy to begin with. For example, someone who is overweight may have switched from regular soda to diet soda to help control an already burgeoning waistline.

And not every study has shown that diet sodas negatively affect health. For example, in 2012 researchers from the Harvard School of Public Health analyzed the drinking habits of almost 43,000 men and found that those who drank sugary drinks had a higher risk of coronary heart disease, but those who drank diet sodas did not.

Another reason scientists hesitate to say definitively that diet sodas are bad for your health is that they aren’t sure how they increase disease risk. It’s possible that artificial sweeteners may damage blood vessels—possibly explaining their link to diseases such as diabetes and stroke. It’s also possible that the artificial sweeteners commonly used in diet sodas may “trick” the brain into craving rich, high-calorie foods, leading to weight gain. They may also cause changes in hormone levels or gut bacteria, both of which play a role in weight and insulin management. For example, a study published in the journal Nature in 2014 found that artificial sweeteners altered intestinal bacteria in people and mice, increasing their risk of sugar intolerance, a condition often preceding diabetes. However, these various ideas warrant larger, more rigorous studies.

                             What to Do

In general, your best bet is to avoid regular and diet sodas altogether. They offer little nutritional benefit, and in some cases, diet sodas may even cause headaches. For example, shortly after the artificial sweetener aspartame came onto the market in the late 1990s, one of the biggest complaints the Food and Drug Administration received about the sweetener was regarding headaches. No scientific studies have proved that aspartame or diet sodas in general cause headaches, but a review of evidence published in The Clinical Journal of Pain in 2009 suggested that large amounts of the sweetener—such as that in five or more diet soda drinks—could trigger or make headaches worse in people who are already susceptible to migraines.

In the end, an occasional soda—with sugar or artificial sweeteners—is probably fine. But your best bet is to stick with water, plain or sparkling, as much as possible. If you find unflavored water boring, add a splash of bitters with a slice of lemon or lime. Unsweetened tea is also a great choice.