I’m writing this newsletter while on a flight to New York for a Save the Children board meeting. (And I just had a glass of wine, which seems more potent at 39,000 feet!) So I am tempted (and will indeed succumb) to announce a new initiative by the LA Associates of Save the Children. Two other board members from LA and I have organized a group of women and their daughters in an effort to raise money to build several schools in Africa. The first of these schools will provide primary education in a rural area in Mozambique. A school for over 300 students ages 5 to 13 equipped with desks, books and trained teachers can be built for $140,000. (Not that much considering the $230 million it took to build a high school in LA.) There have been some amazing studies published in the medical journal Lancet that demonstrate the fact that if girls in the developing world are given a primary education, the subsequent child mortality decreases by 50%! That global number currently stands at 8 million children who die every year… half from preventable causes. (It’s a number that I can’t even begin to contemplate as I look at my children and new grandchildren!) Having now “gone public” with our school building effort, if any of my readers or patients want to help, just email me or call the office… Now, onto the medical subject I promised in the title:

There has been an estimated 70% decrease in new use of hormone therapy subsequent to the Women’s Health Initiative reported in 2002. And as we gather more information about the ongoing results of this study, there is a definitive and much publicized concern about the increase in risk over time of breast cancer in women who were on combination Premarin and the synthetic progestin, Provera.  Many physicians now prefer prescribing other forms of estrogen, such as estradiol (especially transdermal, i.e. delivered via skin absorption) and progesterone both similar to what the ovaries produced during our reproductive years. But even then, we are concerned about long-term use. I don’t question the amazing diminishment of symptoms that estrogen therapy renders, especially in the early years of menopause. But at some point, I remind my patients that if they want to continue long-term use, they have to consider whether they are doing so for quality of life (and abeyance of symptoms) or for medical reasons. I have to point out the risks… Well, this time, I also want to point out a study that indicates a risk in stopping hormone therapy.

A recent observational study of postmenopausal women aged 60 or older was conducted using data from 11 Kaiser Permanente medical centers here in Southern California. They assessed the risk of fracture for women who stopped taking hormone therapy compared with those who continued. Between July 2002 and December 2008, hormone therapy use in this population decreased from 85% to 18%. The survey took into account age and race before comparisons were made. Women who did not use hormone therapy in the previous year had a 55% increased risk of hip fracture; moreover, hip fracture increased significantly with 2 or more years of HT cessation.

The study was limited in that the researchers did not have information about previous fractures in this population. But they did compare mean bone mineral density (BMD) and found that it was inversely associated with cumulative years of HT use. (In other words, the longer the women were on HT, the better their bone densities).

There is more information needed before most physicians warn all patients not to stop their HT for fear of fracture.  But this study does give pause for a medical discussion.

Bottom line: If you do stop hormone therapy; you should be followed with appropriate assessment of your bone status. If your risk is deemed high (age over 65, low scores or decreasing scores on a DEXA bone density scans, a significant family history of hip fracture, a personal previous fracture and/or the use of certain medications, especially steroids), then you may need to consider osteoporosis medications such as a bisphosphonates. As usual, I will end with the phrase “discuss this with your doctor.”

Voila the genome is tabulated! Researchers are discovering the genes and the genetic variants that contribute to disease. The implications are huge.  Knowledge of the genome allows scientists and physicians to better understand the risk for a disease, how and why it develops, and to search for potential therapies and cures. So if we know all this; why not test our own genes so we can be told…. “Yes (or no) we are at risk for heart disease, diabetes, osteoarthritis , cancer or aging!”  Direct-to-consumer genomewide profiling is out there… just go on line and you will find many companies willing to test your genetic risk for 20 to 40 common diseases. Here’s a list some of those diseases that I downloaded on my first Google search:

  • Lupus
  • Graves’ disease
  • Celiac disease
  • Multiple sclerosis
  • Psoriasis
  • Cardiovascular Conditions
  • Aneurysm
  • Atrial fibrillation
  • Heart disease
  • Peripheral arterial disease
  • Venous thromboembolism
  • Aging
  • Macular degeneration
  • Alzheimer’s disease
  • Osteoarthritis
  • Rheumatoid arthritis
  • General Health
  • Obesity
  • Migraine
  • Type 1 diabetes
  • Type 2 diabetes
  • Cancers
  • Bladder cancer
  • Breast cancer
  • Colorectal cancer
  • Gastric cancer
  • Lung cancer
  • Prostate cancer
  • Skin cancer

These include many of our future health worries (with exceptions such as neurological diseases, mental health disorders, accidents, and sudden death). The tests examine approximately 500,000 bases (and their variants) in a person’s DNA, usually obtained by swabbing the inside of the mouth (not exactly spit, but the title seemed pretty catchy). The tests are available for the price of $400 to $2,000 and consultation with a health provider is not a prerequisite. The $500,000 question (a dollar for each DNA base tested) is whether this information will have a positive impact on your lifestyle choices, and will it encourage you to get more tests and health-screens that could diagnosis a high risk disease? The proponents of this type of genetic testing say yes. The naysayers say that it simply results in anxiety, and an increased use of unnecessary and expensive screening and medical procedures.

Not an easy choice. So I was delighted to see an article that weighed in on the subject in the New England Journal of Medicine’s February 10th issue. It was written by genomic researchers at Scripps in La Jolla, California. They recruited subjects from health and technology companies who wished to pursue this type of testing, and gave them a discounted rate. (Wow, your genes on sale!) They initially enrolled 3,639 participants, but only 2,037 completed the trial. (I guess when it came down to it; not everyone cared to know their potential health risks.) The results were graded, and the recipients were given a lifetime risk for each of 22 conditions. These were then compared to the average lifetime risk for each condition. To make it even clearer, those tested were given a color-coded risk chart. Orange indicated either an overall lifetime risk of more than 25%, or a risk that was more than 20% above average. Gray indicated a low risk. (Colorless was good.)  Four to 6 months after receiving the genetic test results the subjects were surveyed for any changes in symptoms of anxiety, intake of dietary fat, changes in exercise, and whether they had test-related stress or increased their use of health screening tests. They were also asked if they intended to undergo any of 13 health screening tests with greater frequency than before they received the results, whether they had spoken to a genetic counselor about their results, and whether they had shared their results with a physician.

The results were somewhat surprising: There were no significant differences in the level of anxiety, dietary intake of fat (I guess they still had their steaks), and exercise behavior between baseline and follow-up in the group as a whole. Moreover, there were no associations between risk seen on the reports (i.e. an “orange risk”), and changes in anxiety or behaviors. However, there was an association between measures of risk, and screening tests the subjects intended to complete with greater frequency in the future. (This goes under… “Yes I know I am at risk, when I have a chance I’ll get tested”)… It will be interesting to see how many actually do.

Only 10.4% discussed their results with a board-certified genetic counselor (and they were available free for the study). Twenty-six percent reported sharing their results with their physician. Those who did, reported no increased anxiety, but they did proceed to have a lower fat intake and exercised more. (At least that!)

The authors go on to point out that, “There is evidence that different genomewide testing companies and laboratories produce discrepant risk estimates. Some indicating increased risk and others, indicating decreased risk (as compared with average risk) for the same condition in the same person.” So, the validity of the tests may not be worth the worry (or lack thereof).

Bottom line: This study supports a hypothesis that the results of direct-to-consumer genomic risk testing do not affect short-term health related behavior. It is probably more helpful to go over your family’s medical history, as well as, your past and present health-affecting behaviors with your physician.  Together you can then decide on behavioral changes and appropriate screening tests that you will (I hope) implement. Future studies and follow-up of the consequences of genetic testing in the general population may change my advice… let’s see what happens in the years to come.


I just knew that a substance that causes such culinary happiness would, for once, be good for us! (And I am tired of giving bad news or “don’t-do-this” advice in this newsletter.) So, I’ll take the opportunity to smile as I report thus:

Researchers in Australia have found that women older than 70, who eat (or ate) chocolate at least once a week, were 35% less likely to die from heart disease over a 10 year period and were 60% less likely to be hospitalized or die from heart failure. (A serving of chocolate was considered 1 cup of hot cocoa.) In an article published in The Archives on Internal medicine, over 1200 women over 70 were followed for a decade. They were asked to report on how much, and how often they consumed chocolate. About half the women stated that they consumed less than a serving a week. Of these, 90 were hospitalized or died from heart disease during those 10 years compared with 65 women who ate chocolate more frequently. And to add to this, 35 infrequent chocolate eaters developed heart failure; whereas “only” 18 of the women who ate chocolate once a week or more did. I immediately thought…well if once a week was good; would daily be better? The researchers found, however, that those who ate chocolate just once a week did just as well in this study as the women who ate it daily. (At least more didn’t hurt.)

Now I know there are a lot of issues that were not controlled for in this study. First a comparison of heart health in these 70-year-old women at the beginning of the study is not entirely accounted for. Second, how many developed diabetes? But never mind…

This data does make me want to give myself a chocolate treat…telling myself that my chocolate pleasure receptors need stimulation, I deserve it, and yes the flavonides in it (mostly in dark chocolate, which is what I love) will be good for my heart. And the older I get, perhaps the more my heart will be grateful!

I know that the conclusions of a recent article published in The New England Medical Journal may not appeal to many anti-choice advocates. But I rather doubt that even S.P. (you know who I mean) could figure out how to use statistics to render the following Danish study invalid. So here it goes: The National Center for Registered Based Research in Denmark together with researchers at the Copenhagen University conducted a study that linked information from the Danish Civil Register of Patients and the Danish Psychiatric Central registry.  (It turns out that everyone in this small country is medically accounted for.) They looked at data for girls over 15 and women with no record of mental disorders between 1995-2007 who had a first-trimester induced abortion or a first childbirth during that period.

This included a total of 954,702 girls and women. (It should be stated, that abortion became legal in Denmark  in 1973 and as a result any woman 18 years or older can have a termination of pregnancy within the first 12 weeks of pregnancy; if she is younger, permission from a legal guardian or parent is necessary.) The girls and women in the study population were followed individually from 9 months before a first time abortion or birth of a live infant through 12 months after the event or until a psychiatric contact occurred for the first time. (In other words, medical care in Denmark is extraordinarily well documented and allowed researchers to determine if the women had psychiatric evaluations and diagnoses of mental disorders way before, as well after, they either had an abortion or delivered.

This was not a retrospective study where a woman is asked if she has had a termination and if she developed psychiatric problems subsequent to, or as a result of the termination. This type of request for recall would be negatively biased and certainly not controlled. In this study, the women were their own controls since their history was known before they were pregnant. Moreover a comparable group of women who went on to deliver were used as a comparison. Now let me end my analysis of their analysis and simply give you the results…)

A total of 84,620 girls and women had a first-time first-trimester abortion during the period from 1995 to 2007.  And 1% (868) had a first psychiatric contact (inpatient or outpatient) during the 9 months before the abortion, whereas 1.5% (1277) did so within the first 12 months after the abortion. When this is calculated in what we call incidence rates, mental health issues were clinically diagnosed at the rate of 14.5 per 1000 person-years before abortion and 15.2 per 1000 person years after abortion. Statistically, the risk of psychiatric contact did not differ before and after abortion.

But when they calculated the girls and women who had live births during the same time period they found that a total of 280,930 gave birth to their first-born child. Of these girls and women, 0.3% had a first-time psychiatric contact within the 9 months preceding delivery (790) and 0.7% (1916) from O to 12 months postpartum. This then becomes an incidence rate of 3.9 per 1000 person-years before childbirth and 6.7 per 1000 person-years after childbirth.

The researchers could therefore go on to make the statement that “the risk of psychiatric contact did not differ significantly before and after abortion but the risk after childbirth was significantly greater than the risk before childbirth.”

It’s pretty obvious that the girls and women who did undergo termination had a higher propensity toward mental health disorders before their pregnancy than those who went on to have a live birth. (Perhaps this affected their use of contraception and/or social or medical reasons that led them to have an abortion.) The higher number of psychiatric consults and need for diagnoses and care among the postpartum women probably reflect the enormity of delivering and caring for a first-born, as well as the hormonal changes that add to postpartum depression.

Bottom line: This large Danish study has demonstrated that first-trimester induced abortion does not increase risk of mental disorders.

Links

-->