Connect with us

Health

The Shocking Truth About Skin Cancer: What You’re Not Being Told About the Sun

Now, it all makes sense.

Published

on

This article originally appeared on The Forgotten Side of Medicine and was republished with permission.

Guest post by A Midwestern Doctor

Story at a Glance:

• Skin cancers are by far the most commonly diagnosed cancer in the United States, so to prevent them, the public is constantly told to avoid the sun. However, while the relatively benign skin cancers are caused by sun exposure, the ones responsible for most skin cancer deaths are due to a lack of sunlight.

• This is unfortunate because sunlight is arguably the most important nutrient for the human body, as avoiding it doubles one’s rate of dying and significantly increases their risk of cancer.

• A strong case can be made that this dynamic was a result of the dermatology profession (with the help of a top PR firm) rebranding themselves to skin cancer fighters, something which allowed them to become one of the highest paying medical specialities in existence. Unfortunately, despite the billions that is put into fighting it each year, there has been no substantial change in the number of skin cancer deaths.

• In this article, we will also discuss the dangers of the conventional skin cancer treatments, the most effective ways for treating and preventing skin cancer, and some of the best strategies for having a healthy and nourishing relationship with the sun.

Note: in February’s open thread, I presented some potential articles, and since this topic was one of the most requested, I have spent the last month working on it.

Ever since I was a little child something seemed off about the fact everyone would get hysterical about how I needed to avoid sunlight and always wear sunscreen whenever we had an outdoor activity—so to the best of my ability I just didn’t comply. As I got older, I started to notice that beyond the sun feeling really good, anytime I was in the sun, the veins under my skin that were exposed to the sun would dilate, which I took as a sign the body craved sunlight and wanted it to draw into the circulation. Later still, I learned a pioneering researcher found significant alternations would occur in the health of people who wore glasses that blocked specific light spectrums (e.g., most glass blocks UV light) from entering the most transparent part of the body that could be treated by giving them specialized glasses which did not block that spectrum from entering.

Note: all the above touches upon one of my favorite therapeutic modalities—ultraviolet blood irradiation, which will be the focus of an upcoming article.

Later, when I became a medical student (at which point I was familiar with the myriad of benefits of sunlight), I was struck by how neurotic dermatologists were about avoiding sunlight—for instance, in addition to hearing every patient I saw there be lectured about the importance of avoiding sunlight, through my classmates, I learned of dermatologists in the northern latitudes (which had low enough sunlight people suffered from seasonal affective disorder) effectively require their students to wear sunscreen and clothing which covered most of their body while indoors. At this point my perspective on the issue changed to “this crusade against the sun is definitely coming from the dermatologists” and “what on earth is wrong with these people?” A few years ago I learned the final piece of the puzzle through Robert Yoho MD and his book Butchered by Healthcare.

The Monopolization of Medicine

Throughout my life, I’ve noticed three curious patterns in the medical industry:

• They will promote healthy activities people are unlikely to do (e.g., exercising or smoking cessation).

• They will promote clearly unhealthy activities industries make money from (e.g., eating processed foods or taking a myriad of unsafe and ineffective pharmaceuticals).

• They will attack clearly beneficial activities that are easy to do (e.g., sunlight exposure, eating eggs, consuming raw dairy, or eating butter).

As best as I can gather, much of this is rooted in the scandalous history of the American Medical Association, when in 1899, George H. Simmons, MD took possession of the floundering organization (MDs were going out of business because their treatments were barbaric and didn’t work). He, in turn, started a program to give the AMA seal of approval in return for the manufacturers disclosing their ingredients and agreeing to advertise in a lot of AMA publications (they were not however required to prove their product was safe or effective). This maneuver was successful, and in just ten years, increased their advertising revenues 5-fold, and their physician membership 9-fold.

At the same time this happened, the AMA moved to monopolize the medical industry by doing things such as establishing a general medical education council (which essentially said their method of practicing method was the only credible way to practice medicine) which allowed them to then become the national accrediting body for medical schools. This in turn allowed them to end the teaching of many of the competing models of medicine such as homeopathy, chiropractic, naturopathy, and to a lesser extent, osteopathy—as states would often not give licenses to graduates of schools with a poor AMA rating.

Likewise, Simmons (along with his successor, Fishbein, who reigned from 1924 to 1950) established a “Propaganda Department” in 1913 to attack all unconventional medical treatments and anyone (MD or not) who practiced them. Fishbein was very good at what he did and could often organize massive media campaigns against anything he elected to deem “quackery” that were heard by millions of Americans (at a time when the country was much smaller).

After Simmons and Fishbein created this monopoly, they were quick to leverage it. This included blackmailing pharmaceutical companies to advertise with them, demanding the rights for a variety of healing treatments to be sold to the AMA, and sending the FDA or FTC after anyone who refused to sell out (which in at least in one case was proved in court since one of Fishbein’s “compatriots” thought what he was doing was wrong and testified against him). Because of this, many remarkable medical innovations were successfully erased from history (part of my life’s work and much of what I use in practice are essentially the therapies Simmons and Fishbein largely succeeded in wiping off the Earth).

Note: to illustrate that this is not just ancient history, consider how viciously and ludicrously the AMA attacked the use of ivermectin to treat COVID (as it was the biggest competitor to the COVID cartel). Likewise, one of the paradigm changing moments for Pierre Kory (which he discusses with Russel Brand here) was that after he testified to the Senate about ivermectin, he was put into a state of shock by the onslaught of media and medical journal campaigns from every direction trying to tank ivermectin and destroy he and his colleagues’ reputations (e.g., they got fired and had their papers which had already passed peer-review retracted). Two weeks into it, he got an email from Professor William B Grant (a vitamin D expert) that said “Dear Dr. Korey, what they’re doing to ivermectin they’ve been doing to vitamin D for decades” and included a 2017 paper detailing the exact playbook industry uses again and again to bury inconvenient science.

Before long, Big Tobacco became the AMA’s biggest client, which led to countless ads like this one being published by the AMA which persisted until Fishbein was forced out (at which point he became a highly paid lobbyist for the tobacco industry):

Note: because of how nasty they were, they often got people to dig into their past, at which point it was discovered how unscrupulous and sociopathic both Simmons and Fishbein were. Unfortunately, while I know from first-hand experience this was the case (e.g., a friend of mine knew Fishbein’s secretary and she stated that Fishbein was a truly horrible person she regularly saw carry out despicable actions and I likewise knew people who knew the revolutionary healers Fishbein targeted), I was never able to confirm many of the abhorrent allegations against Simmons because the book they all cite as a reference did not provide its sources, while the other books which provide different but congruent allegations are poorly sourced.

The Benefits of Sunlight

One of the oldest “proven” therapies in medicine was having people bathe in sunlight (e.g., it was one of the few things that actually had success in treating the 1918 influenza, prior to antibiotics it was one of the most effective treatments for treating tuberculosis and it was also widely used for a variety of other diseases). In turn, since it is safe, effective, and freely available, it stands to reason that unscrupulous individuals who wanted to monopolize the practice of medicine would want to cut off the public’s access to it.

Note: the success of sunbathing was the original inspiration for ultraviolet blood irradiation.

Because of how successful the war against sunlight has been many people are unaware of its benefits. For example:

1. Sunlight is critical for mental health. This is most well appreciated with depression (e.g., seasonal affective disorder) but in reality the effects are far more broad reaching (e.g., unnatural light exposure destroys your circadian rhythm).

Note: I really got this point during my medical internship, where after a long period of night shifts under fluorescent lights, noticed I was becoming clinically depressed (which has never otherwise happened to me and led to a co-resident I was close to offering to prescribe antidepressants). I decided to do an experiment (I do this a lot—e.g., I try to never recommend treatments to patients I haven’t already tried on myself) and stuck with it for a few more days, then went home and bathed under a full spectrum bulb, at which point I almost instantly felt better. I feel my story is particularly important for healthcare workers since many people in the system are forced to spend long periods of their under artificial light and their mental health (e.g., empathy) suffers greatly from it. For example, consider this study of Chinese operating room nurses which found their mental health was significantly worse than the general population and that this decline was correlated to their lack of sunlight exposure.

2. A large epidemiological study found women with higher solar UVB exposure had only half the incidence of breast cancer as those with lower solar exposure and that men with higher residential solar exposure had only half the incidence of fatal prostate cancer.
Note: a 50% reduction in either of these cancers greatly exceeds what any of the approaches we use to treat or prevent them have accomplished.

3. A 20 year prospective study evaluated 29,518 women in Southern Sweden where average women from each age bracket with no significant health issues were randomly selected, essentially making it one of the best possible epidemiologic studies that could be done. It found that women who were sun avoidant compared to those who had regular exposure to sunlight were:

• Overall 60% more likely to die, being roughly 50% more likely to die than the moderate exposure group and roughly 130% increase more likely to die than the group with high sun exposure.

Note: to be clear, there are very few interventions in medicine that do anything close to this.

• The largest gain was seen in the risk of dying from heart disease, while the second gain was seen in the risk of all causes of death besides heart disease and cancer (“other”), and the third largest gain was seen in deaths from cancer.

Note: the investigators concluded the smaller benefit in reduced cancer deaths was in part an artifact of the subjects living longer and hence succumbing to a type of cancer that would have only affected them later in life.

• The largest benefit was seen in smokers, to the point non-smokers who avoided the sun had the same risk of dying as smokers who got sunlight.

Note: I believe this and the cardiovascular benefits are in large part due to sunlight catalyzing the synthesis of nitric oxide (which is essential for healthy blood vessels) and sulfates (which coat cells like the endothelium and in conjunction with infrared (or sunlight) creates the liquid crystalline water which is essential for the protection and function of the cardiovascular system).

So given all of this, I would say that you need a really good justification to avoid sun exposure.

Skin Cancer

According to the American Academy of Dermatology:

Skin cancer is the most common cancer in the United States. Current estimates are that one in five Americans will develop skin cancer in their lifetime. It is estimated that approximately 9,500 people in the U.S. are diagnosed with skin cancer every day.

Basal cell and squamous cell carcinomas, the two most common forms of skin cancer, are highly treatable if detected early and treated properly.

Because exposure to UV light is the most preventable risk factor for all skin cancers, the American Academy of Dermatology encourages everyone to stay out of indoor tanning beds and protect their skin outdoors by seeking shade, wearing protective clothing — including a long-sleeved shirt, pants, a wide-brimmed hat and sunglasses with UV protection — and applying a broad-spectrum, water-resistant sunscreen with an SPF of 30 or higher to all skin not covered by clothing.

Likewise according to the Skin Cancer Foundation:

More than 2 people die of skin cancer in the U.S. every hour.

That’s sounds pretty scary. Let’s now break down exactly what that means.

Note: fortunately, there is much more awareness of the vast benefits of vitamin D now (which comes from sunlight exposure). However, since many of the sun’s benefits come from things besides creating vitamin D, the current position dermatology is beginning to pivot to (that you can substitute “unsafe” sunlight exposure with vitamin D) is not advice I can at all support.

Basal Cell Carcinoma

By far the most common type of skin cancer is basal cell carcinoma (comprising 80% of all skin cancers), which for reference looks like this:

The exact incidence of BCC varies greatly, ranging from 14 to 10,000 cases per million persons, and within the United States, it is generally believed that around 2.64 million people get one per year (with around 4.32 million total cancers occurring since some people get more than one). The three primary risk factors for BCC are excessive sun exposure, fair skin (which makes you more susceptible to excessive sunlight penetrating your skin), and a family history of skin cancer. Because of this, the widely varying incidence of BCC is largely due to how much sunlight exposure people have, and typically you find it in areas with frequent sunlight exposure (e.g., the face).

The important thing to understand about BCC is that because it almost never metastasizes, it is not very dangerous. Most sources say it has a 0% fatality rate. Instead, it’s normally evaluated by how likely it is to recur once it’s removed (which ranges from 65% to 95%, depending on the source).

Note: we feel one of the biggest shortcomings in the excision based approach to skin cancer is that it does not address the underlying causes of cancer, it can frequently lead to skin cancers recurring and more and more skin needing to be cut off (which becomes problematic as more of it is removed). This in turn is particularly problematic when a potentially deadly one recurs.

Squamous Cell Carcinoma

The second most common type of skin cancer, cutaneous Squamous Cell Carcinoma (SCC) looks as follows:

Since it is also caused by sunlight, its incidence varies greatly, ranging from 260 to 4970 per million person-years, with an estimated 1.8 million cases occurring each year in the United States. Previously, BCC was thought to occur around 4 times as often as SCC, but now that gap has closed to it only being twice as common. Unlike BCC, SCC can be dangerous, as it does metastasize. In turn, if it is removed prior to metastasizing, it has a 99% survival rate, but if removed after metastasis, this drops to 56%. As SCC is typically caught before this happens (in 1-2 years, 3-9% of them will metastasize), the average survival rate for this cancer is around 95%, and around 2000 people (although some estimates go as high as 8000) are thought to die from SCC each year in the United States.

Note: since BCC and SCC are unlikely to kill people, unlike the other skin cancers, doctors are not required to report them, and there is hence no centralized database tabulating how many of them occur. As a result, the BCC and SCC numbers are largely estimates.

Melanoma

Melanoma is estimated to occur at a rate of 218 cases per million persons in the United States each year (with the risk varying by ethnicity). However, despite only comprising 1% of all skin cancer diagnoses, Melanoma is responsible for most of the deaths from skin cancer. Since survival is greatly improved by early detection, many guides online exist to help one recognize the common signs of a potential melanoma:

The five year survival rate for melanoma depends upon how far it has spread at the time of its diagnosis (ranging from 99% to 35% and averaging out to 94%), which again makes it important to correctly identify—but likewise, some cases are aggressive and metastasize quickly (so they often don’t get caught in time) and those variants have between a 15-22.5% survival rate. In total, this works out to a bit over 8000 deaths each year in the United States.
Note: these melanoma variants likely distort the overall survival statistics about the cancer.

What’s critically important to understand about melanoma is that while it’s widely considered to be linked to sunlight exposure—it’s not. For example:

A study of 528 patients with melanoma found those who had solar elastosis (a common change in the skin that follows excessive sun exposure) were 60% less likely to die from melanoma.

87% of all SCC cases occur in regions of the body that have significant sunlight exposure, such as the face (which in total comprises 6.2% of its surface area), while 82.5% of BCC occur in those regions. Conversely, only 22% of melanomas occur in these regions. This indicates that SCC and BCC are linked to sun exposure, but melanoma is not, and this is congruent with the fact that we constantly find them in areas that get almost no sunlight exposure.

Outdoor workers get 3–10 times the annual UV dose that indoor workers get, yet they have lower incidences of cutaneous malignant melanoma and an odds ratio (risk) that is half that of their outdoor colleagues.

A 1997 meta analysis of the available literature found workers with significant occupational sunlight exposure were 14% less likely to get melanoma.

Existing research has found using sunscreen either has no effect on the rates of malignant melanoma or increases it, which makes it quite frustrating that governments around the world always parrot the advice to wear more of it, especially whenever melanoma rates are rising (in other words, exactly what we also see with the COVID-19 vaccine drives).
Note: a case can be made that the chemicals in sunscreen cause skin cancer, and likewise some evidence exists for this with certain cosmetic products on the market.

• A (now forgotten) 1982 study of 274 women found that fluorescent light exposure at work caused a 2.1 times increase in their risk of developing malignant melanoma, with this risk increasing with more fluorescent light exposure, either due to the exposure at their job (1.8X with moderate exposure jobs, 2.6X with high exposure jobs) or the time spent working at it (i.e., 2.4X more likely for 1-9 year of work, 2.8X for 10-19 years, and 4.1X for over 20 years).
Note: there is some evidence these lights also affect animals (e.g., this study showed they dramatically dropped milk production).

• There has been a significant increase in many areas from melanoma, something which argues against sunlight being the primary issue as it has not significantly changed in the last few decades. For instance, consider this data from Norway’s cancer registry on malignant melanoma:

Note: there is also some evidence linking sunlight exposure to an increased risk of developing melanoma is more conflicting, as some data points show a small reduction, while others show a small increase (e.g., this study found sunlight exposure caused the risk of melanoma to increase by approximately 20%). However, while a small increase in melanoma is seen, the opposite occurs with respect to melanomas being larger in size (e.g., one study found those on the trunk were over twice as large as those on the arms) or how likely they are to kill someone (which is what actually matters).

Rare Skin Cancers

This section as not important to read, I’m primarily including it as a reference to support the primary point and to be complete.

Merkel cell carcinoma7 cases per million person-years in the USA52-78% survival rate. Possible link to sunlight.

Kaposi Sarcoma3 to 6 per million people in the USA41-81% survival rate, primarily due to immune suppression (e.g., AIDS, organ transplantation, possibly COVID vaccines). A possible small link to sunlight.

Cutaneous T-cell lymphoma— 6.4 to 8.55 per million people in the USA, 39.4-67.4%, primarily due to immune suppression and specific infections (there are also numerous noteworthy cases of it happening after COVID vaccination—including one of the participants in Moderna’s clinical trial).

Dermatofibrosarcoma Protuberans0.8 to 4.5 cases per million persons per year99.1% survivalrisk factors not known.

Microcystic adnexal carcinoma0.52 cases per million people per year88.1-98.1% survival rate, linked to previous radiation therapy, immunosuppressing medications and sunlight exposure.

Acral lentiginous melanoma1.8 cases per million person-years67.5%-80.3% survival rate, risk factors unknown but generally agreed not to be linked to sunlight exposure.

Sebaceous carcinoma2.43 cases per million person-years50-78% survival rate, linked to immune suppressing drugs, radiation therapy and an existing genetic defect.

Extramammary Paget disease0.4-0.7 cases per million people,81.6%-91.8% survival rate, not linked to sunlight exposure.

Note: the final rare skin cancer, undifferentiated pleomorphic sarcoma occurs at a rate of 30 cases per million people each year, but in many cases, it does not show up in the skin, so it’s harder to get an exact statistic for it.

The Great Dermatology Scam

If you consider the previous section, the following should be fairly clear:

• By far the most common “skin cancer” is not dangerous.

• The “skin cancers” you actually need to worry about are a fairly small portion of the existing skin cancers.

• Sunlight exposure does not cause dangerous cancers (except for SCC, which is nowhere near as dangerous as the others).

In essence, there’s no way to justify “banning sunlight” to “prevent skin cancer,” as the “benefit” from this prescription is vastly outweighed by its harm. However, a very clever linguistic trick bypasses this contradiction—a single label, “skin cancer,” is used for everything, which then selectively adopts the lethality of melanoma, the frequency of BCC, and the sensitivity to sunlight that BCC and SCC have.

This has always really infuriated me, so I’ve given a lot of thought to why they do this.

Note: Dr Malcom Kendrick helps provide some perspective on how this game is played throughout the medical industry by sharing a story from Michael Baum MD:

Each year I play a game with the senior postgraduate students at a course for specialists in cancer run by the Royal College of Surgeons of England. I tell them that there are two potentially effective screening tools for prostate cancer, one which will reduce their chances of dying from the disease by between 20 and 30 per cent, while the other will save one life after 10,000 person-years of screening. As a consumer or as a public health official, which one would you buy into? They all vote for the first; yet the two programmes are the same, they were just packaged differently. To continue marketing screening in terms of relative risk reduction in breast cancer mortality is disingenuous in the extreme.

However, I must emphasize that some skin cancers (e.g., many melanomas) require immediate removal. My point here is to encourage you not to avoid dermatologists entirely but to consider seeking a second opinion from another dermatologist if you are unsure about what has been suggested to you as there are many excellent and ethical dermatologists practicing in the field as well.

The Most Desired Specialty

Much of the medical education process is providing medical students with carrots (incentivizing rewards) they can obtain if they work incredibly hard, are highly compliant, and demonstrate an above-average degree of aptitude. This in turn motivates premeds to work very hard in college (e.g., giving up their social life), and then for medical students to keep on working very hard (even though they “made it” and already got into medical school) and then often for medical residents to keep on doing that (so they can get into a prestigious fellowship). One of the key incentives here is to be able to get into a prestigious specialty, as those typically command more respect and pay more.

Dermatology is commonly seen as the most desired specialty as it:

• Has a relatively short post-medical training period (it’s only four years).

• It has a relatively relaxed work-life balance (e.g., you only work normal hours during the weekdays and can take a day off).

• It is fairly rare you have to deal with high-acuity or challenging patients, so the stress in this field is very low.

• Dermatology is one of the highest paying specialties. The average starting dermatologist’s salary is $400,000.00 a year, although many, such as Mohs surgeons, often make at least $600,000.00 (and often far more). By comparison, general practitioners typically make around $220,000.00 annually.

Note: with an average base salary of around $700,000.00 a year, neurological surgery is typically the highest paying specialty. While this is a lot, I think this it’s “fair,” as beyond this specialty being extremely challenging and nerve-wracking (e.g., many brain surgeries are 3-8 hours long and there are many even longer ones that also happen—throughout all of which the surgeon has to be extremely precise in what they are doing or risk a catastrophe and large lawsuit), you have to spend 7 years after medical school training to become a neurological surgeon and even longer (1-2 years) to specialize in certain aspects of neurological surgeries.

It’s remarkable the dermatology profession was able to pull this off, and as a result, their field tends to attract the most competitive students who really want the incredible lifestyle and salary a private dermatology practice can offer (even though during the application process everyone typically claims they want to be an academic research since that’s what gets you in). Likewise, one of the smartest doctors I know (who had a good heart and the capacity to improve medicine) ultimately went into dermatology for those reasons, and as a result an incredible amount of potential was wasted (similarly I believe a major reason why innovative research has been so slow in dermatology is that its compensation model attracts the gifted physicians who are not interested in research).

Note: one of the most challenging things in dermatology is that to accurately diagnose all the different skin lesions that exist, it requires a fair bit of intelligence and training (so most doctors outside of the speciality can’t). As many challenging skin lesions do exist, it’s important doctors exist to do this (although it’s quite possible in the years to come AI diagnostic technologies will address some of this).

The Transformation of Dermatology

Not too long ago, dermatology was one of the least desired professions as much of what they did was essentially just dealing with acne and pimples in the era before accutane (which unlike most pharmaceuticals actually works—but unfortunately is incredibly toxic, and has permanently disabled a few people I knew quite well).

relatively unknown blog by Dermatologist David J. Elpern, M.D. at last explained what happened:

Over the past 40 years, I have witnessed these changes in my specialty and am dismayed by the reluctance of my colleagues to address them. This trend began in the early 1980s when the Academy of Dermatology (AAD) assessed its members over 2 million dollars to hire a prominent New York advertising agency to raise the public’s appreciation of our specialty. The mad men recommended “educating” the public to the fact that dermatologists are skin cancer experts, not just pimple poppers; and so the free National Skin Cancer Screening Day was established.

These screenings serve to inflate the public’s health anxiety about skin cancer and led to the performance of vast amounts of expensive low-value procedures for skin cancer and actinic keratosis (AKs). At the same time, pathologists were expanding their definitions of what a melanoma is, leading to “diagnostic drift” that misleadingly increased the incidence of melanoma while the mortality has remained at 1980 levels. Concomitantly, non-melanoma skin cancers are being over-treated by armies of micrographic surgeons who often treat innocuous skin cancers with unnecessarily aggressive, lucrative surgeries.

A 2021 journal article provides additional context to Dr. Elpern’s remarks:

Skin cancer screenings began at the community level in the 1970s. The first nationwide public skin cancer screening program was started by the American Academy of Dermatology in 1985 after the rising incidence and mortality rate of malignant melanoma gained increasing attention in the early 1980s. In the early years of the program, President Ronald Reagan signed proclamations creating the “National Skin Cancer Prevention and Detection Week,” and the “Older Americans Melanoma/Skin Cancer Detection and Prevention Week,” and the total body skin examination became the gold standard for skin cancer screening.

Note: this article also shares that the American government has long been extremely doubtful of the value of these screenings and the dermatology field has faced continual challenges to surmount this obstacle they’ve had to lobby for solutions to.

In short, as has happened many times in America, a remarkably sophisticated public relations campaign was launched to transform society for the benefit of an industry.

Note: in a recent article about the widespread psychological changes vaccination has created upon American society (which as the surveys within it show the majority of readers also observed), I discussed the widespread push to impose “gender-diversity” upon our children, and as many commenter noted, this sudden shift could only be explained by a massive PR operation, which as it turns out, is being funded by a group of activist billionaires who wish to transform society.

I am relatively certain a few of the core components of this campaign were:

• A recognition that skin cancers are by far the easiest cancer to diagnose (since you just have to see them).

• Demonize the sun, as doing so allowed the dermatologists to cast themselves as heroes and to stir up as much anxiety as possible about the sun—especially as a psychological investment they had to make constantly putting sunscreen on would make them more likely to go to their dermatologist.

• Allow them to create a massive sales funnel by being allowed to do a massive number of routine full body skin exams (on otherwise healthy individuals) and hence have a huge pool of potential cancers to biopsy or excise (remove).

Note: in addition to this being a sales funnel, dermatologists are paid to use liquid nitrogen to freeze off each “precancerous” lesion on the person found during an appointment (which takes seconds to do and adds around 100 dollars to the reimbursement for the visit). Sadly, while this is often pitched as removing a precancer, research shows most disappear on their own (55% within one year, 70% by 5 years) and very few will become a SCC (0.6% in one year, and 2.57% in 4 years), which makes it hard to justify the cost of this procedure, especially given that it is not without side effects.

• Allow them to piggyback onto the fear the medical industry has marketed around cancer to justify charging a lot of money to do something questionable to prevent cancer and having every patient go along with it the second they hear the dreaded “c” word.

Note: in a recent article, I described how this principle was exploited by Lupron, a poor prostate cancer drug with a range of severe side effects which was then pivoted to it becoming an incredibly lucrative drug that is used for a variety of unproven uses (e.g., assisting a gender transition).

Specifically, their ultimate play was to be able to justify charging a lot of money to surgically remove skin cancers, often being paid more to do this than a surgeon receives from a standard procedure (which to some extent is justified because the surgery also requires a pathologic examination midway through). The procedure for reference is:

The essential purpose of a Mohs surgery is to be able to have a much smaller incision (e.g., not cutting away any more than you have to), which can often make a big deal for a patient since large holes in the face can be devastating. This in part is accomplished by pausing the surgery midway through and looking at what was cut out under a microscope so it can be determined if all of the cancer was extracted and nothing more needs to be cut away (whereas in conventional surgery a larger margin is used to be on the safe side).

In turn, the “trick” to Mohs surgery is that since one is both doing surgery and pathology in the same visit, it allows the doctor to bill for a variety of different things which quite quickly adds up. To illustrate, consider this guide to billing for them and this Medicare summary of what is currently appropriate to bill for them (which, per my understanding, is a bit different from the more lucrative billing that existed about a decade ago and which goes in parallel to recent policy changes in the insurance industry making it so dermatologists in most areas could only be reimbursed for Mohs surgeries if they had completed an additional 1-2 year fellowship in it—which has understandably made it a very competitive fellowship to gain admission into).

Note: the going rate for a Mohs surgery varies widely, but it is typically at least a few thousand dollars. Unlike other surgeries, most of that money goes to the dermatologist since a hospital that provides an OR, OR supplies, OR staff, and a recovery service doesn’t take a significant amount of the reimbursement.

You thus might be able to guess what happened:

The rate of use of Mohs surgery among Medicare beneficiaries in the United States grew 700 percent between 1992 and 2009 [causing it to occupy the number one spot on Medicare’s list of “potentially misvalued” CPT codes], though there was little evidence to suggest in many cases that Mohs was superior to cheaper treatment options, which include scraping, snipping, or even applications of a cream to create a chemical burn. The big difference between these more pedestrian treatments and Mohs is the price tag: hundreds of dollars versus more than $10,000 or even $20,000 for Mohs.

For most benign skin tumors, “the decision to utilize Mohs Micrographic Surgery is likely to reflect the economic advantage to the provider rather than a substantial clinical advantage for the patient,” wrote Dr. Robert Stern, a Harvard dermatologist, noting that in 2012 America spent more than an estimated $2 billion on Mohs surgery, with wide variations in its use: even for sensitive locations like the face and the hands, it was used 53 percent of the time in Minnesota versus only 12 percent in New Mexico. Dr. Stern estimated that nearly 2 percent of all Medicare recipients had Mohs in that year.

Note: Dr. Stern shared with Elisabeth Rosenthal that he was on a 2012 panel convened by the professional dermatology societies to evaluate when it was actually appropriate to use a Mohs surgery (due to Medicare’s concerns over it being overused). Due to the procedural structure of the meeting, the panel ultimately voted to approve 83% of the possible indications for a Mohs surgery, leading to in (Stern’s words) “A lot of us were surprised to see that many things that were quite controversial going in now looked positive and unanimous. How did that happen? It made us really uncomfortable…This was not a medical issue; it was a trade issue.”

To show what that change in the guidelines translated to:

A total of 10, 726 dermatologists were identified in the database, representing 1.2% of all health care professionals and 3% of total Medicare payments ($3.04 billion of approximately $100 billion) [whereas dermatologist comprise slightly less than 1% of the physicians in the country]. Median payment per dermatologist was $171 397. Mean reimbursement for E/M was $77.59 per unit, whereas Mohs received a mean per-procedure reimbursement of $457.33 per unit. Among dermatologists, 98.9% received an E/M [general visit] payment and 19.9% received Mohs-related payments. Total payment to dermatologists was highest for E/M ($756 million), followed by Mohs ($550 million) and destruction of premalignant lesions [cryosurgery] ($516 million) [and then followed by $289 million for biopsies]. Compared with lower-billing dermatologists, top-billing dermatologists received a higher proportion of payments from Mohs and flaps/grafts and a lower proportion from E/M. The top 15.9% of dermatologists received more than half of total payments.

Note: this was for 2013 and it has likely risen since then (I couldn’t find a more recent study other than an article noting that in 2015, 5.9 million skin biopsies were performed on Medicare Part B recipients—a 55% increase from a decade earlier). Additionally, keep in mind that Medicare typically accounts for around 40% of dermatologists’ total patient volume and about 30% of their total practice revenue (although I’ve seen revenue estimates range from 30-60%), so this is just a fraction of what they actually make.

As you might guess, before long, this opportunity also attracted the attention of more unscrupulous parties seeking to cash in on the bonanza. This in turn led the New York Times to investigate the industry where they discovered:

• Private equity firms from Wallstreet had entered the market and were buying out dermatology practices and staffing them with nurses and PAs (who were much cheaper to hire than doctors) despite advertising to the public they would see a doctor. This was unfortunate since these pseudo-dermatologists biopsied over twice as many suspected skin cancers as dermatologists (who in the course of ten years had gone from performing almost none of the biopsies billed to Medicare to over 15% of them). Likewise, they frequently missed actual cancers or misdiagnosed lesions every dermatologist could tell was not a cancer (something which has also happened to people I know who’ve gone to these types of clinics), to the point the 2017 NYT article actually was able to share an example of this happening.

Note: many existing dermatologists have complained about this practice in their academic journals (as they feel it is ruining the profession) but that has not stopped it (given how much money is on the line), and now many of them face the situation that if they try to open up their own practice in a major city, they are stuck competing with a private equity firm which has numerous (well advertised) midlevel practitioners staffing the place. This isn’t necessarily a bad thing however as it does force them to set up shop in areas that have fewer physicians (which is one of the biggest challenges the American medical system faces in almost every specialty).

• There has been a big push (e.g., by greedy doctors) to expand this franchise into nursing homes. For example, they covered a mobile practice in Michigan that sends clinicians to 72 nursing homes where they performed thousands of cryosurgeries, along with many steroid injections and minor surgeries. The investigation in turn found that 75% of those patients had Alzheimer’s disease and most of their skin lesions were inconsequential. Furthermore, they cited a healthcare analytics firm that looked at 17,820 done on patients over age 65 in the last year of life, and found that skin biopsies and the freezing of precancerous lesions were performed frequently, often weeks before death.

• This was quite unfortunate as:

Dr. Linos added that physicians underestimate the side effects of skin cancer procedures. Complications such as poor wound healing, bleeding and infection are common in the months following treatment, especially among older patients with multiple other problems. About 27 percent report problems, her research has found.

Note: while cosmetically, dermatologic surgeries typically turn out quite well when performed by a competent dermatologist in a patient who isn’t too far into old age, we often find that the scars from dermatologic surgeries can cause chronic issues (e.g., pain) that requires either prolotherapy or neural therapy to correct. Typically, this is the most common for facial surgeries, which could either be due to the inherent sensitivity of that high nerve region, or due to the skin not liking being stretched and then sutured (which is what Mohs typically requires).

As I conclude this section, I will share that one of the things that always bothered me about some of the dermatology practices I shadowed was how “salesy” they felt, as the same scripts would be repeated again and again to move patients through the skin cancer sales funnel and simultaneously, the dermatologists would be very particular about having everything and everyone look as nice as possible (alongside having numerous advertisements meant to cater to their female patient’s physical insecurities).

Changes in Skin Cancer

Given how much money is being spent to end skin cancer, one would expect some results. Unfortunately, like many other aspects of the cancer industry that’s not what’s happened. Instead, again and again, we see a tendency for more (previously benign) cancers to be diagnosed, but for the most part, no significant change occurring in the death rate.

The best proof for this came from this study which looked at what type of malignant melanoma was actually being biopsied, and found that almost all of the increase in “skin cancer” were the stage 1 melanomas which rarely created problems:

This study, in turn, illustrates exactly what the result of our war on skin cancer has accomplished:

Finally, since many suspected the COVID vaccines might lead to an increase in melanoma (or other skin cancers), and I could not find the statistics for that online, I decided to create them by compiling all the available annual reports from the American Cancer Society into a few graphs:

Managing Skin Cancer and Sunlight

The primary purpose of this piece was to empower each of you, as I just think it’s really not cool that dermatologists abuse the fear patients have of cancer to push this lucrative business model along.

Nonetheless, I recognize that this article also raises a few obvious questions such as:

• Are there less invasive alternatives to skin cancer surgeries?

• What is the best way to safely interact with the sun (e.g., what are the safest sunscreen approaches)?

• What actually causes skin cancer and how can you prevent it?

In the final part of the article I will share our current thoughts on each of those questions.

Subscribe to A Midwestern Doctor’s Substack to read the rest.

Copyright 2024 The Forgotten Side of Medicine

Trending Now