New Diagnostic Test for the Early Detection of Lung Cancer

I was invited to discuss a new diagnostic test for the early detection of lung cancer by Gerri Willis of Fox Business News’ Willis Report.
An Italian clinical study presented at the September 2014 European Respiratory Society described 82 patients with abnormal chest x-rays. Patients breathed into a machine that measured the temperature of the exhaled air. Forty of the patients ultimately proved to have cancer and 42 did not, as confirmed by subsequent biopsy. They found a correlation between the temperature of the exhaled breath and presence of lung cancer. They also found that long term smokers had higher breath temperatures, as did those with higher stage disease.

For a variety of reasons, a test as simple as breath temperature seems unlikely to be highly specific. After all, the temperature of the exhaled breath could reflect infection, inflammation, or even activity level, as vigorous exercise can raise the body’s core temperature. Nonetheless, the fact that there is any correlation at all is of interest.

PET scan lung cancerWhat might underlie these findings? Accepting the shortfalls of this small study, it is an interesting point of discussion. First, cancer is a hyper metabolic state. Cancers consume increased quantities of glucose, proteins, and lipids. PET scans measure these phenomena every day. Second, cancer is associated with hyper vascularity. Up-regulation of VEGF could cause hyperemia (increased capillary blood flow) in the airways of lung cancer patients, resulting in the finding. Finally, cancer, in and of itself, is an inflammatory state. Inflammation reflects increased metabolic activity that could manifest as a whole body change in basal temperature.

Lung cancer is the leading cause of cancer death in the US, constituting 27% of all cancer deaths. Despite the over 224,000 new diagnoses and 160,000 deaths, the five-year survival for lung cancer today at 17% has not changed in several decades. Nonetheless patients who are detected early (Stage I) have a greater than 50% five-year survival.

We know from the National Lung Cancer Screening Trial published in 2010, that early detection by CT scans can reduce mortality from this disease by 20%. In the cancer literature, that is huge. The problem is that screening CTs are comparatively expensive, inconvenient, expose patients to radiation and are themselves fraught with false positives and false negatives. Furthermore, it is estimated that that broad application of spiral CT’s could cost over $9 billion a year. Thus, simple, non-invasive screening techniques are sorely needed.

The use of exhaled breath to diagnose cancers has been under in development for decades. Recently, investigators from The Cleveland Clinic and others from Israel have reported good results with a microchip that measures the concentration of volatile organic compounds in the breath and provides a colorimetric score. With several hundred patients the receiver-operating curves (ROC, a technique that gauges the sensitivity and specificity of a test) in the range of 0.85 (1.0 is perfect) are quite favorable. Although these techniques have not yet gained broad application, they are extremely interesting from the standpoint of what it is they are actually measuring.

For decades, the principal focus of scientific exploration in cancer has been genomic. Investigators at Boston University and others at MD Anderson in Texas have used genomic and methylation status of oro-and naso-pharyngeal swabs to identify the earliest hallmarks of malignant transformation. To the contrary, the breath tests described above measure phenomena that fall more in the realm of metabolomics. After all, these are measures of cellular biochemical reactions and identify the transformed state at a metabolic level.

Though still in its infancy, metabolomics reflects the most appealing of all cancer analyses. Examining cancer for what it is, rather than how it came to be, uses biochemistry, enzymology and quantitative analyses. These profile the tumor at the level of cellular function. Like the platforms that I utilize (EVA-PCD), these metabolic analyses examine the tumor phenotype.

I applaud these Italian investigators for using a functional approach to cancer biology. This is a highly productive direction and fertile ground for future research. Will breath temperature measurement prove sensitive and specific enough to diagnose cancer at early stage? It is much too early to say, but at least for now, I wouldn’t hold my breath.

Expert Advice – Another Wrinkle

Few dictates of modern medicine could be considered more sacrosanct than the prohibition of excess salt intake in our daily diets. For more then five decades every medical student has had the principle of dietary salt reduction drummed into his or her heads. Salt was the bane of human health, the poison that created hypertension, congestive heart failure, stroke, renal failure and contributed to the death of untold millions of people in the western society. At least so it seemed.

Three articles in the 08/14/2014 New England Journal of Medicine raise serious questions about the validity of that heretofore established principle of medical therapeutics.

Two of the articles utilized urinary sodium and potassium excretion as a surrogate for dietary intake to examine impact on blood pressure, mortality and cardiovascular events overall. A third article applied a Bayesian epidemiologic modeling technique to assess the impact of sodium intake on cardiovascular mortality.

salt shaker-nihThe first two articles were unequivocal. Low sodium intake, that is, below 1.5 to 2 grams per day was associated with an increase in mortality. High sodium intake that is, greater than 6 grams per day, was also associated with an increase in mortality; but the middle ground, that which reflects the usual intake of sodium in most western cultures did not pose a risk. Thus, the sodium intake associated with the western diet was safe. What is troubling however is the fact that very low sodium diets, those promulgated by the most established authorities in the field, are in fact hazardous to our health.

It seems that every day we are confronted with a new finding that refutes an established dogma of modern medicine. I have previously written blogs on the intake of whole milk or consumption of nuts, both of which were eschewed by the medical community for decades before being resurrected as healthy foodstuffs in the new millennium. One by one these pillars of western medicine have fallen by the wayside. To this collection, we must now add the low-salt diet.

Thomas Kuhn in his 1962 book, The Structure of Scientific Revolutions, stated that a new paradigm would only succeed if a new one arises that can replace it. Perhaps these large meta-analyses will serve that purpose for sodium intake and health. One can only wonder what other medical sacred cows should now be included in these types of inquiries?

As a researcher in the field of human tumor biology and purveyor of the EVA-PCD platform for prediction of chemotherapy drug response and oncologic discovery, I am intrigued but also encouraged, by the scientific community’s growing ability to reconsider its most established principles as new data forces a re-examination of long held beliefs. It may only be a matter of time before more members of the oncologic community re-examine the vast data supporting the predictive validity of these Ex Vivo Analyses and come to embrace these important human tumor phenotypic platforms. At least we can hope so.

Toward A 100% Response Rate in Human Cancer

Oncologists confront numerous hurdles as they attempt to apply the new cancer prognostic and predictive tests. Among them are the complexities of gene arrays that introduce practicing physicians to an entirely new lexicon of terms like “splice variant, gene-rearrangement, amplification and SNP.”

Althougcancer for dummiesh these phrases may roll of the tongue of the average molecular biologists (mostly PhDs), they are foreign and opaque to the average oncologist (mostly MDs). To address this communication shortfall laboratory service providers provide written addenda (some quite verbose) to clarify and illuminate the material. Some institutions have taken to convening “molecular tumor boards” where physicians most adept at genomics serve as “translators.” Increasingly, organizations like ASCO offer symposia on modern gene science to the rank and file, a sort of Cancer Genomics for Dummies. If we continue down this path, oncologists may soon know more but understand less than any other medical sub-specialists.

However well intended these educational efforts may be, none of them are prepared to address the more fundamental question: How well do genomic profiles actually predict response? This broader issue lays bare our tendency to confuse data with results and big data with big results. To wit, we must remember that our DNA, originally provided to each of us in the form of a single cell (the fertilized ovum) carries all of the genetic information that makes us, us. From the hair follicles on our heads to the acid secreting cells in our stomach, every cell in our body carries exactly the same genetic data neatly scripted onto our nuclear hard-drives.
What makes this all work, however, isn’t the DNA on the hard drive, but instead the software that judiciously extracts exactly what it needs, exactly when it needs it. It’s this next level of complexity that makes us who we are. While it is true that you can’t grow hair or secrete stomach acid without the requisite DNA, simply having that DNA does not mean you will grow hair or make acid. Our growing reliance upon informatics has created a “forest for the trees” scenario, focusing our gaze upon nearby details at the expense of larger trends and insights.

What is desperately needed is a better approximation of the next level of complexity. In biology that moves us from the genotype (informatics) to the phenotype (function). To achieve this, our group now regularly combines genomic, transcriptomic or proteomic information with functional analyses. This enables us to interrogate whether the presence or absence of a gene, transcript or protein will actually confer that behavior or response at the system level.

I firmly believe that the future of cancer therapeutics will combine genomic, transcriptomic and/or proteomic analyses with functional (phenotypic) analyses.

Recent experiences come to mind. A charming patient in her 50s underwent a genomic analysis that identified a PI3K mutation. She sought an opinion. We conducted an EVA-PCD assay on biopsied tissue that confirmed sensitivity to the drugs that target PI3K. Armed with this information, we administered Everolimus at a fraction of the normal dose. The response was prompt and dramatic with resolution of liver function abnormalities, normalization of her performance status and a quick return to normal activities. A related case occurred in a young man with metastatic colorectal cancer. He had received conventional chemotherapies but at approximately two years out, his disease again began to progress.

A biopsy revealed that despite prior exposure to Cetuximab (the antibody against EGFR) there was persistent activity for the small molecule inhibitor, Erlotinib. Consistent with prior work that we had reported years earlier, we combined Cetuximab with Erlotinib, and the patient responded immediately.

Each of these patients reflects the intelligent application of available technologies. Rather than treat individuals based on the presence of a target, we can now treat based on the presence of a response. The identification of targets and confirmation of response has the potential to achieve ever higher levels of clinical benefit. It may ultimately be possible to find effective treatments for every patient if we employ multi-dimensional analyses that incorporate the results of both genomic and phenotypic platforms.