Findings

Hot spots

Yale research points to a way to measure someone's past sun exposure—and cancer risk.

Anuragrana18/Wikimedia Commons

Anuragrana18/Wikimedia Commons

View full image

You’re in the dermatologist’s office. You want to know your risk of developing skin cancer, the most common form of cancer in the US, the kind that one in five Americans will develop in their lifetime. The doctor asks: were you often sunburned as a kid? You cast your thoughts back—beach vacations, summer hikes, swimming holes. I don’t know, you think. Kind of?

“That’s the situation in the twenty-first century,” says Douglas Brash, a senior research scientist in therapeutic radiology and dermatology. “Which is why there is a push to get data” indicating one’s total past sun exposure—“not someone trying to remember whether they got a sunburn when they were three.”

New research published in the Proceedings of the National Academy of Sciences provides a potentially huge step toward this goal. With seven colleagues, Brash studied the genomes of human skin cells that had been exposed to UV radiation. He expected to find long areas of the genome that showed slightly more susceptibility to UV damage than other areas; what he found instead were many specific sites that were up to 170-fold more susceptible to UV exposure. (Susceptibility was measured by the extent of a particular type of DNA damage, known as a cyclobutane pyrimidine dimer.)

These “hyperhotspots” could ultimately serve as a measure of how much sun exposure a person has had. He’s now at work designing a diagnostic tool for analyzing minute skin samples. Dermatologists could use the tool to identify patients who have sun damage and should be checked for early stage cancers.

The comment period has expired.