Wednesday, June 29, 2011

The Sun Is the Best Optometrist

WHY is nearsightedness so common in the modern world? In the early 1970s, 25 percent of Americans were nearsighted; three decades later, the rate had risen to 42 percent, and similar increases have occurred around the world.

There is significant evidence that the trait is inherited, so you might wonder why our myopic ancestors weren’t just removed from the gene pool long ago, when they blundered into a hungry lion or off a cliff. But although genes do influence our fates, they are not the only factors at play.

In this case, the rapid increase in nearsightedness appears to be due to a characteristic of modern life: more and more time spent indoors under artificial lights.

Our genes were originally selected to succeed in a very different world from the one we live in today. Humans’ brains and eyes originated long ago, when we spent most of our waking hours in the sun. The process of development takes advantage of such reliable features of the environment, which then may become necessary for normal growth.

Researchers suspect that bright outdoor light helps children’s developing eyes maintain the correct distance between the lens and the retina — which keeps vision in focus. Dim indoor lighting doesn’t seem to provide the same kind of feedback. As a result, when children spend too many hours inside, their eyes fail to grow correctly and the distance between the lens and retina becomes too long, causing far-away objects to look blurry.

One study published in 2008 in the Archives of Ophthalmology compared 6- and 7-year-old children of Chinese ethnicity living in Sydney, Australia, with those living in Singapore. The rate of nearsightedness in Singapore (29 percent) was nearly nine times higher than in Sydney. The rates of nearsightedness among the parents of the two groups of children were similar, but the children in Sydney spent on average nearly 14 hours per week outside, compared with just three hours per week in Singapore.

Similarly, a 2007 study by scholars at Ohio State University found that, among American children with two myopic parents, those who spent at least two hours per day outdoors were four times less likely to be nearsighted than those who spent less than one hour per day outside.

In short, the biological mechanism that kept our vision naturally sharp for thousands of sunny years has, under new environmental conditions, driven visual development off course. This capacity for previously well-adapted genes to be flummoxed by the modern world can account for many apparent imperfections. Brain wiring that effortlessly recognizes faces, animals and other symmetrical objects can be thrown off by letters and numbers, leading to reading difficulties. A restless nature was once helpful to people who needed to find food sources in the wild, but in today’s classrooms, it’s often classified as attention deficit hyperactivity disorder. When brains that are adapted for face-to-face social interactions instead encounter a world of e-mail and Twitter — well, recent headlines show what can happen.

Luckily, there is a simple way to lower the risk of nearsightedness, and today, the summer solstice — the longest day of the year — is the perfect time to begin embracing it: get children to spend more time outside.

Parents concerned about their children’s spending time playing instead of studying may be relieved to know that the common belief that “near work” — reading or computer use — leads to nearsightedness is incorrect. Among children who spend the same amount of time outside, the amount of near work has no correlation with nearsightedness. Hours spent indoors looking at a screen or book simply means less time spent outside, which is what really matters.

This leads us to a recommendation that may satisfy tiger and soccer moms alike: if your child is going to stick his nose in a book this summer, get him to do it outdoors.

No comments:

Post a Comment