The mistimed death clock: How much time do I have left?

Someone has set up a macabre “death clock“, a web site where individuals can enter a few personal statistics — birthdate, sex, smoking status, and general level of optimism, and it will calculate a “personal date of death”, together with an ominous clock ticking down the seconds remaining in your life. (For Americans, ethnic group is a hugely significant predictor, but I’m not surprised that they leave this out. Ditto for family income.) It’s supposed to be a sharp dose of reality, I suppose, except that it’s nonsense.

Not because no one knows the day or the hour, though that is true, but because the author has built into the calculator a common but elementary misconception about life expectancy, namely, that we lose a year of expected remaining life for every year that we live. Thus, when I enter my data the clock tells me that I am expected to die on August 6 2042. If I move my birthdate back* by 10 years — making myself 10 years older — my date of death moves back by the same amount, to August 6 2032. If I tell it I was born in 1936 it tells me that my time has already run out, which is obviously absurd.

In fact, every year that you live, you lose 1 year, but gain a proportion a remainder equivalent to the probability that you might have died. Thus, a 46-year-old US man has expected remaining lifespan 33.21 years. He has probability 0.00365 of dying in the next year; if he makes it through that year and reaches his 47th birthday, his expected remaining lifespan is (33.21-1)+.00365 x 32.21 = 32.33 years.** So he’s only lost 0.88 years off his remaining lifespan. In this way, it’s actually possible to have more expected remaining lifespan at an older age than at a younger, if the mortality rate is high enough. Thus, if we go back to 1933 mortality rates, the expected lifespan at birth was 59.2 years. But a 1-year-old, having made it through the 6.5% infant mortality, now has 62.3 years remaining on average.

This is another way of expressing the well-known but still often not-sufficiently-appreciated impact of infant mortality on life expectancy. The life-expectancy at birth for US males is 76.4 years. But that obviously doesn’t mean that everyone keels over 5 months into their 77th year. 60% of the newborn males are expected to live past this age, and a 77-year-old man has 10 remaining years on average.

Of course, these are all what demographers call “period” life expectancies, based on the mortality rates experienced in the current year, and pretending that these mortality rates will continue into the future. Based on the experience of the past two centuries we expect the mortality rates to continue to fall, in which case the true average lifespans for people currently alive — the “cohort life expectancies” will exceed these period calculations — but there is no way to know. If an asteroid hits the earth tomorrow and wipes out all life on earth, this period calculation will be rendered nugatory (but there will be no one left to point that out. Hah!) The true average lifespan of the infants born this year will not be known until well into the 22nd century. Or, if Aubrey de Grey is right, not until the 32nd century.

* Or is it moving my birthdate forward by 10 years when I make it 10 years earlier? Reasonable people disagree on this point! And there’s interesting research on the habits of mind that lead one to choose the metaphor of the stationary self with time streaming past me, or the self moving like a river through a background of time.

** Actually, it’s (33.21-1)/(1-.00365)

Moral panic panic: How much ridicule are the lives of 4500 children a year worth?

As though it need to defend its title as the world’s leading provider of smug, The New Republic has published a piece by NY Times religion reporter Mark Oppenheimer (MO hereafter) about how irrational everyone is. This disturbs him, because when he was growing up, when all was right with the world, “It was taken for granted in my house… that only right-wingers were mad enough to oppose scientifically tested public-health measures.” He describes what he calls “The New Puritanism”, starting from opposition to water fluoridation in Portland (which doesn’t look like an archetypically puritanical cause to the untrained eye), and moving on to Kids Today:

At a birthday party for a three-year-old, I was hit with the realization that most of the parents around me were in the grip of moral panic, the kind of fear of contamination dramatized so well in The Crucible. One mother was trying to keep her daughter from eating a cupcake, because of all the sugar in cupcakes. Another was trying to limit her son to one juice box, because of all the sugar in juice. A father was panicking because there was no place, in this outdoor barn-like space at some nature center or farm or wildlife preserve, where his daughter could wash her hands before eating. And while I did not hear any parent fretting about the organic status of the veggie dip, I became certain there were such whispers all around me.

Now, this could be dismissed as a dreary attempt to channel PJ O’Rourke, or some comparable swaggering humourist, with a cookie-cutter tall tale, but it’s stuffed with all kinds of weird. He hallucinates “whispers all around” about the organic status of the veggie dip, and yet he insists it is the others whose mental stability is in doubt. With that in mind, one might suspect that the father was not “panicking”, but was simply asking where his daughter could wash her hands before eating, which was certainly the custom when I was a child, though perhaps not in Oppenheimer’s antediluvian childhood.

He cites The Crucible, presumably both as a touchstone of left-wing right-thinking and as a marker of his own cultural sophistication, but has clearly never read or seen it. While “witchcraft” are often taken as a metonym for fear of moral contamination, Miller’s play dramatizes political manipulation of mob psychology.

But putting aside MO’s paranoid-pretentious MO, I am fascinated by his comments

When I was a child, birthday parties involved cake, ice cream, and Chuck E. Cheese pizza, or pizza-like substance; and trips to the grandparents’ house involved root-beer floats and late-night viewings of Benny Hill with my grandfather, who liked the T&A humor. I never washed my hands before I ate. And I turned out splendidly.

So, we started with fluoridation of water, which is a “scientifically tested public-health measure” that only a crazy person could oppose, but washing hands before eating — at a “barn-like space” where, presumably, it is not absurd to suppose the children may have been exposed to animal feces — is the kind of over-the-top fear of moral contamination (not just bacterial contamination) that invites mockery.

Now, MO’s aforementioned paranoid delusions may cause one to question his splendid self-appraisal, but he is certainly not alone in trumpeting the formulation “When I was a child we all did X, and we all turned out alright,” where X is some dangerous or unedifying activity that educated middle-class parents today try to limit or eliminate. An extreme version is this text that got forwarded to me a few years back:

To Those of Us Born 1930 – 1979

First, we survived being born to mothers who smoked and/or drank while they were pregnant. They took aspirin, ate blue cheese dressing, tuna from a can and didn’t get tested for diabetes. Then after that trauma, we were put to sleep on our tummies in baby cribs covered with bright colored lead-base paints. We had no childproof lids on medicine bottles, locks on doors or cabinets and when we rode our bikes, we had baseball caps not helmets on our heads. As infants & children, we would ride in cars with no car seats, no booster seats, no seat belts, no air bags, bald tires and sometimes no brakes. Riding in the back of a pick- up truck on a warm day was always a special treat. We drank water from the garden hose and not from a bottle. We shared one soft drink with four friends, from one bottle and no one actually died from this. We ate cupcakes made with Lard, white bread, real butter and bacon. We drank FLAV-OR- AID made with real white sugar…. We fell out of trees, got cut, broke bones and teeth and there were no lawsuits from these accidents. We would get spankings with wooden spoons, switches, ping pong paddles, or just a bare hand and no one would call child services to report abuse…

You might want to share this with others who have had the luck to grow up as kids, before the lawyers and the government regulated so much of our lives for our own good. While you are at it, forward it to your kids so they will know how brave and lucky their parents were. Kind of makes you want to run through the house with scissors, doesn’t it?

The implication is that the kids are all softies and the parents are anxious killjoys. I heard a stand-up comedian a few years back complaining about bicycle helmets: “When I was a kid we all fell off our bikes. We didn’t fall on our heads. If we did, no one died. Have kids’ heads gotten softer?”

Except, of course, that it’s not true that no one died. This is a good example of how people deal with small risks: Some are treated as zero, others are exaggerated. And part of the phenomenon (though I’ve never seen anyone analyse this process in detail) is that people fixate on whatever the current largest risks are, and often succeed in pushing them down. At that point, a new danger pops up that was always there, but masked by a larger risk, and so psychologically zeroed out. Thus, when I was growing up, in the 1970s, public health officials weren’t very concerned with children’s head injuries from bicycle accidents because there were far more of them from automobile accidents in the absence of seat belts, not to mention all the poisonings from medications without child-resistant packaging. If the risk of dying

To put some numbers on it: In the US, in 1998, about 6500 children under the age of 15 died in accidents. In 1981 (the earliest year whose statistics I have easily available at the moment) the number was 9000. In that time, the population under 15 increased from 49 to 60 million. In other words, if the society had held onto its habits of eschewing bicycle helmets, leaving the medications out, riding in the back of a pickup truck and all the rest, we’d have more than 4500 extra dead children a year. How awesome would that be?

That’s not to say that all concerns about health and nutrition and environment are reasonable — or that, even if they are reasonable, that the actions one would take to prevent or mitigate harm would not impose considerable costs, even such that they might be judged to outweigh the benefits. But instead of mockery and “I turned out alright” populism, we need to be clear on what the benefits are: 4500 fewer children being buried every year. And that’s ignoring the costs of nonlethal sickness and injury, the extra miscarriages and stillbirths, and the long-term damage to lungs and other organs that we now know were caused by all those smoking and drinking parents.

Update: The comedian I was thinking of was a woman, but here’s another comedian making fun of bicycle helmets for emasculating our children; in this version, he’s not asking why heads got softer, but why the pavement is harder. Same joke.

Stephen Wolfram’s longitudinal fables

There’s lots of interesting plots on Stephen Wolfram’s analysis of Facebook data, but what jumps out to me is the way he feels compelled to turn his cross-sectional data — information about people’s interests, structure of friendship networks, relationship status, etc. as a function of age — into a longitudinal story. For example, he describes this plotrelationship-status-vs-age2

by saying “The rate of getting married starts going up in the early 20s[…] and decreases again in the late 30s, with about 70% of people by then being married.” Now, this is more or less a true statement, but it’s not really what is being illustrated here. (And it’s not just the weird anomaly, which he comments on but doesn’t try to explain, of the 10% or so of Facebook 13 year olds who describe themselves as married.) What we see is a snapshot in time — a temporal cross section, in the jargon — rather than a description of how the same people (a cohort, as demographers would put it) moves through life. To see how misleading this cross-sectional picture can be if you try to see it as a longitudinal story of individuals moving through life, think first about the right-hand side of the graph. It is broadly true, according to census data, that about 80% of this age group are married or widowed. But it is also true that 95% were once married. In fact, if they had had Facebook when they were 25 years old, their Stephen Wolfram would have found that most of them (about 75%) were already married by that age. (In fact, about 5% of the women and 3% of the men were already in a second marriage by age 25.)

So, the expansion of the “married” segment of the population as we go from left to right reflects in part the typical development of a human life, but it reflects as well the fact that we are moving back in time, to when people were simply more likely to marry. And the absence of a “divorced” category masks the fact that while the ranks of the married expand with age, individuals move in and out of that category as they progress through their lives.

Of course, the same caveat applies to the stories that Wolfram tells about his (quite fascinating) analyses of structure of friend networks by age, and of the topics that people of different ages refer to in Facebook posts. While it is surely true that the surge in discussion of school and university centred at age 18 reflects life-phase-determined variation in interests, the extreme drop in interest in salience of social media as a topic is likely to reflect a generational difference, and the steep increase in prominence of politics with age may be generational as well. (I wonder, too, whether the remarkably unchanging salience of “books” might reflect a balance between a tendency to become less involved with books with age, cancelling out a generational shift away from interest in books.)

Are you demographic?

As a sometime demographer myself, I am fascinated by the prominence of “demographics” as an explanatory concept in the recent presidential election, now already slipping away into hazy memory. Recent political journalism would barely stand without this conceptual crutch, as here and here and here. A bit more nuance here. Some pushback from the NY Times here.

The crassest expression of this concept came in an article yesterday by (formerly?) respected conservative journalist Michael Barone, explaining why he was no longer confident that Mitt Romney would win the election by a large margin. Recall that several days before the election, despite the contrary evidence of what tens of thousands of voters were actually telling pollsters, he predicted 315 electoral votes for Romney, saying “Fundamentals usually prevail in American elections. That’s bad news for Barack Obama.” In retrospect, he says,

I was wrong because the outcome of the election was not determined, as I thought it would be, by fundamentals…. I think fundamentals were trumped by mechanics and, to a lesser extent, by demographics.

Continue reading “Are you demographic?”