Tag Archives: age

Philip Cohen at 50, having been 14 in 1981

This is a sociological reflection about life history. It’s about me because I’m the person I know best, and I have permission to reveal details of my life.

I was born in August 1967, making me 50 years old this month. But life experience is better thought of in cohort terms. Where was I and what was I doing, with whom, at different ages and stages of development? Today I’m thinking of these intersections of biography and history in terms of technology, music, and health.

Tech

We had a TV in my household growing up, it just didn’t have a remote control or cable service, or color. We had two phones, they just shared one line and were connected by wires. (After I moved out my parents got an answering machine.) When my mother, a neurobiologist, was working on her dissertation (completed when I was 10) in the study my parents shared, she used a programmable calculator and graph paper to plot the results of her experiments with pencil. My father, a topologist, drew his figures with colored pencils (I can’t describe the sound of his pencils drawing across the hollow wooden door he used for a desktop, but I can still hear it, along with the buzz of his fluorescent lamp). A couple of my friends had personal computers by the time I started high school, in 1981 (one TRS-80 and one Apple II), but I brought a portable electric typewriter to college in 1988. I first got a cell phone in graduate school, after I was married.

The first portable electronic device I had (besides a flashlight) was a Sony Walkman, in about 1983, when I was 16. At the time nothing mattered to me more than music. Music consumed a large part of my imagination and formed the scaffolding of most socializing. The logistics of finding out about, finding, buying, copying, and listening to music played an outsized role in my daily life. From about 1980 to 1984, most of the money I made at my bagel store job went to stereo equipment, concerts, records, blank tapes for making copies, and eventually drums (as well video games). I subscribed to magazines (Rolling Stone, Modern Drummer), hitchhiked across town to visit the record store, pooled money with friends to buy blank tapes, spent hours copying records and labeling tapes with my friends, and made road trips to concerts across upstate New York (clockwise from Ithaca: Geneva, Buffalo, Rochester, Syracuse, Saratoga, Binghamton, New York City, Elmira).

As I’m writing this, I thought, “I haven’t listened to Long Distance Voyager in ages,” tapped it into Apple Music on my phone, and started streaming it on my Sonos player in a matter of seconds, which doesn’t impress you at all – but the sensory memories it invokes are shockingly vivid (like an acid flashback, honestly) – and having the power to evoke that so easily is awesome, in the old sense of that word.

Some of us worked at the Cornell student radio station (I eventually spent a while in the news department), whose album-oriented rock playlist heavily influenced the categories and relative status of the music we listened to. The radio station also determined what music stayed in the rotation – what eventually became known by the then-nonexistent term classic rock – and what would be allowed to slip away; it was history written in real time.

It’s like 1967, in 1981

You could think of the birth cohort of 1967 as the people who entered the world at the time of “race riots,” the Vietnam and Six Day wars, the Summer of Love, the 25th Amendment (you’re welcome!), Monterey Pop, Sgt. Peppper’s, and Loving v. Virginia. Or you could flip through Wikipedia’s list of celebrities born in 1967 to see how impressive (and good looking) we became, people like Benicio del Toro, Kurt Cobain, Paul Giamatti, Nicole Kidman, Pamela Anderson, Will Ferrell, Vin Diesel, Phillip Seymour Hoffman, Matt LeBlanc, Michael Johnson, Liev Schreiber, Julia Roberts, Jimmy Kimmel, Mark Ruffalo, and Jamie Foxx.

But maybe it makes more sense to think of us as the people who were 14 when John Lennon made his great commercial comeback, with an album no one took seriously – only after being murdered. The experiences at age 14, in 1981, define me more than what was happening at the moment of my birth. Those 1981 hits from album-oriented rock mean more to me than the Doors’ debut in 1967. My sense of the world changing in that year was acute – because it was 1981, or because I was 14? In music, old artists like the Moody Blues and the Rolling Stones released albums that seemed like complete departures, and more solo albums – by people like Stevie Nicks and Phil Collins – felt like stakes through the heart of history itself (I liked them, actually, but they were also impostors).

One moment that felt at the time like a historical turning point was the weekend of September 19, 1981. My family went to Washington for the Solidarity Day rally, at which a quarter million people demonstrated against President Reagan and for organized labor, a protest fueled by the new president’s firing of the PATCO air traffic controllers the previous month (and inspired by the Solidarity union in Poland, too). Besides hating Reagan, we also feared a nuclear war that would end humanity – I mean really feared it, real nightmare fear.

173042_940043902348_3109330_o

A piece of radio news copy I wrote and read at WVBR, probably 1983. The slashes are where I’m going to take a breath. “Local AQX” is the name of the tape cartridge with the sound bite (“actuality”) from Alfred Kahn, and “OQ:worse” means that’s the last word coming out of the clip.

On the same day as Solidarity, while we were in D.C., was Simon and Garfunkel’s Concert in Central Park. They were all of 40 (literally my mother’s age), tired old people with a glorious past (I’m sure I ignored the rave reviews). As I look back on these events – Reagan, the Cold War, sell-out music – in the context of what I thought of as my emerging adulthood, they seemed to herald a dark future, in which loss of freedom and individuality, the rise of the machines, and runaway capitalism was reflected in the decline of rock music. (I am now embarrassed to admit that I even hated disco for a while, maybe even while I listened 20 times, dumbstruck, to an Earth, Wind, and Fire album I checked out of the library.)

I don’t want to overdramatize the drama of 1981; I was basically fine. I came out with a penchant for Camus, a taste for art rock, and leftism, which were hardly catastrophic traits. Still, those events, and their timing, probably left a mark of cynicism, sometimes nihilism, which I carry today.

1937404_725426252838_6822157_n

About 1984, with Daniel Besman (who later died) in Ithaca. Photo by Linda Galgani.

Data aside

Maybe one reason 1981 felt like a musical watershed to me is because it really was, because pop music just got worse in the 1980s compared to the 1970s. To test (I mean prove, really) that hypothesis, I fielded a little survey (more like a game) that asked people to rate the same artists in both decades. I chose 58 artists by flipping through album charts from 1975-1984 and finding those that charted in both decades; then I added some suggestions from early respondents. To keep the task from being too onerous, as it required scoring bands twice from 1 (terrible) to 5 (great), once for each period, and some people found it difficult, I set the survey to serve each person just 10 artists at random (a couple of people did it more than once). The participants were 3/4 men, 3/4 over 40, and 3/4 White and US-born; found on Facebook, Twitter, and Reddit. The average artist was rated 11 times in each period (range 5 to 19). (Feel free to play along or share this link; I’ll update it if more come in.)

The results look very bad for the 1980s. The average change was a drop of .59, and only three acts showed noticeable improvement: Pat Benatar, Michael Jackson, and Prince (and maybe Talking Heads and the lowly Bryan Adams). Here is the full set (click to enlarge):

Technology and survival

I don’t think I would have, at age 14, given much weight to the idea that my life would repeatedly be saved by medical technology, but now that seems like business as usual, to me anyway. I guess as long as there’s been technology there have been people who owe their lives to it (and of course we’re more likely to hear from them than from those who didn’t make it). But the details are cohort-specific. These days we’re a diverse club of privileged people, our conditions, or their remnants, often hidden like pebbles wedged under the balls of our aging feet, gnawing reminders of our bodily precarity.

Family lore says I was born with a bad case of jaundice, probably something like Rh incompatibility, and needed a blood transfusion. I don’t know what would have happened without it, but I’m probably better off now for that intervention.

Sometime in my late teens I reported to a doctor that I had periodic episodes of racing heartbeat. After a brief exam I was sent home with no tests, but advised to keep an eye on it; maybe mitral valve prolapse, he said. I usually controlled it by holding my breath and exhaling slowly. We found out later, in 2001 – after several hours in the emergency room at about 200 very irregular beats per minute – that it was actually a potentially much more serious condition called Wolff-Parkinson-White syndrome. The condition is easily diagnosed nowadays, as software can identify the tell-tale “delta wave” on the ECG, and the condition is listed right there in the test report.

Rhythm_WPW

Two lucky things combined: (a) I wasn’t diagnosed properly in the 1980s (which might have led to open-heart surgery or a lifetime of unpleasant medication), and; (b) I didn’t drop dead before it was finally diagnosed in 2001. They fixed it with a low-risk radiofrequency ablation, just running a few wires up through my arteries to my heart, where they lit up to burn off the errant nerve ending, all done while I was almost awake, watching the action on an x-ray image and – I believed, anyway – feeling the warmth spread through my chest as the doctor typed commands into his keyboard.

Diverticulitis is also pretty easily diagnosed nowadays, once they fire up the CT scanner, and usually successfully treated by antibiotics, though sometimes you have to remove some of your colon. Just one of those things people don’t die from as much anymore (though it’s also more common than it used to be, maybe just because we don’t die from other things as much). I didn’t feel like much like surviving when it was happening, but I suppose I might have made it even without the antibiotics. Who knows?

More interesting was the case of follicular lymphoma I discovered at age 40 (I wrote about it here). There is a reasonable chance I’d still be alive today if we had never biopsied the swollen lymph node in my thigh, but that’s hard to say, too. Median survival from diagnosis is supposed to be 10 years, but I had a good case (a rare stage I), and with all the great new treatments coming online the confidence in that estimate is fuzzy. Anyway, since the cancer was never identified anywhere else in my body, the treatment was just removing the lymph node and a little radiation (18 visits to the radiation place, a couple of tattoos for aiming the beams, all in the summer with no work days off). We have no way (with current technology) to tell if I still “have” it or whether it will come “back,” so I can’t yet say technology saved my life from this one (though if I’m lucky enough to die from something else — and only then — feel free to call me a cancer “survivor”).

It turns out that all this life saving also bequeaths a profound uncertainty, which leaves one with an uneasy feeling and a craving for antianxiety medication. I guess you have to learn to love the uncertainty, or die trying. That’s why I cherish this piece of a note from my oncologist, written as he sent me out of the office with instructions never to return: “Your chance for cure is reasonable. ‘Pretest probability’ is low.”

pretest-probability-is-low

From my oncologist’s farewell note.

Time travel

It’s hard to imagine what I would have thought if someone told my 14-year-old self this story: One day you will, during a Skype call from a hotel room in Hangzhou, where you are vacationing with your wife and two daughters from China, decide to sue President Donald Trump for blocking you on Twitter. On the other hand, I don’t know if it’s possible to know today what it was really like to be me at age 14.

In the classic time travel knot, a visitor from the future changes the future by going back and changing the past. The cool thing about mucking around with your narrative like I’m doing in this essay (as Walidah Imarisha has said) is that it by altering our perception of the past, we do change the future. So time travel is real. Just like it’s funny to think of my 14-year-old self having thoughts about the past, I’m sure my 14-year-old self would have laughed at the idea that my 50-year-old self would think about the future. But I do!

5 Comments

Filed under Me @ work

Births to 40-year-olds are less common but a greater share than in 1960

Never before have such a high proportion of all births been to women over 40 — they are now 2.8% of all births in the US. And yet a 40-year-old woman today is one-third less likely to have a baby than she was in 1947.

From 1960 to 1980, birth rates to women over 40* fell, as the Baby Boom ended and people were having fewer children by stopping earlier. Since 1980 birth rates to women over 40 have almost tripled as people started “starting” their families at later ages, but they’re still lower than they were back when total fertility was much higher.

40yrbirths

Sources: Birth rates 1940-1969, 1970-2010, 2011, 2012-2013, 2014-20152016; Percent of births 1960-1980, 1980-2008.

Put another way, a child born to a mother over 40 before 1965 was very likely the youngest of several (or many) siblings. Today they are probably the youngest of 2 or an only child. A crude way to show this is to use the Current Population Survey to look at how many children are present in the households of women ages 40-49 who have a child age 0 (the survey doesn’t record births as events, but the presence of a child age 0 is pretty close). Here is that trend:

sibs40p

In the 1970s about 60 percent of children age 0 had three or more siblings present, and only 1 in 20 was an only child. Now more than a quarter are the only child present and another 30 percent only have one sibling present. (Note this doesn’t show however many siblings no longer live in the household, and I don’t know how that might have changed over the years).

This updates an old post that focused on the health consequences of births to older parents. The point from that post remains: there are fewer children (per woman) being born to 40-plus mothers today than there were in the past, it just looks like there are more because they’re a larger share of all children.

* Note in demography terms, “over 40” means older than “exact age” 40, so it includes people from the moment they turn 40.

Leave a comment

Filed under In the news

The U.S. government asked 2 million Americans one simple question, and their answers will shock you

What is your age?

[SKIP TO THE END for a mystery-partly-solved addendum]

Normally when we teach demography we use population pyramids, which show how much of a population is found at each age. They’re great tools for visualizing population distributions and discussing projections of growth and decline. For example, consider this contrast between Niger and Japan, about as different as we get on earth these days (from this cool site):

japan-niger-pyramids

It’s pretty easy to see the potential for population growth versus decline in these patterns. Finding good pyramids these days is easy, but it’s still good to make some yourself to get a feel for how they work.

So, thinking I might make a video lesson to follow up my blockbuster total fertility rate performance, I gathered some data from the U.S., using the 2013 American Community Survey (ACS) from IPUMS.org. I started with 10-year bins and the total population (not broken out by sex), which looks like this:

totalbinned

There’s the late Baby Boom, still bulging out at ages 50-59 (born 1954-1963), and their kids, ages 20-29. So far so good. But why not use single years of age and show something more precise? Here’s the same data, but showing single years of age:

totalsingleyears

That’s more fine-grained. Not as much as if you had data by months or days of birth, but still. Except, wait: is that just sample noise causing that ragged edge between 20 and about 70? The ACS sample is a few million people, with tens of thousands of people at each age (up age 75, at least), so you wouldn’t expect too much of that. No, it’s definitely age heaping, the tendency of people to skew their age reporting according to some collective cognitive scheme. The most common form is piling up on the ages ending with 0 and 5, but it could be anything. For example, some people might want to be 18, a socially significant milestone in this country. Here’s the same data, with suspect ages highlighted — 0’s and 5’s from 20 to 80, and 18:

totalsingleyearsflagged

You might think age heaping results from some old people not remembering how old they are. In the old days rounding off was more common at older ages. In 1900, for example, the most implausible number of people was found at age 60 — 1.6-times as many as you’d get by averaging the number of people at ages 59 and 61. Is that still the case? Here it is again, but with the red/green highlights just showing the difference between the number of people reported and the number you’d get by averaging the numbers just above and below:

totalsingleyearsflaggedhighlightProportionately, the 70-year-olds are most suspicious, at 10.8% more than you’d expect. But 40 is next, at 9.2%. And that green line shows extra 18-year-olds at 8.6% more than expected.

Unfortunately, it’s pretty hard to correct. Interestingly, the American Community Survey apparently asks for both an age and a birth date:

acs-age

If you’re the kind of person who rounds off to 70, or promotes yourself to 18, it might not be worth the trouble to actually enter a fake birth date. I’m sure the Census Bureau does something with that, like correct obvious errors, but I don’t think they attempt to correct age-heaping in the ACS (the birth dates aren’t on the public use files). Anyway, we can see a little of the social process by looking at different groups of people.

Up till now I’ve been using the full public use data, with population weights, and including those people who left age blank or entered something implausible enough that the Census Bureau gave them an age (an “allocated” value, in survey parlance). For this I just used the unweighted counts of people whose answers were accepted “as written” (or typed, or spoken over the phone, depending on how it was administered to them). Here are the patterns for people who didn’t finish high school versus those with a bachelor’s degree or higher, highlighting the 5’s and 0’s (click to enlarge):

heapingbyeduc

Clearly, the age heaping is more common among those with less education. Whether it’s really people forgetting their age, rounding up or down for aspirational reasons, or having trouble with the survey administration, I don’t know.

Is this bad? As much as we all hate inaccuracy, this isn’t so bad. Fortunately, demographers have methods for assessing the damage caused by humans and their survey-taking foibles. In this case we can use Whipple’s index. This measure (defined in this handy United Nations slideshow) takes the number of people whose alleged ages end in 0 or 5 and multiplies that by 5, then compares it to the total population. Normally people use ages 23 to 62 (inclusive), for an even 40 years. The amount by which people reporting ages 25, 30, 35, 40, 45, 50, 55, and 60 are more than one-fifth of the population ages 23-62, that’s your Whipple’s index. A score of 100 is perfect, and a score of 500 means everyone’s heaped. The U.N. considers scores under 105 to be “very accurate data.” The 2013 ACS, using the public use file and the weights, gives me a score of 104.3. (Those unweighted distributions by education yield scores of 104.0 for high school dropouts and 101.7 for college graduates.) In contrast, the Decennial Census in 2010 had a score of just 101.5 by my calculation (using table QT-P2 from Summary File 1). With the size of the ACS, this difference shouldn’t have to do with sampling variation. Rather, it’s something about the administration of the survey.

Why don’t they just tell us how old they really are? There must be a reason.

Two asides:

  • The age 18 pattern is interesting — I don’t find any research on desirable young-adult ages skewing sample surveys.
  • This is all very different from birth timing issues, such as the Chinese affinity for births in dragon years (every twelfth year: 1976, 1988…). I don’t see anything in the U.S. pattern that fits fluctuations in birth rates.

Mystery-partly-solved addendum

I focused one education above, but another explanation was staring me in the face. I said “it’s something about the administration of the survey,” but didn’t think to check for the form of survey people took. The public use files for ACS include an indicator of whether the household respondent took the survey through the mail (28%), on the web (39%), through a bureaucrat at the institution where they live (group quarters; 5%), or in an interview with a Census worker (28%). This last method, which is either a computer-assisted telephone interview (CATI) or computer-assisted personal interview (CAPI), is used when people don’t respond to the mailed survey.

It turns out that the entire Whipple problem in the 2013 ACS is due to the CATI/CAPI interviews. The age distributions for all of the other three methods have Whipple index scores below 100, while the CATI/CAPI folks clock in at a whopping 108.3. Here is that distribution, again using unweighted cases:

caticapiacs

There they are, your Whipple participants. Who are they, and why does this happen? Here is the Bureau’s description of the survey data collection:

The data collection operation for housing units (HUs) consists of four modes: Internet, mail, telephone, and personal visit. For most HUs, the first phase includes a mailed request to respond via Internet, followed later by an option to complete a paper questionnaire and return it by mail. If no response is received by mail or Internet, the Census Bureau follows up with computer assisted telephone interviewing (CATI) when a telephone number is available. If the Census Bureau is unable to reach an occupant using CATI, or if the household refuses to participate, the address may be selected for computer-assisted personal interviewing (CAPI).

So the CATI/CAPI people are those who were either difficult to reach or were uncooperative when contacted. This group, incidentally, has low average education, as 63% have high school education or less (compared with 55% of the total) — which may explain the association with education. Maybe they have less accurate recall, or maybe they are less cooperative, which makes sense if they didn’t want to do the survey in the first place (which they are legally mandated — i.e., coerced — to do). So when their date of birth and age conflict, and the Census worker tries to elicit a correction, maybe all hell breaks lose in the interview and they can’t work it out. Or maybe the CATI/CAPI households have more people who don’t know each other’s exact ages (one person answers for the household). I don’t know. But this narrows it down considerably.

6 Comments

Filed under Research reports