Draft: Open letter to the Pew Research Center on generation labels

This post has been updated with the final signing statement and a link to the form. Thanks for sharing!

I have objected to the use of “generation” divisions and names for years (here’s the tag). Then, the other day, I saw this introduction to an episode of Meet the Press Reports, which epitomized a lot of the gibberishy nature of generationspeak (sorry about the quality).

OK, it’s ridiculous political punditry — “So as their trust in institutions wanes, will they eventually coalesce behind a single party, or will they be the ones to simply transform our political system forever?” — but it’s also generations gobbledygook. And part of what struck me was this: “millennials are now the largest generation, they have officially overtaken the Baby Boom.” Well-educated people think these things are real things, official things. We have to get off this train.

If you know the generations discourse, you know a lot of it emanates from the Pew Research Center. They do a lot of excellent research — and make a lot of that research substantially worse by cramming into the “generations” framework that they more than anyone else have popularized — have made “official.”

After seeing that clip, I put this on Twitter, and was delighted by the positive response:

So I wrote a draft of an open letter to Pew, incorporating some of the comments from Twitter. But then I decided the letter was too long. To be more effective maybe it should be more concise and less ranty. So here’s the long version, which has more background information and examples, followed by a signing version, with a link to the form to sign it. Please feel to sign if you are a demographer or other social scientist, and share the link to the form (or this post) in your networks.

Maybe if we got a lot of signatories to this, or something like it, they would take heed.


Preamble by me

Pew’s generation labels — which are widely adopted by many other individuals and institutions — encourage unhelpful social science communication, driving people toward broad generalizations, stereotyping, click bait, sweeping character judgment, and echo chamber thinking. When people assign names to generations, they encourage anointing them a character, and then imposing qualities onto whole populations without basis, or on the basis of crude stereotyping. This fuels a constant stream of myth-making and myth-busting, with circular debates about whether one generation or another fits better or worse with various of its associated stereotypes. In the absence of research about whether the generation labels are useful either scientifically or in communicating science, we are left with a lot of headlines drawing a lot of clicks, to the detriment of public understanding.

Cohort analysis and the life course perspective are important tools for studying and communicating social science. We should study the shadow, or reflection, of life events across people’s lives at a cultural level, not just an individual level. In fact, the Pew Research Center’s surveys and publications make great contributions to that end. But the vast majority of popular survey research and reporting in the “generations” vein uses data analyzed by age, cross-sectionally, with generational labels applied after the fact — it’s not cohort research at all. We shouldn’t discourage cohort and life course thinking, rather we should improve it.

Pew’s own research provides a clear basis for scrapping the “generations.” “Most Millennials Resist the ‘Millennial’ Label” was the title of a report Pew published in 2015. This is when they should have stopped — based on their own science — but instead they plowed ahead as if the “generations” were social facts that the public merely failed to understand.

This figure shows that the majority of Americans cannot correctly identify the generational label Pew has applied to them.

The concept of “generations” as applied by Pew (and many others) defies the basic reality of generations as they relate to reproductive life cycles. Pew’s “generations” are so short (now 16 years) that they bear no resemblance to reproductive generations. In 2019 the median age of a woman giving birth in the U.S. was 29. As a result, many multigenerational families include no members of some generations on Pew’s chart. For example, it asks siblings (like the tennis-champion Williams sisters, born one year apart) to identify as members of separate generations.

Perhaps due to their ubiquitous use, and Pew’s reputation as a trustworthy arbiter of social knowledge, many people think these “generations” are official facts. Chuck Todd reported on NBC News just this month, “Millennials are now the largest generation, they have officially overtaken the Baby Boom.” (NPR had already declared Millennials the largest generation seven years earlier, using a more expansive definition.) Pew has perhaps inadvertently encouraged these ill-informed perspectives, as when, for example, Richard Fry wrote for Pew, “Millennials have surpassed Baby Boomers as the nation’s largest living adult generation, according to population estimates from the U.S. Census Bureau” — despite the fact that the Census Bureau report referenced by the article made no mention of generations. Note that Chuck Todd’s meaningless graphic, which doesn’t even include ages, is also falsely attributed to the U.S. Census Bureau.

Generations are a beguiling and appealing vehicle for explaining social change, but one that is more often misleading than informative. The U.S. Army Research Institute commissioned a consensus study report from the National Academies, titled, Are Generational Categories Meaningful Distinctions for Workforce Management? The group of prominent social scientists concluded: “while dividing the workforce into generations may have appeal, doing so is not strongly supported by science and is not useful for workforce management. …many of the stereotypes about generations result from imprecise use of the terminology in the popular literature and recent research, and thus cannot adequately inform workforce management decisions.”

As one of many potential examples of such appealing, but ultimately misleading, uses of the “Millennial” generation label, consider a 2016 article by Paul Taylor, a former executive vice president of the Pew Research Center. He promised he would go beyond “clichés” to offer “observations” about Millennials — before describing them as “liberal lions…who might not roar,” “downwardly mobile,” “unlaunched,” “unmarried,” “gender role benders,” “upbeat,” “pre-Copernican,” and as an “unaffiliated, anti-hierarchical, distrustful” generation who nevertheless “get along well with their parents, respect their elders, and work well with colleagues” while being “open to different lifestyles, tolerant of different races, and first adopters of new technologies.” And their “idealism… may save the planet.”

In 2018 Pew announced that it would henceforth draw a line between “Millennials” and “Generation Z” at the year 1996. And yet they offered no substantive reason, just that “it became clear to us that it was time to determine a cutoff point between Millennials and the next generation [in] order to keep the Millennial generation analytically meaningful, and to begin looking at what might be unique about the next cohort.” In asserting that “their boundaries are not arbitrary,” the Pew announcement noted that they were assigning the same length to the Millennial Generation as they did to Generation X — both 16 years, a length that bears no relationship to reproductive generations, nor to the Baby Boom cohort, which is generally considered to be 19 years (1946-1964).

The essay that followed this announcement attempted to draw distinctions between Millennials and Generation Z, but it could not delineate a clear division, because none can be drawn. For example, it mentioned that “most Millennials came of age and entered the workforce facing the height of an economic recession,” but in 2009, the trough year for that recession, Millennials by Pew’s definition ranged from age 13 to 29. The other events mentioned — the 9/11 terrorist attacks, the election of Barack Obama, the launch of the iPhone, and the advent of social media — similarly find Millennials at a range of ages too wide to be automatically unifying in terms of experience. Why is being between 12 and 28 at the time of Obama’s election more meaningful a cohort experience than being, say, 18 to 34? No answer to this is provided, because Pew has determined the cohort categories before the logical scientific questions can be asked.

Consider a few other hypothetical examples. In the future, we might hypothesize that those who were in K-12 school during the pandemic-inflicted 2020-2021 academic year constitute a meaningful cohort. That 13-year cohort was born between 2003 and 2015, which does not correspond to one of Pew’s predetermined “generations.” For some purposes, an even narrower range might be more appropriate, such as those who graduated high school in 2020-2021 alone. Under the Pew generational regime, too many researchers, marketers, journalists, and members of the general public will look at major events like these through a pre-formed prism that distorts their ability to pursue or understand the way cohort life course experiences affect social experience.

Unlike the other “generations” in Pew’s map, the Baby Boom corresponds to a unique demographic event, painstakingly, empirically demonstrated to have begun in July 1946 and ended in mid-1964. And being part of that group has turned out to be a meaningful experience for many people — one that in fact helped give rise to the popular understanding of birth cohorts as a concept. But it does not follow that any arbitrarily grouped set of birth dates would produce a sense of identity, especially one that can be named and described on the basis of its birth years alone. It is an accident of history that the Baby Boom lasted 18 years — as far as we know having nothing to do with the length of a reproductive generation, but perhaps leading subsequent analysts to use the term “generation” to describe both Baby Boomers and subsequent cohorts.

The good researchers at Pew are in a tough spot (as are others who rely on their categories). The generations concept is tremendously appealing and hugely popular. But where does it end? Are we going to keep arbitrarily dividing the population into generations and giving them names — after “Z”? On what scientific basis would the practice continue? One might be tempted to address these problems by formalizing the process, with a conference and a dramatic launch, to make it even more “official.” But there is no scientific rationale for dividing the population arbitrarily into cohorts of any particular length for purposes of analyzing social trends, and to fix their membership a priori. Pew would do a lot more to enhance its reputation, and contribute to the public good, by publicly pulling the plug on this project.


Open letter to the Pew Research Center on generation labels

Sign the letter here.

We are demographers and other social scientists, writing to urge the Pew Research Center to stop using its generation labels (currently: Silent, Baby Boom, X, Millennial, Z). We appreciate Pew’s surveys and other research, and urge them to bring this work into better alignment with scientific principles of social research.

  1. Pew’s “generations” cause confusion.

The groups Pew calls Silent, Baby Boom, X, Millennial, and Z are birth cohorts determined by year of birth, which are not related to reproductive generations. There is further confusion because their arbitrary lengths (18, 19, 16, 16, and 16 years, respectively) have grown shorter as the age difference between parents and their children has lengthened.

  1. The division between “generations” is arbitrary and has no scientific basis.

With the exception of the Baby Boom, which was a discrete demographic event, the other “generations” have been declared and named on an ad hoc basis without empirical or theoretical justification. Pew’s own research conclusively shows that the majority of Americans cannot identify the “generations” to which Pew claims they belong. Cohorts should be delineated by “empty” periods (such as individual years, equal numbers of years, or decades) unless research on a particular topic suggests more meaningful breakdowns.

  1. Naming “generations” and fixing their birth dates promotes pseudoscience, undermines public understanding, and impedes social science research.

The “generation” names encourage assigning them a distinct character, and then imposing qualities on diverse populations without basis, resulting in the current widespread problem of crude stereotyping. This fuels a stream of circular debates about whether the various “generations” fit their associated stereotypes, which does not advance public understanding.

  1. The popular “generations” and their labels undermine important cohort and life course research

Cohort analysis and the life course perspective are important tools for studying and communicating social science. But the vast majority of popular survey research and reporting on the “generations” uses cross-sectional data, and is not cohort research at all. Predetermined cohort categories also impede scientific discovery by artificially imposing categories used in research rather than encouraging researchers to make well justified decisions for data analysis and description. We don’t want to discourage cohort and life course thinking, we want to improve it.

  1. The “generations” are widely misunderstood to be “official” categories and identities

Pew’s reputation as a trustworthy social research institution has helped fuel the false belief that the “generations” definitions and labels are social facts and official statistics. Many other individuals and organizations use Pew’s definitions in order to fit within the paradigm, compounding the problem and digging us deeper into this hole with each passing day.

  1. The “generations” scheme has become a parody and should end.

With the identification of “Generation Z,” Pew has apparently reached the end of the alphabet. Will this continue forever, with arbitrarily defined, stereotypically labeled, “generation” names sequentially added to the list? Demographic and social analysis is too important to be subjected to such a fate. No one likes to be wrong, and admitting it is difficult. We sympathize. But the sooner Pew stops digging this hole, the easier it will be to escape. A public course correction from Pew would send an important signal and help steer research and popular discourse around demographic and social issues toward greater understanding. It would also greatly enhance Pew’s reputation in the research community. We urge Pew to end this as gracefully as possible — now.

As consumers of Pew Research Center research, and experts who work in related fields ourselves, we urge the Pew Research Center to do the right thing and help put an end to the use of arbitrary and misleading “generation” labels and names.

Philip Cohen at 50, having been 14 in 1981

This is a sociological reflection about life history. It’s about me because I’m the person I know best, and I have permission to reveal details of my life.

I was born in August 1967, making me 50 years old this month. But life experience is better thought of in cohort terms. Where was I and what was I doing, with whom, at different ages and stages of development? Today I’m thinking of these intersections of biography and history in terms of technology, music, and health.

Tech

We had a TV in my household growing up, it just didn’t have a remote control or cable service, or color. We had two phones, they just shared one line and were connected by wires. (After I moved out my parents got an answering machine.) When my mother, a neurobiologist, was working on her dissertation (completed when I was 10) in the study my parents shared, she used a programmable calculator and graph paper to plot the results of her experiments with pencil. My father, a topologist, drew his figures with colored pencils (I can’t describe the sound of his pencils drawing across the hollow wooden door he used for a desktop, but I can still hear it, along with the buzz of his fluorescent lamp). A couple of my friends had personal computers by the time I started high school, in 1981 (one TRS-80 and one Apple II), but I brought a portable electric typewriter to college in 1988. I first got a cell phone in graduate school, after I was married.

The first portable electronic device I had (besides a flashlight) was a Sony Walkman, in about 1983, when I was 16. At the time nothing mattered to me more than music. Music consumed a large part of my imagination and formed the scaffolding of most socializing. The logistics of finding out about, finding, buying, copying, and listening to music played an outsized role in my daily life. From about 1980 to 1984, most of the money I made at my bagel store job went to stereo equipment, concerts, records, blank tapes for making copies, and eventually drums (as well video games). I subscribed to magazines (Rolling Stone, Modern Drummer), hitchhiked across town to visit the record store, pooled money with friends to buy blank tapes, spent hours copying records and labeling tapes with my friends, and made road trips to concerts across upstate New York (clockwise from Ithaca: Geneva, Buffalo, Rochester, Syracuse, Saratoga, Binghamton, New York City, Elmira).

As I’m writing this, I thought, “I haven’t listened to Long Distance Voyager in ages,” tapped it into Apple Music on my phone, and started streaming it on my Sonos player in a matter of seconds, which doesn’t impress you at all – but the sensory memories it invokes are shockingly vivid (like an acid flashback, honestly) – and having the power to evoke that so easily is awesome, in the old sense of that word.

Some of us worked at the Cornell student radio station (I eventually spent a while in the news department), whose album-oriented rock playlist heavily influenced the categories and relative status of the music we listened to. The radio station also determined what music stayed in the rotation – what eventually became known by the then-nonexistent term classic rock – and what would be allowed to slip away; it was history written in real time.

It’s like 1967, in 1981

You could think of the birth cohort of 1967 as the people who entered the world at the time of “race riots,” the Vietnam and Six Day wars, the Summer of Love, the 25th Amendment (you’re welcome!), Monterey Pop, Sgt. Peppper’s, and Loving v. Virginia. Or you could flip through Wikipedia’s list of celebrities born in 1967 to see how impressive (and good looking) we became, people like Benicio del Toro, Kurt Cobain, Paul Giamatti, Nicole Kidman, Pamela Anderson, Will Ferrell, Vin Diesel, Phillip Seymour Hoffman, Matt LeBlanc, Michael Johnson, Liev Schreiber, Julia Roberts, Jimmy Kimmel, Mark Ruffalo, and Jamie Foxx.

But maybe it makes more sense to think of us as the people who were 14 when John Lennon made his great commercial comeback, with an album no one took seriously – only after being murdered. The experiences at age 14, in 1981, define me more than what was happening at the moment of my birth. Those 1981 hits from album-oriented rock mean more to me than the Doors’ debut in 1967. My sense of the world changing in that year was acute – because it was 1981, or because I was 14? In music, old artists like the Moody Blues and the Rolling Stones released albums that seemed like complete departures, and more solo albums – by people like Stevie Nicks and Phil Collins – felt like stakes through the heart of history itself (I liked them, actually, but they were also impostors).

One moment that felt at the time like a historical turning point was the weekend of September 19, 1981. My family went to Washington for the Solidarity Day rally, at which a quarter million people demonstrated against President Reagan and for organized labor, a protest fueled by the new president’s firing of the PATCO air traffic controllers the previous month (and inspired by the Solidarity union in Poland, too). Besides hating Reagan, we also feared a nuclear war that would end humanity – I mean really feared it, real nightmare fear.

173042_940043902348_3109330_o
A piece of radio news copy I wrote and read at WVBR, probably 1983. The slashes are where I’m going to take a breath. “Local AQX” is the name of the tape cartridge with the sound bite (“actuality”) from Alfred Kahn, and “OQ:worse” means that’s the last word coming out of the clip.
On the same day as Solidarity, while we were in D.C., was Simon and Garfunkel’s Concert in Central Park. They were all of 40 (literally my mother’s age), tired old people with a glorious past (I’m sure I ignored the rave reviews). As I look back on these events – Reagan, the Cold War, sell-out music – in the context of what I thought of as my emerging adulthood, they seemed to herald a dark future, in which loss of freedom and individuality, the rise of the machines, and runaway capitalism was reflected in the decline of rock music. (I am now embarrassed to admit that I even hated disco for a while, maybe even while I listened 20 times, dumbstruck, to an Earth, Wind, and Fire album I checked out of the library.)

I don’t want to overdramatize the drama of 1981; I was basically fine. I came out with a penchant for Camus, a taste for art rock, and leftism, which were hardly catastrophic traits. Still, those events, and their timing, probably left a mark of cynicism, sometimes nihilism, which I carry today.

1937404_725426252838_6822157_n
About 1984, with Daniel Besman (who later died) in Ithaca. Photo by Linda Galgani.
Data aside

Maybe one reason 1981 felt like a musical watershed to me is because it really was, because pop music just got worse in the 1980s compared to the 1970s. To test (I mean prove, really) that hypothesis, I fielded a little survey (more like a game) that asked people to rate the same artists in both decades. I chose 58 artists by flipping through album charts from 1975-1984 and finding those that charted in both decades; then I added some suggestions from early respondents. To keep the task from being too onerous, as it required scoring bands twice from 1 (terrible) to 5 (great), once for each period, and some people found it difficult, I set the survey to serve each person just 10 artists at random (a couple of people did it more than once). The participants were 3/4 men, 3/4 over 40, and 3/4 White and US-born; found on Facebook, Twitter, and Reddit. The average artist was rated 11 times in each period (range 5 to 19). (Feel free to play along or share this link; I’ll update it if more come in.)

The results look very bad for the 1980s. The average change was a drop of .59, and only three acts showed noticeable improvement: Pat Benatar, Michael Jackson, and Prince (and maybe Talking Heads and the lowly Bryan Adams). Here is the full set (click to enlarge):

Technology and survival

I don’t think I would have, at age 14, given much weight to the idea that my life would repeatedly be saved by medical technology, but now that seems like business as usual, to me anyway. I guess as long as there’s been technology there have been people who owe their lives to it (and of course we’re more likely to hear from them than from those who didn’t make it). But the details are cohort-specific. These days we’re a diverse club of privileged people, our conditions, or their remnants, often hidden like pebbles wedged under the balls of our aging feet, gnawing reminders of our bodily precarity.

Family lore says I was born with a bad case of jaundice, probably something like Rh incompatibility, and needed a blood transfusion. I don’t know what would have happened without it, but I’m probably better off now for that intervention.

Sometime in my late teens I reported to a doctor that I had periodic episodes of racing heartbeat. After a brief exam I was sent home with no tests, but advised to keep an eye on it; maybe mitral valve prolapse, he said. I usually controlled it by holding my breath and exhaling slowly. We found out later, in 2001 – after several hours in the emergency room at about 200 very irregular beats per minute – that it was actually a potentially much more serious condition called Wolff-Parkinson-White syndrome. The condition is easily diagnosed nowadays, as software can identify the tell-tale “delta wave” on the ECG, and the condition is listed right there in the test report.

Rhythm_WPW

Two lucky things combined: (a) I wasn’t diagnosed properly in the 1980s (which might have led to open-heart surgery or a lifetime of unpleasant medication), and; (b) I didn’t drop dead before it was finally diagnosed in 2001. They fixed it with a low-risk radiofrequency ablation, just running a few wires up through my arteries to my heart, where they lit up to burn off the errant nerve ending, all done while I was almost awake, watching the action on an x-ray image and – I believed, anyway – feeling the warmth spread through my chest as the doctor typed commands into his keyboard.

Diverticulitis is also pretty easily diagnosed nowadays, once they fire up the CT scanner, and usually successfully treated by antibiotics, though sometimes you have to remove some of your colon. Just one of those things people don’t die from as much anymore (though it’s also more common than it used to be, maybe just because we don’t die from other things as much). I didn’t feel like much like surviving when it was happening, but I suppose I might have made it even without the antibiotics. Who knows?

More interesting was the case of follicular lymphoma I discovered at age 40 (I wrote about it here). There is a reasonable chance I’d still be alive today if we had never biopsied the swollen lymph node in my thigh, but that’s hard to say, too. Median survival from diagnosis is supposed to be 10 years, but I had a good case (a rare stage I), and with all the great new treatments coming online the confidence in that estimate is fuzzy. Anyway, since the cancer was never identified anywhere else in my body, the treatment was just removing the lymph node and a little radiation (18 visits to the radiation place, a couple of tattoos for aiming the beams, all in the summer with no work days off). We have no way (with current technology) to tell if I still “have” it or whether it will come “back,” so I can’t yet say technology saved my life from this one (though if I’m lucky enough to die from something else — and only then — feel free to call me a cancer “survivor”).

It turns out that all this life saving also bequeaths a profound uncertainty, which leaves one with an uneasy feeling and a craving for antianxiety medication. I guess you have to learn to love the uncertainty, or die trying. That’s why I cherish this piece of a note from my oncologist, written as he sent me out of the office with instructions never to return: “Your chance for cure is reasonable. ‘Pretest probability’ is low.”

pretest-probability-is-low
From my oncologist’s farewell note.
Time travel

It’s hard to imagine what I would have thought if someone told my 14-year-old self this story: One day you will, during a Skype call from a hotel room in Hangzhou, where you are vacationing with your wife and two daughters from China, decide to sue President Donald Trump for blocking you on Twitter. On the other hand, I don’t know if it’s possible to know today what it was really like to be me at age 14.

In the classic time travel knot, a visitor from the future changes the future by going back and changing the past. The cool thing about mucking around with your narrative like I’m doing in this essay (as Walidah Imarisha has said) is that it by altering our perception of the past, we do change the future. So time travel is real. Just like it’s funny to think of my 14-year-old self having thoughts about the past, I’m sure my 14-year-old self would have laughed at the idea that my 50-year-old self would think about the future. But I do!