Brownstone » Brownstone Journal » Psychology » The Creation of the 25-Year-Old Child
The Creation of the 25-Year-Old Child

The Creation of the 25-Year-Old Child

SHARE | PRINT | EMAIL

Throughout much of history you reached adulthood somewhere between your mid-teens and 21. Societies made these determinations based not on precise metrics or exact assessments of adultness, but rough estimates of when most people presumably had attained other milestones. Had you reached puberty? Were you, if male, physically developed enough to die fighting for your king? 

There were sometimes loopholes for royalty and aristocrats because what country doesn’t benefit from an inbred child monarch from time to time? And Rome being Rome also factored in an estimate of when you were probably capable of understanding whether you were acting within the law. 

But, for the most part, if you were at an age where you and most of your peers had gotten through puberty and were physically developed enough for battle, in most places you had reached the age of majority. Congratulations!

In the United States, guesstimates of a good age of majority, when formally codified, have generally tended to be set at either 18 or 21. Eighteen probably makes a little more sense. You’ve been through puberty. You’re done with compulsory education. You’re free from your parents. You should have enough sense to know whether you are acting within the law. You’re physically capable of giving your life for your country if its leaders get in a pissing contest with Russia or its esteemed defense contractors need to move product. What more is there to consider?

Ever so briefly, the United States sort of recognized this. At the start of the Vietnam War, 18 was old enough for you to be drafted, but not quite old enough to choose the people drafting you or enjoy a beer before you shipped out. Ergo lawmakers at the federal level, acknowledging this apparent logical inconsistency, lowered the voting age to 18 in 1971. Some states similarly lowered their drinking age, until it was, for all practical purposes, raised to 21 at the federal level – albeit in something of a constitutional workaround via some technicalities related to funding highways.

More recently though, it has become rather fashionable for the sophisticated people to shake their heads at this backwards notion that young adults in their late-teens or early 20s be permitted to engage in activities usually reserved for grownups. The smart people know 21 is too young to make serious choices about how you wish to live. The educated people understand 18 isn’t old enough for you to behave responsibly without one of the big people looking over your shoulder. The rental car companies have known this for years: they only rent to 25 and older.  

When Wisconsin was considering lowering its minimum legal drinking age to 19 in 2017, a 20-year-old UW Madison student who spent her summer casually drinking in Ireland, with an air of civic duty, argued against the proposed change, for 20-year-olds like her were too immature to have a glass of wine at dinner as she regularly did whilst abroad. 

In 2019 President Donald Trump signed legislation barring anyone under 21 from buying cigarettes. 

Following mass shootings in the United States, there are routinely calls to raise the minimum age to purchase any sort of firearm to at least 21. 

In 2020 a 22-year-old young woman wrote to a Slate advice column noting how numerous feminist friends of hers believed women and gay men under 25 were too young to consent to sex. 

When testifying before the Tennessee House in February 2023 in support of a bill that would limit so-called “gender-affirming care” for minors, Daily Wire journalist and documentary filmmaker Matt Walsh, while responding to a question regarding whether one was mature enough to consent to such procedures at 16, implied one might not be capable of doing so till 25. 

Presidential candidate Vivek Ramaswamy recently proposed the voting age be raised to 25 for those who have not served in the military or passed a civics test. 

Psychologists and education commentators have suggested it is unrealistic and unfair for professors to expect traditional-aged college students to be able to manage long-term deadlines on account of they’re just not old enough yet.  

The rationale for much of this – with the apparent exception of Ramaswamy’s call to raise the voting age, which seems more about revitalizing perceptions of civic duty and the act of voting – generally comes down to an appeal to common sense with a dash of science. If you are in your late teens or early to mid-20s you are obviously immature, irresponsible, and not capable of the sound judgment of adults. 

The latest brain science backs this up. Therefore, it would be in the best interest of you and the rest of society if we treated you as a child just a little bit longer until your brain finishes ripening.

A lot of science and perhaps some common sense though is lost in this argument. For a more comprehensive understanding of the science bit, one first needs to back up to about the mid-20th century. Prior to the neurofication of all human thought and behavior, somewhere in the aughts through the use of neuroimaging devices, especially fMRIs, developmental psychologists tended to work within a more theoretical and observational paradigm when dividing people’s lives, from birth through old age, into different developmental periods.

Erik Erikson, writing primarily in the 1950s and 1960s, was probably the most influential of them as he theorized that childhood likely ended around the onset of puberty, at which point adolescence began and lasted until the onset of young adulthood in the late teens. Young adulthood then lasted till about 40. 

Such divisions weren’t entirely new, but Erikson’s were probably the most enduring, going largely unchallenged until roughly 2000 when Jeffrey Arnett, a professor of psychology at Clark University, proposed a new developmental phase, at least for those in Western industrialized societies. Arnett called it “emerging adulthood.” He placed it between adolescence and young adulthood.

Arnett’s rationale was that when Erikson conceptualized his developmental phases in the mid-20th century, the lives of individuals in their late teens and 20s were much different than they were at the dawn of the new millennium. In Erikson’s day, people began work earlier. Most didn’t go to college. By 20 they found a steady job. By 23 or so they were married. About a year later they had their first child.

In the late 1990s, however, young people in their late teens and early to mid-20s, instead of settling into adult roles, were entering a period of “semi-autonomy” in which they “take on some of the responsibilities of independent living but leave others to their parents, college authorities, or other adults.”

During this period they often pursue additional education and live lives characterized by exploration and frequent change while existing in a quasi-adult state. Physically they are adults. They are considered adults with some restrictions in the eyes of the law. Yet, they don’t feel like adults. They don’t feel responsible for their own lives. They don’t feel like they make their own independent decisions. Plus, they often lack financial independence. For many, this does not change until sometime in their mid-to-late 20s. 

In response to all this, Arnett suggested, at least for those in industrialized societies, that young adulthood may not actually begin until 25. Later though, due to continued delays in taking on the responsibilities of steady work, marriage, and children, Arnett would later move the start of young adulthood to 29.

Coinciding with Arnett’s attempt to make emerging adulthood a thing, neuroimaging devices became increasingly used to find neural correlates for everything from religious belief to reactions to unflattering information about personally-favored political figures to love to emotional pain. Some researchers looked at how the brain changes over the human lifespan. Some examined how one’s performance on complex tasks and decision-making changes as one matures from child to adult and how it may differ within different age groups based on contextual factors.

With time, many commentators and policymakers began proposing that oversimplified findings from these studies inform law and policy with a particular focus on how the brains and cognitive abilities of adolescents and those in young or emerging adulthood continue to change into roughly the mid-20s. 

People started arguing that since the brain isn’t fully mature till the mid-20s, one is not an adult until 25. They started acting as if permitting 18, 21, or even 23-year-olds to take responsibility for their own lives or make decisions independently is as absurd as handing a 12-year-old a bottle of scotch, a handgun, and a box of condoms before sending him off to run a bank.

Sometimes this comes off as a cynical attempt to appeal to science as a means to indirectly restrict activities individual commentators or policymakers probably would rather just simply ban altogether. Other times it seems more like what overeducated proponents of the safetyist nanny-state would perceive as a well-intentioned, honest attempt to help less-informed hoi polloi stay safe by following The Science. In both cases though, it also reveals, at best, a naive understanding of the science they claim to be following.

Honest researchers have long acknowledged that scientifically, the implications of notions of emerging adulthood and findings that the brain may continue to develop past standard legal demarcations of adulthood are unclear. Many even seem uncomfortable establishing a fixed definition of what constitutes a true adult brain or how to measure it. Some appear to object to framing discussions in terms of defining a true adult brain or establishing an exact point at which an adolescent brain has completed its metamorphosis into an adult one. When one examines some of the actual neurodevelopmental research on the topic it becomes obvious why.

When examining questions pertaining to neurodevelopment, researchers don’t really have a clear single metric for neurodevelopment or neuroadulthood. Instead, they have many options to choose from and generally they don’t align perfectly with one another. Thus, for research purposes, scientists will select an operational measure and look to see what age changes in that operational measure plateau.

 But again, for any given study, researchers must decide what measure to use: structural changes, the amount of gray matter, the amount of white matter, connectivity, the availability of particular neurotransmitters, metabolic efficiency, etc. They also must choose what part of the brain to focus on. Depending on the choices the researchers of a given study make, they may then find neuroadulthood is attained as early as 15 or as late as never.

Increasingly though, many are homing in on the prefrontal cortex. In some ways this sort of makes sense. This is the part of the brain associated with many higher or executive functions and reasoning capabilities, after all. A related approach is to focus on psychological components of cognitive ability that can be measured without a neuroimaging device, then try to match up performance on the cognitive measure with some neurodevelopmental one because the pretty pictures of an fMRI convey the authority of science better than a bar graph showing reaction times on a complex cognitive task that would take 20 minutes to explain. 

Yet still, when implementing either approach to divine the age of neuro or cognitive adulthood, researchers still seem to end up floating imperfect guesstimates ranging from mid-20s to 30s to never that seem to do little more than continue to complicate what was once a sort of simple matter.

This does not mean the research isn’t interesting or worthwhile, but it should make one think twice before deferring to it when arguing for restricting the rights of putative adults. 

Moreover, even if science here were a little less fuzzy and we had a more precise age for the maturation of the prefrontal cortex and could definitively correlate it with performance on a relevant cognitive task, a lot is still lost both scientifically and practically.

First, by at least partially tying legal adult activities to one or more scientific metrics, one establishes a seemingly risky precedent, opening the door to adulthood being something forever in flux. Today we might seek to reclassify 18-21-year-olds as children because their brains are not as mature as a 25-year-old’s. 

Tomorrow we may reclassify 22-24-year-olds as minors because their brains are more similar to those of 21-year-olds than those of 35-year-olds. A generation from now, we may end up with the same conversation about 35-year-olds. Potentially, this could go on forever.

Second, if we go this route of reclassifying young adults as not quite real adults responsible for their lives and the choices they make, why shouldn’t we finalize the process and keep them under parental care or state control until they are 21 if not 25 or whatever other age, while rewriting remaining laws about tobacco, alcohol, guns, the age of consent, and a plethora of other opportunities for bad choices, while adjusting societal expectations for this age group accordingly? 

Drinking and smoking would be prohibited for these twenty-something-year-old minors. Romantic relationships between proper adults and those under whatever the new cutoff is would be treated as statutory rape. College could be made mandatory. But professors would have to be careful not to make the coursework too hard because, in this view, 18 or even 20 is simply not old enough for a child to do adult-level schoolwork.

Lastly though, this whole endeavor to try to find a neurodevelopmental or cognitive measure for the precise age at which one becomes sufficiently adultlike and to shape policy around that measure would seem to discount that the neurodevelopmental and cognitive features being measured may themselves be forever in flux for a variety of sociocultural and environmental reasons. It also ignores that most societies throughout human history have gotten along just fine without knowing the exact moment the prefrontal cortex reaches maximum adultness.  

Once more, Arnett noted in 2000 that the young adults of that era were different from those of the mid-20th century, taking on the responsibilities of steady work, marriage, and children later than their earlier counterparts. He also noted how it is well established that marriage and parenthood tend to hasten feelings of adulthood and decrease risky behaviors practically better than any other human experience. 

Similarly, developmental psychologist and author of iGen, Jean Twenge, has pointed out that it is not just 18-25-year-olds who seem caught in a state of arrested development, but adolescents as well. Since 2000 adolescents have shown declines in doing things like working, driving, dating, drinking alcohol, having sex, and even just going out without their parents. A high school senior of the 2010s went out less than an eighth grader of the 1990s and dated as much as a tenth grader from that decade. Additionally, since the 1990s, maintaining one’s virginity through high school has become the norm. 

Taken with the work of Arnett, it would seem to indicate that our society and culture have developed in a way where everyone gets held back a developmental stage for roughly the duration of a developmental stage at least until they’re 30.  

The reasons for this are complex and not fully understood. Economic realities of the past 20-plus years and a higher educational system in which young adults take on massive loans for what often proves to be a largely symbolic credential have made financial independence from one’s parents ever more elusive for many young adults. 

Twenge also has suggested the delay in partaking in adult activities by adolescents may be something of a natural symptom of an affluent society free from harsh conditions and large-scale childhood mortality: when families can afford to have few children and expect them to survive to adulthood, parents invest more resources, including attention and protection, in the limited number of children they have rather than send them out into the streets with little more instruction than to be home before dark without upsetting the neighbors. 

Our overly safetyist culture in which doing so has become illegal in some locales probably also plays a role, as does an education system that has shifted responsibility from students to earn good grades to teachers to ensure students don’t receive bad ones, as does a system of higher education in which universities are expected, as described by Jonathan Haidt and Greg Lukianoff in The Coddling of the American Mind, to maintain the psychological safety of students, protect them from scary ideas that might offend them, and mediate trivial disagreements as if their campuses were populated by first-graders. 

Although we can’t know for certain, maybe if we had fMRIs in the era of Erikson or even the 1990s we’d see brains back then reached some metric of adultness earlier than those of kids today. 

Of course young people have always done dumb things and made stupid decisions. Just watch any teen movie that takes place in the 1950s. Everyone apparently got into drag races with greaser kids and preppy bullies – even when trying to stop an alien blob from destroying Earth. 

Perhaps by turning to science to tell us the exact age at which someone no longer must be protected from making their own decisions, we are further exacerbating a vicious cycle in which our society has already trapped its youth.

By attempting to cocoon both adolescents and young adults to protect them from bad choices, responsibility, and real-world consequences for their decisions until they reach a scientifically defined age at which they can enter the world fully mature and unsupervised, we will in fact be protracting their immaturity and delaying their development into the responsible adults we are waiting for them to become.



Published under a Creative Commons Attribution 4.0 International License
For reprints, please set the canonical link back to the original Brownstone Institute Article and Author.

Author

  • Daniel Nuccio

    Daniel Nuccio holds master's degrees in both psychology and biology. Currently, he is pursuing a PhD in biology at Northern Illinois University studying host-microbe relationships. He is also a regular contributor to The College Fix where he writes about COVID, mental health, and other topics.

    View all posts

Donate Today

Your financial backing of Brownstone Institute goes to support writers, lawyers, scientists, economists, and other people of courage who have been professionally purged and displaced during the upheaval of our times. You can help get the truth out through their ongoing work.

Subscribe to Brownstone for More News

Stay Informed with Brownstone Institute