Recently, child psychologists in the United Kingdom received a new recommendation: that their treatment population should not be birth to eighteen, but rather birth to twenty-five.
The reason behind this change is our advancing knowledge of human brain development. Our brains, including functions involving self-regulation, decision-making, and risk-taking, do not develop into a “fully adult” brain until age 25 or later.
This has brought a storm of criticism. Some question the effect of the new guidelines on young adults, and whether it will prolong adolescence beyond where it is. Some think we are coddling young people who have been considered adults for most of history.
As anyone who works in the field of adolescent health knows, this research and these recommendations are not new, and our health system has been slowly incorporating these ideas into health care.
In the United States, the age of majority, or the age at which one is considered an adult (except in terms of alcohol), is eighteen. It’s important to remember that eighteen is not a magic number. Anyone who has known (or been) a teen turning eighteen, knows that instant maturity and responsibility do not set in after the last day of being seventeen is through.
Eighteen wasn’t the age of majority throughout American history, either. In fact, for most of American history, the age of majority was twenty-one. The age was lowered for numerous reasons, including allowing earlier marriages, industrialization and increasing secondary education, and and allowing younger men into the military (the women came later.)
I wanted to touch on a few concepts and arguments that have arisen due to the new research around when our brains become “adult”
- It’s all average. We’ve all known teens who are mature beyond their years, and we’ve all known adults that seem to have the self-regulation abilities of a distracted 14-year-old. When scientists discuss brain development, they are taking the average of everybody in our population.
- This is unlikely to create major policy changes. The amount of effort and restructuring to extend the age of majority would affect almost every aspect of our country: commercial, military, educational, and legal, to name a few. Considering the nation’s willingness to grant teens power to consent in certain medical situations (which I support), try teens under eighteen as adults in the criminal justice system (which I don’t), and look askance at any young adult who is receiving help from parents and not “pulling themselves up by their bootstraps”, we’re not about to embark on a huge national shift.
- Some people find the concept demeaning. I have known teens to take on employment and parenthood- and do well at both- long before eighteen, to manage families when their parents are unwilling or unable, and to show that they understand complex moral and legal dilemmas. This change is coming from neuroscientific research, and is not meant to imply that young adults are incapable, or immature, or incompetent. It might, however, explain why young adults are more likely to take certain risks, hold certain priorities, or think certain ways that are different from older adults.
- You will see more and more of this concept in medical, behavioral health, and educational fields . If you walk into Adolescent Medicine at Seattle Children’s, you will see a sign that says “Adolescent and Young Adult Medicine”. Many professionals who work with adolescents are starting to examine the unique needs of young adults, and some are solely focusing on them. Your average 13-year-old does not think like your average 23-year-old, but both are at an age with distinct developmental needs.
As someone who, looking back, was an impressively immature young adult, this data intuitively makes sense. However, I’ve spoken to other adults who were responsible, sensible, and mature by their early twenties, and find themselves instinctively opposed to it. What was your young adult experience, and what do you think about these new guidelines?