Search This Blog

Monday, November 15, 2010

Quick thought on social and neuroscience...

Some of the recent studies showing continued brain growth in teenage brains (a "lite" report here) are being used as to push for an "extended childhood" for teens.  During the teenage years, the prefrontal cortex--the area of the brain concerned with "rational" decision making and courses of action--first grows and then prunes connections and myelinates particular areas.  (Myelination increases the speed a message can travel along a nerve, essentially "locking in" a particular message by reducing time to "consider" alternatives.)

In  several anthropology papers and courses, I've been shown patterns where young men (at least) may have greater strength and aptitude, but the peak age for production by hunting is upwards of 35.  This implies there is a context-specific experiential process at work for this socially important activity.  The fact it's a relatively smooth curve from onset of puberty to adulthood and a relatively monolithic (excuse the pun) adult culture in most cases says something as well. 

In some of the behavioral genetics heritability studies on certain traits like the heritability of religiosity, the heritability of the trait is seriously affected by age.  Those that are older often show an increase of heritability which wouldn't make sense if it was a basic additive genetic trait.  If it requires a (social) context-specific "burn-in" period, however, and is tied into a set of working behavior-governing institutions, that would make sense.

So...

My hypothesis is that the "developmental" period during puberty isn't part of an extended childhood, it's the intended developmental period for adulthood.  Essentially, the apprenticeship period for learning the rules and institutions in their social context in order to successfully function as an adult.  By removing the opportunity to develop in the adult society they will later be a part of, teens are rendered less prepared or with substitute, created variants of the "adult world" they have adapted to.

This would explain a number of things, least of all is the divorce rate and the periodic creation of sub-cultures based around teenagers.  Essentially, by "protecting" teens from the adult world, they learn and lock in (that's where the myelination comes in) rules that are appropriate for the limited situation they live through as teens and have difficulty adapting to the "responsible" adult world, a lot like Peter Pan.

How could you prove such a hypothesis?

The ideal situation would be somewhat ethically challenging in most places because it would require treating any subject post-puberty as an adult.  Alternatively, one could consider a cross-cultural study where adulthood is conferred at puberty, but the confounds would be ridiculous because most of these cultures also have a more simple overall culture.  If there were reason to believe different parts of this development process weren't very tightly integrated, one could use activities that are less regulated in this culture and create two experimental groups where "childhood" is enforced in one and adult activities are permitted/encouraged in the other.   Maybe something like hunting or finances may be of interest.

Just an idea.

No comments:

Post a Comment