Skip to main content

Verified by Psychology Today

ADHD

Why Parents May Feel Obligated to Medicate Children for ADHD

Medicating kids used to be a tough sell. What changed?

Key points

  • The widespread medication of children has been criticized as an improper form of social control and intolerance of children's differences.
  • ADHD and medications are no longer conceived in terms of bad behaviors or control but in relation to self and deficient self-management.
  • The new conception did not arise from any new discovery but from a changed understanding of what it is to be “normal.”
  • When the promise of medication for children is to realize a better, more successful version of themselves, it is hard for parents to argue.
Photo by RODNAE Productions
Source: Photo by RODNAE Productions

Parents, when surveyed, consistently report strong reservations about medicating kids for attention deficit/hyperactivity disorder (ADHD). Yet, at the same time, the rates of diagnosis and treatment steadily rise. How can both be true? In an earlier post, I referred to this curious correlation as the paradox of ADHD.

In seeking to resolve the paradox, I made three points. One, as the scope of ADHD has expanded over time to include more and milder symptoms, a growing number of parents are left wondering if their struggling child has a medical problem and feeling an obligation to seek medical advice. Two, professionals, and even lay people, promote a 'responsibility' that presupposes a readiness on the part of parents to orient themselves by 'scientific' knowledge and expert guidance. And three, since the experts strongly support medication treatment (with or without other interventions), even skeptical and initially resistant parents are likely to find themselves accepting the diagnosis and filling the prescription.

These reflections, based on several types of evidence, raise a deeper question I want to pursue here, a question that pertains to doctors as well as parents. Put simply, how does giving a child or adolescent a diagnosis and a powerful drug, especially in non-severe cases, come to be seen as the responsible thing to do?

Medicating kids used to be a tough sell. In their 1975 book, The Myth of the Hyperactive Child, Peter Schrag and Diane Divoky argued that a small number of children have a condition sufficiently serious to warrant medication. However, “most do not; they are being drugged…to make them more manageable.” In a historical study of popular magazine articles including Time and Forbes, from 1968 to 2006, I found a similar concern. Many article authors, while acknowledging some genuine cases of a medical condition, worried that the growth in stimulant medication use was being driven by a social need to “establish docility” and compel a narrow “conformity” at home and school. They regarded this over-use as unethical and irresponsible because, at least in the first instance, those who benefited from the drug were parents and teachers, not the children themselves.

Many ordinary citizens, as we found in our surveys and focus groups, also see the widespread medication of children in terms of improper social control and an intolerance of children’s natural differences. Administering a drug is “not necessarily good for the kid,” said one focus group participant, “but it’s good for the people around the kid, the adults around the kid. And I think that’s dangerous.” Even doctors, judging from quotes in the magazine articles, sometimes shared this worry. Yet, in concrete cases, the “good for the adults” case disappears. A diagnosis and medication become the responsible thing because it is deemed 'good for the kid.' How does this happen?

The short answer is that the child’s problem, however initially understood, gets redefined. The path that leads to the doctor’s office normally begins with troublesome behaviors, often initially remarked upon by teachers but also observed by parents. Poor emotional control, distractible, disorganized, overactive, quarrelsome, and underperforming are among the bad behaviors—the symptoms—commonly reported. All impact relations and success at school and frequently at home and with peers. The doctor is called because something needs to be done for the sake of everyone involved. And the potential of medication is just that: to curb and control unusual and wearisome conduct, which is a benefit to all parties, including, though not necessarily willingly, the child.

Expressed in these behavioral terms, medicating kids sounds a lot like enforced conformity to social and institutional standards, the very thing that people often consider wrong and irresponsible. I doubt that skeptical parents, if this was all they knew or heard, would be inclined on that basis to put their son or daughter on an amphetamine like Adderall. To deem such a course of action responsible, parents must be convinced that the child’s welfare is foremost, that taking a medication is good for the child.

This is where the 'scientific' perspective on ADHD that parents are encouraged to adopt is decisive. It used to be the case that psychiatric concerns with childhood hyperactivity and distractibility were defined in terms of maladaptive behaviors. Drug ads, for instance, that appeared in the medical journals in the 1960s, described the effects of medication as “reducing” behavioral symptoms and promoting “good adjustment.” There is some reference to improving school performance and interpersonal relations, but, as in a classic Ritalin ad, this possibility is not a direct result of the drug. The advertised drug effect is to reduce disruptive behavior, which, in turn, might make “the child more responsive” to “non-pharmacological” interventions, such as psychotherapy. These other interventions help with school and enhance relations. Not many kids were being medicated in those days.

In our age of mass diagnosis and prescriptions, the conception of ADHD is quite different. In the widely influential Executive Functions (EF) theory, the disorder is not defined in terms of specific behaviors or even cognitive deficits in attention. “It’s not really attention that’s the problem,” according to prominent EF theorist Russell Barkley.1 The notion of attention deficits was prominent in the 1970s and 1980s. That idea led to the name “attention-deficit disorder” and to the theory that, by somehow normalizing brain processes, “stimulant medication helps reduce these deficits,” to quote Virginia Douglas, a pioneer of this perspective.2

But now ADHD is conceived in relation to self, a “self,” says Barkley, that, compared to others, is inadequately “controlling, regulating, or executing behavior.” The sufferer has a deficient capacity to self-manage, be goal-oriented, and project a future. Not particular behaviors but the very organization of behavior is at stake. Symptoms might include any contextually problematic behavior that suggests a lack of individual motivation, self-control, or self-interest regarding future goals or consequences. And the stimulants—still the main drug of choice—are defined as directly producing a better self, one with self-discipline and drive.

Consumer ads promoting drugs for ADHD perfectly demonstrate the “self” theory of ADHD. In an Intuniv ad, for instance, the better self is depicted by a little boy, who, on medication, can remove the monster costume that was all that anyone could see. In a Concerta ad, a mother tells us that now that her son is on medication, she can “see Jason, not his ADHD.” The touted benefits suggest new motivation and self-application to demanding tasks—“helps improve academic productivity,” “more chores done at home”—which lead to better social relations: “already done with my homework dad,” “friends that ask him to join the group,” “makes teacher proud.” The drugs empower by helping a child to “reveal his potential,” produce “schoolwork that matches his intelligence,” puts him “on the path to success,” and even, as in one Adderall ad, “adds new meaning to [his] life.” By happy coincidence, what is good for the young person also brings peace to mom, dad, and teachers. A win-win all around.

No new scientific discovery or pharmacological innovation brought about this changed interpretation. What it reflects, instead, is a logic of self that has become a dominant organizing principle in our society. So much of the structure that previously organized life and growing up—traditions, social roles, rites of passage, moral codes, everyday rituals—has receded or disappeared. Now, choices in every area of life must be made about what to do and whom to be. Though typically thought of as liberating, having to choose everything is more difficult and demanding. As a body of social research shows, despite talk of our society as “permissive,” expanded choice means that more self-steering is required from the individual, not less.3 And all this with far fewer signposts by which to steer.

Now each child has a responsibility, increasing with age, to be the sort of person who can “manage herself,” find her own motivation, and handle herself in a wide range of social situations. The sort of person who can “make” herself into the child she somehow knows her parents want her to be. The sort who can stand out, live up to her potential, and “meet the challenges the future is throwing at her.”

All this self-making and regulating and motivating might sound daunting for a teenager. But, according to Barkley, it is perfectly natural. The properly functioning adolescent is an internally motivated and strategic self-optimizer. She knows what she wants and she knows how to get it. With little in the way of incentives or inducements, she navigates the uncertain and ever-changing environment, evaluating and choosing those courses of action that contribute to achieving success in every area of life. Distractions, immediate gratification, and boring and unrewarding tasks do not deter her.

The properly functioning child is undeterred because she is under the control of a properly functioning brain. The two functionalities—of self and brain—match. This equivalence of norms, made in fact by analogy, is what makes the theory plausible and seemingly scientific. Executive functions (neurocognitive processes), Barkley claims, underlie the powers of self-control and future-directed behavior. They guide individuals, according to a review article, in making “optimal” decisions. In any situation, EFs keep “information about possible choices in working memory,” integrate “this knowledge with information about the current context” and so “identify the optimal action for the situation.”4 If an adolescent’s neural circuitry is firing normally, she prioritizes and carries through with her own best (natural) interests.

Those who operate differently, then, very likely have a brain dysfunction. The way to find out, as has always been the case, is with the medication. When ADHD is suspected, a visit to the doctor is likely to result in a diagnosis and a recommendation to try a course of drug therapy. If the drug “works,” if it improves behavior or increases productivity, the evidence for the brain pathology, now conceived in terms of abnormal EF execution, is presumed to have been supplied. As people so often said in interviews and focus groups, they knew the problem was “real” (biological) because they could see the difference.5

The responsibility to try medication logically follows. Confronted with confident assertions of a biological deficit, parents, whatever their initial reservations, are likely to see their obligation as trying to put it right. Their own challenges with the child’s behavior and performance or ideas about providing a more suitable environment are likely to appear quite secondary by comparison. Although the functional person in EF theory only does what is socially appropriate and only strives for what is socially approved, there is no mention of social norms or conformity. The promise of treatment is on a higher plane: your son or daughter realizing a better, more successful version of themselves. It's hard to argue with that.

References

1. Unless otherwise noted, quotes are from Russell Barkley, Taking Charge of ADHD: The Complete, Authoritative Guide for Parents, third ed. New York: Guilford, 2000.

2. Virginia I. Douglas, “This Week’s Citation Classic.” Current Contents 44 (October 29, 1984): 16.

3. See, for example, Alain Ehrenberg, The Weariness of the Self: Diagnosing the History of Depression in the Contemporary Age. Montreal: McGill-Queen’s University Press, 2010; and Andreas Reckwitz, The Society of Singularities. Medford, MA: Polity, 2020.

4. Erik G. Willcutt, et al. “Validity of the Executive Function Theory of Attention-Deficit/Hyperactivity Disorder: A Meta-Analytic Review.” Biological Psychiatry 57 (2005): 1336–1346.

5. See my Chemically Imbalanced: Everyday Suffering, Medication, and our Troubled Quest for Self-Mastery. University of Chicago Press, 2020.

advertisement
More from Joseph E. Davis Ph.D.
More from Psychology Today