As an undergraduate at the University of Arizona, Kristy vanMarle knew she wanted to go to grad school for psychology, but wasn't sure what lab to join. Then, she saw a flyer: Did you know that babies can count?
"I thought, No way. Babies probably can't count, and they certainly don't count the way that we do," she says. But the seed was planted, and vanMarle started down her path of study.
What's been the focus of your most recent research?
Being literate with numbers and math is becoming increasingly important in modern society — perhaps even more important than literacy, which was the focus of a lot of educational initiatives for so many years.
We know now that numeracy at the end of high school is a really strong and important predictor of an individual's economic and occupational success. We also know from many, many different studies — including those conducted by my MU colleague, David Geary — that kids who start school behind their peers in math tend to stay behind. And the gap widens over the course of their schooling.
Our project is trying to get at what early predictors we can uncover that will tell us who might be at risk for being behind their peers when they enter kindergarten. We're taking what we know and going back a couple steps to see if we can identify kids at risk in the hopes of creating some interventions that can catch them up before school entry and put them on a much more positive path.
Your research points out that parents aren't engaging their kids in number-learning nearly enough at home. What should parents be doing?
There are any number of opportunities (no pun intended) to point out numbers to your toddler. When you hand them two crackers, you can place them on the table, count them ("one, two!" "two cookies!") as they watch. That simple interaction reinforces two of the most important rules of counting — one-to-one correspondence (labeling each item exactly once, maybe pointing as you do) and cardinality (in this case, repeating the last number to signify it stands for the total number in the set). Parents can also engage children by asking them to judge the ordinality of numbers: "I have two crackers and you have three! Who has more, you or me?"
Cooking is another common activity where children can get exposed to amounts and the relationships between amounts.
I think everyday situations present parents with lots of opportunities to help children learn the meanings of numbers and the relationships between the numbers.
Showing posts with label developmental psychology. Show all posts
Showing posts with label developmental psychology. Show all posts
Friday, August 11, 2017
Made for Math
Via NPR, research suggests that we're all born with math abilities, which we can hone as we grown:
Thursday, March 23, 2017
Age Peaks
IFLScience just posted this infographic showing the ages at which you peak at various skills, experiences, and characteristics:
I'm experiencing both fascination and slight existential crisis. Thankfully, today is my Friday.
I'm experiencing both fascination and slight existential crisis. Thankfully, today is my Friday.
Wednesday, December 28, 2016
On Star Wars, Leia, and Coming of Age Stories
Not so long ago, in a city not so far away, I watched Star Wars: A New Hope for the first time. There were many viewings after that, and the Star Wars trilogy became part of my regular rotation of movies. Yesterday, after finding out the sad news that Carrie Fisher had passed away, I had many hours in our car ride home from the holidays to think about Star Wars and what it meant to me. Because though I went on to see many movies with Carrie Fisher, that was the first time I encountered her and her badass character - a character I could look up to, who was strong, intelligent, beautiful, and brave. For a little girl who at many times wished she could do "boy" things and felt stifled by many (though not all) of the "girly" things, her character meant a lot to me.
I think Star Wars meant a lot to many people I know, just as so many coming of age stories do. We were all Luke, wishing for bigger and better things, wanting to fight for something, to be brave and heroic. And we all secretly wished we could find out we were special somehow, with some power or ability that made us different from the rest - that gave us a purpose. The story of Luke developing his Force abilities appealed to us in the same way as (to name a more contemporary movie franchise) Harry Potter's magic abilities. I think that's one reason these stories are so enduring. And Luke's story is probably most meaningful to people who were themselves coming of age when they first encountered it - whether as children, teenagers, or young adults, trying to find themselves in the world.
The thing about Star Wars is, there's something in there for everyone, and on repeat viewings I see and experience new things. I have a personal "theory" that while coming of age stories appeal most to the young, the themes that become important to us in adulthood are stories about redemption. While there are many characters in Star Wars with redemptive goals, the best example is of course Darth Vader. At first, Luke wants to fight and defeat him. As he learns the truth about who Vader is, his goal changes to wanting to save him. And in some ways, Vader wants to be saved, but even then, believes it's too late for that. It's unlikely any of us have done anything as awful as what Vader did, but I think most adults have regrets - things they feel bad about and want to atone for in some way. Many of us think we've done too many bad things or gone too far to ever come back from it. But those stories of redemption resonate with us, and help show us that it's never too late to change for the better. It's those messages that appeal to me on repeat viewings of Star Wars now.
What hasn't totally changed over the years is my opinion of Leia. I loved her feistiness, her strength, her willingness to sacrifice for the greater good, and of course, that she had no problem grabbing the gun and basically rescuing herself when needed. I aspired to be like her, and it was wonderful to have a character who "looked like me" that I could look up to. I went on to adore many movies, shows, books, and so on featuring strong female characters. And Leia is also probably why my approach to flirting frequently involves teasing someone; I haven't called anyone a scruffy-looking nerf herder or a scoundrel though (not yet anyway), but you get the idea.
I probably sound like kind of a nerd for saying that when I heard about Carrie Fisher's heart attack, I said a little prayer that she would be okay. No, I don't know her. Yes, I know Leia isn't her only character. And yes in response to a blog post I saw circulating on Facebook (that I won't link to here but really hope the three ghosts of Christmas pay that blogger a visit soon), I realize that she was pretty hard on her body over her 60 years. But I really wanted her to be okay. One of her characters meant a lot to me - as a child and still today - and so she meant a lot to me. Over the years, as I learned more about her, I saw she really was someone I could look up to. Yes, she made her mistakes, if you want to call mental illness and addiction a "mistake," and like so many of us, she was working toward redemption, toward self-improvement. She was a writer who used her words as a way to heal and grow, and also to share her story and connect to others. To show all of us that we could come back from wherever we found ourselves, that our past is not who we are now. And she encouraged us to be proud of how far we had come.
And damn, did she have a way with words:
I think Star Wars meant a lot to many people I know, just as so many coming of age stories do. We were all Luke, wishing for bigger and better things, wanting to fight for something, to be brave and heroic. And we all secretly wished we could find out we were special somehow, with some power or ability that made us different from the rest - that gave us a purpose. The story of Luke developing his Force abilities appealed to us in the same way as (to name a more contemporary movie franchise) Harry Potter's magic abilities. I think that's one reason these stories are so enduring. And Luke's story is probably most meaningful to people who were themselves coming of age when they first encountered it - whether as children, teenagers, or young adults, trying to find themselves in the world.
The thing about Star Wars is, there's something in there for everyone, and on repeat viewings I see and experience new things. I have a personal "theory" that while coming of age stories appeal most to the young, the themes that become important to us in adulthood are stories about redemption. While there are many characters in Star Wars with redemptive goals, the best example is of course Darth Vader. At first, Luke wants to fight and defeat him. As he learns the truth about who Vader is, his goal changes to wanting to save him. And in some ways, Vader wants to be saved, but even then, believes it's too late for that. It's unlikely any of us have done anything as awful as what Vader did, but I think most adults have regrets - things they feel bad about and want to atone for in some way. Many of us think we've done too many bad things or gone too far to ever come back from it. But those stories of redemption resonate with us, and help show us that it's never too late to change for the better. It's those messages that appeal to me on repeat viewings of Star Wars now.
What hasn't totally changed over the years is my opinion of Leia. I loved her feistiness, her strength, her willingness to sacrifice for the greater good, and of course, that she had no problem grabbing the gun and basically rescuing herself when needed. I aspired to be like her, and it was wonderful to have a character who "looked like me" that I could look up to. I went on to adore many movies, shows, books, and so on featuring strong female characters. And Leia is also probably why my approach to flirting frequently involves teasing someone; I haven't called anyone a scruffy-looking nerf herder or a scoundrel though (not yet anyway), but you get the idea.
I probably sound like kind of a nerd for saying that when I heard about Carrie Fisher's heart attack, I said a little prayer that she would be okay. No, I don't know her. Yes, I know Leia isn't her only character. And yes in response to a blog post I saw circulating on Facebook (that I won't link to here but really hope the three ghosts of Christmas pay that blogger a visit soon), I realize that she was pretty hard on her body over her 60 years. But I really wanted her to be okay. One of her characters meant a lot to me - as a child and still today - and so she meant a lot to me. Over the years, as I learned more about her, I saw she really was someone I could look up to. Yes, she made her mistakes, if you want to call mental illness and addiction a "mistake," and like so many of us, she was working toward redemption, toward self-improvement. She was a writer who used her words as a way to heal and grow, and also to share her story and connect to others. To show all of us that we could come back from wherever we found ourselves, that our past is not who we are now. And she encouraged us to be proud of how far we had come.
And damn, did she have a way with words:
Tuesday, November 15, 2016
There Are No Old Dogs, Just New Tricks
Throughout my adult life, I've run into many people who talk about wanting to learn some new skill: a new language, some topic of study they didn't get to in college, a new instrument, etc. When I've encouraged them to go for it, I often hear the old adage, "You can't teach an old dog new tricks." As a social psychologist with a great deal of training in behaviorism, I know for many things, that just isn't true. It might be more difficult to learn some of these new skills in adulthood, but it's certainly not impossible. In fact, your brain is still developing even into early adulthood. And the amazing concept of brain plasticity (the brain changing itself) means that a variety of cognitive changes, including learning, can continue even past that stage.
A new study in Psychological Science examined training in different skills, comparing individuals from ages 11 to 33, and found that some skills are better learned in adulthood. They included three types of training: numerosity-discrimination (specifically in this study, "the ability to rapidly approximate and compare the number of items within two different sets of colored dots presented on a gray background"), relational reasoning (identifying abstract relationships between items, sort of like "one of these things is not like the other"), and face perception (identifying when two faces presented consecutively on a screen are of the same person or different people). They also measured verbal working memory with a backward digit-span test, which involved presentation of a series of digits that participants had to recall later in reverse order.
Participants completed no more than 1 training session per day over 20 days, through an online training platform. They were told to use an internet-enabled device other than a smartphone (so a computer or tablet). The tasks were adaptive, so that performance on the previous task determined the difficulty of the next task. To compare this training program to one you're probably familiar with, what they created is very similar to Lumosity.
They found that training improved performance on all three tasks (looking at the group overall, controlling for the number of training sessions the participants actually completed). But when they included age as a variable, they found some key differences. The improvements they saw in numerosity-discrimination were due mainly to the results of the adult age group; when adults were excluded, and they only looked at adolescents, the effects became non-significant. The same was also true in relational-reasoning performance. Though there was a trend toward more improvements among adults on face-perception, these differences were not significant. You can take a look at accuracy by skill, testing session, and age group below (asterisks indicate a significant difference):
Another key finding was that there was no evidence of transfer effects - that is, receiving ample training in one task had no impact on performance in a different task. This supports something psychologists have long argued, much to the chagrin of companies that create cognitive ability training programs (ahem, Lumosity): training in a cognitive skill improves your ability in that skill specifically, and doesn't cause any generalized improvement. That's not to say doing puzzles is bad for you - it's great, but it's not going to suddenly improve your overall cognitive abilities.
But the key finding from this study is not only that "old dogs" can learn new tricks, but that for some tricks, older really is better.
EDIT: I accidentally omitted the link to the study abstract. But the good news is, I discovered it's available full-text for free! Link is now included.
A new study in Psychological Science examined training in different skills, comparing individuals from ages 11 to 33, and found that some skills are better learned in adulthood. They included three types of training: numerosity-discrimination (specifically in this study, "the ability to rapidly approximate and compare the number of items within two different sets of colored dots presented on a gray background"), relational reasoning (identifying abstract relationships between items, sort of like "one of these things is not like the other"), and face perception (identifying when two faces presented consecutively on a screen are of the same person or different people). They also measured verbal working memory with a backward digit-span test, which involved presentation of a series of digits that participants had to recall later in reverse order.
Participants completed no more than 1 training session per day over 20 days, through an online training platform. They were told to use an internet-enabled device other than a smartphone (so a computer or tablet). The tasks were adaptive, so that performance on the previous task determined the difficulty of the next task. To compare this training program to one you're probably familiar with, what they created is very similar to Lumosity.
They found that training improved performance on all three tasks (looking at the group overall, controlling for the number of training sessions the participants actually completed). But when they included age as a variable, they found some key differences. The improvements they saw in numerosity-discrimination were due mainly to the results of the adult age group; when adults were excluded, and they only looked at adolescents, the effects became non-significant. The same was also true in relational-reasoning performance. Though there was a trend toward more improvements among adults on face-perception, these differences were not significant. You can take a look at accuracy by skill, testing session, and age group below (asterisks indicate a significant difference):
Another key finding was that there was no evidence of transfer effects - that is, receiving ample training in one task had no impact on performance in a different task. This supports something psychologists have long argued, much to the chagrin of companies that create cognitive ability training programs (ahem, Lumosity): training in a cognitive skill improves your ability in that skill specifically, and doesn't cause any generalized improvement. That's not to say doing puzzles is bad for you - it's great, but it's not going to suddenly improve your overall cognitive abilities.
But the key finding from this study is not only that "old dogs" can learn new tricks, but that for some tricks, older really is better.
EDIT: I accidentally omitted the link to the study abstract. But the good news is, I discovered it's available full-text for free! Link is now included.
Sunday, October 30, 2016
Facing Their Fears
In Harry Potter and the Prisoner of Azkaban, the students learned about boggarts, magical creatures that take on the form of whatever their targets fear the most:
While this could have been an absolutely terrifying lesson, as each student is asked to step up and face the boggart, the students spent much of the time laughing. During the lesson, Professor Lupin taught them a charm that would take away the power of the boggart (or rather, take away the power the boggart had over them), by forcing it to take on a ridiculous form (whatever the spellcaster imagines).
There may be something cathartic about facing one's fears, perhaps by taking a good long look at it and realizing it is actually ridiculous. That may have been the motivation for photographer Joshua Hoffine, who photographs reenactments of his children's nightmares:
And the pictures, which will be featured in a forthcoming book, Horror Photography, are absolutely gorgeous, if not a bit disturbing:
While this could have been an absolutely terrifying lesson, as each student is asked to step up and face the boggart, the students spent much of the time laughing. During the lesson, Professor Lupin taught them a charm that would take away the power of the boggart (or rather, take away the power the boggart had over them), by forcing it to take on a ridiculous form (whatever the spellcaster imagines).
There may be something cathartic about facing one's fears, perhaps by taking a good long look at it and realizing it is actually ridiculous. That may have been the motivation for photographer Joshua Hoffine, who photographs reenactments of his children's nightmares:
Hoffine, based in Kansas City, Mo., and a self-proclaimed “Horror Photographer,” is interested in the psychology of fear. In his project “After Dark, My Sweet,” Hoffine’s surreal and staged images render these fears visible with the “visual grammar of a child.” Through elaborate sets, costumes, makeup and fog machines, Hoffine’s children act out these terrifying scenes in front of his camera.It's easy to let things spiral out of control in one's mind, but talking to others about it helps keep you grounded; others may be better at seeing when something is irrational. In setting up and taking the pictures, Hoffine is able to talk to his children about their nightmares, which may help take some of the power away.
And the pictures, which will be featured in a forthcoming book, Horror Photography, are absolutely gorgeous, if not a bit disturbing:
Wednesday, October 12, 2016
Don't Tell Momma What You Know
Not long ago, a post went viral that shouted to the masses "New Research Establishes That Intelligence Is Inherited From The Mother." Hopefully everyone called their mom and thanked her for this incredible gift, not because it's true (it's not), but because it's always good to call your mom and thank her for something.
Not the point of the post. The point is that, no, you don't inherit your intelligence from your mom. At least not completely. In a recent post for Forbes, science writer Emily Willingham ripped this post to shreds:
But more than that, she offers a far more complex view of intelligence, reminding her audience that only about half of what we call intelligence comes from genes:
I saw many of my friends, including fellow scientists, share this hot mess of a post. Why? Willingham has one potential explanation:
In my case, I know my mom is responsible for a lot of my intelligence, not just from genetics, but from offering an enriching world. She would teach me new words and ask me to try to use them in a sentence that day. I was probably the only first-grader who knew "tumultuous." She taught me the difference between independent and dependent variables when I entered a science fair in 7th grade. And she instilled me the curiosity that continues to influence my efforts today. She's also the reason I write.
Regardless of what genes were passed on (or not), there's a lot we can do for children.
Not the point of the post. The point is that, no, you don't inherit your intelligence from your mom. At least not completely. In a recent post for Forbes, science writer Emily Willingham ripped this post to shreds:
A garbled post from a website called Second Nexus has gone viral in my feeds (and possibly yours), likely because of its eye-catching headline claim that “New Research Establishes That Intelligence Is Inherited From The Mother.” The piece is bylined “Editorial Staff,” presumably because everyone was too embarrassed to put a real name on it.Ouch. She goes on to offer some education on genetics, including that 1) women "tend to" have two X chromosomes (but not always), 2) one of those X's had to come from the father, 3) and having two X's doesn't double your odds for receiving some trait because cells might shut down most of one X, reducing its influence.
But more than that, she offers a far more complex view of intelligence, reminding her audience that only about half of what we call intelligence comes from genes:
While maybe half of our intelligence as we currently define and measure it is inherited, that proportion is in turn fractured into many many genetic variants scattered across our genomes. These variants operate together in various ways to form what we view as intelligence. And each of those fragments of heredity that contributes is itself subject to a host of environmental factors, both in its immediate molecular world and inputs to the whole organism, that will influence function. And that influence continues after birth as an ongoing mutual interplay of gene variants and environment. It’s layer upon layer upon layer of interacting pieces. So no. Not just your mother. Not just the X chromosome. Not even just genes.Willingham then traces back the sources the original post used to make its claims, including Psychology Spot (a blog she calls "a dumpster fire of poor information about genetics and embryonic development"), and Cosmopolitan.
I saw many of my friends, including fellow scientists, share this hot mess of a post. Why? Willingham has one potential explanation:
Those headlines. It’s an irresistible invitation to humblebragging, whether you have a mother and think you’re a genius or you are a mother and think your children are geniuses or you’re feeling feminist and want to stake a claim that women bring the smarts to this world. That’s a pretty solid built-in audience ready to click…and share.This is not to say mothers aren't important for their children's development and intelligence. About half of our intelligence comes from genes, and short of genetic engineering, there's not a lot we can do about that. But the other half seem to be environmentally influenced. Mothers (and fathers, and really anyone interacting with children) have a lot more control here. For instance, some research shows that offering growing infants lots of stimulation can result in more developed brains - the environment can actually influence the physical. Taking children out to see the world, and experience new things, does enrich them and boost their intelligence. (Note: this is real stimulation, not on a screen - though some would argue television has deleterious effects on children, the most we can say from research is that it has no effect. And don't get me started on the Baby Einstein videos...)
In my case, I know my mom is responsible for a lot of my intelligence, not just from genetics, but from offering an enriching world. She would teach me new words and ask me to try to use them in a sentence that day. I was probably the only first-grader who knew "tumultuous." She taught me the difference between independent and dependent variables when I entered a science fair in 7th grade. And she instilled me the curiosity that continues to influence my efforts today. She's also the reason I write.
Regardless of what genes were passed on (or not), there's a lot we can do for children.
Friday, September 9, 2016
On Psychometrics, Cognitive Ability, and Achievement in the STEM Fields
Yesterday, I stumbled across an article that on the surface (based on the title, at least) appears to be an article for parents wanting to encourage genius in their children, but is in actuality a nice history of psychometrics, the measurement of cognitive ability, and evolution of gifted programs. The article focuses on the research of psychometrician Julian Stanley, who conducted the Study of Mathematically Precocious Youth (SMPY):
But Stanley saw some problems with these results, the most important of which was that Terman was using overall IQ score, which is a measure of general intelligence, to attempt to identify people gifted in science and math, which are specific intelligences. So Stanley skipped using cognitive ability tests and opted to use the quantitative portion of the SAT. Later on, they used an even more specific indicator of mathematical ability - spatial reasoning:
As the longest-running current longitudinal survey of intellectually talented children, SMPY has for 45 years tracked the careers and accomplishments of some 5,000 individuals, many of whom have gone on to become high-achieving scientists. The study's ever-growing data set has generated more than 400 papers and several books, and provided key insights into how to spot and develop talent in science, technology, engineering, mathematics (STEM) and beyond.The study was inspired in part by Lewis Terman's famous study of genetics and genius; what made this study so interesting is that it limited inclusion in the study to people with IQs in the genius range, but found low levels of success in the sample, and also missed by only a few points including two Nobel Prize laureates. This resulted in the conclusion that IQ, once you get to a certain level, does not necessarily correlate with greater success and achievement, a conclusion Malcolm Gladwell echoed in his book Outliers.
But Stanley saw some problems with these results, the most important of which was that Terman was using overall IQ score, which is a measure of general intelligence, to attempt to identify people gifted in science and math, which are specific intelligences. So Stanley skipped using cognitive ability tests and opted to use the quantitative portion of the SAT. Later on, they used an even more specific indicator of mathematical ability - spatial reasoning:
A 2013 analysis found a correlation between the number of patents and peer-refereed publications that people had produced and their earlier scores on SATs and spatial-ability tests. The SAT tests jointly accounted for about 11% of the variance; spatial ability accounted for an additional 7.6%.Evidence from Stanley's work has resulted in changes in how gifted individuals are identified and educated, including the notion of skipping grades or allowing students to complete materials at their own pace and/or take advanced coursework while remaining in the same grade level (something I did during my time in school). What it comes down to is the importance of proper measurement - knowing what to measure to understand and predict certain outcomes.
Wednesday, September 7, 2016
The Bully Who Cried Wolf
This morning on Facebook, I came across an article about bullying. Or more specifically, it was about differentiating bullying from rudeness and meanness. As a researcher, I love operational definitions and digging into different but related concepts to come to a better understanding of human interaction. Specifically, the author Signe Whitson (social worker and author of Friendship & Other Weapons: Group Activities to Help Young Girls Cope with Bullying) defined the terms as follows:
At the same time, rudeness and meanness can grow into bullying, if left unchecked. So I think it is still important to intervene before things get out of hand. To be fair, Whitson does say all of these behaviors need correction, so she isn't downplaying their importance. But we do need to strike a balance between responding to bullying, and intervening to keep other behaviors from turning into bullying.
"Rude = Inadvertently saying or doing something that hurts someone else.The point Whitson was trying to make is that, while all of these behaviors are problematic and need correction, they shouldn't all be called bullying. Her reason for this is that overuse of the word takes away its effectiveness, especially if people begin hearing ad nauseum about cases that aren't bullying:
Mean = Purposefully saying or doing something to hurt someone once (or maybe twice).
Bullying = Intentionally aggressive behavior, repeated over time, that involves an imbalance of power."
"I have already begun to see that gratuitous references to bullying are creating a bit of a 'little boy who cried wolf' phenomena. In other words, if kids and parents improperly classify rudeness and mean behavior as bullying — whether to simply make conversation or to bring attention to their short-term discomfort — we all run the risk of becoming so sick and tired of hearing the word that this actual life-and-death issue among young people loses its urgency as quickly as it rose to prominence."What I've learned as a psychologist - and have repeated to friends, especially those who are or are soon becoming parents - is that it takes a lot to damage a person permanently. People are pretty resilient. But repeated injuries, whether physical or psychological, can have lasting effects. We need time to heal and rebuild, but if a person keeps being bullied without having the opportunity to grow and move on, they become stuck, merely reacting to it. They don't have time to prepare and build resources that would help them shield from it. Even worse, they may come to see being bullied as part of who they are. So I think Whitson's terminology is very important to keep in mind.
At the same time, rudeness and meanness can grow into bullying, if left unchecked. So I think it is still important to intervene before things get out of hand. To be fair, Whitson does say all of these behaviors need correction, so she isn't downplaying their importance. But we do need to strike a balance between responding to bullying, and intervening to keep other behaviors from turning into bullying.
Friday, May 27, 2016
Childhood Abuse, Reliability, and Measurement Error: More from Sara's Week in Psychological Science
This morning, I attended a great session on methodological issues in studying trauma. One of the presentations was about studying childhood abuse. Many measures of childhood abuse are done retrospectively, among college students/adults, asking about past experiences. Researchers have noted a variety of issues with this approach, including the issue that reports of childhood abuse seem to vary over time - that is, a person may report that they were abused as a child at one timepoint, but not at another.
This leads some to conclude that reports of childhood abuse may be influenced by current levels of distress - people may misremember or misreport childhood abuse depending on how distressed they are feeling as adults. We know that current experiences can color past experiences, which we see in any kind of memory research; memory is highly malleable. However, any time we measure something in people, we also have to worry about measurement error. Poorly worded questions, respondent fatigue, and other factors may affect how well the measure "works."
The great thing about structural equation modeling (SEM) is that it separates measurement error out, so we can get a more pure read of how much particular constructs relate to each other. Some people have used this as mark against SEM, that it shows a "perfect world" relationship that we would rarely see in practice. But in many cases, SEM is a great technique to use as a way to answer questions about reliability due to measurement error versus systematic sources of variation.
The researchers measured past experiences of childhood abuse at the same time as they measured symptoms of distress, using a post-traumatic stress disorder (PTSD) checklist. Two weeks later, they measured these two variables again. They built a model where time 1 PTSD predicted time 2 PTSD, time 1 reports of childhood abuse predicted time 2 reports, and time 2 PTSD predicted time 2 reports. What they found was that time 2 PTSD predicted less than 2% of the variance in time 2 reports of childhood abuse. Time 1 abuse reports was highly predictive of time 2 abuse reports, and once measurement error was factored out, they found reliability in abuse reports above 0.80 (which in measurement world is considered excellent reliability).
What this means is that, when it looks like people are saying different things at different times, measurement error is a much more likely culprit than how a person is currently feeling. Obviously, the study was done over a short timeline (2 weeks), so results may be different if that time period is longer. This was also a college sample, and reports of abuse, especially physical and sexual abuse, were low. But this study gives some guidance for studying reports of childhood abuse in other samples, and highlights a time when separating measurement error from systematic variation (i.e., actual differences in reports of childhood abuse) is the optimal approach.
This leads some to conclude that reports of childhood abuse may be influenced by current levels of distress - people may misremember or misreport childhood abuse depending on how distressed they are feeling as adults. We know that current experiences can color past experiences, which we see in any kind of memory research; memory is highly malleable. However, any time we measure something in people, we also have to worry about measurement error. Poorly worded questions, respondent fatigue, and other factors may affect how well the measure "works."
The great thing about structural equation modeling (SEM) is that it separates measurement error out, so we can get a more pure read of how much particular constructs relate to each other. Some people have used this as mark against SEM, that it shows a "perfect world" relationship that we would rarely see in practice. But in many cases, SEM is a great technique to use as a way to answer questions about reliability due to measurement error versus systematic sources of variation.
The researchers measured past experiences of childhood abuse at the same time as they measured symptoms of distress, using a post-traumatic stress disorder (PTSD) checklist. Two weeks later, they measured these two variables again. They built a model where time 1 PTSD predicted time 2 PTSD, time 1 reports of childhood abuse predicted time 2 reports, and time 2 PTSD predicted time 2 reports. What they found was that time 2 PTSD predicted less than 2% of the variance in time 2 reports of childhood abuse. Time 1 abuse reports was highly predictive of time 2 abuse reports, and once measurement error was factored out, they found reliability in abuse reports above 0.80 (which in measurement world is considered excellent reliability).
What this means is that, when it looks like people are saying different things at different times, measurement error is a much more likely culprit than how a person is currently feeling. Obviously, the study was done over a short timeline (2 weeks), so results may be different if that time period is longer. This was also a college sample, and reports of abuse, especially physical and sexual abuse, were low. But this study gives some guidance for studying reports of childhood abuse in other samples, and highlights a time when separating measurement error from systematic variation (i.e., actual differences in reports of childhood abuse) is the optimal approach.
Wednesday, May 11, 2016
More About Self Theories: It's Not About Views on Success, But Views on Failure
I blogged recently about self theories, which refer to whether one believes intelligence is innate (fixed) or learned (incremental). As a result of this work, many psychologists have cautioned parents about the type of praise they give their children when they succeed. "You're so smart" implies that intelligence is innate, while "You worked really hard" implies that intelligence/ability can be learned.
However, I learned about a study yesterday that says it isn't parents' views on success, but failure, that impact their children's self theories. They conducted 4 studies, 2 with parents only and 2 with parents and their children. Across all four studies, they found that parents' self theories did not predict children's self theories, but that instead, parent's views on failure predicted children's self theories:
What this means is, if we want to avoid having our children develop a fixed view of intelligence, we have to change our own view of intelligence - not just in what we say. How do we do that? Unsurprisingly, Carol Dweck, who developed the concept of self theories, has designed an intervention called Mindset, though it is aimed at students. But perhaps another way is to adapt an approach used in cognitive therapy and similar interventions: self-monitoring and mindfulness.
In these approaches, the individual monitors his or her thoughts for instances of negative self-talk. They are not trying to suppress those thoughts - because we know that can backfire. Instead, when they encounter those thoughts, they practice acceptance of any mistakes they made that led to that self-talk and may even come up with rebuttals to those negative thoughts. You can read a little more about mindfulness and negative self talk here. In fact, as you'll see in the article, there are many common threads between mindfulness and incremental views of intelligence. For instance, the article states:
However, I learned about a study yesterday that says it isn't parents' views on success, but failure, that impact their children's self theories. They conducted 4 studies, 2 with parents only and 2 with parents and their children. Across all four studies, they found that parents' self theories did not predict children's self theories, but that instead, parent's views on failure predicted children's self theories:
Our findings indeed show that parents who believe failure is a debilitating experience have children who believe they cannot develop their intelligence. The findings further suggest that this is because these parents react to their children’s failures by focusing more on their children’s ability or performance than on their learning. Taken together, our findings seem to have identified a parental belief that translates into concerns and behaviors that are visible to children and that, in turn, shape children’s own beliefs.It is easier - although maybe not easy - to be watchful about the kind of compliments one gives to children, but when your child has failed at something and he/she is upset, it's more difficult to think rationally about the right response. In those situations, we tend to just react. The problem is, if that reaction is devastation that your child has failed, it sends a message that your child can do nothing to learn and develop his/her abilities. As a result, children may disengage and instead do something "safer" - something at which they know they will succeed. They may miss the opportunity to grow and challenge themselves - challenges which have many cognitive and emotional benefits.
What this means is, if we want to avoid having our children develop a fixed view of intelligence, we have to change our own view of intelligence - not just in what we say. How do we do that? Unsurprisingly, Carol Dweck, who developed the concept of self theories, has designed an intervention called Mindset, though it is aimed at students. But perhaps another way is to adapt an approach used in cognitive therapy and similar interventions: self-monitoring and mindfulness.
In these approaches, the individual monitors his or her thoughts for instances of negative self-talk. They are not trying to suppress those thoughts - because we know that can backfire. Instead, when they encounter those thoughts, they practice acceptance of any mistakes they made that led to that self-talk and may even come up with rebuttals to those negative thoughts. You can read a little more about mindfulness and negative self talk here. In fact, as you'll see in the article, there are many common threads between mindfulness and incremental views of intelligence. For instance, the article states:
People with good self-esteem see mistakes and failures as opportunities to learn about themselves. They take a "beginner's mind" approach - putting aside the judgements and conclusions from past behaviour and actions and, instead, thinking about what they've learned from these experiences.And for a humorous approach to defeating negative self-talk, you could try this:
Wednesday, May 4, 2016
On Feelings (Woah-oh-oh, Feelings)
According to Paul Ekman, there are 7 basic human emotions that exist across cultures: happiness, sadness, surprise, contempt, anger, disgust, and fear. (It should be noted that Ekman's original research only identified 6 emotions; contempt was added later.) Ekman's work, based on research and observation in many different cultures, remains highly influential, and he has served as consultant for shows and movies, including the series Lie to Me, based on his work in lie detection, and the Pixar movie, Inside Out. Emotions, and their outward expression, serve an important evolutionary function, and also provide evidence in support of evolution, because humans and other primates share many emotional expressions.
These basic emotions are the ingredients for more complex emotions: happiness and contempt, for instance, combine to form smugness. And we all know what happens when you combine fear and anger:
Emotions not only give us cues when interacting with other people, allowing us to alter our behavior in response to their emotions, they also are an important guide in our own decision-making. Previously, we thought choice was determined by our perceptions about value; that is, we assign values to different outcomes, and pick the outcome (or course of action that would lead to that outcome) that has the most value to us. Value is considered a proxy for our emotions. However, no one has really tested that assumption directly.
Until now, that is. A recent study in Psychological Science, which you can read for free here, examined feelings and choice. From this, the researchers developed a "feeling function" that allowed them to relate feelings to value, and ultimately choice. If feelings are a proxy for value, we would expect a perfect or near perfect relationship between the two.
Based on past research, the authors expected loss to have more of an influence than wins; people are loss averse, and will rate a loss of money as more impactful than a win of the same magnitude. We also show diminishing sensitivity; a gain of $10 isn't twice as valuable to us as a gain of $5. The gambling task involved a series of shapes that participants had to choose between. They were randomized into one of three types: mixed (one shape had a 50% chance of a gain and 50% chance of a loss, and the other was a sure option of 0), gain only (one shape had a 50% chance of a gain and 50% of 0, or a sure, smaller gain), and loss only (one shape had a 50% chance of a high loss and 50% chance of 0, or a sure, smaller loss).
Participants also completed two feeling tasks: one about expected feelings - how they thought they would feel about winning or losing various amounts, and one about experienced feelings - how they actually felt after winning or losing various amounts. They used the expected feelings data to develop the feeling function, and tested it on the experienced feelings data. They found that feelings better predicted choice than values. They also found the diminishing sensitivity relationship, where a loss or gain of $10 has less than twice the impact of a loss or gain of $5. Interestingly, they didn't find evidence for loss aversion; losses and gains of the same magnitude had the same impact.
A lot of people refer to decisions made with the aid of emotions irrational. But emotions allow us to make quick decisions, as well as to decide between things that are generally equivalent. It's not always necessary to systematically consider all options. Sometimes, it's good to just trust your instinct.
But in all seriousness, May the Fourth be with you!
These basic emotions are the ingredients for more complex emotions: happiness and contempt, for instance, combine to form smugness. And we all know what happens when you combine fear and anger:
![]() |
And May the Fourth be with you! |
Emotions not only give us cues when interacting with other people, allowing us to alter our behavior in response to their emotions, they also are an important guide in our own decision-making. Previously, we thought choice was determined by our perceptions about value; that is, we assign values to different outcomes, and pick the outcome (or course of action that would lead to that outcome) that has the most value to us. Value is considered a proxy for our emotions. However, no one has really tested that assumption directly.
Until now, that is. A recent study in Psychological Science, which you can read for free here, examined feelings and choice. From this, the researchers developed a "feeling function" that allowed them to relate feelings to value, and ultimately choice. If feelings are a proxy for value, we would expect a perfect or near perfect relationship between the two.
Based on past research, the authors expected loss to have more of an influence than wins; people are loss averse, and will rate a loss of money as more impactful than a win of the same magnitude. We also show diminishing sensitivity; a gain of $10 isn't twice as valuable to us as a gain of $5. The gambling task involved a series of shapes that participants had to choose between. They were randomized into one of three types: mixed (one shape had a 50% chance of a gain and 50% chance of a loss, and the other was a sure option of 0), gain only (one shape had a 50% chance of a gain and 50% of 0, or a sure, smaller gain), and loss only (one shape had a 50% chance of a high loss and 50% chance of 0, or a sure, smaller loss).
Participants also completed two feeling tasks: one about expected feelings - how they thought they would feel about winning or losing various amounts, and one about experienced feelings - how they actually felt after winning or losing various amounts. They used the expected feelings data to develop the feeling function, and tested it on the experienced feelings data. They found that feelings better predicted choice than values. They also found the diminishing sensitivity relationship, where a loss or gain of $10 has less than twice the impact of a loss or gain of $5. Interestingly, they didn't find evidence for loss aversion; losses and gains of the same magnitude had the same impact.
A lot of people refer to decisions made with the aid of emotions irrational. But emotions allow us to make quick decisions, as well as to decide between things that are generally equivalent. It's not always necessary to systematically consider all options. Sometimes, it's good to just trust your instinct.
But in all seriousness, May the Fourth be with you!
Tuesday, April 19, 2016
P is for Parasocial Relationships
Human beings are social creatures; we seek out relationships with other people in a variety of capacities and to fulfill a variety of important needs. In fact, we are so hard-wired to build relationships with others, that we may even feel connected - in a social sense - to people we have never met, often people we encounter through the media. We call this phenomenon "parasocial relationships."
This behavior begins very early on in life. As children, we learn about social norms and how we should behave by watching others, including through television, movies, and video games. Because children have such active imaginations, and often do not yet know the difference between fantasy and reality, they may come to believe the characters they watch and even interact with are real. As we grow older, we (hopefully) learn that the characters aren't real...
But the feelings and connections continue to influence us and shape how we interact with others. Even as adults, we continue to feel connections to characters and media personalities, even when we recognize that those connections aren't real. You could argue that fandom, having favorite characters, and so on, are all extensions of our tendency to form parasocial relationships.
The concept of parasocial relationships plays an important part in a theory of media known as uses and gratifications (U&G) theory - essentially, people have different motivations for using media, and will select media that fulfills their needs. In this theory, rather than being passive recipients of media information, consumers play an active role in seeking out and integrating media into their lives. However, though U&G theory is relatively new (since the 1940s), the concept of parasocial relationships has been around much longer, and could encompass feelings of connectivity with story characters, or even gods and spirits.
While we all show this tendency, some people are more likely to form parasocial relationships or rather, more likely to form strong parasocial relationships, than others. People who do not have many opportunities for regular social interaction, for instance, tend to compensate for this deficit with parasocial relationships. I actually had the opportunity to witness this firsthand several years ago. My mom is visually impaired, and since I was in school, and my dad and brother worked full-time, she spent a lot of time at home with the dog and the TV. I introduced her to my all-time favorite show, Buffy the Vampire Slayer, and got her hooked.
So hooked, in fact, that I noticed she started talking about the characters - especially Willow, her favorite character - as though they were real people.
At first, I was a bit concerned, until I started thinking back to the concept of parasocial relationships. I realized that what she was doing was actually quite normal, maybe even healthy. And though somewhat more intense, her connection to the characters was not altogether different from mine - considering that show can make me laugh or cry, regardless of how many times I've seen a particular episode, I likely also feel some strong social connections to the characters of Buffy.
And though I've focused the post so far on characters, we can also form parasocial relationships with real people, like celebrities. For instance, I know a lot about some of my favorite authors - I've read about them, even met a few of them, and can talk about them almost as though I know them. While at the logical level I know I don't actually KNOW them, it's completely normal to still feel a social connection to them.
This behavior begins very early on in life. As children, we learn about social norms and how we should behave by watching others, including through television, movies, and video games. Because children have such active imaginations, and often do not yet know the difference between fantasy and reality, they may come to believe the characters they watch and even interact with are real. As we grow older, we (hopefully) learn that the characters aren't real...
But the feelings and connections continue to influence us and shape how we interact with others. Even as adults, we continue to feel connections to characters and media personalities, even when we recognize that those connections aren't real. You could argue that fandom, having favorite characters, and so on, are all extensions of our tendency to form parasocial relationships.
The concept of parasocial relationships plays an important part in a theory of media known as uses and gratifications (U&G) theory - essentially, people have different motivations for using media, and will select media that fulfills their needs. In this theory, rather than being passive recipients of media information, consumers play an active role in seeking out and integrating media into their lives. However, though U&G theory is relatively new (since the 1940s), the concept of parasocial relationships has been around much longer, and could encompass feelings of connectivity with story characters, or even gods and spirits.
While we all show this tendency, some people are more likely to form parasocial relationships or rather, more likely to form strong parasocial relationships, than others. People who do not have many opportunities for regular social interaction, for instance, tend to compensate for this deficit with parasocial relationships. I actually had the opportunity to witness this firsthand several years ago. My mom is visually impaired, and since I was in school, and my dad and brother worked full-time, she spent a lot of time at home with the dog and the TV. I introduced her to my all-time favorite show, Buffy the Vampire Slayer, and got her hooked.
So hooked, in fact, that I noticed she started talking about the characters - especially Willow, her favorite character - as though they were real people.
![]() |
Because, honestly, who doesn't love Willow? |
At first, I was a bit concerned, until I started thinking back to the concept of parasocial relationships. I realized that what she was doing was actually quite normal, maybe even healthy. And though somewhat more intense, her connection to the characters was not altogether different from mine - considering that show can make me laugh or cry, regardless of how many times I've seen a particular episode, I likely also feel some strong social connections to the characters of Buffy.
And though I've focused the post so far on characters, we can also form parasocial relationships with real people, like celebrities. For instance, I know a lot about some of my favorite authors - I've read about them, even met a few of them, and can talk about them almost as though I know them. While at the logical level I know I don't actually KNOW them, it's completely normal to still feel a social connection to them.
Saturday, March 26, 2016
What is Social Psychology: A Taste of What's to Come in April
I'm so excited for the April A-Z Blog Challenge, and am thrilled to get comments from new visitors excited about the topic. A couple of you mentioned that you haven't heard the term "social psychology" before (and trust me, you're not alone - probably half of the people who hear my degree is in social psychology are unfamiliar with the term), so I thought now would be a good time to give a brief overview, and let you know some things to expect in April.
Social psychology is a subfield of psychology. In fact, there are many subfields; just to name a few: developmental psychology, cognitive psychology, behavioral psychology, and clinical psychology. All of these subfields are part of the science of how people think, feel, and behave (the overall definition of psychology), but they go about understanding and studying those concepts in different ways.
Social psychology focuses on how people think, feel, and behave in social settings and examines social factors (like other people, groups of people, even societal norms) that impact how people think, feel, and behave. That is, even when you are by yourself, social forces like the media or the groups to which you belong continue to influence you.
Many people ask me how social psychology differs from sociology (an excellent question!). Sociology is about studying very large groups - like race, culture, or country of origin - and the overall behaviors of those large groups. Social psychology, on the other hand, is about individuals and their behavior, and in some cases, small groups, like juries; in fact, jury decision-making is a big topic in social psychology. Sociology and social psychology also use different theories and sometimes different methods to study concepts.
So what are a few key topics in social psychology? As I mentioned above, jury decision-making and the related topic of how small groups make decisions, is a big one. I also mentioned the media as an influence on behavior, which is one of my favorite areas to study in social psychology. In fact, some social psychologists study whether the media encourages aggression or violence. If you've taken introductory psychology, you may remember watching a video about the "Bobo doll" study, a study by Albert Bandura examining whether watching violent media made children more likely to punch a Bobo doll.
If you didn't take introductory psychology, or don't remember this particular contribution, you're in luck! I'll be talking more about this study and Bandura's work in April!
Another famous social psychological study is sometimes called the "Milgram shock experiment," where Stanley Milgram did what appeared to be, on the surface, a study of learning, in which the teacher shocked the pupil for wrong answers; in actuality, the study was on obedience to authority, because the teacher (the actual participant) was encouraged by the experimenter to shock the pupil (an actor, who was not receiving real shocks) at increasing intensity. If this sounds like a brutal and unethical experiment (and it was, although it wasn't deemed unethical until after the fact), Milgram conducted the study to understand why seemingly good people did horrible things at the behest of a leader (such as what happened during the Holocaust). Once again, more on that study in April!
In both of these cases, someone else is influencing the individual directly: the violent media or the experimenter demanding more shocks. But what about topics where the influence isn't as direct or obvious? For instance, what about the ways in which we associate ourselves with others, to form relationships or attachments? These are also prime topics for social psychologists. One example is the attachments we form with people we haven't met. We have favorite celebrities who we feel as if we "know." Or, if you're like me, you've become so attached to a character from a book or TV show, that you feel genuine emotions about what happens to him/her. We call these "parasocial relationships." This is why we, for example, cry when a character we know and love dies. More on that in April!
Hopefully this post has answered some of your questions about my favorite field. If not, feel free to ask questions in the comments below! I absolutely love getting to tell people more about what I do. And be sure to check back in April!
Social psychology is a subfield of psychology. In fact, there are many subfields; just to name a few: developmental psychology, cognitive psychology, behavioral psychology, and clinical psychology. All of these subfields are part of the science of how people think, feel, and behave (the overall definition of psychology), but they go about understanding and studying those concepts in different ways.
Social psychology focuses on how people think, feel, and behave in social settings and examines social factors (like other people, groups of people, even societal norms) that impact how people think, feel, and behave. That is, even when you are by yourself, social forces like the media or the groups to which you belong continue to influence you.
Many people ask me how social psychology differs from sociology (an excellent question!). Sociology is about studying very large groups - like race, culture, or country of origin - and the overall behaviors of those large groups. Social psychology, on the other hand, is about individuals and their behavior, and in some cases, small groups, like juries; in fact, jury decision-making is a big topic in social psychology. Sociology and social psychology also use different theories and sometimes different methods to study concepts.
So what are a few key topics in social psychology? As I mentioned above, jury decision-making and the related topic of how small groups make decisions, is a big one. I also mentioned the media as an influence on behavior, which is one of my favorite areas to study in social psychology. In fact, some social psychologists study whether the media encourages aggression or violence. If you've taken introductory psychology, you may remember watching a video about the "Bobo doll" study, a study by Albert Bandura examining whether watching violent media made children more likely to punch a Bobo doll.
![]() |
Honestly, I'd punch anything that looks like a clown. Clowns are terrifying. |
If you didn't take introductory psychology, or don't remember this particular contribution, you're in luck! I'll be talking more about this study and Bandura's work in April!
Another famous social psychological study is sometimes called the "Milgram shock experiment," where Stanley Milgram did what appeared to be, on the surface, a study of learning, in which the teacher shocked the pupil for wrong answers; in actuality, the study was on obedience to authority, because the teacher (the actual participant) was encouraged by the experimenter to shock the pupil (an actor, who was not receiving real shocks) at increasing intensity. If this sounds like a brutal and unethical experiment (and it was, although it wasn't deemed unethical until after the fact), Milgram conducted the study to understand why seemingly good people did horrible things at the behest of a leader (such as what happened during the Holocaust). Once again, more on that study in April!
In both of these cases, someone else is influencing the individual directly: the violent media or the experimenter demanding more shocks. But what about topics where the influence isn't as direct or obvious? For instance, what about the ways in which we associate ourselves with others, to form relationships or attachments? These are also prime topics for social psychologists. One example is the attachments we form with people we haven't met. We have favorite celebrities who we feel as if we "know." Or, if you're like me, you've become so attached to a character from a book or TV show, that you feel genuine emotions about what happens to him/her. We call these "parasocial relationships." This is why we, for example, cry when a character we know and love dies. More on that in April!
Hopefully this post has answered some of your questions about my favorite field. If not, feel free to ask questions in the comments below! I absolutely love getting to tell people more about what I do. And be sure to check back in April!
Friday, August 7, 2015
When the King Makes Budget Cuts, the Arts are the First to Go: Pippin at the Cadillac Palace Theatre
Last night, I went to a spectacular production of Pippin at the Cadillac Palace Theatre in Chicago. Pippin is the eldest son of Charlemagne - though these characters are based on real people, Charlemagne and Pepin, very little of the plot is historically accurate. The show begins, after introducing a troupe of players (Magic to Do), when Pippin has finished university and returned home to begin finding his place in the world (Corner of the Sky).
Pippin does not seem to have a clear idea of what he wants to do with his life, beyond that he is "extraordinary" and he wants to have a fulfilling life. He tries on various "selves" over the course of the show: warrior, courtesy of some sibling rivalry with his half-brother Lewis - his time as a warrior ends after a discussion with a headless corpse; lover, after a discussion with his grandmother; revolutionary and then King, after some coaxing by the Lead Player; artist (until the arts budget is cut); and religious man (where he was "touched" but not by an angel). For each, Pippin seems to drift along, trying on these different identities, but never fully committing to them - in fact, even selecting these identities is not always his idea. The only decision he seems to make on his own is to run away and try something new, all the while insisting that he is "extraordinary" - so how could he possibly lead a simple, ordinary life? When nothing seems to be working, Pippin's existential crisis leaves him in utter despair.
The show echoes some of the struggles we all go through, of determining our identity and role in the world (see a previous post about this topic). According to Erikson's stages of development, this stage occurs during adolescence, about ages 13-19. Presumably Pippin is older than this by a few years, but it's quite likely that this stage may extend a little later for people who attend university before selecting a career and life goal. The Lead Player operates as the little voice inside Pippin's head, telling him he is made for great things and should never settle. The Lead Player even attempts to sabotage Pippin's relationship with a woman describing herself as "ordinary." But Pippin seems to find some fulfillment and meaning when he meets someone who needs him, and this helps guide him to his destiny.
As I mentioned, the production was spectacular. The story took place inside a circus tent, with the players doing complicated acrobatics, dangling from hoops and trapeze, and, in one scene, balancing on four stacked metal tubes. Pippin's stepmother, Fastrada, had two costume changes in one scene that each took only a few seconds. The actors all had a great time, interacting with the audience and ad-libbing. The show breaks the fourth wall very often, especially in Act 2, so the interactions with the audience - such as Charlemagne playfully chiding the audience for applauding Fastrada ("Don't applaud; you'll only encourage her... And don't applaud that, either.") - fit well with the tone of the show.
In one scene, Lewis was supposed to leap through a hoop the Lead Player held over her head, and he just barely missed. Staying completely in the character, the Lead Player said - as the orchestra kept playing - "Nope, we're doing that one again." Lewis returned to his starting position, the orchestra transitioned back to that point in the music seamlessly, and Lewis made the leap flawlessly to thunderous applause. As the Lead Player continued on with the show, she briefly paused and said to the audience, "You're welcome."
Under the surface, though, Pippin deals with a much deeper, even somber theme, about finding fulfillment and leading a good life - regardless of whether it is extraordinary or not - as well as the danger of letting perfect be the enemy of good. The light-hearted tone of the rest of the show allows Pippin (and the audience) to get so caught up in the fun, we almost miss when the action takes a darker turn.
I wish all of you could see this show. Sadly, the production closes on Sunday. But if you have the opportunity between now and then - yes, I know, not a lot of time - definitely check it out!
~Sara
Pippin does not seem to have a clear idea of what he wants to do with his life, beyond that he is "extraordinary" and he wants to have a fulfilling life. He tries on various "selves" over the course of the show: warrior, courtesy of some sibling rivalry with his half-brother Lewis - his time as a warrior ends after a discussion with a headless corpse; lover, after a discussion with his grandmother; revolutionary and then King, after some coaxing by the Lead Player; artist (until the arts budget is cut); and religious man (where he was "touched" but not by an angel). For each, Pippin seems to drift along, trying on these different identities, but never fully committing to them - in fact, even selecting these identities is not always his idea. The only decision he seems to make on his own is to run away and try something new, all the while insisting that he is "extraordinary" - so how could he possibly lead a simple, ordinary life? When nothing seems to be working, Pippin's existential crisis leaves him in utter despair.
The show echoes some of the struggles we all go through, of determining our identity and role in the world (see a previous post about this topic). According to Erikson's stages of development, this stage occurs during adolescence, about ages 13-19. Presumably Pippin is older than this by a few years, but it's quite likely that this stage may extend a little later for people who attend university before selecting a career and life goal. The Lead Player operates as the little voice inside Pippin's head, telling him he is made for great things and should never settle. The Lead Player even attempts to sabotage Pippin's relationship with a woman describing herself as "ordinary." But Pippin seems to find some fulfillment and meaning when he meets someone who needs him, and this helps guide him to his destiny.
As I mentioned, the production was spectacular. The story took place inside a circus tent, with the players doing complicated acrobatics, dangling from hoops and trapeze, and, in one scene, balancing on four stacked metal tubes. Pippin's stepmother, Fastrada, had two costume changes in one scene that each took only a few seconds. The actors all had a great time, interacting with the audience and ad-libbing. The show breaks the fourth wall very often, especially in Act 2, so the interactions with the audience - such as Charlemagne playfully chiding the audience for applauding Fastrada ("Don't applaud; you'll only encourage her... And don't applaud that, either.") - fit well with the tone of the show.
In one scene, Lewis was supposed to leap through a hoop the Lead Player held over her head, and he just barely missed. Staying completely in the character, the Lead Player said - as the orchestra kept playing - "Nope, we're doing that one again." Lewis returned to his starting position, the orchestra transitioned back to that point in the music seamlessly, and Lewis made the leap flawlessly to thunderous applause. As the Lead Player continued on with the show, she briefly paused and said to the audience, "You're welcome."
Under the surface, though, Pippin deals with a much deeper, even somber theme, about finding fulfillment and leading a good life - regardless of whether it is extraordinary or not - as well as the danger of letting perfect be the enemy of good. The light-hearted tone of the rest of the show allows Pippin (and the audience) to get so caught up in the fun, we almost miss when the action takes a darker turn.
I wish all of you could see this show. Sadly, the production closes on Sunday. But if you have the opportunity between now and then - yes, I know, not a lot of time - definitely check it out!
~Sara
Friday, November 4, 2011
On Publishing, Perishing, and Post-Docing: A Reaction to Diederik Stapel's Confession
One reason I started this blog was as an outlet for my writing. I've always loved writing, and often considered it for a career (in those fleeting moments when I was really proud of something I had written and thought, "Yeah, I can do this forever"). I was constantly penning short stories, creating characters and writing notes to my friends in which they played a prominent role (or sometimes were the authors of the notes themselves). I've written many plays: one acts, two acts, I even had the outline of a three act modern tragedy that I still think of going back to - my Citizen Kane or Death of a Salesman (yes, I know I'm being overly dramatic: as a formerly theatre person, I have the flair for drama, and as a psychology person, I'm painfully self-aware of that and all my other traits).
Of course, I changed my major in the middle of my first semester at college, from theatre to psychology, not realizing that, if I thought getting published as a fiction writer was tough, it was nothing compared to getting published as a psychology researcher. Publish or perish is the expression in my field, and it is accurate. Getting the best jobs, getting research funding, it all depends on having a strong publication record. And with more people earning higher degrees now, there's even more competition. This is one reason the number of PhDs going into post-doc positions has also increased recently; grad school alone is no longer enough to prepare most people for the most attractive research and academic positions.
My number one goal in my post-doc is to publish as much as I possibly can. I even submitted a paper today. But I can't rest on my laurels, because I've got 5 other papers in various stages of preparation. Though my most recent reviews may still sting (and I'm not alone - there's actually a group on Facebook devoted to Reviewer 2, often the most matter-of-fact and even rude of the group) I can't let it traumatize me for too long, because there are more studies to perform, more data to analyze, more papers to write.
That's why when I read an article in the New York Times about a prominent psychology researcher who admitted that he massaged data, made up findings, and even wrote up studies that were never actually performed, and published it all in prominent journals, I was a bit annoyed. Am I bitter that while I was dealing with snide reviewers insulting my intelligence, research methods knowledge, and mother, this guy was fabricating data, falsifying methodology, and just plain making whole studies up (and getting rewarded for it, albeit not purposefully)? In a word: yes. But, no matter how tough the publishing world was, the possibility of doing what this guy did was never even an option. It's not that I thought this sort of thing doesn't happen; we all know it does, just as we know there are students who hire people to take the SATs or write their theses for them.
I know I'm not the only one who can say that this wouldn't be one of my answers to the difficulty of publishing in this field, and it's not because of a lack of creativity. Whenever we write research proposals, we have already have to write the introduction/background and methodology sections; we sometimes have to write an expected results section. Make that "expected" part disappear, add some statistics, illustrative quotes, whatever, then finish with a discussion/conclusion and voila! Made up study. And if you're in a field or at an institution where it's normal for someone to conduct and write up a study all by his- or herself, who will ever find out?
Well, apparently someone did, because this guy was caught and confessed, and the whole thing was written up in the New York Times. You can perhaps understand his motivation, and there are surely countless other researchers who have done the same thing and never got caught. And if you're a bit sly about it, your chances of getting caught will likely go down further. So what makes the people who would never do such a thing different?
Anyone who has taken an introductory philosophy class - or who has seen the movie Election - can tell you the difference between morals and ethics. For those who fall in neither of those groups: Morals are notions about what is right and what is wrong. Ethics often refers to the moral code of a particular group, and it sometimes is used to describe what is considered right and wrong at someone's job or within a certain field. That is, if we say a study was conducted ethically, we mean generally that it was performed in a way to minimize unnecessary harm, but more specifically, we mean that an overseeing body examined it and decided it abided by the rules established by some even higher-up overseeing body. Psychological ethics clearly say that falsifying data is wrong; it's unambiguously stated. Stapel can't plead ignorance here.
As we move into adolescence and adulthood, we also move into stages 3 and 4. In stage 3, people begin fulfilling societal roles, and behave in ways that conform to others' expectations; it seems the motivating principle here is they, as in "what would they think?" In stage 4, morality is based on legality. Finally, some lucky few move to stages 5 and 6, which Kohlberg considered the highest levels of morality. These individuals are no longer motivated by pleasing others, what is legal/illegal, or even self-interest; instead, they develop universal principles of what is right and wrong, and seek to enforce those principles, even if it means breaking the law or sacrificing their own needs.
But perhaps what it really comes down to is why one became a scientist at all. I like to think I went into this field because I was good at it, but then there are other things I'm good at (perhaps things I'm even better at than this), some that I could have potentially built a career around. I find the field to be challenging, but once again, there are other interesting and challenging fields I could have pursued. As cheesy as it sounds, I really want to make the world a better place and I see my field as one approach to reaching that goal. I'm sure Diederik Stapel had similar reasons for going into this field. Somewhere along the way, that motivation got lost, or at least overpowered by the drive to publish (or perish).
How can we keep people from getting to this point? How can we reward scientific integrity, even if it means fewer publications and a less attractive CV? And most importantly, how can we verify a researcher's findings are valid?
Thoughtfully yours,
Sara
Of course, I changed my major in the middle of my first semester at college, from theatre to psychology, not realizing that, if I thought getting published as a fiction writer was tough, it was nothing compared to getting published as a psychology researcher. Publish or perish is the expression in my field, and it is accurate. Getting the best jobs, getting research funding, it all depends on having a strong publication record. And with more people earning higher degrees now, there's even more competition. This is one reason the number of PhDs going into post-doc positions has also increased recently; grad school alone is no longer enough to prepare most people for the most attractive research and academic positions.
My number one goal in my post-doc is to publish as much as I possibly can. I even submitted a paper today. But I can't rest on my laurels, because I've got 5 other papers in various stages of preparation. Though my most recent reviews may still sting (and I'm not alone - there's actually a group on Facebook devoted to Reviewer 2, often the most matter-of-fact and even rude of the group) I can't let it traumatize me for too long, because there are more studies to perform, more data to analyze, more papers to write.
That's why when I read an article in the New York Times about a prominent psychology researcher who admitted that he massaged data, made up findings, and even wrote up studies that were never actually performed, and published it all in prominent journals, I was a bit annoyed. Am I bitter that while I was dealing with snide reviewers insulting my intelligence, research methods knowledge, and mother, this guy was fabricating data, falsifying methodology, and just plain making whole studies up (and getting rewarded for it, albeit not purposefully)? In a word: yes. But, no matter how tough the publishing world was, the possibility of doing what this guy did was never even an option. It's not that I thought this sort of thing doesn't happen; we all know it does, just as we know there are students who hire people to take the SATs or write their theses for them.
I know I'm not the only one who can say that this wouldn't be one of my answers to the difficulty of publishing in this field, and it's not because of a lack of creativity. Whenever we write research proposals, we have already have to write the introduction/background and methodology sections; we sometimes have to write an expected results section. Make that "expected" part disappear, add some statistics, illustrative quotes, whatever, then finish with a discussion/conclusion and voila! Made up study. And if you're in a field or at an institution where it's normal for someone to conduct and write up a study all by his- or herself, who will ever find out?
Well, apparently someone did, because this guy was caught and confessed, and the whole thing was written up in the New York Times. You can perhaps understand his motivation, and there are surely countless other researchers who have done the same thing and never got caught. And if you're a bit sly about it, your chances of getting caught will likely go down further. So what makes the people who would never do such a thing different?
Anyone who has taken an introductory philosophy class - or who has seen the movie Election - can tell you the difference between morals and ethics. For those who fall in neither of those groups: Morals are notions about what is right and what is wrong. Ethics often refers to the moral code of a particular group, and it sometimes is used to describe what is considered right and wrong at someone's job or within a certain field. That is, if we say a study was conducted ethically, we mean generally that it was performed in a way to minimize unnecessary harm, but more specifically, we mean that an overseeing body examined it and decided it abided by the rules established by some even higher-up overseeing body. Psychological ethics clearly say that falsifying data is wrong; it's unambiguously stated. Stapel can't plead ignorance here.
![]() |
Sorry, my moral compass appears to be broken today. I'll have to get back to you tomorrow. |
But not everyone avoids doing something because it's wrong. People are at different stages in their moral development; for some the possibility of getting caught is their deterrent. One of the most well-known theorists on moral reasoning is Kohlberg, who (while a post-doc at University of Chicago) began developing a taxonomy of six developmental stages. The first two stages apply to children; in the first stage, people are motivated by seeking pleasure and avoiding punishment, and determine morality by what an action gets them in return. Similarly, stage 2 individuals are driven by self-interest and in actions that further their own goals, needs, etc.; these people behave morally toward others when it also benefits them.
As we move into adolescence and adulthood, we also move into stages 3 and 4. In stage 3, people begin fulfilling societal roles, and behave in ways that conform to others' expectations; it seems the motivating principle here is they, as in "what would they think?" In stage 4, morality is based on legality. Finally, some lucky few move to stages 5 and 6, which Kohlberg considered the highest levels of morality. These individuals are no longer motivated by pleasing others, what is legal/illegal, or even self-interest; instead, they develop universal principles of what is right and wrong, and seek to enforce those principles, even if it means breaking the law or sacrificing their own needs.
But perhaps what it really comes down to is why one became a scientist at all. I like to think I went into this field because I was good at it, but then there are other things I'm good at (perhaps things I'm even better at than this), some that I could have potentially built a career around. I find the field to be challenging, but once again, there are other interesting and challenging fields I could have pursued. As cheesy as it sounds, I really want to make the world a better place and I see my field as one approach to reaching that goal. I'm sure Diederik Stapel had similar reasons for going into this field. Somewhere along the way, that motivation got lost, or at least overpowered by the drive to publish (or perish).
How can we keep people from getting to this point? How can we reward scientific integrity, even if it means fewer publications and a less attractive CV? And most importantly, how can we verify a researcher's findings are valid?
Thoughtfully yours,
Sara
Subscribe to:
Posts (Atom)