Just a quick post to revisit a topic I've covered before. One of my past blog posts was about two articles, one covering a controversial t-shirt marketed to young girls and the other discussing a study of men's and women's spatial ability in two cultures.
I just read an article about women in science that also provides some support that women are just as capable as men if given the right environment in which to thrive. It's interesting, though, that the professor in charge of the lab discussed in the article, worries that cultivating an all-women lab (at least by accident) may be as negative as the "old boys" labs of the past.
Some research has found that single gender classrooms are actually beneficial for both male and female students. Of course, at what point should integration happen (because it will have to happen eventually, unless you plan on the workplace also being divided)? And are there any long-term negative consequences associated with single gender classes? Does it make it difficult when the student finally encounters a member of the opposite gender in an academic setting? Or do these students, because of the lack of variability in gender in their classrooms, never learn that gender might be related to academic skills?
It seems, though, that Professor Harbron has every reason to be concerned. After all, even though you could argue, "Male students just don't seem to be interested in joining her lab, so why should we force them?", that argument has been used for a while to rationalize doing nothing to deal with many female students' lack of interest in the STEM fields.
I'm all about encouraging people "follow your dreams", but at some point, we have to recognize the powerful outside forces that can influence those dreams. As someone who discovered a love of math later in life, I wish I had had someone to help me with my struggles and push me to keep trying. In fact, it seems to me, the best way to encourage students to follow their dreams, is to get them to try everything and hold off on deciding what they want to do as their career until it is absolutely necessary for them to decide.
Yes, I know that sounds kind of counter-intuitive, but hear me out. In many other countries, students are tested early on to discover what they're good at. At some point, educators determine what Joe Student is good at, and begin training Joe in that discipline. Sure, our system is not as structured as that. Even if Joe is good at a certain thing, Joe can choose to go into another discipline all on his own. Still, once Joe has decided what he likes, we direct him toward activities and classes that will get him to his goal. And, if we think Joe is making a bad choice, we may try to direct him toward something else. But, if instead, we give Joe a taste of all his options without influencing him toward one field over another, and keep doing that until it's time for him to decide, who knows?
You may be saying, "We already do that." But do we? If Jane Student expresses an interest in math, do we encourage her with the same vigor as we do Joe? Do we place equal value on all the different options students have, or do we make casual statements that direct students to one option over another (by suggesting one option is better than others)? If we can get rid of preconceived notions about who is suited for a certain field (and who is not), we can create an environment where students thrive. Perhaps Professor Harbron is right that her lab is no more ideal a set-up than the all-male labs from before. But by examining her lab, and other educational environments, maybe we can discover the best approach.
Thoughtfully yours,
Sara
Saturday, October 22, 2011
Sunday, October 9, 2011
The Need to Personalize: Why Consumer Data is More Important Now than Ever Before
As a long-time researcher, my answer to many questions is, "What do the data say?" I consider myself to be a very empirical person, so having data to support views or my approach to life is very important to me. Even in parts of my life where no data are available, I continue asking questions. And, like most people, I constantly observe other people and draw inferences from their behavior. So when I read about some of the cutting edge work the Obama campaign is doing with supporter data, I wondered, "Why aren't more politicians doing this?" and more importantly, "Why aren't more people doing this?"
I'll be the first to say that too much personalization is not necessarily a good thing. For one, I'm really not a fan of design-your-own-major programs and would be happy to go into the "whys" of that sometime. But when it comes to marketing or informing people about causes they can join, personalization is an excellent idea. In fact, it's the logical continuation of what market researchers have been doing for years.
When a company creates a new product, designs a new ad campaign, etc., they want to get consumer perspectives. They do this through things like focus groups, where a group of similar people are brought in to try out a new product or view a new ad and discuss their thoughts as a group (I also frequently use focus groups in my research - you can get a lot of really useful information and they're fun to do!), and survey research, where people may (as an example) hear a new product slogan and rate it on a scale.
Market researchers also often collect demographic information from their participants, things like age, gender, race & ethnicity, and education, to see how these factors relate to responses to the product. This gives some basic information on who is likely to buy your product, and what approaches those groups respond to. A company who wants to appeal to many demographic groups may develop more than one ad campaign and put certain ads in places likely to be seen by certain groups. If you want to see some basic personalized marketing in action, grab a couple of magazines, say a fashion magazine and a sports magazine. Take a look at the ads in each - the ads in the fashion magazine may be for different products than the ads in the sports magazine. Not only that, you may notice that even ads for a product found in both of the magazines are different in terms of color scheme, layout, and writeup. You'll probably even notice different people featured in the ads.
The same is true for advertising during certain shows. Market researchers know what kinds of things their target demographic likes to watch on television and will buy ad space during that time.
Of course, I call this "basic" because it's not really personalized to one specific person; it's aimed at a specific group who have some feature or features in common. But advances in technology have made it even easier to gather information about a specific person, and in some cases, deliver that personalized advertising directly to that one individual. Google has been doing this for years. Facebook is also doing more and more of this targeted marketing. Using data from people's search terms or status updates, specific ads are selected from a database and displayed to that person.
Why is this personalization such a good idea? People respond to (e.g., process) things that are personally relevant more quickly. Research shows that people, for instance, show stronger preferences for the letters in their own names, probably because these letters are more familiar and therefore more fluent (easier to process - discussed in a previous blog post). When we're feeling especially miserly with our cognitive resources and are operating on auto-pilot, such highly personalized information can still "get in" and be noticed and processed by the brain.
Personally relevant information also appeals to our sense-of-self, our identity (also discussed in a previous blog post). We view the world through the lens of our personal experiences and details; some people may be better at considering another person's viewpoint than others, but we can never be separate from the self, so our view of the world is always going to be self-centered (and I use that term in a descriptive, rather than judgmental, sense).
Even in theories developed in fields like psychology, we recognize that the perspective of the theorist receives a lot of weight (this is why following the scientific method to develop testable and falsifiable hypotheses and gather empirical data, is so important; it's also one reason why theories are developed by many studies on a subject, hopefully performed by more than one individual, and no single study is the end-all, be-all on the topic).
I remember reading a quote in college by a great psychologist about Freud, and after much searching, could not uncover the quote or the source, but the person (I want to say Gordon Allport, who met Freud briefly when he was 22 and became disillusioned with psychoanalysis as a result) essentially asked of Freud's theory, "What kind of little boy would want to be so possessive of his mother?" - that is, he suggested that Freud's theory, specifically its emphasis on the Oedipus complex, was more about Freud himself than normal human development.
These days, individuals are doing things with computers that were once only reserved for the nerdiest of super-nerds sitting in front of a computer the size of a small house.
If you sell a product, any kind of product, you should be collecting data on your customers and using their information to make important decisions. And if you've got any kind of computational and statistical know-how, work those skills, because you are going to be sorely needed as the market continues moving in this direction.
True, some people are just born to sell things - they can convince you that you need this product, that your very life depends on having this thing. We can't all be Steve Jobs, walking into a room and presenting the product with such flair and resonance that we all suddenly wonder how our life was ever complete before iPod. (Years from now, when people look at the products Jobs was responsible for and think, "Oh, a personal music player, how quaint", they're still going to be watching and dissecting his presentations to find that magic formula. If only it were that easy.)
And perhaps part of Jobs's genius was that he could do some of this data-crunching in his head, examining how people are responding to his presentation in real-time and making subtle shifts to bring the message home and get people on board. Few people possess that ability. But for the rest of us, we can perhaps get by with a little help from our data.
I'll be the first to say that too much personalization is not necessarily a good thing. For one, I'm really not a fan of design-your-own-major programs and would be happy to go into the "whys" of that sometime. But when it comes to marketing or informing people about causes they can join, personalization is an excellent idea. In fact, it's the logical continuation of what market researchers have been doing for years.
When a company creates a new product, designs a new ad campaign, etc., they want to get consumer perspectives. They do this through things like focus groups, where a group of similar people are brought in to try out a new product or view a new ad and discuss their thoughts as a group (I also frequently use focus groups in my research - you can get a lot of really useful information and they're fun to do!), and survey research, where people may (as an example) hear a new product slogan and rate it on a scale.
Market researchers also often collect demographic information from their participants, things like age, gender, race & ethnicity, and education, to see how these factors relate to responses to the product. This gives some basic information on who is likely to buy your product, and what approaches those groups respond to. A company who wants to appeal to many demographic groups may develop more than one ad campaign and put certain ads in places likely to be seen by certain groups. If you want to see some basic personalized marketing in action, grab a couple of magazines, say a fashion magazine and a sports magazine. Take a look at the ads in each - the ads in the fashion magazine may be for different products than the ads in the sports magazine. Not only that, you may notice that even ads for a product found in both of the magazines are different in terms of color scheme, layout, and writeup. You'll probably even notice different people featured in the ads.
The same is true for advertising during certain shows. Market researchers know what kinds of things their target demographic likes to watch on television and will buy ad space during that time.
Of course, I call this "basic" because it's not really personalized to one specific person; it's aimed at a specific group who have some feature or features in common. But advances in technology have made it even easier to gather information about a specific person, and in some cases, deliver that personalized advertising directly to that one individual. Google has been doing this for years. Facebook is also doing more and more of this targeted marketing. Using data from people's search terms or status updates, specific ads are selected from a database and displayed to that person.
Why is this personalization such a good idea? People respond to (e.g., process) things that are personally relevant more quickly. Research shows that people, for instance, show stronger preferences for the letters in their own names, probably because these letters are more familiar and therefore more fluent (easier to process - discussed in a previous blog post). When we're feeling especially miserly with our cognitive resources and are operating on auto-pilot, such highly personalized information can still "get in" and be noticed and processed by the brain.
Personally relevant information also appeals to our sense-of-self, our identity (also discussed in a previous blog post). We view the world through the lens of our personal experiences and details; some people may be better at considering another person's viewpoint than others, but we can never be separate from the self, so our view of the world is always going to be self-centered (and I use that term in a descriptive, rather than judgmental, sense).
Even in theories developed in fields like psychology, we recognize that the perspective of the theorist receives a lot of weight (this is why following the scientific method to develop testable and falsifiable hypotheses and gather empirical data, is so important; it's also one reason why theories are developed by many studies on a subject, hopefully performed by more than one individual, and no single study is the end-all, be-all on the topic).
I remember reading a quote in college by a great psychologist about Freud, and after much searching, could not uncover the quote or the source, but the person (I want to say Gordon Allport, who met Freud briefly when he was 22 and became disillusioned with psychoanalysis as a result) essentially asked of Freud's theory, "What kind of little boy would want to be so possessive of his mother?" - that is, he suggested that Freud's theory, specifically its emphasis on the Oedipus complex, was more about Freud himself than normal human development.
These days, individuals are doing things with computers that were once only reserved for the nerdiest of super-nerds sitting in front of a computer the size of a small house.
![]() |
And if we leave it running all night, we could have our checkbook balanced by tomorrow morning! Who has the punch cards? |
The people who are embracing today's technology to personalize content are the ones who will be sticking around as the market becomes more and more saturated. That's why I would argue that this kind of personalization is so important - there are almost 6.8 billion people on this planet, nearly all of whom will make their livelihood off of selling something (whether that something is concrete, like a product, or abstract, like a skill). As the population continues to grow, getting even a minuscule percentage of that total to see what you have to offer and show an interest in buying it, is going to take some serious data-crunching ability.
If you sell a product, any kind of product, you should be collecting data on your customers and using their information to make important decisions. And if you've got any kind of computational and statistical know-how, work those skills, because you are going to be sorely needed as the market continues moving in this direction.
True, some people are just born to sell things - they can convince you that you need this product, that your very life depends on having this thing. We can't all be Steve Jobs, walking into a room and presenting the product with such flair and resonance that we all suddenly wonder how our life was ever complete before iPod. (Years from now, when people look at the products Jobs was responsible for and think, "Oh, a personal music player, how quaint", they're still going to be watching and dissecting his presentations to find that magic formula. If only it were that easy.)
And perhaps part of Jobs's genius was that he could do some of this data-crunching in his head, examining how people are responding to his presentation in real-time and making subtle shifts to bring the message home and get people on board. Few people possess that ability. But for the rest of us, we can perhaps get by with a little help from our data.
Tuesday, October 4, 2011
Whatever They Offer You, Don't Feed the Trolls
You may have noticed that I talk a lot about the media on this blog. The media is one of my main research interests - one that I've retained despite respecializing as part of my post-doc. I find it fascinating. Media information is everywhere and, as people become more and more connected through the Internet and mobile devices, its influence is only likely to grow. Though media research has been conducted since the early 20th century, one area that has really taken off is research on Internet media. This is not only because the Internet is becoming people's main sources for information, but also because the Internet is inherently different from other forms of media as well as constantly in flux.
Even with reality television taking off like it has, it's not easy to get onto television. Movies, music, and similar forms of media are not as easy to get into either. You need things like talent, good looks, and connections (unless you're Shia LaBeouf - in that case, your celebrity is inexplicable). The Internet, however, is one big free-for-all. Anyone can get online and create a webpage, post a music video on YouTube, start a blog ;). Thanks to the Web 2.0 movement, the tools are available to allow anyone, regardless of technical know-how, to get his or her message out there. Because of the ease with which individuals can add content to the ever-growing World Wide Web, the Internet is constantly changing. New information is becoming available, and new tools are being created to allow individuals even more control over content.
Obviously, there are many aspects of the Internet that are worthy of further exploration, but today, I'd like to write about an Internet phenomenon that has been around probably as long as the Internet itself, and is only becoming worse thanks to Web 2.0: trolling.
The argument is that people troll because they are "deindividuated" in a place like the Internet: they can shed their usual identity and adopt a new persona, which research suggests can lead to bullying and outbursts of violence. This is the phenomenon behind "mob mentality", where people in a large crowd can begin engaging in antisocial behavior, such as vandalism, physical assault, etc. So take away the opportunity to hide one's identity, and problem solved, right?
Yes, I tend to ask questions right before I'm about to argue the opposite. I'm totally predictable. :)
Let's be honest, other than my friends who read this blog (and perhaps my entire readership is made up of my friends, but hey, that makes you guys great friends :D), do you honestly know me? I'm not anonymous here; you can find out my name, my general location, my education and background. I drop autobiographical statements here and there. Still, your entire experience with me is online. I could be totally different here than I am in real life.
So what's to stop me from adopting a new personality entirely for my online interactions, that differs markedly from the "real" me? And what's to stop me from adopting the persona of a thoughtless jerk who trolls message boards trying to get a rise out of people? (This is just hypothetical, BTW; I have never, say, inferred a singer sucked on a comment board of their YouTube video… or anything like that.) Honestly, even on a site like Facebook, where it's super-easy to figure out who I am (and getting easier each day), I'm more apt to call someone out on something I would never utter in person.
I suppose if someone says something truly awful, requiring registration would make it easy to track them down for disciplinary (and even legal) action. But just like other forms of bullying and harassment, there is always a gray area where the behavior, though repugnant, is not punishable. Even online behavior that has led to a person's death went unpunished for fear that it would lead to a precedent that could make creating false online identities illegal. And as this case showed us, even requiring a person to register doesn't guarantee they are who they say they are. And requiring comments to be approved before appearing leaves too much room for bias; can't the moderator simply choose to accept the comments with which he/she agrees and reject the rest?
Perhaps the issue, then, is not deindividuation, but distance from the target. Stanley Milgram, who conducted one of the most unethical (aka: most awesome) social psychology experiments ever, found that people were more likely to follow the experimenter's instructions to shock another participant when they were farther away from the person getting shocked. On the other hand, if people were in the same room as the participant being shocked, they were much less likely to follow the experimenter's orders.
If the issue really is distance from the target, then we'll always have this issue in Internet-mediated communication. In fact, as people spend more and more time communicating with people via the Internet, the problem is only likely to worsen. Can we ever get away from the trolls? Other than "not feeding them" - and seriously, DON'T FEED THE TROLLS - what can we do to prevent trolling?
Thoughtfully yours,
Sara
Even with reality television taking off like it has, it's not easy to get onto television. Movies, music, and similar forms of media are not as easy to get into either. You need things like talent, good looks, and connections (unless you're Shia LaBeouf - in that case, your celebrity is inexplicable). The Internet, however, is one big free-for-all. Anyone can get online and create a webpage, post a music video on YouTube, start a blog ;). Thanks to the Web 2.0 movement, the tools are available to allow anyone, regardless of technical know-how, to get his or her message out there. Because of the ease with which individuals can add content to the ever-growing World Wide Web, the Internet is constantly changing. New information is becoming available, and new tools are being created to allow individuals even more control over content.
Obviously, there are many aspects of the Internet that are worthy of further exploration, but today, I'd like to write about an Internet phenomenon that has been around probably as long as the Internet itself, and is only becoming worse thanks to Web 2.0: trolling.
![]() |
They may look cute and cuddly, but it's best to ignore them. |
Last month, BBC news featured a story about trolling and a few cases in which people were arrested and jailed for trolling. In these cases, the trolling was really over-the-top bad: for example, a young man posting really thoughtless remarks on a memorial website for a young woman who was killed. Still, websites are cracking down on trolling in a variety of ways, such as by requiring comments to be approved before appearing on the site.
Some argue that simply requiring people to register should be sufficient, because people are no longer anonymous.
The argument is that people troll because they are "deindividuated" in a place like the Internet: they can shed their usual identity and adopt a new persona, which research suggests can lead to bullying and outbursts of violence. This is the phenomenon behind "mob mentality", where people in a large crowd can begin engaging in antisocial behavior, such as vandalism, physical assault, etc. So take away the opportunity to hide one's identity, and problem solved, right?
Yes, I tend to ask questions right before I'm about to argue the opposite. I'm totally predictable. :)
Let's be honest, other than my friends who read this blog (and perhaps my entire readership is made up of my friends, but hey, that makes you guys great friends :D), do you honestly know me? I'm not anonymous here; you can find out my name, my general location, my education and background. I drop autobiographical statements here and there. Still, your entire experience with me is online. I could be totally different here than I am in real life.
So what's to stop me from adopting a new personality entirely for my online interactions, that differs markedly from the "real" me? And what's to stop me from adopting the persona of a thoughtless jerk who trolls message boards trying to get a rise out of people? (This is just hypothetical, BTW; I have never, say, inferred a singer sucked on a comment board of their YouTube video… or anything like that.) Honestly, even on a site like Facebook, where it's super-easy to figure out who I am (and getting easier each day), I'm more apt to call someone out on something I would never utter in person.
I suppose if someone says something truly awful, requiring registration would make it easy to track them down for disciplinary (and even legal) action. But just like other forms of bullying and harassment, there is always a gray area where the behavior, though repugnant, is not punishable. Even online behavior that has led to a person's death went unpunished for fear that it would lead to a precedent that could make creating false online identities illegal. And as this case showed us, even requiring a person to register doesn't guarantee they are who they say they are. And requiring comments to be approved before appearing leaves too much room for bias; can't the moderator simply choose to accept the comments with which he/she agrees and reject the rest?
Perhaps the issue, then, is not deindividuation, but distance from the target. Stanley Milgram, who conducted one of the most unethical (aka: most awesome) social psychology experiments ever, found that people were more likely to follow the experimenter's instructions to shock another participant when they were farther away from the person getting shocked. On the other hand, if people were in the same room as the participant being shocked, they were much less likely to follow the experimenter's orders.
If the issue really is distance from the target, then we'll always have this issue in Internet-mediated communication. In fact, as people spend more and more time communicating with people via the Internet, the problem is only likely to worsen. Can we ever get away from the trolls? Other than "not feeding them" - and seriously, DON'T FEED THE TROLLS - what can we do to prevent trolling?
Thoughtfully yours,
Sara
Tuesday, September 27, 2011
When Thinking Feels Hard: New Layouts, New Features, and New Thought Patterns
It feels like the Internet landscape is changing. Recently, Facebook unveiled its new look (as well as some additional features that have had more than a few express concerns about privacy). As with any new Facebook roll-out, people are complaining. The news feed has been replaced with Top Stories and Not So Top Stories (okay, not the term, but that's what it comes down to). On the side, users not only have the chat list that has been available for some time, but a Twitter-style feed (dubbed the ADD bar) giving real-time updates from friends, friends of friends, and the occasional random friend of a friend of a friend.
As people have pointed out, the people who complain about a Facebook update are likely the same people who complained about the update before that, and before that, perhaps suggesting that some people like to complain. But just like Dr. Gregory House thinks everyone lies, I say, "Everyone complains", at some time or another. I think there's more to these layout changes than predisposition to complain. That's right, ladies and gents, I'm talking about the situational influences - notice a theme here? :)
You may also have noticed the look of this blog has changed. 'Tis the season. It seems like as a child, I would always want to reinvent myself in the Fall. Perhaps the same is true for websites. But how might these changes influence our perceptions?
A few years ago, I had the pleasure of listening to researcher Norbert Schwarz give a talk at my grad school alma mater, Loyola University Chicago. If you've never checked out his research, you definitely should; visit his homepage here. Not only is he incredibly friendly and funny, his research, while definitely theory-driven, is incredibly applicable to a variety of social situations. Or maybe it's that he's really good at taking theories and applying them to a variety of situations; either way, I would love to be able to do that. Theory is not my strong point.
One area Schwarz has studied a great deal is metacognitive experiences, basically thinking about thinking, and how we use cues from our thinking to influence the way we think. Wow, that made so much more sense in my head. Okay, how about a concrete example? Let's say I show you an ad for a car, then ask you to come up with a list of 10 reasons why you should buy that car.
Schwarz has shown that processing fluency can be manipulated in many ways, such as by using an illegible font or by asking participants to remember very specific personal events (such as 12 times you behaved assertively). Another way is familiarity; more familiar things are easier to process.
Now obviously, we don't always need thinking to feel easy. Sometimes, we encounter things to which we want to devote our full cognitive effort. But as I mentioned in a previous blog post, we're cognitive misers. We're choosey with how we spend our cognitive resources. If we're asked to learn a new software package for work, for example, we might be willing to devote the effort (there are a lot of other variables operating, but this is just a for instance). Facebook, on the other hand, is a leisure time activity, and many people who aren't high need-for-cognition folks would rather be able to have fun without thinking too hard.
But people continue to use Facebook, and though some users have likely split recently, Facebook currently has 750 million members (according to Google population data, the Earth's population is currently 6,775,235,700, so that means about 1 of every 9 people uses Facebook). Perhaps processing fluency is not the only issue at work here; the very nature of the social networking site is, well, it's social. Your friends are there, and in some cases, it might be your only opportunity for interaction. That might make some people unlikely to leave (of course, since Google+ is now open to the public, the landscape may continue to shift).
For those who left Facebook, I'd love to hear your reasons (in comments below), even if you left long before the recent update. For those who stuck around, don't worry; eventually you'll get used to the new look and thinking won't feel so difficult... just in time for the next update.
Thoughtfully yours,
Sara
As people have pointed out, the people who complain about a Facebook update are likely the same people who complained about the update before that, and before that, perhaps suggesting that some people like to complain. But just like Dr. Gregory House thinks everyone lies, I say, "Everyone complains", at some time or another. I think there's more to these layout changes than predisposition to complain. That's right, ladies and gents, I'm talking about the situational influences - notice a theme here? :)
You may also have noticed the look of this blog has changed. 'Tis the season. It seems like as a child, I would always want to reinvent myself in the Fall. Perhaps the same is true for websites. But how might these changes influence our perceptions?
A few years ago, I had the pleasure of listening to researcher Norbert Schwarz give a talk at my grad school alma mater, Loyola University Chicago. If you've never checked out his research, you definitely should; visit his homepage here. Not only is he incredibly friendly and funny, his research, while definitely theory-driven, is incredibly applicable to a variety of social situations. Or maybe it's that he's really good at taking theories and applying them to a variety of situations; either way, I would love to be able to do that. Theory is not my strong point.
One area Schwarz has studied a great deal is metacognitive experiences, basically thinking about thinking, and how we use cues from our thinking to influence the way we think. Wow, that made so much more sense in my head. Okay, how about a concrete example? Let's say I show you an ad for a car, then ask you to come up with a list of 10 reasons why you should buy that car.
![]() |
Go ahead and get started on your list; I can wait. |
Unless you know a lot about the car or I offered a really great option to consider (Batmobile anyone?), you probably had a lot of trouble coming up with a list of 10 items. You might use that cue, "Wow, thinking of 10 items was really hard" to tell you something about whether you really want to buy the car. That is, because thinking felt difficult, you took that as a cue to mean the thing you were considering was not that good. Schwarz refers to this perceived ease/difficulty as "processing fluency".
Schwarz has shown that processing fluency can be manipulated in many ways, such as by using an illegible font or by asking participants to remember very specific personal events (such as 12 times you behaved assertively). Another way is familiarity; more familiar things are easier to process.
Now obviously, we don't always need thinking to feel easy. Sometimes, we encounter things to which we want to devote our full cognitive effort. But as I mentioned in a previous blog post, we're cognitive misers. We're choosey with how we spend our cognitive resources. If we're asked to learn a new software package for work, for example, we might be willing to devote the effort (there are a lot of other variables operating, but this is just a for instance). Facebook, on the other hand, is a leisure time activity, and many people who aren't high need-for-cognition folks would rather be able to have fun without thinking too hard.
But people continue to use Facebook, and though some users have likely split recently, Facebook currently has 750 million members (according to Google population data, the Earth's population is currently 6,775,235,700, so that means about 1 of every 9 people uses Facebook). Perhaps processing fluency is not the only issue at work here; the very nature of the social networking site is, well, it's social. Your friends are there, and in some cases, it might be your only opportunity for interaction. That might make some people unlikely to leave (of course, since Google+ is now open to the public, the landscape may continue to shift).
For those who left Facebook, I'd love to hear your reasons (in comments below), even if you left long before the recent update. For those who stuck around, don't worry; eventually you'll get used to the new look and thinking won't feel so difficult... just in time for the next update.
Thoughtfully yours,
Sara
Friday, September 23, 2011
Pachelbel and Queen: The Psychology of Music Cravings
Cravings; everyone gets them one time or another. Food cravings make sense (well, some of them). Your body often craves what it needs more of, though apparently some cravings for one thing (e.g., sugar) are actually a sign your body is lacking something else (e.g., magnesium). And while your taste buds may be fooled (Mmm, thanks for that Diet Coke, sweet nectar of life), your body is not (Artificial sweetener? Nice try, I still want sugar) (which some research suggests is why drinking diet soda doesn't actually help if you're trying to lose weight).
But today, I experienced another craving, one I've had before but never really considered until now. I found myself craving certain music. As I was listening to one album (Rachmaninoff's All Night Vigil, admittedly, one I've been listening to a lot recently because my choir will be performing it) I found myself craving (for lack of a better word) another work, Carmina Burana, different in style, instrumental support, and subject matter: While Rachmaninoff is a series of a cappella songs based on chants from the Eastern Orthodox tradition (read: quite religious), Carmina Burana is a celebration of sex, drugs, and rock n' roll (well, perhaps not rock n' roll, but the other two most definitely), and of course, fortune. One of the most well-known works from Carmina Burana is "O Fortuna", a piece you've definitely heard; it's often used in advertisements, even though the lyrics "O fortune, like the moon, you are changeable…" are perhaps not well-suited for the car, movie, football game, etc., ads I constantly see it used in. But people keep using it because it sounds epic - like you're talking about something really important and awesome. Well, it does if you don't understand Latin. But I digress.
Read a list of popular culture uses of O Fortuna.
Obviously, when you crave food, it fulfills a bodily need; even if the food you crave isn't all that good for you (And really, are any food cravings for things that are actually healthy? I don't know about you, but I rarely crave carrots.), it still provides calories your body needs to run all of its systems. In fact, pretty much any craving you can think of fulfills a survival need of some kind. But what need does music fulfill? Apparently, I'm not the only one who has considered this question. There are textbooks, articles, and even whole scholarly journals devoted entirely to the psychology of music. Researchers have examined how music preferences relate to personality, the factors that explain why someone likes a particular piece of music, and how music affects mood, to name a few.
Even researchers outside of music psychology recognize the power of music to influence your mood. One of the most popular manipulations for studies on the effect of mood is to have participants listen to a piece of music known to elicit certain feelings: happiness, sadness, etc. But research directly on music is far more rare (but becoming more common thanks to all these great publication outlets). Peter Rentfrow and Sam Gosling, two personality researchers who also frequently examine music in their work, noted in this article that of the 11,000 articles published in top social and personality psychology journals between 1965 and 2002, music was an index or subject term in only 7. The few studies on these topics find that music is related to personality (such traits as sensation-seeking - also implicated in enjoyment of horror movies - as well as extraversion, warmth, conservatism, and psychoticism), social identity (something I've also blogged about before), and physiological arousal (though Rentfrow and Gosling's brief review of this subject still touches a lot on personality).
Of course preference would refer to what types of music you enjoy. Personally, I enjoy many different styles of music, everything from orchestral and choral music (what many refer to as "classical") to piano-driven pop to classic rock to blues (my background while writing most of this blog post was music of Stevie Ray Vaughan). But at certain times, I may prefer to listen to one style of music, or one particular artist, or even one particular song. What determines that minute-to-minute preference? One study (Ter Bogt, Mulder, Raaijmakers, & Gabhainn, 2011, abstract here) recently published in Psychology of Music may offer some explanation. They categorized their participants by self-rated level of importance of and involvement with music: high-involved, medium-involved, and low-involved listeners. High-involved listeners were the eclectic types - they liked a broad range of music styles, experienced a great deal of positive affect while listening, and reported that music served a variety of purposes in their lives: dealing with stress, constructing their identity, relating to others, and enhancing their mood. The other two groups showed more narrow music preferences and considered music to be a less integral part of their lives.
Relatedly, Saarikallio (2011 - abstract here) argued that music was important for emotional self-regulation, and performed interviews with people from a variety of age groups and levels of involvement with music, finding that reasons for listening to music included mood maintenance, distraction, solace, revival, and psyching up; reasons were quite consistent across age groups.
Though these studies don't tackle the topic directly, they suggest that people may select music from their internal list that they think will serve whatever purpose they're addressing (such as coping with stress or maintaining a happy mood). High-involved listeners have a larger internal list, and also get a great deal out of listening to music, so they probably do it regularly and may have certain songs in mind for particular needs. For example, I worked for a year in downtown Chicago, commuting from the north side. Though I had the El (that's the Elevated Train for non-Chicagoans) and didn't have to deal with Chicago drivers (shudder), the El was often crowded early in the morning and people were none too friendly. (I spent part of this time commuting with a broken arm and did anyone offer me a seat? No, but that's a blog post for another day.) My solution for all the negativity: sublime choral music, especially Bruckner's Mass in e minor; just out-of-this-world gorgeous. Made my commute so much more bearable.
There are countless other examples of studies examining the relationship between traits and music preference, whether those traits use personality theory terms (extraversion, openness to experience, etc.) or ability terms (musical abilities, etc.). Over the course of the literature I've read, I've had my ego stroked (you like lots of types of music and use music for emotional needs because you're a good musician - yay!) but also knocked down a bit (people who use music in those ways tend to have lower IQs - aw, man). But what about this notion of need? What need does music fulfill and can that explain the issues of musical cravings?
So who do I turn to when I want to examine human needs? Maslow, of course! Why didn't I think of it sooner? (See aforementioned finding about IQ.) Most people who have taken introductory psychology know Maslow as the guy who created the hierarchy of needs.
Look familiar? According to Maslow, human needs fit into one of five levels of the pyramid. Needs at the bottom of the pyramid are most important - without them, we don't really consider the needs higher up on the pyramid, because we're busy trying to fulfill those basic needs. So in order of importance, our needs are Physiological, Safety, Love/Belonging, Esteem, and Self-Actualization.
Arguably, music would be one of the Self-Actualization needs, falling perhaps under the sub need of creativity. How then can we explain the ubiquity of music in so many human cultures, even cultures that perhaps struggle to meet needs that are at the base of the pyramid? Of course, even though music is present in nearly all human cultures, there are certainly individual differences in importance of music (as was shown in studies above). Perhaps for some people, then, music falls on a lower, more integral part of the pyramid. For instance, Ter Bogt et al. (above) found that the high-involved music-lovers considered music to be an important part of their identity (both individually and socially). These individuals might place music, then, with Esteem or even Love/Belonging. In fact, music might fall in more than one place, and be used when fulfilling a variety of levels of needs; that might even explain why certain music is more appealing at certain times.
I'm sure I'm not the only one who experiences these musical cravings. Even so, despite services that introduce you to new music based on other music preferences, there doesn't seem to be anyone trying to measure and use these minute-to-minute variations to make a buck. Perhaps because when you get right down to it, no one completely understands it beyond knowing it exists.
Thoughtfully yours,
Sara
But today, I experienced another craving, one I've had before but never really considered until now. I found myself craving certain music. As I was listening to one album (Rachmaninoff's All Night Vigil, admittedly, one I've been listening to a lot recently because my choir will be performing it) I found myself craving (for lack of a better word) another work, Carmina Burana, different in style, instrumental support, and subject matter: While Rachmaninoff is a series of a cappella songs based on chants from the Eastern Orthodox tradition (read: quite religious), Carmina Burana is a celebration of sex, drugs, and rock n' roll (well, perhaps not rock n' roll, but the other two most definitely), and of course, fortune. One of the most well-known works from Carmina Burana is "O Fortuna", a piece you've definitely heard; it's often used in advertisements, even though the lyrics "O fortune, like the moon, you are changeable…" are perhaps not well-suited for the car, movie, football game, etc., ads I constantly see it used in. But people keep using it because it sounds epic - like you're talking about something really important and awesome. Well, it does if you don't understand Latin. But I digress.
Read a list of popular culture uses of O Fortuna.
Obviously, when you crave food, it fulfills a bodily need; even if the food you crave isn't all that good for you (And really, are any food cravings for things that are actually healthy? I don't know about you, but I rarely crave carrots.), it still provides calories your body needs to run all of its systems. In fact, pretty much any craving you can think of fulfills a survival need of some kind. But what need does music fulfill? Apparently, I'm not the only one who has considered this question. There are textbooks, articles, and even whole scholarly journals devoted entirely to the psychology of music. Researchers have examined how music preferences relate to personality, the factors that explain why someone likes a particular piece of music, and how music affects mood, to name a few.
Even researchers outside of music psychology recognize the power of music to influence your mood. One of the most popular manipulations for studies on the effect of mood is to have participants listen to a piece of music known to elicit certain feelings: happiness, sadness, etc. But research directly on music is far more rare (but becoming more common thanks to all these great publication outlets). Peter Rentfrow and Sam Gosling, two personality researchers who also frequently examine music in their work, noted in this article that of the 11,000 articles published in top social and personality psychology journals between 1965 and 2002, music was an index or subject term in only 7. The few studies on these topics find that music is related to personality (such traits as sensation-seeking - also implicated in enjoyment of horror movies - as well as extraversion, warmth, conservatism, and psychoticism), social identity (something I've also blogged about before), and physiological arousal (though Rentfrow and Gosling's brief review of this subject still touches a lot on personality).
Of course preference would refer to what types of music you enjoy. Personally, I enjoy many different styles of music, everything from orchestral and choral music (what many refer to as "classical") to piano-driven pop to classic rock to blues (my background while writing most of this blog post was music of Stevie Ray Vaughan). But at certain times, I may prefer to listen to one style of music, or one particular artist, or even one particular song. What determines that minute-to-minute preference? One study (Ter Bogt, Mulder, Raaijmakers, & Gabhainn, 2011, abstract here) recently published in Psychology of Music may offer some explanation. They categorized their participants by self-rated level of importance of and involvement with music: high-involved, medium-involved, and low-involved listeners. High-involved listeners were the eclectic types - they liked a broad range of music styles, experienced a great deal of positive affect while listening, and reported that music served a variety of purposes in their lives: dealing with stress, constructing their identity, relating to others, and enhancing their mood. The other two groups showed more narrow music preferences and considered music to be a less integral part of their lives.
Relatedly, Saarikallio (2011 - abstract here) argued that music was important for emotional self-regulation, and performed interviews with people from a variety of age groups and levels of involvement with music, finding that reasons for listening to music included mood maintenance, distraction, solace, revival, and psyching up; reasons were quite consistent across age groups.
Though these studies don't tackle the topic directly, they suggest that people may select music from their internal list that they think will serve whatever purpose they're addressing (such as coping with stress or maintaining a happy mood). High-involved listeners have a larger internal list, and also get a great deal out of listening to music, so they probably do it regularly and may have certain songs in mind for particular needs. For example, I worked for a year in downtown Chicago, commuting from the north side. Though I had the El (that's the Elevated Train for non-Chicagoans) and didn't have to deal with Chicago drivers (shudder), the El was often crowded early in the morning and people were none too friendly. (I spent part of this time commuting with a broken arm and did anyone offer me a seat? No, but that's a blog post for another day.) My solution for all the negativity: sublime choral music, especially Bruckner's Mass in e minor; just out-of-this-world gorgeous. Made my commute so much more bearable.
There are countless other examples of studies examining the relationship between traits and music preference, whether those traits use personality theory terms (extraversion, openness to experience, etc.) or ability terms (musical abilities, etc.). Over the course of the literature I've read, I've had my ego stroked (you like lots of types of music and use music for emotional needs because you're a good musician - yay!) but also knocked down a bit (people who use music in those ways tend to have lower IQs - aw, man). But what about this notion of need? What need does music fulfill and can that explain the issues of musical cravings?
So who do I turn to when I want to examine human needs? Maslow, of course! Why didn't I think of it sooner? (See aforementioned finding about IQ.) Most people who have taken introductory psychology know Maslow as the guy who created the hierarchy of needs.
Look familiar? According to Maslow, human needs fit into one of five levels of the pyramid. Needs at the bottom of the pyramid are most important - without them, we don't really consider the needs higher up on the pyramid, because we're busy trying to fulfill those basic needs. So in order of importance, our needs are Physiological, Safety, Love/Belonging, Esteem, and Self-Actualization.
Arguably, music would be one of the Self-Actualization needs, falling perhaps under the sub need of creativity. How then can we explain the ubiquity of music in so many human cultures, even cultures that perhaps struggle to meet needs that are at the base of the pyramid? Of course, even though music is present in nearly all human cultures, there are certainly individual differences in importance of music (as was shown in studies above). Perhaps for some people, then, music falls on a lower, more integral part of the pyramid. For instance, Ter Bogt et al. (above) found that the high-involved music-lovers considered music to be an important part of their identity (both individually and socially). These individuals might place music, then, with Esteem or even Love/Belonging. In fact, music might fall in more than one place, and be used when fulfilling a variety of levels of needs; that might even explain why certain music is more appealing at certain times.
I'm sure I'm not the only one who experiences these musical cravings. Even so, despite services that introduce you to new music based on other music preferences, there doesn't seem to be anyone trying to measure and use these minute-to-minute variations to make a buck. Perhaps because when you get right down to it, no one completely understands it beyond knowing it exists.
Thoughtfully yours,
Sara
Wednesday, September 21, 2011
Keep Telling Yourself It's Only a Movie: The Psychology of Horror Movie Enjoyment, Part 2
As I said in my last post, I love horror movies. I set out to explore some reasons why I (and many other people) might love movies that others might find disturbing. Little did I know, this search would turn up so much information, in addition to my personal notes on the subject, that 1) my first blog post was quite long and 2) I still had to cut it off and add the ever-annoying "To Be Continued". Who knew that something so trivial could bring up so much relevant information in the scientific community? Well, I did, but that was what we call a rhetorical question, dear reader.
So what is it that sets horror movie lovers apart from others? Last time, I wrote about the sensation-seeking personality, which includes love of horror movies in a long list of traits held by people who seek out thrills in all the wrong places and suffer from an insatiable case of neophilia (and if you thought, "Ew, they like dead people?!" dear reader, read that word again). As I said, though, I hesitate to accept something so clinical. As a social psychologist, I try to look for situational explanations as well. And as a recovering radical behaviorist (hi, my name is Sara, and I'm a Skinner-holic), I try to think of how learning and patterns of reinforcement & punishment might have shaped a behavior.
Love of horror movies, for instance, seems to be correlated with gender. Men are more likely to enjoy them than women. Of course, you could make the case that there are some innate, biological differences between men and women that make them respond differently to these images, but behavior shaping and reinforcement could also explain some of these differences. When children start growing up, they begin going through what is called gender socialization: they learn how to be little boys and little girls.
One way this happens is through selective reinforcement. We reinforce (through our verbal and behavioral responses) when little girls play with dolls and little boys play with trucks; we may not necessarily reinforce when little boys play with dolls and little girls play with trucks. We often reinforce when little boys play rough, but sometimes even punish little girls when they play rough.
Gender stereotypes can become so internalized, we even get the kids to do the dirty work for us:
Even parents who insist they don't want to reinforce gender stereotypes with their kids may inadvertently reinforce gendered behaviors; a parent may not "have a problem" with Jr. playing with dolls but they may not say anything or join him during that kind of play, but would respond positively and join when Jr. plays with a truck.
As a behaviorist (oops, I mean recovering behaviorist), I love observing these kinds of interactions; people often fail to realize what kinds of behaviors they're rewarding.
So you could argue gender socialization influences movie choice. When children pick out a movie to watch at the movie theater, at home, etc., parents always have to option of saying, "No, not that one. How about something else?" Do that often enough, and with certain movie selections, and over time, children learn what sorts of movies they should be watching (and what they should avoid – or at least, what movies they should be watching when the parents are around; people constantly test to see what they can “get away with”).
The problem with explaining a behavior with reinforcement and punishment is that the definition is circular, and it's difficult to get to the root cause. What is a reinforcer? Something that reinforces, that is, makes a behavior more likely to occur again. The thing is defined by the effect is has on behavior, and if it doesn't have that effect on behavior, it was never that thing to begin with (do you see why I say recovering behaviorist? - I love this subfield of psychology but it definitely gives me a headache). You are never able to get away from individual differences here, because something that may be reinforcing for one person may be neutral or punishing for another. Individual differences are fine, of course (I love including some of these variables in my studies), but where do they come from? Biology, perhaps? Very early experiences? Start thinking too much about this, and Watson's insistence that, "I can shape anyone to be anything I want, mwa ha ha" and Skinner's "Innate schminnate" attitude start to unravel, and the only way to repair it is to weave it with – gasp – cognitive psychology, evolutionary psychology, and psychodynamics. Oh, the horror.
Of course, timing is key when it comes to reinforcement and punishment. We know that for a reinforcer to be truly reinforcing, and a punisher to be truly punishing, it should happen quickly after the target behavior; this is one of the basic tenets of behaviorism. To take this one step farther, something may only work if it is delivered at a certain time. Perhaps one has to be in a certain state of mind for a horror movie to be enjoyable, and if you see a movie during that proper window, you’re more likely to seek it out again. Your mind shapes to enjoy these images (there is evidence that experiences actually change your brain physically, and that early stimulation may have long-term implications for brain development and, therefore, things like intelligence), and you become, over time, a horror movie fiend.
If you first view a movie during one of those off times, your response may be disgust, a response that strengthens over time. This is similar to something I’m exploring for another blog post (coming soon!) about cravings. This argument still depends very much on inner states, like mood, but considering that other research has established things like mood influence our decision-making and processing of information, it stands to reason that mood could influence whether something is reinforcing or not. And also considering we know that physiological states influence whether something is reinforcing or not (a cookie isn’t very rewarding if you’re not at all hungry), it doesn’t seem like a huge leap to weave these two lines of thinking together.
Hard to believe I once spouted Skinner constantly – Skinner, the man who said, “So far as I’m concerned, cognitive science is the creationism of psychology. It is an effort to reinstate that inner initiating or originating creative self or mind that, in a scientific analysis, simply does not exist.” Ah, Skinner, you were a brilliant man, but just because you can’t take something out and dissect it, or put it in an operant chamber and train it to press a bar, doesn’t mean it isn’t real or important. Not to mention, with advances in technology and scientific methods, something that was once unobservable, and therefore untestable, unfalsifiable, and unscientific, often becomes the subject of routine scientific study.
If we keep moving forward in the 20th century in the history of behaviorism, we come to Albert Bandura and his work on vicarious learning, imitation, and modeling. We learn a lot by watching others and chose role models to imitate (something Thorndike began studying almost a century prior to Bandura, but he found little to no evidence in his studies of animals and determined such learning did not exist). It’s possible, then, that love (or hatred) of horror movies develops in part from who we aspire to be and from observing the responses of others. This could even explain why behaviors not in line with gender stereotypes get shaped and reinforced; a little girl may learn to behave in “boyish” way by observing and imitating little boys (perhaps one reason that little girls with lots of brothers become “tomboys” themselves – I use the quotes because I personally hate these terms, but sometimes there is no better way to explain something).
To use a personal analogy, I grew up with a brother and mostly male cousins, so perhaps my early models were mostly male. And though I did engage in some “girly” play behaviors, I would also play with action figures and train sets with my brother, and we would often watch TV together: lots of He-Man, Justice League, and even WWF. (I was traumatized when I learned pro wrestling was fake. Hulk Hogan, I still remember that sobbing letter I wrote to you after you “broke your back”. Hope you had a nice vac-ay, you liar!)
Once again, we come back to the original dilemma, and to play devil’s advocate, you may be asking, “Okay, but where did the behavior originally come from? If we’re imitating, where does the imitated behavior come from? What makes parents want to reinforce gendered behavior? Which came first, the chicken or the egg? And you really wrote a get-well letter to Hulk Hogan? Dork.” Yes, yes, I know. I suppose we can never truly get away from the cognitive, personality, and clinical arguments. But depending on them entirely seems as misguided as depending entirely on contingencies of reinforcement as the sole explanation for behavior (sorry Skinner). Taken together, I think we… well, I think we perhaps created more questions than we started with, but isn’t that what good science is all about? What do you think?
Thoughtfully yours,
Sara
So what is it that sets horror movie lovers apart from others? Last time, I wrote about the sensation-seeking personality, which includes love of horror movies in a long list of traits held by people who seek out thrills in all the wrong places and suffer from an insatiable case of neophilia (and if you thought, "Ew, they like dead people?!" dear reader, read that word again). As I said, though, I hesitate to accept something so clinical. As a social psychologist, I try to look for situational explanations as well. And as a recovering radical behaviorist (hi, my name is Sara, and I'm a Skinner-holic), I try to think of how learning and patterns of reinforcement & punishment might have shaped a behavior.
Love of horror movies, for instance, seems to be correlated with gender. Men are more likely to enjoy them than women. Of course, you could make the case that there are some innate, biological differences between men and women that make them respond differently to these images, but behavior shaping and reinforcement could also explain some of these differences. When children start growing up, they begin going through what is called gender socialization: they learn how to be little boys and little girls.
One way this happens is through selective reinforcement. We reinforce (through our verbal and behavioral responses) when little girls play with dolls and little boys play with trucks; we may not necessarily reinforce when little boys play with dolls and little girls play with trucks. We often reinforce when little boys play rough, but sometimes even punish little girls when they play rough.
Gender stereotypes can become so internalized, we even get the kids to do the dirty work for us:
Even parents who insist they don't want to reinforce gender stereotypes with their kids may inadvertently reinforce gendered behaviors; a parent may not "have a problem" with Jr. playing with dolls but they may not say anything or join him during that kind of play, but would respond positively and join when Jr. plays with a truck.
As a behaviorist (oops, I mean recovering behaviorist), I love observing these kinds of interactions; people often fail to realize what kinds of behaviors they're rewarding.
So you could argue gender socialization influences movie choice. When children pick out a movie to watch at the movie theater, at home, etc., parents always have to option of saying, "No, not that one. How about something else?" Do that often enough, and with certain movie selections, and over time, children learn what sorts of movies they should be watching (and what they should avoid – or at least, what movies they should be watching when the parents are around; people constantly test to see what they can “get away with”).
The problem with explaining a behavior with reinforcement and punishment is that the definition is circular, and it's difficult to get to the root cause. What is a reinforcer? Something that reinforces, that is, makes a behavior more likely to occur again. The thing is defined by the effect is has on behavior, and if it doesn't have that effect on behavior, it was never that thing to begin with (do you see why I say recovering behaviorist? - I love this subfield of psychology but it definitely gives me a headache). You are never able to get away from individual differences here, because something that may be reinforcing for one person may be neutral or punishing for another. Individual differences are fine, of course (I love including some of these variables in my studies), but where do they come from? Biology, perhaps? Very early experiences? Start thinking too much about this, and Watson's insistence that, "I can shape anyone to be anything I want, mwa ha ha" and Skinner's "Innate schminnate" attitude start to unravel, and the only way to repair it is to weave it with – gasp – cognitive psychology, evolutionary psychology, and psychodynamics. Oh, the horror.
Of course, timing is key when it comes to reinforcement and punishment. We know that for a reinforcer to be truly reinforcing, and a punisher to be truly punishing, it should happen quickly after the target behavior; this is one of the basic tenets of behaviorism. To take this one step farther, something may only work if it is delivered at a certain time. Perhaps one has to be in a certain state of mind for a horror movie to be enjoyable, and if you see a movie during that proper window, you’re more likely to seek it out again. Your mind shapes to enjoy these images (there is evidence that experiences actually change your brain physically, and that early stimulation may have long-term implications for brain development and, therefore, things like intelligence), and you become, over time, a horror movie fiend.
If you first view a movie during one of those off times, your response may be disgust, a response that strengthens over time. This is similar to something I’m exploring for another blog post (coming soon!) about cravings. This argument still depends very much on inner states, like mood, but considering that other research has established things like mood influence our decision-making and processing of information, it stands to reason that mood could influence whether something is reinforcing or not. And also considering we know that physiological states influence whether something is reinforcing or not (a cookie isn’t very rewarding if you’re not at all hungry), it doesn’t seem like a huge leap to weave these two lines of thinking together.
Hard to believe I once spouted Skinner constantly – Skinner, the man who said, “So far as I’m concerned, cognitive science is the creationism of psychology. It is an effort to reinstate that inner initiating or originating creative self or mind that, in a scientific analysis, simply does not exist.” Ah, Skinner, you were a brilliant man, but just because you can’t take something out and dissect it, or put it in an operant chamber and train it to press a bar, doesn’t mean it isn’t real or important. Not to mention, with advances in technology and scientific methods, something that was once unobservable, and therefore untestable, unfalsifiable, and unscientific, often becomes the subject of routine scientific study.
If we keep moving forward in the 20th century in the history of behaviorism, we come to Albert Bandura and his work on vicarious learning, imitation, and modeling. We learn a lot by watching others and chose role models to imitate (something Thorndike began studying almost a century prior to Bandura, but he found little to no evidence in his studies of animals and determined such learning did not exist). It’s possible, then, that love (or hatred) of horror movies develops in part from who we aspire to be and from observing the responses of others. This could even explain why behaviors not in line with gender stereotypes get shaped and reinforced; a little girl may learn to behave in “boyish” way by observing and imitating little boys (perhaps one reason that little girls with lots of brothers become “tomboys” themselves – I use the quotes because I personally hate these terms, but sometimes there is no better way to explain something).
To use a personal analogy, I grew up with a brother and mostly male cousins, so perhaps my early models were mostly male. And though I did engage in some “girly” play behaviors, I would also play with action figures and train sets with my brother, and we would often watch TV together: lots of He-Man, Justice League, and even WWF. (I was traumatized when I learned pro wrestling was fake. Hulk Hogan, I still remember that sobbing letter I wrote to you after you “broke your back”. Hope you had a nice vac-ay, you liar!)
Once again, we come back to the original dilemma, and to play devil’s advocate, you may be asking, “Okay, but where did the behavior originally come from? If we’re imitating, where does the imitated behavior come from? What makes parents want to reinforce gendered behavior? Which came first, the chicken or the egg? And you really wrote a get-well letter to Hulk Hogan? Dork.” Yes, yes, I know. I suppose we can never truly get away from the cognitive, personality, and clinical arguments. But depending on them entirely seems as misguided as depending entirely on contingencies of reinforcement as the sole explanation for behavior (sorry Skinner). Taken together, I think we… well, I think we perhaps created more questions than we started with, but isn’t that what good science is all about? What do you think?
Thoughtfully yours,
Sara
Friday, September 16, 2011
Keep Telling Yourself It's Only a Movie: The Psychology of Horror Movie Enjoyment, Part 1
The weather is turning colder and fall is just around the corner. There are many things to love about fall. As much as people love summer activities, the heat begins to wear on many of us; the cooler days of fall are usually a welcome change by summer's end. Fall clothes are also some of my favorites; say goodbye to shorts and flip-flops and hello to scarves, sweaters, jackets - there's just something cozy about the layers, the neutral colors. Speaking of colors, fall leaves... need I say more? Yes, fall is wonderful for many reasons, including one more: Halloween.
Why Halloween? For one: It's the one time of year that it is acceptable to watch horror movies (I watch them pretty much year-round, but this is the one time that it's totally acceptable to talk about these movies and invite people over to watch them with you). I love horror movies. Despite (or perhaps because of) being slightly traumatized watching a scene from Children of the Corn when I was 4 or so years old, a scene I remember quite vividly, I grew up to be a horror movie fiend. When my family made its visit to the local video store, I'd want to check out movies from A Nightmare on Elm Street series (I've seen them all, even, shudder, Freddie versus Jason; Part 3: Dream Warriors, is my favorite for blending scary with funny – generally on purpose). On rainy days when we couldn't play outside, I'd turn off all the lights, block out of the windows, and watch the Exorcist in the dark. Alone, because no one in my family wanted to join me. And it wasn't just movies: I was a voracious reader, and though I would read pretty much anything from the fiction or nonfiction sections of the library, Stephen King was (and still is) one of my favorite authors.
Many believe that people who love scary movies don't find them scary. I don't know about others, but I find them terrifying. I'll sometimes have trouble sleeping after a really good one, like the first time I saw Poltergeist, and, especially when I'm alone, I'll find myself imagining all kinds of creepy things and explanations for weird noises. I probably find them just as scary as people who refuse to watch them (and I know a lot of people who fall into this camp, including my family, especially my mom).
But still, I love them. I own many, one of the first things people notice when looking through my movie collection. I talk about them to anyone who will listen; yes, I'm that person who still goes on about that awesome scene in that one movie that came out the year I was born but didn't see until I was 10 (aka: The Thing). When someone talks about zombies, I feel we have to clarify, "Are these slow-moving Night of the Living Dead zombies? Or crazy fast 28 Days Later zombies? Or somewhere in the middle Walking Dead zombies", because it’s an important distinction. And yes, I’m also that obnoxious person who thrives on horror movie trivia: did you know that A Nightmare on Elm Street (movie 1) took place in Springwood, California, but suddenly in later movies, they reveal they’re in Springwood, Ohio? That’s movie magic for you; they moved an entire city across the country.
So what explains my love of movies that would leave my mom insisting she sleep with the lights on surrounded by crosses, garlic, and silver? I started doing some research on this, and it seems there are many psychologists and communication experts who have sought to explain this very thing. There's actually a lot here, and a lot of commentary that I think is necessary, so this is the first of two parts on this topic.
One explanation is the notion of "sensation-seeking". Some people, for instance, are high sensation seekers. According to Jonathan Roberti, clinical psychologist and expert in sensation-seeking (read a review of his here), these individuals thrive on experiences that leave them emotionally and physically high. Not only would high sensation seekers be more interested in seeking out thrills and other emotional highs through the media (they apparently enjoy horror movies), they also are at increased risk for other "thrilling" activities, like drug use, excessive gambling, and casual sex. (Hmm, this isn't really sounding like me, but let's continue exploring.) Furthermore, high sensation seekers thrive on novelty, always seeking out new experiences, and are willing - and perhaps, prefer - to take risks to achieve these thrills; they're likely to select careers that allow them to take risks and experience new things - forensic identification, aka profiling, is one career they tend to express an interest in - and are prone to boredom.
Some research suggests that people have different brain responses to stimuli, and that some people, called "high stimulation seekers" (sounds like high sensation seekers to me, but different terms for different folks) experience activation in areas of the brain associated with reinforcement and arousal when exposed to intense stimuli, whereas others experience activation in emotional areas of the brain. While the researchers did not measure these brain reactions in response to horror movies, it may be that my brain responses are what differentiate me from my mom. While these images are reinforcing and psychologically arousing to horror movie lovers, they're upsetting to others.
But I hesitate to leave it at that. It seems that this explanation is more clinical, referring to enjoyment of horror movies as part of an additional diagnosis or, at the very least, a personality type. I hesitate to accept this explanation alone, because it clinicalizes (if that is in fact a word - and if not, it should be) something that could be considered normal. Additionally, though this research suggests there is a correlation between enjoyment of horror movies and these other behaviors, the one thing that makes horror movie viewing different from these other things is the notion of risk. There are risks of bodily injury or death involved in these behaviors, even something as commonplace as riding a roller coaster (though the odds are very, very low). The only risk involved with watching horror movies is that you might get a little freaked out, which could be quite traumatizing for people who dislike horror movies but probably not people who seek them out (either way, remember the advice given to viewers of Last House on the Left: "to keep from fainting, keep telling yourself 'it’s only a movie'"). As Roberti points out, however, not all of their behaviors are related to risk; they also enjoy trying new things in general, in terms of art, music, and sports; they score high on the personality dimension, "Openness to Experience".
Still, there are many good reasons to avoid clinical explanations when other explanations are just as likely if not better. One early example of the dangers of over-clinicalizing is a study by Rosenthal, aptly titled "On Being Sane in Insane Places". In this study Rosenthal and 7 colleagues got themselves committed to different mental hospitals, by each meeting with psychiatrists and informing them they were hearing voices saying the words, "Empty", "Hollow", and "Thud". All but one were diagnosed with schizophrenia (the final patient was diagnosed with bipolar disorder, or “manic depression” as it was called then) and committed. After reporting to an inpatient psychiatric ward, these "pseudo patients" stopped faking any symptoms and began acting as they normally would to see how long it would take for facility staff to notice. Their stays ranged from 7 to 52 days, with an average of 19 days.
So, they had time to kill, and thought, "Let's have some fun". What's a social psychologist's idea of fun? Doing research. While they were inpatient, they had an interesting opportunity to observe the various happenings: patients' behaviors, providers' behaviors, and most importantly, providers' assessments of patients' behaviors. They found that once a patient had been labeled with a clinical diagnosis, his or her behavior was often interpreted in line with that diagnosis, even when situational explanations were perhaps better. For example, patients spent a lot of time hanging out in the cafeteria waiting for meal-time, which providers attributed to things like "oral fixation", but the researchers thought was more likely because there's not much to do in a mental hospital, but eating is one regular activity that breaks up the monotony. Even the researchers' note-taking behavior was attributed to their diagnoses. [Interesting side note: while the providers never caught on that the pseudo patients were not actually mentally ill, 35 of the 118 other patients caught on rather quickly, and would ask the researchers things like, "Why are you here? You're not crazy."]
As a social psychologist, I think we should at least consider situational explanations for these phenomena. That's for next time. To be continued...
Thoughtfully yours,
Sara
Why Halloween? For one: It's the one time of year that it is acceptable to watch horror movies (I watch them pretty much year-round, but this is the one time that it's totally acceptable to talk about these movies and invite people over to watch them with you). I love horror movies. Despite (or perhaps because of) being slightly traumatized watching a scene from Children of the Corn when I was 4 or so years old, a scene I remember quite vividly, I grew up to be a horror movie fiend. When my family made its visit to the local video store, I'd want to check out movies from A Nightmare on Elm Street series (I've seen them all, even, shudder, Freddie versus Jason; Part 3: Dream Warriors, is my favorite for blending scary with funny – generally on purpose). On rainy days when we couldn't play outside, I'd turn off all the lights, block out of the windows, and watch the Exorcist in the dark. Alone, because no one in my family wanted to join me. And it wasn't just movies: I was a voracious reader, and though I would read pretty much anything from the fiction or nonfiction sections of the library, Stephen King was (and still is) one of my favorite authors.
Many believe that people who love scary movies don't find them scary. I don't know about others, but I find them terrifying. I'll sometimes have trouble sleeping after a really good one, like the first time I saw Poltergeist, and, especially when I'm alone, I'll find myself imagining all kinds of creepy things and explanations for weird noises. I probably find them just as scary as people who refuse to watch them (and I know a lot of people who fall into this camp, including my family, especially my mom).
But still, I love them. I own many, one of the first things people notice when looking through my movie collection. I talk about them to anyone who will listen; yes, I'm that person who still goes on about that awesome scene in that one movie that came out the year I was born but didn't see until I was 10 (aka: The Thing). When someone talks about zombies, I feel we have to clarify, "Are these slow-moving Night of the Living Dead zombies? Or crazy fast 28 Days Later zombies? Or somewhere in the middle Walking Dead zombies", because it’s an important distinction. And yes, I’m also that obnoxious person who thrives on horror movie trivia: did you know that A Nightmare on Elm Street (movie 1) took place in Springwood, California, but suddenly in later movies, they reveal they’re in Springwood, Ohio? That’s movie magic for you; they moved an entire city across the country.
So what explains my love of movies that would leave my mom insisting she sleep with the lights on surrounded by crosses, garlic, and silver? I started doing some research on this, and it seems there are many psychologists and communication experts who have sought to explain this very thing. There's actually a lot here, and a lot of commentary that I think is necessary, so this is the first of two parts on this topic.
One explanation is the notion of "sensation-seeking". Some people, for instance, are high sensation seekers. According to Jonathan Roberti, clinical psychologist and expert in sensation-seeking (read a review of his here), these individuals thrive on experiences that leave them emotionally and physically high. Not only would high sensation seekers be more interested in seeking out thrills and other emotional highs through the media (they apparently enjoy horror movies), they also are at increased risk for other "thrilling" activities, like drug use, excessive gambling, and casual sex. (Hmm, this isn't really sounding like me, but let's continue exploring.) Furthermore, high sensation seekers thrive on novelty, always seeking out new experiences, and are willing - and perhaps, prefer - to take risks to achieve these thrills; they're likely to select careers that allow them to take risks and experience new things - forensic identification, aka profiling, is one career they tend to express an interest in - and are prone to boredom.
Some research suggests that people have different brain responses to stimuli, and that some people, called "high stimulation seekers" (sounds like high sensation seekers to me, but different terms for different folks) experience activation in areas of the brain associated with reinforcement and arousal when exposed to intense stimuli, whereas others experience activation in emotional areas of the brain. While the researchers did not measure these brain reactions in response to horror movies, it may be that my brain responses are what differentiate me from my mom. While these images are reinforcing and psychologically arousing to horror movie lovers, they're upsetting to others.
But I hesitate to leave it at that. It seems that this explanation is more clinical, referring to enjoyment of horror movies as part of an additional diagnosis or, at the very least, a personality type. I hesitate to accept this explanation alone, because it clinicalizes (if that is in fact a word - and if not, it should be) something that could be considered normal. Additionally, though this research suggests there is a correlation between enjoyment of horror movies and these other behaviors, the one thing that makes horror movie viewing different from these other things is the notion of risk. There are risks of bodily injury or death involved in these behaviors, even something as commonplace as riding a roller coaster (though the odds are very, very low). The only risk involved with watching horror movies is that you might get a little freaked out, which could be quite traumatizing for people who dislike horror movies but probably not people who seek them out (either way, remember the advice given to viewers of Last House on the Left: "to keep from fainting, keep telling yourself 'it’s only a movie'"). As Roberti points out, however, not all of their behaviors are related to risk; they also enjoy trying new things in general, in terms of art, music, and sports; they score high on the personality dimension, "Openness to Experience".
Still, there are many good reasons to avoid clinical explanations when other explanations are just as likely if not better. One early example of the dangers of over-clinicalizing is a study by Rosenthal, aptly titled "On Being Sane in Insane Places". In this study Rosenthal and 7 colleagues got themselves committed to different mental hospitals, by each meeting with psychiatrists and informing them they were hearing voices saying the words, "Empty", "Hollow", and "Thud". All but one were diagnosed with schizophrenia (the final patient was diagnosed with bipolar disorder, or “manic depression” as it was called then) and committed. After reporting to an inpatient psychiatric ward, these "pseudo patients" stopped faking any symptoms and began acting as they normally would to see how long it would take for facility staff to notice. Their stays ranged from 7 to 52 days, with an average of 19 days.
So, they had time to kill, and thought, "Let's have some fun". What's a social psychologist's idea of fun? Doing research. While they were inpatient, they had an interesting opportunity to observe the various happenings: patients' behaviors, providers' behaviors, and most importantly, providers' assessments of patients' behaviors. They found that once a patient had been labeled with a clinical diagnosis, his or her behavior was often interpreted in line with that diagnosis, even when situational explanations were perhaps better. For example, patients spent a lot of time hanging out in the cafeteria waiting for meal-time, which providers attributed to things like "oral fixation", but the researchers thought was more likely because there's not much to do in a mental hospital, but eating is one regular activity that breaks up the monotony. Even the researchers' note-taking behavior was attributed to their diagnoses. [Interesting side note: while the providers never caught on that the pseudo patients were not actually mentally ill, 35 of the 118 other patients caught on rather quickly, and would ask the researchers things like, "Why are you here? You're not crazy."]
As a social psychologist, I think we should at least consider situational explanations for these phenomena. That's for next time. To be continued...
Thoughtfully yours,
Sara
Subscribe to:
Posts (Atom)