I didn't think, when I started this blog, that I would even bother responding to celebrity quotes. True, I could probably blog forever and a day about the things celebrities utter in interviews, on their Twitter page, etc. - in fact, there are many successful blogs devoted to just that topic. In a recent interview, however, Mila Kunis talked about weight loss. Since weight management is one of my areas of research, I felt I needed to respond -- plus, I was looking for a good reason to talk about weight management research on here.
Essentially, Mila said that people who are “trying to lose weight” and are unsuccessful are simply not trying hard enough. What prompted her to reach this conclusion is the fact that she was able to lose 20 pounds for her role in Black Swan, a substantial amount, considering she normally is very thin. Of course, what Mila said is problematic for a few reasons.
Even when an individual is successful at losing weight through a program, weight gain in the time following the program is very common; most will gain back two-thirds of the weight within a year, and nearly all of it within 5 years. Why? Because sudden and drastic changes are difficult to maintain. That’s one reason you’ll find that, for many people, losing weight, especially with “fad diets”, is easy but maintaining weight loss is difficult. The approaches celebrities often take to lose weight for a role definitely work over the short-term, but are rarely sustainable. Look at celebrities who did not lose weight for a role, but who did so because their weight was unhealthy – for example, Oprah Winfrey (whose weight has often been the target of comedians) lost 67 pounds on the liquid diet, and unveiled her new look on her show while pulling a wagon of fat… only to regain much of that weight later. In fact, within a week of going off the diet, she had gained 10 pounds. Such low calorie diets cannot be maintained for very long, and there’s a good reason for that. In fact, even in cases where a doctor has prescribed a very low calorie diet (an approach taken only for patients who are very obese), the patient has to (or is supposed to) undergo intense medical supervision.
Whatever changes you make to lose weight, whether it is diet, exercise, or some combination of both, they have to be changes you’re willing to maintain over the long-term, or your chances of regaining the weight are high.
My predominant concern when I see celebrities losing large amounts of weight, and talking about how easy it is and “anyone can do this”, is that it creates unrealistic expectations. In fact, a lot of research shows that people entering weight loss programs come in with really unrealistic expectations.
Furthermore, telling people they’re going about weight loss in the “wrong way” and to “try harder” doesn’t instruct them on how to lose weight effectively and in a healthy way. This is probably one reason that media coverage of celebrity weight loss and the constant messages about what people are “supposed to look like” can lead to disordered eating and other maladaptive behaviors. Figuring out how to lose weight is pretty intuitive – cut down on calorie intake and/or increase physical activity – but the approaches one needs to take to lose weight healthily are definitely not intuitive. Even if someone decides to do some research into losing weight, there are many sources of information, some teaching really unhealthy approaches. A lot of people don’t even realize what disordered eating means, thinking that, as long as they aren’t starving themselves or forcing themselves to vomit after eating, their behaviors (like “fasting” after large meals or exercising to the point of exhaustion) are perfectly normal and even healthy.
I’m certainly not attacking celebrities. I know that Mila probably felt that by telling people “no, you can” was an attempt to boost people’s confidence in themselves and their ability to reach their goals (a concept psychologists call “self-efficacy”). It’s definitely a noble goal, because research suggests that people starting weight management programs often have low self-efficacy.
Even so, boosting confidence may lead people to try something to lose their weight, but not necessary the right thing, so increasing self-efficacy needs to be done in concert with teaching healthy weight management approaches. This is one of the many reasons that people trying to lose weight on their own are not very effective.
Just once, rather than hearing a celebrity go on and on about how much he loves to eat fast food or how she is able to keep thin simply by “playing with her kids”, I would love to hear a celebrity say, “You know what, keeping thin is hard work! Here are all the things I do…” Okay, not as a great a sound-bite, I know (and arguably not the celebrity's responsibility), but it might help to balance out some of the other celebrity sound-bites that I fear do more harm than good.
Of course, celebrities are not the only ones sending the wrong, or at least, incomplete messages. Proposed policy to outlaw Happy Meals or add additional taxes to “junk food” is just as bad as simply saying, “What you’re doing is wrong” – it doesn’t teach what people should be doing instead. Rather than punishing people for making the “wrong” choices, we need to incentivize them to make the “right” choices.
Thoughtfully yours,
Sara
Thursday, August 18, 2011
Thursday, August 11, 2011
Citizen Journalists and Credibility
Five years ago this month, CNN started an initiative called iReport, where regular people could generate news content. Though visitors to CNN.com can access these iReports via the iReport page, CNN will also occasionally place iReport stories on their main homepage as well. In a recent story covering what the iReport news team consider to be the five stories that defined their "year one", many commenters attacked the initiative, saying iReport stories were "non news-worthy", and "lack the credibility and the training to hold such an important place on the website". Others posted support for iReports, saying that these "iReporters" put more efforts into fact-checking and proofreading than regular CNN reporters. This notion of citizen-journalists is definitely very interesting, and could be the starting point for many interesting debates. For another day, I suppose. What I'd really like to delve into today is this idea about credibility: what determines whether someone is viewed as credible and what biases may influence that judgment?
Human beings are regularly called upon to process large amounts of information from the world around us. This information is perceived by our five senses, and the information has to be interpreted by our brain in order for us to make decisions and navigate our environment. We're very good at processing this information. We're so good in fact that people are always looking for supernatural explanations for why we're so good at it - I know a lot of people who believe in ESP, but honestly, I think that people are just very good at picking up on cues in the environment (and some people are better and faster) and that because of their skill and speed, it appears they reacted before whatever they were reacting to even happened. Once again, another post, another day.
That's not to say that biases can't be introduced into our processing. We often see what we want to see and hear what we want to hear. It's true that our current experiences are often colored by our past experiences, which I think is just another example of how awesome our brain is; not only do we process what's directly in front of us, we are processing past experiences in tandem and drawing connections between the two. This means, however, that we can make mistakes.
Want to learn more about some of these "mistakes"? Check out this list of cognitive biases.
Of course, just because we're capable of systematic processing doesn't mean we will always be raring to think things through at that level all the time. On the contrary, thinking through everything at this level would lead us to become quickly overwhelmed. Therefore, we've created mental shortcuts that help us to navigate our environment while saving our cognitive resources for the things that really matter; Susan Fiske and Shelly Taylor referred to this tendency, calling us "cognitive misers". It is important to note, however, that some people really enjoy thinking and engage in it much more than the average person; we would say they have a "high need for cognition". Even these people will occasionally engage in mental shortcuts, they just do so less frequently than the average person.
One mental shortcut is called a heuristic. Heuristics are quick rules or devices, which allow someone to draw a conclusion or solve a problem quickly and without a great deal of thought. Heuristics are not always wrong. For example, if you see someone standing on a busy street carrying a map, you can probably safely assume they are a tourist. True, you could cycle through all the possible reasons a person would be carrying a map, but the cognitive miser in you will probably just think, "Tourist", and move on (or if you're feeling altruistic, offer to help said tourist find his or her next destination). The problem is when we use these shortcuts in situations where more systematic thought is necessary.
For example, imagine you've been called for jury duty. You're presented with lots of competing evidence, and you have to use that information to determine whether the defendant is guilty or not guilty. Now would probably be a bad time to use a heuristic, but some research evidences suggest that people do use heuristics in these situations, especially when the evidence is confusing or points equally to guilt or innocence. One heuristic is referred to as "what is beautiful is good"; it is a belief that, if a defendant is physically attractive, he or she must be innocent.
Another well-known heuristic is called the "availability heuristic". When we believe something is true - such as, "Women are bad at telling jokes" - we can easily think of many examples that support our conclusion, but often have difficulty remembering examples that refute the conclusions - such as times when women showed excellent comedic skills or times when men showed poor skills. Based on having many available examples that support our conclusion, we develop even stronger beliefs that our initial conclusion is correct.
The heuristic that is probably operating here deals with source credibility. When someone is presenting information, and especially, trying to convince you of something, you could 1) listen to her arguments and systematically think through them to determine if her conclusions are valid, or 2) decide whether you agree with her based on her education and/or title. Research shows that arguments presented by someone with, for example, a PhD are judged to be more sound than the same arguments presented by, say, a high school student. True, when studies manipulate how good the arguments really are, people are generally able to differentiate good arguments from bad arguments, but even then, the PhD often still has an advantage over Joe H.S. Student.
I don't often read the iReport stories, so I don't really have any conclusions about whether there is any truth to claims of inaccuracy. It stands to reason, however, that some of these commenters are reacting to job title rather than content. An iReporter could be anybody. CNN does provide information on their vetting process - how they determine whether an iReport is accurate and worthy of being called news - but someone who is applying a heuristic to determine whether to believe this particular story may not be very motivated to read about the vetting process and determine whether that raises their estimation of an iReporter's credibility.
That’s not to say that job title is the only marker of credibility people use. For instance, Elizabeth Smart, whose kidnapping case received widespread media attention, is now a contributor to ABC News on stories involving kidnappings. ABC News clearly feels this experience qualified her to talk about kidnapping in general (and many viewers likely do, as well). Of course, her hiring did prompt the Daily Beast to ask, “Other than fame – as the victim of a horrifying crime – what exactly are her qualifications?” True, the qualifications they would like see, such as a degree in psychology or years of experience in studying kidnapping victims, could be considered a metric of knowledge – we assume someone which such training to have a lot of knowledge on the topic, but this is not always the case; once again, this is just a heuristic at work.
Heuristics are not likely to go away. Despite their flaws, these shortcuts are necessary. Imagine if every decision you made (paper or plastic, boxers or briefs, soup or salad) involved the same careful thought you currently reserve for the important decisions (city or suburbs, car or SUV, Cubs or White Sox). So what is the best way to deal with this dilemma of credibility? Other than what CNN currently does – provide a document detailing the vetting process – what could they do to set your mind at ease?
Thoughtfully yours,
Sara
Human beings are regularly called upon to process large amounts of information from the world around us. This information is perceived by our five senses, and the information has to be interpreted by our brain in order for us to make decisions and navigate our environment. We're very good at processing this information. We're so good in fact that people are always looking for supernatural explanations for why we're so good at it - I know a lot of people who believe in ESP, but honestly, I think that people are just very good at picking up on cues in the environment (and some people are better and faster) and that because of their skill and speed, it appears they reacted before whatever they were reacting to even happened. Once again, another post, another day.
That's not to say that biases can't be introduced into our processing. We often see what we want to see and hear what we want to hear. It's true that our current experiences are often colored by our past experiences, which I think is just another example of how awesome our brain is; not only do we process what's directly in front of us, we are processing past experiences in tandem and drawing connections between the two. This means, however, that we can make mistakes.
Want to learn more about some of these "mistakes"? Check out this list of cognitive biases.
Of course, just because we're capable of systematic processing doesn't mean we will always be raring to think things through at that level all the time. On the contrary, thinking through everything at this level would lead us to become quickly overwhelmed. Therefore, we've created mental shortcuts that help us to navigate our environment while saving our cognitive resources for the things that really matter; Susan Fiske and Shelly Taylor referred to this tendency, calling us "cognitive misers". It is important to note, however, that some people really enjoy thinking and engage in it much more than the average person; we would say they have a "high need for cognition". Even these people will occasionally engage in mental shortcuts, they just do so less frequently than the average person.
One mental shortcut is called a heuristic. Heuristics are quick rules or devices, which allow someone to draw a conclusion or solve a problem quickly and without a great deal of thought. Heuristics are not always wrong. For example, if you see someone standing on a busy street carrying a map, you can probably safely assume they are a tourist. True, you could cycle through all the possible reasons a person would be carrying a map, but the cognitive miser in you will probably just think, "Tourist", and move on (or if you're feeling altruistic, offer to help said tourist find his or her next destination). The problem is when we use these shortcuts in situations where more systematic thought is necessary.
For example, imagine you've been called for jury duty. You're presented with lots of competing evidence, and you have to use that information to determine whether the defendant is guilty or not guilty. Now would probably be a bad time to use a heuristic, but some research evidences suggest that people do use heuristics in these situations, especially when the evidence is confusing or points equally to guilt or innocence. One heuristic is referred to as "what is beautiful is good"; it is a belief that, if a defendant is physically attractive, he or she must be innocent.
Another well-known heuristic is called the "availability heuristic". When we believe something is true - such as, "Women are bad at telling jokes" - we can easily think of many examples that support our conclusion, but often have difficulty remembering examples that refute the conclusions - such as times when women showed excellent comedic skills or times when men showed poor skills. Based on having many available examples that support our conclusion, we develop even stronger beliefs that our initial conclusion is correct.
The heuristic that is probably operating here deals with source credibility. When someone is presenting information, and especially, trying to convince you of something, you could 1) listen to her arguments and systematically think through them to determine if her conclusions are valid, or 2) decide whether you agree with her based on her education and/or title. Research shows that arguments presented by someone with, for example, a PhD are judged to be more sound than the same arguments presented by, say, a high school student. True, when studies manipulate how good the arguments really are, people are generally able to differentiate good arguments from bad arguments, but even then, the PhD often still has an advantage over Joe H.S. Student.
I don't often read the iReport stories, so I don't really have any conclusions about whether there is any truth to claims of inaccuracy. It stands to reason, however, that some of these commenters are reacting to job title rather than content. An iReporter could be anybody. CNN does provide information on their vetting process - how they determine whether an iReport is accurate and worthy of being called news - but someone who is applying a heuristic to determine whether to believe this particular story may not be very motivated to read about the vetting process and determine whether that raises their estimation of an iReporter's credibility.
That’s not to say that job title is the only marker of credibility people use. For instance, Elizabeth Smart, whose kidnapping case received widespread media attention, is now a contributor to ABC News on stories involving kidnappings. ABC News clearly feels this experience qualified her to talk about kidnapping in general (and many viewers likely do, as well). Of course, her hiring did prompt the Daily Beast to ask, “Other than fame – as the victim of a horrifying crime – what exactly are her qualifications?” True, the qualifications they would like see, such as a degree in psychology or years of experience in studying kidnapping victims, could be considered a metric of knowledge – we assume someone which such training to have a lot of knowledge on the topic, but this is not always the case; once again, this is just a heuristic at work.
Heuristics are not likely to go away. Despite their flaws, these shortcuts are necessary. Imagine if every decision you made (paper or plastic, boxers or briefs, soup or salad) involved the same careful thought you currently reserve for the important decisions (city or suburbs, car or SUV, Cubs or White Sox). So what is the best way to deal with this dilemma of credibility? Other than what CNN currently does – provide a document detailing the vetting process – what could they do to set your mind at ease?
Thoughtfully yours,
Sara
Monday, August 8, 2011
PC versus Mac, the Debate of Our Age
It's been a while since I last updated - sorry about that! In addition to a trip out of the country and starting a new job (hooray for both!), I recently bought a new laptop - a shiny new MacBook Air. I love how thin and lightweight this baby is (two of my most important characteristics in choosing a laptop), not to mention fast! Startup takes about 10 seconds. In my free time, I've been playing around with it, getting the hang of MacOS. As a longtime Windows user, and occasional Linux user, there are still some things I have to learn. I've definitely made some silly, "Windows-user" style mistakes.
Before I bought the MacBook Air, I posed a question to my friends on Facebook about what my next laptop should be. When I mentioned that I was leaning toward a MacBook, this discussion turned into a debate among my friends about which is better, Windows or Mac. This debate has gone on for a while, perpetuated by the "I'm a PC. I'm a Mac." advertising, where operating system is more than a simple preference; it is an identity. Still, though other ad campaigns involve these forced choice paradigms (Coke or Pepsi, for example), few create such strong allegiances and even denigration of people who prefer the other option. Why might this be the case?
Identity is very important. This probably goes without saying - so why am I saying it? Because identity is not only very important, it is an essential part of our lives. Erik Erikson, a developmental psychologist and psychoanalyst famous for his stages of psychosocial development, argued that the formation of a consistent, integrated identity is a necessary step in becoming an adult. It's no coincidence that the "coming of age" story, the tale of the journey of self discovery, is so popular in literature, film, and a variety of other media. It's a journey we all take, and struggle with on the way, so it's a story with which we can all identify. Additionally, according to identity researcher Seth J. Schwartz, failing to develop a clear identity is associated with depression, anxiety, and aggression, and even risky behavior, like drug use. Our identity is like our guide; so many of our decisions are influenced by "who we are". There are few experiences more excruciating than identity confusion.
Our social identity - who we are in relation to others and our group membership - is also important, in terms of "who we are" and our self-esteem. Many identify with a particular race or ethnicity, culture, gender, even baseball team (to my fellow Chicagoans, Cubs or Sox?). The best known researcher and theorist on these effects is Henri Tajfel. His research shows us that it doesn't take much for us to begin dividing people into groups, and that we hold a lot of beliefs, many completely irrational, about the group to which we belong, our "ingroup", and the other group, the "outgroup". We ascribe certain characteristics to ourselves, based on our identified groups, and ascribe certain characteristics (often called stereotypes) to the other groups as well. Discovering that another person is similar to us in some way can lead us to give that person preferential treatment, called "ingroup favoritism". Furthermore, we can create "ingroups" and "outgroups" based on seemingly unimportant details, like preference for a particular painting. Studies in social identity often use what is called the "minimal group paradigm" - these studies examine how much (or rather, how little) information about others is needed for us to begin categorizing them as like us, our "ingroup", or different, our "outgroup". We show these tendencies even when we've been randomly assigned to a group (and even if we KNOW we've been assigned completely at random).
For a comedic take on this dilemma, see this XKCD comic.
Why, then, don't we see angry interactions between Coke drinkers and Pepsi drinkers? What is so different about computer preference? Arguably, computers are more integrated into our daily lives than our choice of beverage. I'm not sure about you, but I spend most of my day seated in front of a computer, writing, conducting literature searches for manuscripts and grant applications, designing online surveys, and analyzing data. Even my leisure time activity is spent in front of a computer or similar device (like a personal music player or ebook reader - notice my avoidance of brand/product names :D). Furthermore, computer advertisements, which focus on this issue of identity, are a reflection of the design of the product itself. The "Mac" from the commercial spends a lot of time on "creating things" - photos, videos, music - and many of the products that come standard on a Mac are also focused on "creating things". Additionally, the "Mac"-guy is technically saavy in the sense that he is comfortable with using technology, but more in the "I don't care HOW it works, just THAT it works" way. That is, he wants technology that is intuitive and completely user friendly.
The "PC"-guy is portrayed as more professional, with a business and productivity orientation. It's true that Windows machines really shine in this application. I've worked in many large organizations, all of them using networks of Windows computers. Could you have a large organization with all Mac users? Probably. But I've yet to see it myself. While recent Windows advertising has focused on the "creative" side of Windows, this operating system and its accompanying software is still designed with this application in mind.
Of course, the landscape is changing. In the past, having a Mac meant limits in compatible software, and this was the case for many of the programs I use as part of my job until very recently. Now that software companies are increasing availability of their popular titles for both Windows and Mac, people will no longer have to feel married to a particular operating system. Does that mean this identity division will go away? I think not. If anything, this might make advertising more likely to highlight the different identities. If two products are very similar, how can you get people to commit to one over the other? One way is by appealing to their sense of self.
So which am I? In my opinion, each has its own benefits and drawbacks. I have no issue with continuing to use my Windows machine at work. I have to admit, though, every time I start up my work computer, and wait... and wait... and wait for the login screen before I wait... and wait... and wait for the desktop to load, I think of my MacBook Air and how much work I could have accomplished in that time.
Thoughtfully yours,
Sara
Before I bought the MacBook Air, I posed a question to my friends on Facebook about what my next laptop should be. When I mentioned that I was leaning toward a MacBook, this discussion turned into a debate among my friends about which is better, Windows or Mac. This debate has gone on for a while, perpetuated by the "I'm a PC. I'm a Mac." advertising, where operating system is more than a simple preference; it is an identity. Still, though other ad campaigns involve these forced choice paradigms (Coke or Pepsi, for example), few create such strong allegiances and even denigration of people who prefer the other option. Why might this be the case?
Identity is very important. This probably goes without saying - so why am I saying it? Because identity is not only very important, it is an essential part of our lives. Erik Erikson, a developmental psychologist and psychoanalyst famous for his stages of psychosocial development, argued that the formation of a consistent, integrated identity is a necessary step in becoming an adult. It's no coincidence that the "coming of age" story, the tale of the journey of self discovery, is so popular in literature, film, and a variety of other media. It's a journey we all take, and struggle with on the way, so it's a story with which we can all identify. Additionally, according to identity researcher Seth J. Schwartz, failing to develop a clear identity is associated with depression, anxiety, and aggression, and even risky behavior, like drug use. Our identity is like our guide; so many of our decisions are influenced by "who we are". There are few experiences more excruciating than identity confusion.
Our social identity - who we are in relation to others and our group membership - is also important, in terms of "who we are" and our self-esteem. Many identify with a particular race or ethnicity, culture, gender, even baseball team (to my fellow Chicagoans, Cubs or Sox?). The best known researcher and theorist on these effects is Henri Tajfel. His research shows us that it doesn't take much for us to begin dividing people into groups, and that we hold a lot of beliefs, many completely irrational, about the group to which we belong, our "ingroup", and the other group, the "outgroup". We ascribe certain characteristics to ourselves, based on our identified groups, and ascribe certain characteristics (often called stereotypes) to the other groups as well. Discovering that another person is similar to us in some way can lead us to give that person preferential treatment, called "ingroup favoritism". Furthermore, we can create "ingroups" and "outgroups" based on seemingly unimportant details, like preference for a particular painting. Studies in social identity often use what is called the "minimal group paradigm" - these studies examine how much (or rather, how little) information about others is needed for us to begin categorizing them as like us, our "ingroup", or different, our "outgroup". We show these tendencies even when we've been randomly assigned to a group (and even if we KNOW we've been assigned completely at random).
For a comedic take on this dilemma, see this XKCD comic.
Why, then, don't we see angry interactions between Coke drinkers and Pepsi drinkers? What is so different about computer preference? Arguably, computers are more integrated into our daily lives than our choice of beverage. I'm not sure about you, but I spend most of my day seated in front of a computer, writing, conducting literature searches for manuscripts and grant applications, designing online surveys, and analyzing data. Even my leisure time activity is spent in front of a computer or similar device (like a personal music player or ebook reader - notice my avoidance of brand/product names :D). Furthermore, computer advertisements, which focus on this issue of identity, are a reflection of the design of the product itself. The "Mac" from the commercial spends a lot of time on "creating things" - photos, videos, music - and many of the products that come standard on a Mac are also focused on "creating things". Additionally, the "Mac"-guy is technically saavy in the sense that he is comfortable with using technology, but more in the "I don't care HOW it works, just THAT it works" way. That is, he wants technology that is intuitive and completely user friendly.
The "PC"-guy is portrayed as more professional, with a business and productivity orientation. It's true that Windows machines really shine in this application. I've worked in many large organizations, all of them using networks of Windows computers. Could you have a large organization with all Mac users? Probably. But I've yet to see it myself. While recent Windows advertising has focused on the "creative" side of Windows, this operating system and its accompanying software is still designed with this application in mind.
Of course, the landscape is changing. In the past, having a Mac meant limits in compatible software, and this was the case for many of the programs I use as part of my job until very recently. Now that software companies are increasing availability of their popular titles for both Windows and Mac, people will no longer have to feel married to a particular operating system. Does that mean this identity division will go away? I think not. If anything, this might make advertising more likely to highlight the different identities. If two products are very similar, how can you get people to commit to one over the other? One way is by appealing to their sense of self.
So which am I? In my opinion, each has its own benefits and drawbacks. I have no issue with continuing to use my Windows machine at work. I have to admit, though, every time I start up my work computer, and wait... and wait... and wait for the login screen before I wait... and wait... and wait for the desktop to load, I think of my MacBook Air and how much work I could have accomplished in that time.
Thoughtfully yours,
Sara
Subscribe to:
Posts (Atom)