Tuesday, May 31, 2016

Blast from the Past: Facebook "On This Day"

If you use Facebook, and if you've been on Facebook for more than a year, you probably have gotten more than a few Facebook "On this Day" memories. Most of them - for me, anyway - are just random things I posted, like a cute picture of puppies or a joke I made. Occasionally, they are something more meaningful.

On this day, 5 years ago, I was sending out the announcement for my dissertation defense. These defenses are considered public, so people are required to post them a certain number of days in advance. Usually, they are only posted in the academic department where the defense will take place, and sometimes they are publicized to the larger university.

I joked with friends about also posting the announcement on Facebook, though I didn't want random friends to feel compelled to show up. Instead decided to have a little fun with it. So I posted this:







In case you're wondering, this is the format in which defense announcements are written. For comparison, this is the real thing:






JUNE 8, 2011; 10:00 AM; COFFEY HALL 116

COMMITTEE CHAIR NAME, DEGREE, COMMITTEE CHAIR (Redacted to protect the innocent)

Spoiler alert: I passed that dissertation defense, so on June 8 this year, it will be exactly 5 years since the first time I was called "Dr."

Monday, May 30, 2016

Follow-Up on Peer Review

I've blogged many times about the peer review process (here and here, especially) - one of the first steps in publishing scientific research in scholarly journals. While the purpose of peer review is, in part, to improve the manuscript, there can certainly be a "too many chefs" component:

Sunday, May 29, 2016

Meta-Analysis and Reproducibility: More from Sara's Week in Psychological Science

Yesterday, I attended a symposium on meta-analysis, which is a set of methods and statistical techniques to combine results of multiple studies. The purpose of a meta-analysis is to determine the true effect of some phenomenon, and also to understand what characteristics of samples, studies, and so on impact study results. I did one of my grad school candidacy exams on meta-analysis, and have a meta-analysis I've been working on and trying to get published.

During yesterday's session, one of the presenters talked about an Open Science Framework project, in which the researchers are trying to reproduce findings from multiple meta-analyses in psychology. The issue is that, as is true with many psychological studies, people often have difficulty reproducing findings from past meta-analyses, in part because of the strong impact of subjective decisions when conducting a meta-analysis. That is, decisions like what types of studies to include, how to divide up data within studies, and the specific analysis technique can be highly subjective and left up to the discretion of the researcher.

This project will involve meta-analyzing 2000 studies. Obviously, they are going to need help, and if you're so inclined, you can volunteer to be part of the project.

Friday, May 27, 2016

Childhood Abuse, Reliability, and Measurement Error: More from Sara's Week in Psychological Science

This morning, I attended a great session on methodological issues in studying trauma. One of the presentations was about studying childhood abuse. Many measures of childhood abuse are done retrospectively, among college students/adults, asking about past experiences. Researchers have noted a variety of issues with this approach, including the issue that reports of childhood abuse seem to vary over time - that is, a person may report that they were abused as a child at one timepoint, but not at another.

This leads some to conclude that reports of childhood abuse may be influenced by current levels of distress - people may misremember or misreport childhood abuse depending on how distressed they are feeling as adults. We know that current experiences can color past experiences, which we see in any kind of memory research; memory is highly malleable. However, any time we measure something in people, we also have to worry about measurement error. Poorly worded questions, respondent fatigue, and other factors may affect how well the measure "works."

The great thing about structural equation modeling (SEM) is that it separates measurement error out, so we can get a more pure read of how much particular constructs relate to each other. Some people have used this as mark against SEM, that it shows a "perfect world" relationship that we would rarely see in practice. But in many cases, SEM is a great technique to use as a way to answer questions about reliability due to measurement error versus systematic sources of variation.

The researchers measured past experiences of childhood abuse at the same time as they measured symptoms of distress, using a post-traumatic stress disorder (PTSD) checklist. Two weeks later, they measured these two variables again. They built a model where time 1 PTSD predicted time 2 PTSD, time 1 reports of childhood abuse predicted time 2 reports, and time 2 PTSD predicted time 2 reports. What they found was that time 2 PTSD predicted less than 2% of the variance in time 2 reports of childhood abuse. Time 1 abuse reports was highly predictive of time 2 abuse reports, and once measurement error was factored out, they found reliability in abuse reports above 0.80 (which in measurement world is considered excellent reliability).

What this means is that, when it looks like people are saying different things at different times, measurement error is a much more likely culprit than how a person is currently feeling. Obviously, the study was done over a short timeline (2 weeks), so results may be different if that time period is longer. This was also a college sample, and reports of abuse, especially physical and sexual abuse, were low. But this study gives some guidance for studying reports of childhood abuse in other samples, and highlights a time when separating measurement error from systematic variation (i.e., actual differences in reports of childhood abuse) is the optimal approach.

Thursday, May 26, 2016

The Trouble with Workshops

As I blogged yesterday, I’m attending the Association for Psychological Science annual meeting, which is being held in Chicago this year. I’ve spent this afternoon in a workshop, on how to conduct structural equation modeling (one of my favorite statistical techniques) using an R package called lavaan.

The workshop lasted almost 4 hours and was taught by Yves Rosseel, who developed lavaan. Unfortunately, as I’ve discovered with many workshops, most of the session was spent giving a refresher course on structural equation modeling. We didn’t even get to R or lavaan until almost 1.5 hours into the workshop. This also meant we had little time to do hands-on work using the package; the practice session ended up running concurrently with the break, so we had to choose which we wanted. To be fair, he gave us the slides we would need to do the practice session, so we could do that after the session.

While I found the information interesting, and learned a thing or two, I wanted to spend more time digging into the program and how it could be used to test various structural equation models. This is especially because I’ve used lavaan before, and had a lot of success with it – it’s very user friendly compared to other R packages I’ve used – but wanted to learn to do more and go beyond the simple models I ran before.

The problem with workshops, of course, is that there are rarely prerequisites. So it’s difficult to assume everyone will come into the workshop with, say, a strong working knowledge of the underlying statistic, or some basic experience with the program.

But not impossible. Workshop descriptions could inform people what knowledge they are expected to have. Slides sent ahead of time (as these were) could contain background information slides, and attendees could be asked to review a portion of the slides on their own time prior to attending the workshop.

Even a simple survey among people signing up for the workshop could be used to assess how much people already know about a topic. (Obviously, asking people what they know can be problematic, but at the very least, you could find out where most of the people will be in terms of exposure to the subject matter, and shape the workshop around the majority group.)

That being said, I enjoyed the workshop and would recommend it to others. He’s an excellent teacher and very funny. My favorite comment from the session was his description of the mystical land of Asymptotia, a magical place where all the data are normally distributed.

Probably a neighbor of Lake Wobegon
I just wish they had allowed Professor Rosseel to teach 2 workshops, one about structural equation modeling alone and another about using lavaan.

Wednesday, May 25, 2016

Sara's Week in Psychological Science

The Association for Psychological Science meeting starts tomorrow!

I'm looking forward to hearing about some of the current psychological science research. I blogged previously about some of the workshops being offered. I finally settled on the Structural Equation Modeling with Lavaan workshop. Lavaan is a package for the R statistical language. I've used it before and found it to be relatively user friendly; the manual is also very helpful. However, there were a few things I couldn't figure out how to do, so the workshop will at least be an opportunity to ask some questions.

I hope to post some updates while I'm attending the conference. Stay tuned!

Tuesday, May 24, 2016

Patient Data and Myriad Genetics

A few days ago, the ACLU filed a complaint on behalf of 4 individuals who had genetic testing performed, asking Myriad Genetics to release the individuals' data. Though they did so, after the complaint had been filed, they didn't provide everything the individuals wanted and argue they have a right to:
Genetic testing labs, including Myriad, normally provide clients with information on gene variants that are known to increase the risk of disease (pathogenic), likely to be pathogenic, or of uncertain significance. But nearly everyone also has variants (often called polymorphisms) that are deemed benign—and companies typically don’t send clients information about those variants. But as researchers pool data from thousands of cancer gene tests, some of those benign variants may be reclassified as dangerous, or may become informative in ways that hadn’t been anticipated. That is one reason some of the patients involved in the complaint wanted Myriad to provide all of their genetic data, and not just the information on the pathogenic variants.

In the complaint, ACLU and the plaintiffs, supported by researchers in cancer genetics, say that Myriad ran afoul of a new regulation promulgated by HHS this past January: that individuals have a right to receive “the full gene variant information generated by the test.”
The issue is that, although the new HHS policy allows individuals to request their full genetic information, Myriad only shares the pathogenic data, and actually resent the same results to the 4 individuals after the complaint.

What's strange about the whole thing is that a new policy about data transparency was released rather quietly, on the HHS blog. Any policy released via blog is rather strange, of course.

Monday, May 23, 2016

Movie Quotes and Social Interactions

It probably won't surprise you that I reference movies often in my conversation, including quoting lines from my favorite movies. For me, pop culture references are one of the ways I interact with and relate to people, by quoting something people recognize. And of course, it's always a pleasant surprise to quote one of my favorite movies with someone new, and learn they also love that movie. I'd wager some of my friendships were started with a Big Lebowski quote or two.

I also have a quote board on my door at work. It started as an activity during October - I decorate my door for Halloween, and include a marker board for horror movie quotes. A coworker told me she really enjoyed the October quotes and asked if I would mind keeping the board up all the time. I haven't been as good about updating it recently - it used to be every day, but now it's more like every other day - but I try to include a mix of easy and difficult quotes, to make it inclusive.

I also do a lot of reading, I love having some of my favorite books adapted into movies and TV shows. I may not always like the finished result, but I love seeing the story come to life. Though I'm sure some would disagree that movie adaptations are a good thing:

Sunday, May 22, 2016

A Beautiful Day Keeps the Blogger Away

It's been a gorgeous weekend in Chicagoland. I'd been meaning to sit down and write a post to keep my "at least once daily" record going, but haven't sat still long enough to write one. Hopefully I'll have something thought-provoking to say tomorrow. In the meantime, here's how I've been spending much of today:

Saturday, May 21, 2016

"Keeps the Doctor Away" or Why Medical Errors Are a Leading Cause of Death

The American Journal of Managed Care shared a press release on new research finding that medical error is the third leading cause of death in the United States:
Then they used hospital admission rates from 2013 and extrapolated that based on 35,416,020 hospitalizations, there were 251,454 deaths from medical error, which translates to 9.5% of all deaths each year in the United States.

Comparing their estimate to the CDC’s list of the most common causes of death in the United States, the authors calculated that medical error is the third most common cause of death, surpassing respiratory disease—the CDC’s currently listed third leading cause of death.
This is a shocking statistic, although one I've heard before. In fact, based on previous discussions in my job, medical errors have been the third leading cause of death for some time now.

To be clear, though, I should first state what medical errors are. The article linked above sums it up as follows:
Medical error is defined as an unintended act either of omission or commission or one that does not achieve its intended outcome; the failure of a planned action to be completed as intended (an error of execution); the use of a wrong plan to achieve an aim (an error of planning); or a deviation from the process of care that may or may not cause harm to the patient.
This is, of course, a rather long way of defining medical error. Essentially, what it comes down to is not giving the patient the care he or she needs to get better. That could mean misdiagnosis, resulting in a care plan that would be right if the diagnosis were correct. Or it could mean prescribing something that the patient can't do for some reason - like putting a strong onus on the patient to self-manage a condition that doesn't work with the patient's time, ability/knowledge, or finances. Or it could mean missing something entirely - thinking a patient is well when he or she is actually very sick.

The problem comes down to differing expertise. The doctor is an expert in the field of medicine. The patient is the expert about him or herself. Unfortunately, if there is not enough overlap in knowledge, the patient won't know what information to share and the doctor won't know what to ask about.

A coworker of mine recently published a book with a colleague on research he has been conducting for many years now.

The basic premise of the research is on contextualized care; this refers to care that is guided by the patient's context, everything happening outside of the patient. Many physicians still focus on what happens within the patient. To use an example from the book, a patient may show up in the physician's office with sudden and drastic weight loss, leading the physician to assess for things like cancer. But if the patient doesn't say that he lost his job and can't afford to eat, the physician may be looking at the wrong thing. This is referred to as a "contextual error."

The research began with unannounced standardized patients - actors trained to portray a specific type of patient, who attend an office visit as though they are a regular patient. These patients wore hidden recorders to record what happened during the appoint. In the research, they found that even when the patient dropped hints as to why he had lost so much weight (e.g., homeless), he was often sent for tests to find a biomedical cause (e.g., a full cancer workup). In fact, overall in this first study, when the underlying explanation for the patient's complaint was contextual, physicians made errors 78% of the time.

They later moved to using real patients, who agreed to serve as "secret shoppers." And they've been using this line of study to provide feedback to physicians and other medical staff on how to improve care. If you're at all interested in the topic, I highly recommend this book.

Thursday, May 19, 2016

The Eyes Have It

I talk a lot on this blog about the human brain, and what various parts are responsible for in terms of behavior and sensation/perception. But I don't really delve much into other body parts. However, I learned Monday about a third set of receptors in the eye that I started looking into.

You may have learned in your biology and/or psychology class about the receptors in the retina - the rods (detecting light and dark) and cones (detecting color). But Monday I learned there is a third set of receptors: the intrisincally photosensitive retinal ganglion cells. Like rods, these receptors react to light, but are so much more sensitive, they can respond even when responses of the rods and cones are blocked (or when rods and cones are missing, as in some rare eye diseases). Their response is also much slower, and relate to light conditions over the long-term. But they have some really important roles:
  • Circadian rhythms - They help to regulate our day/night cycles, by sending signals to the hypothalamus. Our bodies use light to regulate when we should be awake and when we should be asleep.
  • Melatonin regulation - Relatedly, they contribute to our body's regulation of melatonin, which also relates to metabolism and sleep-wake cycles.
  • Pupil response - They also play a role in our pupil response to light. These cells are why people who are blind may still show a pupil response to light and dark.
These receptors help explain why people who are blind can experience sleep disturbance. If these receptors aren't functioning properly, a person's sleep-wake cycle could be affected.

Intrinsically photosensitive retinal ganglion cells were accidentally discovered in the 1920s, when geneticist Clyde E. Keeler bred mice without rods and cones (yes, they were in fact blind mice), but found they still responded to light. However, no one did anything in this line of research until the 1990s, and these specific cells weren't discovered until the early 2000s.

These cells respond mostly to blue light. It's probably no coincidence, then, that light therapy to treat sleep disturbance (as well as seasonal affect disorder) often uses blue light.

Wednesday, May 18, 2016

Science Fiction Becomes Science Fact

If you've ever read The Hitchhiker's Guide to the Galaxy, you probably remember the Babel fish - the organism that would live in one's ear and translate any language.

If you ever wished this little fishy were real, you're in luck:

New Research on Familiar versus Unknown

I blogged a little while ago about when we want familiar places and when we want the thrill of the unknown. And the other day, I blogged about why we (I) love mashups, which represent a balance between familiar and new. In both cases, I talked about cognitive resources as a determinant of when we prefer one over the other. Yesterday, I received my weekly newsletter from the Association for Psychological Science (This Week in Psychological Science, or TWiPS), which shares links to early views of articles from forthcoming issues. One of the articles, by Baror and Bar, was about just this topic.

The researchers define the processes of seeking out the familiar versus the less familiar as exploitation and exploration, respectively. They begin with the concept of associations, using a free-association task. Basically, what this task involves is one person says a word, and the other says whatever comes to mind.

They examined word association among participants who were and were not under cognitive load. They manipulated cognitive load in different ways (across four experiments): digit-span task (memorize a certain number of digits), alphabetizing task (put letters from the word-association words in alphabetical order), and attending to features of the target words (color, shape). Overall, they found that responses were more diverse among participants under low cognitive load than participants under high cognitive load. That is, responses among people under high cognitive load were strongly associated with the target word.

So what overall conclusion can be drawn from these findings? As the authors state in their discussion:
Our findings support the notion that exploration is the default mode: The brain has a basic tendency to go beyond the nearest associations and activate unique ones instead when resources are available.
This seems to fly in the face of the "cognitive miser" claim - that people prefer to save their cognitive resources and go through life on autopilot, unless something happens that requires them to think. Instead, it offers some support for something I said in the post linked above: we're cognitive neophiliacs - we're drawn to the new and will use our cognitive resources to understand that new thing. This can include that "outside the box" thinking that Baror and Bar observed in their study.

Tuesday, May 17, 2016

A Guided Tour of the Human Brain

I talk a lot about the brain on this blog. I've posted many times about different parts of the brain. But have never really given an overview. But someone shared a great video that does just that.

Monday, May 16, 2016

Beyond the Mere Exposure Effect: Pointing Out Stereotypical Comments

Scene: A noisy bar, out with some boisterous people. I've been told that I have a softer voice and people sometimes have difficulty hearing me. I'm aware of this, but since I don't like feeling as though I'm shouting, in those situations, I prefer to just listen. I can feel a little left out in those situations, but I guess I prefer that over shouting. When someone pointed out that I hadn't said much that evening, I tried responding, doing the only thing I could do to be heard - raising my voice. And that's when I received a response that bordered on sexist.

At that point, I checked out.

Research shows that most people are aware of stereotypes about certain groups. Whether they believe them or not, they still linger below the surface. And those stereotypes can come bubbling up, even among people who would deny any stereotypical attitudes. Mere exposure effect can only get your so far in correcting stereotypical attitudes. Sometimes, you need a more active intervention.

I think what bothered me most about the situation was my own response. I disengaged, rather than saying something. But I was reminded today of research a friend from grad school performed for her masters thesis. This research was published in the Journal of Experimental Social Psychology. What they found was, pointing out sexism actually increased positive evaluations of the speaker and reduced use of sexist language in future encounters. And pointing out sexism was as simple as, "Don't you think that's kind of sexist?" It doesn't have to be drawn out or judgmental. Just a simple comment in an interpersonal interaction could be enough.

Sunday, May 15, 2016

Why We Love Mashups

I'm sure I'm not alone in saying I love mashups. Mashups are combinations using existing source materials in different ways. Song mashups usually take two or more different songs and arrange them together. A song mashup differs from a medley in that medleys go through multiple songs one after another, while mashups overlay or mix them together. They're especially interesting when they take two very different songs and make them work together. For instance:

There are other types of mashups as well. Books like Pride and Prejudice and Zombies and Abraham Lincoln: Vampire Hunter take existing materials (Pride and Prejudice or Abraham Lincoln's biography) and combine them with a different genre (in this case, horror). There are even movie mashups:

So why are mashups so appealing? One theory is that they do things that are unexpected - they are uncanny, to use the term Freud used to explain humor. This could be part of it, but at the same time, certain mashups that are really well done get my attention and can sustain it. That is, I can listen to a really good mashup many times without it losing its effect, while humor - on multiple viewings - loses some of its effect.

Another possibility has to do with how your brain reacts when encountering the familiar versus the unfamiliar. Last year, I blogged about travel, cognition, and the balance between feeling comfortable in the familiar and the stimulation of the unfamiliar. Mashups, especially when you are familiar with the original source material, represent a middle option between familiar and unfamiliar. The sections containing chunks of a single song feel familiar, but as you settle into cognitive autopilot, the song shifts, activating your attention.

There are many great mashups out there on YouTube to check out. I've even been trying my hand at a few recently (and I may even post some here when I finally finish them).

Saturday, May 14, 2016

The Lost Blog Post Topic and State-Dependent Memory

I posted the other day about luck and serendipity. Last night, after staying out a little too long and having a little too much fun with some friends, I thought of the perfect blog post topic for today. Sadly, I didn't write it down, and this morning, I have no idea what that topic was. I had a notebook and pen with me. I had my phone where I could leave a note for myself. I had options, I just didn't use them because as usual, I thought I would remember.

After I finished searching my brain for some semblance of the topic and coming up with nothing, I remembered a psychological concept known as state-dependent memory or state-dependent learning. I laughed as I realized that this had happened to me... then decided I'd stumbled upon the perfect blog topic.

State-dependent memories or learning are memories or information that can be recalled when people are in the same internal state as they were when they learned it. This concept has frequently been studied when people are in states of consciousness brought on by alcohol. However, it could also apply to other kinds of internal states, like moods.

I remember learning this concept in college, and my classmates and I joked about getting drunk to study, then getting drunk to take the test. Of course, I should point out that state-dependent learning isn't magic that's going to dramatically improve your ability to retain information. It may make it a little easier to recall that information on command, but you're going to be limited by how good your memory is usually. Motivation also comes into play - you have to be motivated to learn or commit the information to memory.

But the way you can apply this information to your life - if you discover you're having difficulty recalling something you learned, you can enhance your recall of that information by recreating the conditions under which you learned that information. So this is more a tool at your disposal - as opposed to a strategy going in.

What this means is, I could probably get that blog topic idea back if I recreated the condition from last night. No thanks. I'll just stick with coffee and serendipity.

Friday, May 13, 2016

More on Head Songs and Earworms

In a past post, I talked about head songs (songs I wake up with in my head) and earworms. Since I'm listening to music much of the time, it makes sense that I also have some song stuck in my head.

I try to post my head songs on Facebook, though I sometimes forget. But I'd love to examine my past head songs and see if there's any kind of pattern (I'm a researcher, after all, and lover of data). I mean, where do they come from?

Today's head song I can explain. It's a track I listened to just yesterday, and also one of my favorite songs off of Sia's new album:

As is true with most of my head songs, it was the full track, with accompaniment, backup vocals, etc.

On the other hand, as I was walking in to work this morning, I got an inexplicable earworm - Baby Got Back by Sir Mix-a-Lot. And in true earworm style, it was just a snippet of the lyrics repeated: "Oh baby, I wanna get with ya, and take your picture."

Rather than post that song, which I know you've all heard, how about this great version, created from snippets of dialogue from movies:

Thursday, May 12, 2016

More on Parody (A Thinly Veiled Excuse to Post the "Deadpool Honest Trailer")

I've blogged before about parody. And I've blogged many more times about movies. I saw (and loved) Deadpool, because not only is Deadpool awesome, but the movie parodies many of the superhero movie tropes. So what could be more meta than a Deadpool Honest Trailer, in which Deadpool appears and makes fun of himself and his own movie:

So to recap, it's a parody of a parody. And since Deadpool also pokes fun at Honest Trailers in the process, it's a parody of a parody of a parody.

My head might just be about to explode.

Luck, Serendipity, and Locus of Control

My friend over at The Daily Parker shared this story with me a couple of days ago: Why Luck Matters More Than You Might Think. In the article, the author talks about people who see themselves as self-made, even though luck (and by luck, I mean random happenings that are beneficial, as opposed to some systematic force like karma) is a strong influence - perhaps even stronger than individual action:
In the process, I have discovered that chance plays a far larger role in life outcomes than most people realize. And yet, the luckiest among us appear especially unlikely to appreciate our good fortune. According to the Pew Research Center, people in higher income brackets are much more likely than those with lower incomes to say that individuals get rich primarily because they work hard. Other surveys bear this out: Wealthy people overwhelmingly attribute their own success to hard work rather than to factors like luck or being in the right place at the right time.

That’s troubling, because a growing body of evidence suggests that seeing ourselves as self-made—rather than as talented, hardworking, and lucky—leads us to be less generous and public-spirited. It may even make the lucky less likely to support the conditions (such as high-quality public infrastructure and education) that made their own success possible.
Some of the greatest scientific achievements happened through luck - or serendipity, the more frequently applied term, which refers to fortunate accidents. In the field of psychology, one great example is Pavlov's concept of classical conditioning. Pavlov was a physiologist studying digestion. In his famous study, he was examining salivation - the first step in digestion - in dogs, when he noticed something interesting. The dogs actually began to salivate before they had food in their mouths.

This prompted him to make observations, and he found that the dogs would begin salivating in response to cues that meant food was on its way: the sound of the lab door opening, the sight of white lab coats, and so on. In his famous experiment, he decided to test whether he could associate a neutral stimulus (the sound of a metronome - not a bell, as some state) with food (called the unconditioned stimulus - something that elicits a response without having to be learned), to elicit salivation (called the unconditioned response). His study worked, and at the sound of the metronome (the conditioned stimulus), the dogs began salivating (now called the conditioned response).

Social psychologists ascribe strong influence to external factors on our behavior. You could reword that to state that luck determines a lot of what happens to you. Some social psychologists believe in these outside influences so strongly that they deny the existence of personality altogether, and instead explain all behaviors as externally influenced. Not all social psychologists take that radical view, and of course, different subfields of psychology have different degrees of belief in luck versus internally motivated behavior.

Of course, not only do psychologists have systematic views on the roles of luck versus internal motivation, people do in general. This concept is referred to (by psychologists) as locus of control. A person who believes they are responsible for their own behavior and destiny (self-made) would have a strong internal locus of control. On the other hand, a person who believes everything that happens to them is the result of luck and external factors would have a strong external locus of control. This belief falls on a continuum, so you can fall anywhere between those two extremes. BTW, locus of control is also sometimes known as "attributional style." Once again, what we're talking about here is attribution - what caused something to happen.

Curious to know more about your own locus of control? You can take a measure of locus of control here.

Wednesday, May 11, 2016

More About Self Theories: It's Not About Views on Success, But Views on Failure

I blogged recently about self theories, which refer to whether one believes intelligence is innate (fixed) or learned (incremental). As a result of this work, many psychologists have cautioned parents about the type of praise they give their children when they succeed. "You're so smart" implies that intelligence is innate, while "You worked really hard" implies that intelligence/ability can be learned.

However, I learned about a study yesterday that says it isn't parents' views on success, but failure, that impact their children's self theories. They conducted 4 studies, 2 with parents only and 2 with parents and their children. Across all four studies, they found that parents' self theories did not predict children's self theories, but that instead, parent's views on failure predicted children's self theories:
Our findings indeed show that parents who believe failure is a debilitating experience have children who believe they cannot develop their intelligence. The findings further suggest that this is because these parents react to their children’s failures by focusing more on their children’s ability or performance than on their learning. Taken together, our findings seem to have identified a parental belief that translates into concerns and behaviors that are visible to children and that, in turn, shape children’s own beliefs.
It is easier - although maybe not easy - to be watchful about the kind of compliments one gives to children, but when your child has failed at something and he/she is upset, it's more difficult to think rationally about the right response. In those situations, we tend to just react. The problem is, if that reaction is devastation that your child has failed, it sends a message that your child can do nothing to learn and develop his/her abilities. As a result, children may disengage and instead do something "safer" - something at which they know they will succeed. They may miss the opportunity to grow and challenge themselves - challenges which have many cognitive and emotional benefits.

What this means is, if we want to avoid having our children develop a fixed view of intelligence, we have to change our own view of intelligence - not just in what we say. How do we do that? Unsurprisingly, Carol Dweck, who developed the concept of self theories, has designed an intervention called Mindset, though it is aimed at students. But perhaps another way is to adapt an approach used in cognitive therapy and similar interventions: self-monitoring and mindfulness.

In these approaches, the individual monitors his or her thoughts for instances of negative self-talk. They are not trying to suppress those thoughts - because we know that can backfire. Instead, when they encounter those thoughts, they practice acceptance of any mistakes they made that led to that self-talk and may even come up with rebuttals to those negative thoughts. You can read a little more about mindfulness and negative self talk here. In fact, as you'll see in the article, there are many common threads between mindfulness and incremental views of intelligence. For instance, the article states:
People with good self-esteem see mistakes and failures as opportunities to learn about themselves. They take a "beginner's mind" approach - putting aside the judgements and conclusions from past behaviour and actions and, instead, thinking about what they've learned from these experiences.
And for a humorous approach to defeating negative self-talk, you could try this:

Tuesday, May 10, 2016

Your Brain on Stress

Speaking of brain activity and responses, a friend shared this video with me - how stress affects your brain:

This video gives a nice overview of many important brain systems and what they do, while talking about the effect of stress. The video also talks briefly about the epigenetics, the ways in which the environment can trigger certain genes to express. This means that, even if you have a genetic predisposition to stress and anxiety, a nurturing environment can keep that gene from expressing.

An important extension of this concept is the biopsychosocial model, which states that biology, psychology, and social environment combine to determine health across one's lifespan.

Experience can change the brain, though your brain becomes less plastic (changeable) as you age. This is why a small child may recover from a head injury that would be fatal to an adult. The brain is able to rewire itself, especially prior to the age of 6. And while the brain changes discussed in the video are real, they represent a worst-case scenario of stress response. If you experience normal amounts of stress or only occasional instances of high stress, you'll probably be fine. But if you experience high chronic stress, you'll want to do something to cope with that - whether it be talk therapy, lifestyle changes to minimize stress, and/or medications for anxiety.

Your Brain on Sarcasm

Sarcasm is a big part of my sense of humor. My mom comments on how young I learned sarcasm. In all honesty, I probably wasn't incredibly young - the typical child is able to understand sarcasm at 6 years old.

There's a lot of research on sarcasm; on the positive side is research finding that engaging in sarcasm increases creativity, while on the negative side, research finds that sarcasm can damage relationships (though I think that's mainly true if the other person isn't sarcastic).

But what really interests me is how sarcasm is understood and the brain activity associated with sarcasm. An article in the Smithsonian Magazine highlights a host of research on sarcasm and brain activity, stating that the brain has to work harder to understand sarcasm, but that additional processing is actually good for you. An important component of understanding sarcasm is developing a "theory of mind" - the ability to understand how other people think.

Additionally, many parts of the brain are involved in comprehending sarcasm. The temporal lobe is involved in understanding prosody - larger units of speech - which involves not just the words in a phrase, but the intention, emotion, and presence of irony. The right hemisphere is especially involved in understanding speech in which the actual meaning is the opposite of the literal meaning. (Sound familiar?)

A study published in 2005 also examined sarcasm in the brain, and found activity in the frontal lobe, particularly the prefrontal cortex - the so-called executive center of the brain, involved in decision-making - and linked the right hemisphere activity discussed above with the activity in the frontal lobe.
In sum, Shamay-Tsoory and colleagues propose a neural network for processing sarcastic utterances:
1.The left hemisphere language cortices interpret the literal meaning of the utterance;
2.The frontal lobes and right hemisphere process the intentional, social and emotional context, identifying the contradiction between the literal meaning and the social/emotional context;
3.The right ventromedial prefrontal cortex integrates the literal meaning with the social/emotional knowledge of the situation and previous situations, helping the listener determine the true meaning.
So basically, the language center on the left side of the brain examines the words, the right hemisphere and frontal lobe determine the meaning, and the right side of the prefrontal cortex combines these two pieces of information.

Well, isn't that special?

Monday, May 9, 2016

John Oliver on Science in the Media

I've blogged about media representation of science before - but for the tl;dr, here's John Oliver's take:

P-hacking, which he discusses in the story, is definitely a thing. My grad school statistics professor called it "fishing." Basically, it's what happens when you run multiple statistical analyses on results, looking for something significant. My dissertation director joked about doing this (not publishing) with some data on Alcatraz inmates; the only significant relationship they found was that Alcatraz inmates were significantly more likely to be Capricorns. She then looked at me very seriously, and asked, "You're not a Capricorn, right?"

Yes, I am.

Statistical results are probabilistic; we look for results that have a low chance of happening if no real relationship exists. We usually set that value at 5%. What that means is, if I run 20 tests, one those will probably be significant by chance alone. That's less of a concern if I have pre-existing (a priori) hypotheses, based on past research and/or theory, I'm testing but even if I am testing a priori hypotheses, I should apply a correction to account for the number of tests I'm running.

The problem with p-hacking is that, not only does it involve running many tests, it also usually involves only reporting the significant results. So a reader would have no idea that a person ran potentially dozens of tests based on reading the article. Unfortunately, this is one of the negative consequences of the "publish or perish" mentality. Scientists feel so much pressure to come up with results, that they'll do things they know are questionable in order to meet their publication quotas for tenure and/or funding. And that problem compounds when journals reject articles that replicate past studies. As John Oliver says in the story, "There's no Nobel prize for fact-checking."

On Selfies

In 2013, the word "selfie" was added to the Oxford English Dictionary, a sign that selfies - a photo taken of oneself - were becoming so common as to enter our vernacular. Scrolling through Facebook, you'll see them everywhere. Even my parents know the meaning of the word selfie, and nearly all of friends take them to varying degrees. I try, but I'll be honest - I'm awful at taking selfies.

Source: Moderately Confused
But what does taking selfies say about us? If you Google "psychology of selfies," you'll find tons of articles about the correlation between taking selfies and narcissism. On the other hand, people like me who have to take tons of selfies before finding one I actually like, may have Body Dysmorphic Disorder. Wonderful.</sarcasm>

But are selfies really a sign of mental disorder? First, studies examining the relationship between selfie-taking and negative traits have found small to modest correlations. So there's a relationship there, but there are likely many other things that explain selfie-taking more strongly. Second, one could argue that selfies are simply a new approach to a rather old artistic approach: the self portrait.

Self portraits - in painting form - became common during the Early Renaissance, in part because of more widespread availability of inexpensive mirrors. Artists viewed self portraits as a form of advertisement - not only to see what the artist looks like but to show skill. The invention of photography made self portraiture more approachable, but still mainly for artists.

Until now. Now that people have ready access to cameras, and now that they are small enough to fit in one hand, it's understandable that even non-artists would want to join in on this medium.

Not only are they easier to take, selfies are a way to see yourself the way other people see you. Obviously, this could be a good thing or a bad thing, depending on whether you like what you see.

This approach also allows you to have photos of yourself in situations where you usually want a photo taken, such as when you're visiting a new place, without having to stop random people and ask them to take a picture. Or for situations where you may have wanted a photo taken, such as after getting ready for a night out, but haven't had anyone around to take one for you. So selfies could just be a reflection of independence - wanting to do things oneself - combined with the ease of taking such photos due to improvements of technology.

Even if there is a relationship between selfies and characteristics like narcissism, as selfies become more common (to the point that nearly everyone is taking them), that relationship will become smaller and smaller. In fact, one could argue that once just about anything enters the mainstream, it no longer correlates with personality traits.

What are you thoughts on selfies? Let us know in comments!

Sunday, May 8, 2016

Awesomeness, Regressing to the Mean, and the Sports Illustrated Effect

I spent a lot of this past week volunteering for my choir. I've received many thanks and compliments for this, to which I replied, "We'll see how long it takes before I return to my baseline level of awesome." In statistics, this is known as regressing to the mean.

In order to understand this concept, we have to start with our good friend, the normal distribution.

Let's talk first about how to read this guy. This is what's known as a frequency distribution. Along the X-axis are scores. The Y-axis is frequency, how often those scores occur. Because this is a symmetrical distribution, the very middle is the mean or average. But the normal distribution also has two additional features. The middle is also the median, the place that divides the distribution into 50% on each side. And it's also the mode, the most frequent score.

As you can see the scores in the middle are very common. The scores on the far ends are rare. If we think in terms of probability - and the normal distribution is actually a probability distribution, giving the likelihood of various scores - middle scores are likely and extreme scores are unlikely. As a result, if I give a group of people a test once, I'll see many middle scores and few extreme scores. If I gave that same test again, the extreme scores would probably move (regress) toward the middle (mean).

So let's say the test is of ability. There are many variables that affect those scores besides knowledge: level of fatigue, environmental conditions (like lighting and temperature), and even mood, to name a few. Those variables would impact some of those scores, and we would see some extreme scores. We would expect those additional variables to, well, vary, so giving you an ability test again, we would expect those variables to be different, and the extreme scores less extreme. Any time you specifically select people because their scores are extremely high or extremely low, you're going to see those scores become less extreme on average even if you do nothing to the group. True, some people have genuine extreme scores, but for many, you may just be capitalizing on chance.

To use a real-life example, let's think about a different kind of ability: athletics. Basketball players may suddenly be "on fire" during a key game. Baseball players may suddenly start hitting home run after home run. A golfer may suddenly hit a hole in one. They may be signs of high ability, but they may also be, at least in part, luck.

And for my friends who don't like sports, how about a Star Wars reference? Because we all know what happened to Obi-Wan shortly after he said this...

Those extreme examples get a lot of attention, and may lead that player to be put on the cover of Sports Illustrated. Many have noticed that after being on the cover, the player suddenly doesn't perform as well, known as the Sports Illustrated effect, or even the Sports Illustrated curse.

What we're probably seeing isn't a curse resulting from being on the cover of Sports Illustrated. More likely, it's simply regressing to the mean. This is especially like if a player was good but not fantastic before, and suddenly started performing really well. Or, you know, maybe steroids. But I digress. The point is, extreme scores tend to become less extreme over time.

So my current levels of awesome are likely to dissipate. But that's okay!

And by the way, if you'd still like to check out Apollo Chorus of Chicago's Broadway concert, you have one last chance today!

Saturday, May 7, 2016

Help Kickstart the National Museum of Psychology

I'm working on a new blog post for Monday, so for today, I just wanted to share an opportunity to help establish a National Museum of Psychology via this Kickstarter campaign:
With your help, we will open a newly renovated 6,000 square foot museum in the Spring of 2017. This museum will be the only one of its kind in the nation, dedicated to exploring psychology’s past, present, and future.

The museum space is nearly completed and the exhibits have been planned. What’s needed is your help to turn those plans into a reality. Your support will be used to create and install temporary and permanent museum exhibits that showcase many of the most important documents, media, and artifacts from the history of psychology.
I've already made my donation. Hope you can help out as well!

Friday, May 6, 2016

The Language of Music

I come from a very musical family. Both of my parents were in choirs in school, my brother writes and performs music, and music was almost always playing at home or in the car. So it's unsurprising that I developed a love of music very early in life - my mom would argue from infancy - and kept making music continuously up to now, on the piano or my flute, or with my voice. Even after I changed my major from theatre to psychology, I kept performing in my college choir.

Most of my friends are also musicians, or musically inclined, and I married a fellow musician. So music is, and probably always will be, an important part of my life. As a psychologist, though, I'm always fascinated by the experience of music and the brain activity involved with listening to or producing music. This is especially true because I notice that my husband and my experiences of music seem to differ from each other, and I think it comes down to brain activity.

Just to note some of the social differences between us: my husband minored in music, plays piano and organ incredibly well, and is at ease with many styles, including jazz, classical (choral and orchestral), showtunes, and church music/hymnity. My exposure growing up was mainly to classic rock and showtunes, my (minimal) vocal training is in musical theatre, and I played the flute in my school band (so lots of pep band/marching band music).

My husband loves listening to music when he's working or reading. I listen to a lot of music during the day, but if I'm reading or writing and need to do some heavy cognitive processing, I find the music can be distracting. But I also notice that this occurs, regardless of whether I'm listening to music with words or without. That is, it seems that my language center in my brain is processing the music, even when it's instrumental.

I'm not the first to notice this connection. Brown, Martinez, and Parsons (2006) examined the similarities of music and language in the brain. Their participants were amateur musicians, who heard snippets of melody and spoken phrases, and were asked to improvise melodies or phrases based on those snippets. Positron emission tomography (PET scan) was used to look at brain activity during these activities.
Direct comparisons of the two tasks revealed activations in nearly identical functional brain areas, including the primary motor cortex, supplementary motor area, Broca’s area, anterior insula, primary and secondary auditory cortices, temporal pole, basal ganglia, ventral thalamus, and posterior cerebellum. Most of the differences between melodic and sentential generation were seen in lateralization tendencies, with the language task favouring the left hemisphere. However, many of the activations for each modality were bilateral, and so there was significant overlap.
To put this in more common language, they found overlap in many major brain areas - including Broca's area, the so-called language center of the brain - and also found that, while language tasks tended to involve the left hemisphere of the brain, which is where the language center is located, more heavily than music tasks, they both showed activities in both sides of the brain. In fact, while I was looking into the specific brain areas involved, I found this article, which references this handy graphic:

So one possibility is that music activates my language center more heavily, which makes it difficult to complete another linguistic task at the same time. Another potential factor is choice in music. At home, if I'm reading or writing, I'm usually at my desk or on the couch with a book or laptop, while my husband is at the "main" computer, where we store our music files. So in this scenario, he's in charge of what music we listen to. It's possible that the distraction factor is because I didn't select the music. We may react differently to music we are "forced" to listen to and music we select. This could be one reason why Christmas music is so annoying.

In any case, research shows that music is a linguistic task. Not only that, it activates many areas of the brain, including the motor cortex (we tap our feet or dance), the amygdala (we feel emotions in response to the music), and the hippocampus (we associate memories with the music). There are so many positive impacts of music on brain and skills development. Even in those few instances where I find music distracting, I know it has made me a better person.

Tonight, my choir, the Apollo Chorus of Chicago, is performing the first of two concerts featuring music of Broadway. If you're in the Chicago area, be sure to check it out!

Thursday, May 5, 2016

Frustration, Road Rage, and the Fundamental Attribution Error

On my way to a family gathering the other day, I witnessed some serious road rage. For a psychology class in school, I remember developing a survey about road rage, and did some reading on the concept in preparation. I read somewhere (and can't find that source now, unfortunately) that there are some people who respond with aggression to many situations, and others who only respond with aggression when they're driving; that is, they are generally friendly, calm people who get supremely pissed off at other drivers.

I can understand this reaction, because I also used to be really bad about road rage. Not the over-the-top, picking fights with other drivers rage; more shouting and flipping people off rage. Either one isn't particularly healthy, and I made a decision a couple of years ago to give up road rage completely. I decided that road rage is like an adult temper tantrum, and it doesn't actually help the situation. In fact, it's more likely to make things worse.

There are a variety of psychological explanations for road rage. A common one stems from the frustration-aggression hypothesis: when something frustrates you, you respond with aggression, especially when you have no control over the situation and can't really do anything to change it. Being stuck behind a bad driver, particularly when you're in a hurry, is incredibly frustrating.

A study by Lupton (read it here) examined people's views on driving and experiences with road rage, and found that driving was viewed as a pleasurable experience that allowed people to feel independent. When they experience frustrations that keep them from feeling that pleasure, and reminding them that they can't control other drivers, it's understandable that they would react with aggression.

However, attributions are also at play. I recently blogged about the fundamental attribution error, which occurs when you apply situational explanations for your own bad behavior, and dispositional explanations for other's bad behavior. In the case of driving, if I'm driving like a jerk, it's because I'm in a hurry; if someone else is driving like a jerk, well, it's because they're a jerk. And jerks make us angry.

Whether you realize it or not, when you get mad at other drivers, it might be because you're assuming their poor driving is dispositional, rather than situational. It's a lot harder to get mad at other drivers if you assume that they too are in a hurry or have somewhere else they need to be. It might be incorrect - maybe that bad driver is just a jerk. But if it helps keep you from getting upset, who cares?

I was talking to my husband about this, and he commented that, when you're in the moment (with road rage), it's hard to let it go, especially when the other driver starts to road rage. I told him that in that moment, you make a choice: you can choose to escalate or de-escalate. When another driver rages, I choose to let it go. Sure, he may laugh at me, and think I'm an idiot. But who cares what that guy thinks?

Wednesday, May 4, 2016

On Feelings (Woah-oh-oh, Feelings)

According to Paul Ekman, there are 7 basic human emotions that exist across cultures: happiness, sadness, surprise, contempt, anger, disgust, and fear. (It should be noted that Ekman's original research only identified 6 emotions; contempt was added later.) Ekman's work, based on research and observation in many different cultures, remains highly influential, and he has served as consultant for shows and movies, including the series Lie to Me, based on his work in lie detection, and the Pixar movie, Inside Out. Emotions, and their outward expression, serve an important evolutionary function, and also provide evidence in support of evolution, because humans and other primates share many emotional expressions.

These basic emotions are the ingredients for more complex emotions: happiness and contempt, for instance, combine to form smugness. And we all know what happens when you combine fear and anger:

And May the Fourth be with you!

Emotions not only give us cues when interacting with other people, allowing us to alter our behavior in response to their emotions, they also are an important guide in our own decision-making. Previously, we thought choice was determined by our perceptions about value; that is, we assign values to different outcomes, and pick the outcome (or course of action that would lead to that outcome) that has the most value to us. Value is considered a proxy for our emotions. However, no one has really tested that assumption directly.

Until now, that is. A recent study in Psychological Science, which you can read for free here, examined feelings and choice. From this, the researchers developed a "feeling function" that allowed them to relate feelings to value, and ultimately choice. If feelings are a proxy for value, we would expect a perfect or near perfect relationship between the two.

Based on past research, the authors expected loss to have more of an influence than wins; people are loss averse, and will rate a loss of money as more impactful than a win of the same magnitude. We also show diminishing sensitivity; a gain of $10 isn't twice as valuable to us as a gain of $5. The gambling task involved a series of shapes that participants had to choose between. They were randomized into one of three types: mixed (one shape had a 50% chance of a gain and 50% chance of a loss, and the other was a sure option of 0), gain only (one shape had a 50% chance of a gain and 50% of 0, or a sure, smaller gain), and loss only (one shape had a 50% chance of a high loss and 50% chance of 0, or a sure, smaller loss).

Participants also completed two feeling tasks: one about expected feelings - how they thought they would feel about winning or losing various amounts, and one about experienced feelings - how they actually felt after winning or losing various amounts. They used the expected feelings data to develop the feeling function, and tested it on the experienced feelings data. They found that feelings better predicted choice than values. They also found the diminishing sensitivity relationship, where a loss or gain of $10 has less than twice the impact of a loss or gain of $5. Interestingly, they didn't find evidence for loss aversion; losses and gains of the same magnitude had the same impact.

A lot of people refer to decisions made with the aid of emotions irrational. But emotions allow us to make quick decisions, as well as to decide between things that are generally equivalent. It's not always necessary to systematically consider all options. Sometimes, it's good to just trust your instinct.

But in all seriousness, May the Fourth be with you!

Tuesday, May 3, 2016

Weight Management and Reality TV

Weight management is one of the topics I study at my job, so it's unsurprisingly something I've blogged about before. Yesterday, the NY Times featured a story about The Biggest Loser, a reality TV weight-loss competition, and a scientific study performed using previous contestants on the show.

The study was done on 14 of the 16 contestants from Season 8 of the show, which aired about 6 years ago. Participants completed 3 days of testing at offices at the National Institutes of Health, where Kevin Hall, the principal investigator, works (at the National Institute of Diabetes and Digestive and Kidney Diseases, part of NIH). Additionally, researchers sent equipment beforehand to track participants' physical activity and weight prior to the visit (just in case participants tried to lose weight before testing began).

They found participants had drastically lower basal metabolic rates (BMR: the rate at which your body burns calories at rest). Over half of the participants now burn 400-800 fewer calories a day than the average person their size. For comparison, a pound equals 3500 calories. If these individuals used standard calculations to determine how many calories they could eat without gaining weight, they would gain between 0.8 and 1.6 pounds a week because of their slowed metabolism. Participants also had low levels of leptin, a hormone that controls hunger, which also explains weight gain; low leptin would make them feel hungry almost all the time.

Dr. Robert Huizenga, the physician for the show's contestants, sadly shows a complete lack of understanding that what contestants are being asked to do is unreasonable and unlikely to lead to long-term weight loss. In fact, his response to these findings, other than that he thought the measurements in the study were inaccurate was this:
But maintaining weight loss is difficult, he said, which is why he tells contestants that they should exercise at least nine hours a week and monitor their diets to keep the weight off.

“Unfortunately, many contestants are unable to find or afford adequate ongoing support with exercise doctors, psychologists, sleep specialists, and trainers — and that’s something we all need to work hard to change,” he said in an email.
That's right, 9 hours a week. Just for comparison, the recommendation from the American Heart Association is 40 minutes 3 to 4 times a week; that's less than 3 hours. And by the way, that number is actually for people at risk for heart disease, due to high blood pressure or cholesterol. If your BP and cholesterol are normal, you only need 2.5 hours a week of moderate exercise or 1 hour 15 minutes of vigorous exercise.

And don't even get me started on the fact that Dr. Huizenga thinks contestants need a team of weight management specialists after the show is over. Isn't the purpose of the show to demonstrate that regular people can lose weight and get healthy? Oh wait, it's reality TV, one of the most contrived, unrealistic situations you can find. Silly me.

Sure, Dr. Huizenga probably recommends this extreme amount of exercise because of the drastically reduced BMR. But it seems, for a medical professional, the better advice would be don't do The Biggest Loser. Just don't.

The problem with these shows and other fad diets, where people want to lose weight quickly, is that: the faster the weight is lost, the more drastic the changes needed to lose that weight. And drastic changes, which can include extreme calorie restriction or unreasonable amounts of exercise (or more likely, both) are 1) difficult to maintain long-term, and 2) trigger starvation mode, resulting in slower metabolism. What we didn't know is how long the body stays in starvation mode following extreme weight loss. This study tells us that, even 6 years later, BMR and leptin levels are significantly reduced.

Further, this study highlights the importance of weight maintenance - the things we do to keep the weight off. In so many cases, people participate in diet programs to lose the weight, and are essentially told, "Good job. Have a nice life." But it's not that easy. Ongoing support is needed. Not at the level Dr. Huizenga thinks, but keeping weight off can be just as challenging as losing the weight initially, perhaps even more challenging. There's still a lot we don't know, and more research is needed. But this study gives us some direction for the future of weight management research.

Monday, May 2, 2016

Beer and the Minimal Group Paradigm Revisited

In April, I blogged about the minimal group paradigm - basically, how we form groups with people based on common characteristics. While we can form groups for any random bit of information, like what painting someone prefers, some characteristics are more important to us than others. That is, we consider some characteristics part of our core identity. These are the things that we tend to mention when people say, "Tell me about yourself."

In addition to being a woman, a social psychologist, and a singer (all important parts of my identity), I am a beer lover. I love trying new beers and new breweries. I'm especially a fan of porters, stouts, and sour ales, and will drink pretty much anything described as "barrel-aged."

So discovering that someone else is also a beer lover is like finding a kindred spirit. Friday, a coworker invited me to join a group for an after work outing. I considered going home, sitting on my couch, and reading - it had been a busy week and I needed some me-time. But then I saw the plan was to go to a local brewery called "Exit Strategy." (I had to laugh that I was trying to come up with an exit strategy to avoid going to Exit Strategy. Turns out the founders of Exit Strategy, the Valleaus, quit their jobs to start the brewery - the business was their exit strategy.)

I got online and checked out their website. First of all, awesome beer names, like Judgmental Dick and Nobody Reads the Copy. Second, old-fashioned posters for their beers:

So I decided I had to check this place out, and headed over right after work. First, I'm very glad that I used Google Maps turn-by-turn directions, instead of just looking up the address, because Exit Strategy is in such a nondescript brick building that I nearly drove by it (and would have if Google hadn't said, "Your destination is on the right"). That's my only complaint about the whole experience.

Parking is really easy to find on the street. The inside is super-cute:

It's a brewpub, so they also had food. I ordered the giant pretzel (sadly, didn't think to take a picture) - it's not just a name. This pretzel took up an entire metal tray. Next time, I'll probably plan on sharing it with someone.

Plus I got to bond with the coworker who organized the event, and learn about her favorite beers. Turns out we have many of the same tastes in beer, including our mutual love of stouts and porters. So the whole night was a win.

Sunday, May 1, 2016

How Do You Measure Success?: On Surviving the A-Z Blog Challenge

The A-Z Blog Challenge is over, and I'm thrilled to report it was an incredible success! I blogged 26 times in April - one for each letter of the alphabet, and following the schedule recommended by the A-Z Challenge Blog. I also got to talk through so many of my favorite social psychological issues this month - including topics I've not yet been able to work into my previous posts. It was great fun and I got a lot of positive feedback from both friends and new visitors.

I think the biggest indicator of success, for me, is that I didn't miss a scheduled blog post. There were days when the post came really late, and on those days, I seriously considered just waiting until tomorrow and writing two posts, or just moving a post to a Sunday. But I made myself do it, and it worked. I guess I should apply that same perseverance to other things in my life.

Some lessons learned that I'll applying for the next blog challenge:

  1. Having a theme was a huge help! I can't imagine having to come up with 26 topics on the fly.
  2. Relatedly, writing up a schedule with each topic already identified before April was an even bigger help. I think the problem I encounter with blogging regularly is coming up with a good topic, and I tend to depend too heavily on momentary inspiration to put together a blog post. It might be a good idea to identify certain topics I'd like to cover, and perhaps tie them to certain days or times of year.
  3. I should have written more of my posts ahead of time. Though I did a little of this, most days, I wrote the blog post the day it was supposed to be up, or at most one day in advance. This created a bit of a time crunch. Once I finally did start writing, it was easy to keep the momentum going - I just usually didn't have the time because I had to squeeze writing in between other tasks. Having an evening I devote to writing a few posts wouldn't be too hard if I just make a writing schedule and stick to it.
  4. I need to be better about writing down ideas. I've done this so much outside of the blog challenge - thought of a good blog topic to revisit later, only to forget what it was when I sat down at the computer. Even during the challenge, some days I would realize another potential topic for that letter and decide I could just blog about those topics later. Now I can't remember what they were.
  5. I need to make more time to visit other people's blogs and respond to comments. I was terrible about doing this, since April turned out to be a busy month work-wise. Writing posts ahead of time would allow me more time to be a good fellow blogger and see what others are writing, another good reason for #3 next April.
Still, absolutely thrilled with how well this went, and I love looking at my archive for this year and seeing that I've already written twice as many posts in 2016 as I did for all of 2015.