Tuesday, May 31, 2016

Blast from the Past: Facebook "On This Day"

If you use Facebook, and if you've been on Facebook for more than a year, you probably have gotten more than a few Facebook "On this Day" memories. Most of them - for me, anyway - are just random things I posted, like a cute picture of puppies or a joke I made. Occasionally, they are something more meaningful.

On this day, 5 years ago, I was sending out the announcement for my dissertation defense. These defenses are considered public, so people are required to post them a certain number of days in advance. Usually, they are only posted in the academic department where the defense will take place, and sometimes they are publicized to the larger university.

I joked with friends about also posting the announcement on Facebook, though I didn't want random friends to feel compelled to show up. Instead decided to have a little fun with it. So I posted this:

WOSSAMOTTA U DEPARTMENT OF SILLINESS

ANNOUNCES THE FINAL PUBLIC HUMILIATION OF SARA M. HOUSATELLI

B.S. COMPARATIVE NERDINESS, FRAK UNIVERSITY
M.A. BEING AWESOME, UNIVERSITY OF SHEEN, CHARLIE

FLYING HIGH: THE EFFECT OF PAPER AIRPLANE ASSEMBLY SKILLS ON SOCIAL SKILLS

MAY 39, 2011; 10 AM; YOUR MOM’S LIVING ROOM

ALFRED G. PACKER, DCA, CANNIBALISM CHAIR
DOUGLAS ADAMS, HHG
JOE GREENS, DMU
LUCIE MANETTE, TTC

In case you're wondering, this is the format in which defense announcements are written. For comparison, this is the real thing:

LOYOLA UNIVERSITY CHICAGO

DEPARTMENT OF PSYCHOLOGY

ANNOUNCES THE FINAL PUBLIC EXAMINATION OF

SARA M. LOCATELLI
B.S. PSYCHOLOGY, BAKER UNIVERSITY
M.A. SOCIAL PSYCHOLOGY, LOYOLA UNIVERSITY CHICAGO

PERSONAL BELIEFS AND PUBLIC PRINT:
THE INFLUENCE OF PRE-EXISTING ATTITUDES AND
PRETRIAL PUBLICITY INFORMATION ON FINAL VERDICTS

JUNE 8, 2011; 10:00 AM; COFFEY HALL 116

COMMITTEE CHAIR NAME, DEGREE, COMMITTEE CHAIR (Redacted to protect the innocent)
REMAINING MEMBERS, DEGREES...

Spoiler alert: I passed that dissertation defense, so on June 8 this year, it will be exactly 5 years since the first time I was called "Dr."

Monday, May 30, 2016

Follow-Up on Peer Review

I've blogged many times about the peer review process (here and here, especially) - one of the first steps in publishing scientific research in scholarly journals. While the purpose of peer review is, in part, to improve the manuscript, there can certainly be a "too many chefs" component:

Sunday, May 29, 2016

Meta-Analysis and Reproducibility: More from Sara's Week in Psychological Science

Yesterday, I attended a symposium on meta-analysis, which is a set of methods and statistical techniques to combine results of multiple studies. The purpose of a meta-analysis is to determine the true effect of some phenomenon, and also to understand what characteristics of samples, studies, and so on impact study results. I did one of my grad school candidacy exams on meta-analysis, and have a meta-analysis I've been working on and trying to get published.

During yesterday's session, one of the presenters talked about an Open Science Framework project, in which the researchers are trying to reproduce findings from multiple meta-analyses in psychology. The issue is that, as is true with many psychological studies, people often have difficulty reproducing findings from past meta-analyses, in part because of the strong impact of subjective decisions when conducting a meta-analysis. That is, decisions like what types of studies to include, how to divide up data within studies, and the specific analysis technique can be highly subjective and left up to the discretion of the researcher.

This project will involve meta-analyzing 2000 studies. Obviously, they are going to need help, and if you're so inclined, you can volunteer to be part of the project.

Friday, May 27, 2016

Childhood Abuse, Reliability, and Measurement Error: More from Sara's Week in Psychological Science

This morning, I attended a great session on methodological issues in studying trauma. One of the presentations was about studying childhood abuse. Many measures of childhood abuse are done retrospectively, among college students/adults, asking about past experiences. Researchers have noted a variety of issues with this approach, including the issue that reports of childhood abuse seem to vary over time - that is, a person may report that they were abused as a child at one timepoint, but not at another.

This leads some to conclude that reports of childhood abuse may be influenced by current levels of distress - people may misremember or misreport childhood abuse depending on how distressed they are feeling as adults. We know that current experiences can color past experiences, which we see in any kind of memory research; memory is highly malleable. However, any time we measure something in people, we also have to worry about measurement error. Poorly worded questions, respondent fatigue, and other factors may affect how well the measure "works."

The great thing about structural equation modeling (SEM) is that it separates measurement error out, so we can get a more pure read of how much particular constructs relate to each other. Some people have used this as mark against SEM, that it shows a "perfect world" relationship that we would rarely see in practice. But in many cases, SEM is a great technique to use as a way to answer questions about reliability due to measurement error versus systematic sources of variation.

The researchers measured past experiences of childhood abuse at the same time as they measured symptoms of distress, using a post-traumatic stress disorder (PTSD) checklist. Two weeks later, they measured these two variables again. They built a model where time 1 PTSD predicted time 2 PTSD, time 1 reports of childhood abuse predicted time 2 reports, and time 2 PTSD predicted time 2 reports. What they found was that time 2 PTSD predicted less than 2% of the variance in time 2 reports of childhood abuse. Time 1 abuse reports was highly predictive of time 2 abuse reports, and once measurement error was factored out, they found reliability in abuse reports above 0.80 (which in measurement world is considered excellent reliability).

What this means is that, when it looks like people are saying different things at different times, measurement error is a much more likely culprit than how a person is currently feeling. Obviously, the study was done over a short timeline (2 weeks), so results may be different if that time period is longer. This was also a college sample, and reports of abuse, especially physical and sexual abuse, were low. But this study gives some guidance for studying reports of childhood abuse in other samples, and highlights a time when separating measurement error from systematic variation (i.e., actual differences in reports of childhood abuse) is the optimal approach.

Thursday, May 26, 2016

The Trouble with Workshops

As I blogged yesterday, I’m attending the Association for Psychological Science annual meeting, which is being held in Chicago this year. I’ve spent this afternoon in a workshop, on how to conduct structural equation modeling (one of my favorite statistical techniques) using an R package called lavaan.

The workshop lasted almost 4 hours and was taught by Yves Rosseel, who developed lavaan. Unfortunately, as I’ve discovered with many workshops, most of the session was spent giving a refresher course on structural equation modeling. We didn’t even get to R or lavaan until almost 1.5 hours into the workshop. This also meant we had little time to do hands-on work using the package; the practice session ended up running concurrently with the break, so we had to choose which we wanted. To be fair, he gave us the slides we would need to do the practice session, so we could do that after the session.

While I found the information interesting, and learned a thing or two, I wanted to spend more time digging into the program and how it could be used to test various structural equation models. This is especially because I’ve used lavaan before, and had a lot of success with it – it’s very user friendly compared to other R packages I’ve used – but wanted to learn to do more and go beyond the simple models I ran before.

The problem with workshops, of course, is that there are rarely prerequisites. So it’s difficult to assume everyone will come into the workshop with, say, a strong working knowledge of the underlying statistic, or some basic experience with the program.

But not impossible. Workshop descriptions could inform people what knowledge they are expected to have. Slides sent ahead of time (as these were) could contain background information slides, and attendees could be asked to review a portion of the slides on their own time prior to attending the workshop.

Even a simple survey among people signing up for the workshop could be used to assess how much people already know about a topic. (Obviously, asking people what they know can be problematic, but at the very least, you could find out where most of the people will be in terms of exposure to the subject matter, and shape the workshop around the majority group.)

That being said, I enjoyed the workshop and would recommend it to others. He’s an excellent teacher and very funny. My favorite comment from the session was his description of the mystical land of Asymptotia, a magical place where all the data are normally distributed.

Probably a neighbor of Lake Wobegon
I just wish they had allowed Professor Rosseel to teach 2 workshops, one about structural equation modeling alone and another about using lavaan.

Wednesday, May 25, 2016

Sara's Week in Psychological Science

The Association for Psychological Science meeting starts tomorrow!


I'm looking forward to hearing about some of the current psychological science research. I blogged previously about some of the workshops being offered. I finally settled on the Structural Equation Modeling with Lavaan workshop. Lavaan is a package for the R statistical language. I've used it before and found it to be relatively user friendly; the manual is also very helpful. However, there were a few things I couldn't figure out how to do, so the workshop will at least be an opportunity to ask some questions.

I hope to post some updates while I'm attending the conference. Stay tuned!

Tuesday, May 24, 2016

Patient Data and Myriad Genetics

A few days ago, the ACLU filed a complaint on behalf of 4 individuals who had genetic testing performed, asking Myriad Genetics to release the individuals' data. Though they did so, after the complaint had been filed, they didn't provide everything the individuals wanted and argue they have a right to:
Genetic testing labs, including Myriad, normally provide clients with information on gene variants that are known to increase the risk of disease (pathogenic), likely to be pathogenic, or of uncertain significance. But nearly everyone also has variants (often called polymorphisms) that are deemed benign—and companies typically don’t send clients information about those variants. But as researchers pool data from thousands of cancer gene tests, some of those benign variants may be reclassified as dangerous, or may become informative in ways that hadn’t been anticipated. That is one reason some of the patients involved in the complaint wanted Myriad to provide all of their genetic data, and not just the information on the pathogenic variants.

In the complaint, ACLU and the plaintiffs, supported by researchers in cancer genetics, say that Myriad ran afoul of a new regulation promulgated by HHS this past January: that individuals have a right to receive “the full gene variant information generated by the test.”
The issue is that, although the new HHS policy allows individuals to request their full genetic information, Myriad only shares the pathogenic data, and actually resent the same results to the 4 individuals after the complaint.

What's strange about the whole thing is that a new policy about data transparency was released rather quietly, on the HHS blog. Any policy released via blog is rather strange, of course.