A team from the University of Washington (UW) and Carnegie Mellon University has developed a system, known as BrainNet, which allows three people to communicate with one another using only the power of their brain, according to a paper published on the pre-print server arXiv.Pretty cool, but...
In the experiments, two participants (the senders) were fitted with electrodes on the scalp to detect and record their own brainwaves—patterns of electrical activity in the brain—using a method known as electroencephalography (EEG). The third participant (the receiver) was fitted with electrodes which enabled them to receive and read brainwaves from the two senders via a technique called transcranial magnetic stimulation (TMS).
The trio were asked to collaborate using brain-to-brain interactions to solve a task that each of them individually would not be able to complete. The task involved a simplified Tetris-style game in which the players had to decide whether or not to rotate a shape by 180 degrees in order to correctly fill a gap in a line at the bottom of the computer screen.
All of the participants watched the game, although the receiver was in charge of executing the action. The catch is that the receiver was not able to see the bottom half of their screen, so they had to rely on information sent by the two senders using only their minds in order to play.
This system is the first successful demonstration of a “multi-person, non-invasive, direct, brain-to-brain interaction for solving a task,” according to the researchers. There is no reason, they argue, that BrainNet could not be expanded to include as many people as desired, opening up a raft of possibilities for the future.
Showing posts with label human brain. Show all posts
Showing posts with label human brain. Show all posts
Thursday, October 4, 2018
Resistance is Futile
In yet another instance of science imitating science fiction, scientists figured out how to create a human hive mind:
Tuesday, May 29, 2018
Stress and Its Effect on the Body
After a stressful winter and spring, I'm finally taking a break from work. So of course, what better time to get sick? After a 4-day migraine (started on my first day of vacation - Friday) with a tension headache and neck spasm so bad I couldn't look left, I ended up in urgent care yesterday afternoon. One injection of muscle relaxer, plus prescriptions for more muscle relaxers and migraine meds, and I'm finally feeling better.
Why does this happen? Why is it that after weeks or months of stress, we get sick when we finally get to "come down"?
I've blogged a bit about stress before. Stress causes your body to release certain hormones, such as adrenaline and norepinephrine, which gives an immediate physiological response to stress, and cortisol, which takes a bit longer for you to feel at work in your body. And in fact, cortisol is also involved in many negative consequences of chronic stress. Over time, it can do things like increase blood sugar, suppress the immune system, and contribute to acne breakouts.
You're probably aware that symptoms of sickness are generally caused by your body reacting to and fighting the infection or virus. So the reason you suddenly get sick when the stressor goes away is because your immune system increases function, realizes there's a foreign body that doesn't belong, and starts fighting it. You had probably already caught the virus or infection, but didn't have symptoms like fever (your body's attempt to "cook" it out) or runny nose (your body increasing mucus production to push out the bug), that let you know you were sick.
And in my case in particular, a study published in Neurology found that migraine sufferers were at increased risk of an attack after the stress "let-down." According to the researchers, this effect is even stronger when there is a huge build-up of stress and a sudden, large let-down; it's better to have mini let-downs throughout the stressful experience.
And here I thought I was engaging in a good amount of self-care throughout my stressful February-May.
Why does this happen? Why is it that after weeks or months of stress, we get sick when we finally get to "come down"?
I've blogged a bit about stress before. Stress causes your body to release certain hormones, such as adrenaline and norepinephrine, which gives an immediate physiological response to stress, and cortisol, which takes a bit longer for you to feel at work in your body. And in fact, cortisol is also involved in many negative consequences of chronic stress. Over time, it can do things like increase blood sugar, suppress the immune system, and contribute to acne breakouts.
You're probably aware that symptoms of sickness are generally caused by your body reacting to and fighting the infection or virus. So the reason you suddenly get sick when the stressor goes away is because your immune system increases function, realizes there's a foreign body that doesn't belong, and starts fighting it. You had probably already caught the virus or infection, but didn't have symptoms like fever (your body's attempt to "cook" it out) or runny nose (your body increasing mucus production to push out the bug), that let you know you were sick.
And in my case in particular, a study published in Neurology found that migraine sufferers were at increased risk of an attack after the stress "let-down." According to the researchers, this effect is even stronger when there is a huge build-up of stress and a sudden, large let-down; it's better to have mini let-downs throughout the stressful experience.
And here I thought I was engaging in a good amount of self-care throughout my stressful February-May.
Friday, May 11, 2018
Neuroscience, Dopamine, and Why We Struggle to Read
I'm a proud book worm. Each year I challenge myself to read a certain number of books, and do so publicly, thanks to Goodreads. Last year, I read 53 books. This year, I challenged myself to read 60. I was doing really well. Then April and May happened, and with it Blogging A to Z, multiple events and performances, work insanity, and some major life stuff. I found it harder to make time for and concentrate on reading.
I got off track, and was disheartened when I logged into book reads and saw that I was behind schedule.
I know I should be proud that I've read 19 books already this year, but that "2 books behind schedule" keeps drawing my attention away from the thing I should be proud of.
And I'm not alone. A lot of people are having difficulty concentrating on and enjoying their time with books. We get distracted by a variety of things, including phone and email. So it's good timing that someone shared with me this article by Hugh McGuire, who built his life on books and reading, and discusses his own difficulty with getting through his ever-growing reading list:
After I put my phone or computer down and pick up the book again, I sometimes have to reread a bit to remind myself where I was or because I wasn't really paying attention the first time I read a paragraph, distracted by what else might be going on in the world.
What can we do to change this? Hugh McGuire decided to set some rules for himself, such as keeping himself from checking Twitter and Facebook during certain times. What about you, readers? Any rules you make for yourself to keep your mind on the task at hand?
I got off track, and was disheartened when I logged into book reads and saw that I was behind schedule.
I know I should be proud that I've read 19 books already this year, but that "2 books behind schedule" keeps drawing my attention away from the thing I should be proud of.
And I'm not alone. A lot of people are having difficulty concentrating on and enjoying their time with books. We get distracted by a variety of things, including phone and email. So it's good timing that someone shared with me this article by Hugh McGuire, who built his life on books and reading, and discusses his own difficulty with getting through his ever-growing reading list:
This sickness is not limited to when I am trying to read, or once-in-a-lifetime events with my daughter.And it's true - sometimes when I've made time to read, I find myself distracted by the digital world: Are there new posts on the blogs I follow? What's going on on Facebook? Hey, Postmodern Jukebox has a new video!
At work, my concentration is constantly broken: finishing writing an article (this one, actually), answering that client’s request, reviewing and commenting on the new designs, cleaning up the copy on the About page. Contacting so and so. Taxes.
It turns out that digital devices and software are finely tuned to train us to pay attention to them, no matter what else we should be doing. The mechanism, borne out by recent neuroscience studies, is something like this:So, every new email you get gives you a little flood of dopamine. Every little flood of dopamine reinforces your brain’s memory that checking email gives a flood of dopamine. And our brains are programmed to seek out things that will give us little floods of dopamine. Further, these patterns of behaviour start creating neural pathways, so that they become unconscious habits: Work on something important, brain itch, check email, dopamine, refresh, dopamine, check Twitter, dopamine, back to work. Over and over, and each time the habit becomes more ingrained in the actual structures of our brains.
- New information creates a rush of dopamine to the brain, a neurotransmitter that makes you feel good.
- The promise of new information compels your brain to seek out that dopamine rush.
- With fMRIs, you can see the brain’s pleasure centres light up with activity when new emails arrive.
There is a famous study of rats, wired up with electrodes on their brains. When the rats press a lever, a little charge gets released in part of their brain that stimulates dopamine release. A pleasure lever.
Given a choice between food and dopamine, they’ll take the dopamine, often up to the point of exhaustion and starvation. They’ll take the dopamine over sex. Some studies see the rats pressing the dopamine lever 700 times in an hour.
We do the same things with our email. Refresh. Refresh.
After I put my phone or computer down and pick up the book again, I sometimes have to reread a bit to remind myself where I was or because I wasn't really paying attention the first time I read a paragraph, distracted by what else might be going on in the world.
What can we do to change this? Hugh McGuire decided to set some rules for himself, such as keeping himself from checking Twitter and Facebook during certain times. What about you, readers? Any rules you make for yourself to keep your mind on the task at hand?
Friday, February 2, 2018
Psychology for Writers: Amnesia
When I started getting ready for this blog series, I added some posts to writer groups I belong to, asking people to describe some of the tropes in fiction that drive them crazy. One person commented on how head injuries are portrayed in fiction, and many replied with similar frustrations: people are knocked unconscious for a time and wake up sometime later with no lingering issues, bumps to the head are used to recover memories, and of course, the whole "don't let someone suffering from a concussion fall asleep" nonsense.
Two jobs ago, I worked as a researcher for the Department of Veterans Affairs. Though I didn't study this topic of head injuries myself, many of my colleagues did. (I've also suffered a couple of those in my lifetime, so I can talk about some of the experience firsthand.)
Traumatic brain injury, or TBI, is the medical term for a head injury. I was originally going to write this post on TBI, but realized that would make for a very long post - or else, a post in which I gloss over a lot of detail. So instead, I'm going to talk specifically about amnesia.
TBI is usually classified into one of three categories: mild, moderate, and severe. But any insult to the brain can bring with it side effects. The person suffering the head injury will probably experience what's called "post-traumatic amnesia," where they may lose their memory of things happening just before and/or just after the event. And they probably won't remember the experience of hitting their head at all. For a mild head injury, the individual may have a more severe bout of amnesia that comes on later, but will still be short-term. In fact, the delay between the head injury and this amnesia can also be seen as an indicator of severity. For more severe head injuries, amnesia and other neurological symptoms, like seizure, may be almost immediate.
The length of this experience of amnesia is also related to severity. For extremely mild injuries, post-traumatic amnesia may last only a few minutes. For a severe injury, it can last days.
And no, another blow to the head won't bring the memory back. In fact, a new head injury shortly after an old one is very bad, because the brain can't heal itself as well. Too many injuries, and the brain may be unable to heal itself at all. This is why retired professional football players have disorders that look like dementia. Their brain has essentially become scar tissue.
For me, the amnesia happened about 3 hours after the injury - I couldn't respond to questions about who I was, who my family was, the day, or my dog's name (all questions my mom asked me). I remember this feeling odd but not as terrifying as you might think. In fiction, amnesia sufferers wander around terrified, asking who they are. But for me, it was more like the answers to these questions were on the tip of my tongue, and with just a little more time, I could answer all of them. Less "Oh my god, I don't remember!" and more "That's strange. I swore I knew that." So at some level, I knew that I had the answers to these questions somewhere in my brain. 28 years later, I still remember all of this, and all of the questions I was asked at the time. The only memory I have never recovered is the second when my head came in contact with the wall. (The experience in the ER is also a bit fuzzy, but mostly because I kept falling asleep - I was exhausted by that point. And I slept a lot in the months after, as I recovered.)
I was feeling back to normal, memory-wise, within a few hours. Now, there are longer-term side effects of a head injury - things like fatigue, insomnia, and intermittent memory issues, which comprise what's known as post-concussive syndrome. Another post for another day.
So amnesia via head injury is generally short-term unless the person is severely injured, and at that point, a little amnesia may be the least of the person's worries.
It's actually much more believable for someone to experience amnesia as a result of some kind of psychological trauma. Dissociation is a psychological term for when a person loses his or her sense of self, and encompasses a handful of related disorders. What was previously known as multiple personality disorder is a dissociative disorder, now called dissociative identity disorder. Dissociation becomes a defense mechanism to protect oneself from re-experiencing trauma.
During dissociative fugue - a much better explanation if you want a character to wander into a town with no memory - the person will often run away from their life and commitments. (In fact, the word "fugue" comes from the Latin word for flight.) While in this state, the person may forget who they are and details of their life. It isn't unusual for a person in a fugue to assume another identity, likely because they realize they need one to function. They probably won't appear to be in immediate distress. After all, the fugue was brought on to escape distress, and to most observers, the person will appear normal. The person may eventually come out of the fugue, but it's harder to predict. Unlike a head injury, which has a physical cause, and therefore a clear trajectory of healing, the block here is psychological and will exist as long as the person needs protection from whatever is causing distress.
Finally, there's a type of dissociation/amnesia many have experienced - alcohol-induced blackouts. Once again, short-term and physical in cause. The hangover will probably last longer than the amnesia itself.
Two jobs ago, I worked as a researcher for the Department of Veterans Affairs. Though I didn't study this topic of head injuries myself, many of my colleagues did. (I've also suffered a couple of those in my lifetime, so I can talk about some of the experience firsthand.)
Traumatic brain injury, or TBI, is the medical term for a head injury. I was originally going to write this post on TBI, but realized that would make for a very long post - or else, a post in which I gloss over a lot of detail. So instead, I'm going to talk specifically about amnesia.
TBI is usually classified into one of three categories: mild, moderate, and severe. But any insult to the brain can bring with it side effects. The person suffering the head injury will probably experience what's called "post-traumatic amnesia," where they may lose their memory of things happening just before and/or just after the event. And they probably won't remember the experience of hitting their head at all. For a mild head injury, the individual may have a more severe bout of amnesia that comes on later, but will still be short-term. In fact, the delay between the head injury and this amnesia can also be seen as an indicator of severity. For more severe head injuries, amnesia and other neurological symptoms, like seizure, may be almost immediate.
The length of this experience of amnesia is also related to severity. For extremely mild injuries, post-traumatic amnesia may last only a few minutes. For a severe injury, it can last days.
And no, another blow to the head won't bring the memory back. In fact, a new head injury shortly after an old one is very bad, because the brain can't heal itself as well. Too many injuries, and the brain may be unable to heal itself at all. This is why retired professional football players have disorders that look like dementia. Their brain has essentially become scar tissue.
For me, the amnesia happened about 3 hours after the injury - I couldn't respond to questions about who I was, who my family was, the day, or my dog's name (all questions my mom asked me). I remember this feeling odd but not as terrifying as you might think. In fiction, amnesia sufferers wander around terrified, asking who they are. But for me, it was more like the answers to these questions were on the tip of my tongue, and with just a little more time, I could answer all of them. Less "Oh my god, I don't remember!" and more "That's strange. I swore I knew that." So at some level, I knew that I had the answers to these questions somewhere in my brain. 28 years later, I still remember all of this, and all of the questions I was asked at the time. The only memory I have never recovered is the second when my head came in contact with the wall. (The experience in the ER is also a bit fuzzy, but mostly because I kept falling asleep - I was exhausted by that point. And I slept a lot in the months after, as I recovered.)
I was feeling back to normal, memory-wise, within a few hours. Now, there are longer-term side effects of a head injury - things like fatigue, insomnia, and intermittent memory issues, which comprise what's known as post-concussive syndrome. Another post for another day.
So amnesia via head injury is generally short-term unless the person is severely injured, and at that point, a little amnesia may be the least of the person's worries.
It's actually much more believable for someone to experience amnesia as a result of some kind of psychological trauma. Dissociation is a psychological term for when a person loses his or her sense of self, and encompasses a handful of related disorders. What was previously known as multiple personality disorder is a dissociative disorder, now called dissociative identity disorder. Dissociation becomes a defense mechanism to protect oneself from re-experiencing trauma.
During dissociative fugue - a much better explanation if you want a character to wander into a town with no memory - the person will often run away from their life and commitments. (In fact, the word "fugue" comes from the Latin word for flight.) While in this state, the person may forget who they are and details of their life. It isn't unusual for a person in a fugue to assume another identity, likely because they realize they need one to function. They probably won't appear to be in immediate distress. After all, the fugue was brought on to escape distress, and to most observers, the person will appear normal. The person may eventually come out of the fugue, but it's harder to predict. Unlike a head injury, which has a physical cause, and therefore a clear trajectory of healing, the block here is psychological and will exist as long as the person needs protection from whatever is causing distress.
Finally, there's a type of dissociation/amnesia many have experienced - alcohol-induced blackouts. Once again, short-term and physical in cause. The hangover will probably last longer than the amnesia itself.
Friday, January 26, 2018
Psychology for Writers: On Fight, Flight, and Fear Responses
This will be a new series on my blog, coming out on Fridays. (I may not post one every single week, but they'll always come out on a Friday.) My goal is to impart some wisdom from my discipline - psychology - to help writers, in part because I see a lot of tropes over and over that are not really in line with what we know about human behavior. (And I have so many writer friends with their own expertise - I'd love to see others share some domain knowledge like this to help us write more accurately!)
One of the big tropes I see is how people respond to extreme fear, usually by soiling themselves in some way. George R.R. Martin just loves to have his characters pee and crap themselves, but he's certainly not the only offender. In fact, I just finished Sleeping Beauties by Stephen King (one of my favorite authors) and Owen King, and was really frustrated when once again, the peeing-oneself-in-fear trope reared its ugly head.
Do people pee themselves in fear? Yes they do. It probably doesn't happen anywhere near as often as it does in books, but then we don't usually encounter some of literature's most fear-inducing creatures in real life. So we'll say for the moment that the frequency with which people pee themselves is fine. In fact, that's not even the issue I have.
The issue I have is when this response occurs. In the books (and movies & TV series too), something scary shows up and the character immediately wets him or herself. Nope. That's not when it happens. It would happen after the fear moment has passed - after the creature has been beaten or moved onto another target, just as the character is beginning to breath a sigh of relief. That's right - you pee after you've been afraid, not during.
How could that be? Let me talk to you for a moment about the fight vs. flight response, for which I've created this visual aid:
Regardless of whether you're going into battle or running way, your body response is the same: in response to some stressor, your body is raring up to go. Specifically, the sympathetic nervous system is getting you ready to expend energy, because fighting or flighting, you're going to be expending energy. Your body releases chemicals, like epinephrine (adrenaline), norepinephrine, and cortisol. Your heart rate and blood pressure increase. Your pupils dilate. Your muscles get ready for action.
But as these systems are charging up, other systems are slowing or shutting down completely. Your immune response is suppressed. Sexual arousal is suppressed. Digestion slows or may even stop. And your body stops producing waste materials, like urine. It's not very helpful if, just as you're getting ready to fight for your life, you realize you're hungry or need to pee, or if your allergies act up and you've suddenly got itchy eyes, sneezes, and a runny nose. Those feelings go away, temporarily, so you can focus your energy on the surviving.
But after the stressful period ends, your parasympathetic nervous system takes over. Unlike the sympathetic nervous system's "fight or flight" response, the parasympathetic nervous system's response is "rest and digest." (I've also heard "feed and breed," because of the aforementioned impact on sexual arousal.) Those processes that were slowed or stopped start up again. Your muscles relax. You can imagine where this is going.
The best portrayal of this concept I've ever seen was in The Green Mile: the scene where Wild Bill, one of the inmates on death row, grabs Percy, one of the guards, and threatens him. The other guards get Wild Bill to release Percy, who rushes to the other side of the corridor and collapses on the floor. Wild Bill says with a smirk that he wouldn't have hurt Percy. And that's when Del, a fellow inmate, points out that Percy has just wet himself.
Percy is away from Wild Bill and safe. While the emotions he was experiencing when Wild Bill grabs him are still present (he's still upset), his body is returning to its previous physiological state. And that's when his bladder lets go.
So there you have it - if you really want your character to wet him- or herself, it should happen after the character begins "coming down."
I know one question I've gotten when discussing this in the past is, why do animals pee on someone/something in fear? Remember, animals don't have the same highly evolved nervous system that we have. Their systems are not going to shut down in the same way a human's does. Also, their reaction is not directly to fear, but is (arguably) an evolved response that is pretty effective in getting predators to let them go. Sure, an animal could bite instead, but that's aggressive and likely to earn an aggressive response. These animals are generally built for flight, not fight, and peeing is a submissive response. But for humans and other creatures higher on the food chain, being an apex predator means we'll have much different responses than animals that are not apex predators.
Bonus: you could use this information to portray the opposite. Sometimes authors want to show how badass their character is by having them be completely unfazed by something another character finds terrifying. You could have your character walking into battle commenting that he/she is starving and hopes to get something to eat after finishing with this nonsense, or something along those lines.
I know I blog about statistics a lot on this blog, but I'm always thrilled to talk about my discipline too! Any psychology topics you'd like to see addressed here? (Or any tropes you're tired to seeing?)
One of the big tropes I see is how people respond to extreme fear, usually by soiling themselves in some way. George R.R. Martin just loves to have his characters pee and crap themselves, but he's certainly not the only offender. In fact, I just finished Sleeping Beauties by Stephen King (one of my favorite authors) and Owen King, and was really frustrated when once again, the peeing-oneself-in-fear trope reared its ugly head.
Do people pee themselves in fear? Yes they do. It probably doesn't happen anywhere near as often as it does in books, but then we don't usually encounter some of literature's most fear-inducing creatures in real life. So we'll say for the moment that the frequency with which people pee themselves is fine. In fact, that's not even the issue I have.
The issue I have is when this response occurs. In the books (and movies & TV series too), something scary shows up and the character immediately wets him or herself. Nope. That's not when it happens. It would happen after the fear moment has passed - after the creature has been beaten or moved onto another target, just as the character is beginning to breath a sigh of relief. That's right - you pee after you've been afraid, not during.
How could that be? Let me talk to you for a moment about the fight vs. flight response, for which I've created this visual aid:
Regardless of whether you're going into battle or running way, your body response is the same: in response to some stressor, your body is raring up to go. Specifically, the sympathetic nervous system is getting you ready to expend energy, because fighting or flighting, you're going to be expending energy. Your body releases chemicals, like epinephrine (adrenaline), norepinephrine, and cortisol. Your heart rate and blood pressure increase. Your pupils dilate. Your muscles get ready for action.
But as these systems are charging up, other systems are slowing or shutting down completely. Your immune response is suppressed. Sexual arousal is suppressed. Digestion slows or may even stop. And your body stops producing waste materials, like urine. It's not very helpful if, just as you're getting ready to fight for your life, you realize you're hungry or need to pee, or if your allergies act up and you've suddenly got itchy eyes, sneezes, and a runny nose. Those feelings go away, temporarily, so you can focus your energy on the surviving.
But after the stressful period ends, your parasympathetic nervous system takes over. Unlike the sympathetic nervous system's "fight or flight" response, the parasympathetic nervous system's response is "rest and digest." (I've also heard "feed and breed," because of the aforementioned impact on sexual arousal.) Those processes that were slowed or stopped start up again. Your muscles relax. You can imagine where this is going.
The best portrayal of this concept I've ever seen was in The Green Mile: the scene where Wild Bill, one of the inmates on death row, grabs Percy, one of the guards, and threatens him. The other guards get Wild Bill to release Percy, who rushes to the other side of the corridor and collapses on the floor. Wild Bill says with a smirk that he wouldn't have hurt Percy. And that's when Del, a fellow inmate, points out that Percy has just wet himself.
Percy is away from Wild Bill and safe. While the emotions he was experiencing when Wild Bill grabs him are still present (he's still upset), his body is returning to its previous physiological state. And that's when his bladder lets go.
So there you have it - if you really want your character to wet him- or herself, it should happen after the character begins "coming down."
I know one question I've gotten when discussing this in the past is, why do animals pee on someone/something in fear? Remember, animals don't have the same highly evolved nervous system that we have. Their systems are not going to shut down in the same way a human's does. Also, their reaction is not directly to fear, but is (arguably) an evolved response that is pretty effective in getting predators to let them go. Sure, an animal could bite instead, but that's aggressive and likely to earn an aggressive response. These animals are generally built for flight, not fight, and peeing is a submissive response. But for humans and other creatures higher on the food chain, being an apex predator means we'll have much different responses than animals that are not apex predators.
Bonus: you could use this information to portray the opposite. Sometimes authors want to show how badass their character is by having them be completely unfazed by something another character finds terrifying. You could have your character walking into battle commenting that he/she is starving and hopes to get something to eat after finishing with this nonsense, or something along those lines.
I know I blog about statistics a lot on this blog, but I'm always thrilled to talk about my discipline too! Any psychology topics you'd like to see addressed here? (Or any tropes you're tired to seeing?)
Friday, December 15, 2017
The Power of the Human Voice
Human beings are drawn to the sound of human voices. It's why overhearing half of a conversation can be so distracting. It's why DJs will talk over the intro of the song, but make sure they stop before the singer comes in. It's why Deke Sharon and Dylan Bell, two a cappella arrangers, recommend arrangements be kept short (less than 4 minutes).
And new research shows yet another way a human voice can have a powerful impact - it keeps us from dehumanizing someone we disagree with:
This research inspires some interesting questions. For instance, what about computer-generated voices? We know we're getting better at generating realistic voices, but what is the impact when you know the voice is generated by a machine and not another human being? Also, the researchers admit that they couldn't test the impact of visual and audio cues separately. But what if you had an additional condition where you see the person, but their words are displayed as captions instead?
What are your thoughts on this issue? And where would you like to see this research go in the future?
And new research shows yet another way a human voice can have a powerful impact - it keeps us from dehumanizing someone we disagree with:
[F]ailing to infer that another person has mental capacities similar to one’s own is the essence of dehumanization—that is, representing others as having a diminished capacity to either think or feel, as being more like an animal or an object than like a fully developed human being. Instead of attributing disagreement to different ways of thinking about the same problem, people may attribute disagreement to the other person’s inability to think reasonably about the problem. [W]e suggest that a person’s voice, through speech, provides cues to the presence of thinking and feeling, such that hearing what a person has to say will make him or her appear more humanlike than reading what that person has to say.They conducted four experiments to test their hypotheses: that dehumanization is less likely to occur when we hear the person speaking their thoughts, rather than simply reading them. It wasn't even necessary to see the person doing the talking - that is, video and audio versus audio only did not result in reliably different evaluations. The authors conclude:
On a practical level, our work suggests that giving the opposition a voice, not just figuratively in terms of language, but also literally in terms of an actual human voice, may enable partisans to recognize a difference in beliefs between two minds without denigrating the minds of the opposition. Modern technology is rapidly changing the media through which people interact, enabling interactions between people around the globe and across ideological divides who might otherwise never interact. These interactions, however, are increasingly taking place over text-based media that may not be optimally designed to achieve a user’s goals. Individuals should choose the context of their interactions wisely. If mutual appreciation and understanding of the mind of another person is the goal of social interaction, then it may be best for the person’s voice to be heard.
This research inspires some interesting questions. For instance, what about computer-generated voices? We know we're getting better at generating realistic voices, but what is the impact when you know the voice is generated by a machine and not another human being? Also, the researchers admit that they couldn't test the impact of visual and audio cues separately. But what if you had an additional condition where you see the person, but their words are displayed as captions instead?
What are your thoughts on this issue? And where would you like to see this research go in the future?
Friday, March 10, 2017
The Truth About Your Brain
I've blogged in the past about Ben Carson, and all the problems when he tries to demonstrate expertise in an area that isn't neuroscience or medicine more generally. So I'm surprisingly not surprised by this speech where he got a lot of things wrong about the human brain:
Memory is tricky. Your brain isn't a recorder or a computer that commits everything that ever happens to you to some storage compartment. An electrode or hypnosis or sharp blow to the head won't suddenly make these instances come flooding back, and even if they did, those instances would probably be horribly inaccurate. Your brain is a complex organ inside an unbelievably complex system that allows us to navigate the world and have a semblance of self by actively interpreting what we encounter. It isn't the book we read that gets committed to memory - it's our brain's interpretation of it and how we connect it to previously learned information that (sometimes, not always) gets written to memory.
In fact, all our memories are interpretations, with our brain filling things in with previous experience and expectations. And it isn't just during encoding that mistakes can be introduced; it's during retrieval as well. Have you ever remembered a time in your past and somehow remember a person being there you didn't even know at the time? This happens to me a lot. There's no way I could have even known about that person then, let alone remember seeing them. But my present life gets mixed up with the past. It's a little like writing a paper and saving it to your computer only to find years later that it was accidentally merged with newer files. That's what your brain is like.
And a lot of research has shown how easy it is to implant false memories that feel just as - maybe more - real than actual memories.
Carson's speech was apparently extemporaneous, but still, when talking about something you've dedicated your life to, you should be able to talk off-the-cuff without resorting to misinformation and tropes from bad movies:
It remembers everything you’ve ever seen. Everything you’ve ever heard. I could take the oldest person here, make a hole right here on the side of the head, and put some depth electrodes into their hippocampus and stimulate, and they would be able to recite back to you verbatim a book they read 60 years ago. It’s all there; it doesn’t go away.Ben, please stop. You're making everyone with a doctorate look bad.
Memory is tricky. Your brain isn't a recorder or a computer that commits everything that ever happens to you to some storage compartment. An electrode or hypnosis or sharp blow to the head won't suddenly make these instances come flooding back, and even if they did, those instances would probably be horribly inaccurate. Your brain is a complex organ inside an unbelievably complex system that allows us to navigate the world and have a semblance of self by actively interpreting what we encounter. It isn't the book we read that gets committed to memory - it's our brain's interpretation of it and how we connect it to previously learned information that (sometimes, not always) gets written to memory.
In fact, all our memories are interpretations, with our brain filling things in with previous experience and expectations. And it isn't just during encoding that mistakes can be introduced; it's during retrieval as well. Have you ever remembered a time in your past and somehow remember a person being there you didn't even know at the time? This happens to me a lot. There's no way I could have even known about that person then, let alone remember seeing them. But my present life gets mixed up with the past. It's a little like writing a paper and saving it to your computer only to find years later that it was accidentally merged with newer files. That's what your brain is like.
And a lot of research has shown how easy it is to implant false memories that feel just as - maybe more - real than actual memories.
Carson's speech was apparently extemporaneous, but still, when talking about something you've dedicated your life to, you should be able to talk off-the-cuff without resorting to misinformation and tropes from bad movies:
Tuesday, February 28, 2017
Top-Down, Bottom-Up, and the ABCD of Personality
I've blogged many times about the human brain, taking time to discuss the various brain regions and what behaviors and processes they control. Your brain is an amazing demonstration of evolution in action, even in terms of its structure.
The lowest parts of the brain (the hindbrain - the cerebellum, pons, and medulla oblongata) control the basics of life: breathing, heartbeat, sleep, swallowing, bladder control, movement, etc. The midbrain/forebrain* controls processes that rank a little higher on the continuum, but still not what we'd consider high-level processing: emotion, sleep-wake cycle and arousal, temperature regulation, and the transfer of short-term to long-term memory (the very basics of learning), among other things.
Finally, the cerebral cortex, the outer-most part of the brain that developed last evolutionarily speaking; it is responsible for what we call consciousness, and this part of the brain in particular is responsible for many of the traits that differentiate humans from other animals - memory, attention, language, and perception. Other animals have a cerebral cortex as well but not nearly as developed as our own.
These various brain structures work together, and sometimes a lower part of the brain will take over for the higher parts of the brain, especially when there is some kind of disorder of higher brain function. Sandeep Gautam over at The Mouse Trap discusses the work of Paul McClean, and refers to activity coming from the lower brain areas as "bottom-up" and activity from the higher brain areas as "top-down." In his post, he discusses the ABCDs - affect (emotion), behavior, cognition (thought), and desire - and links these bottom-up/top-down processes to different personality traits, offering an eight-part structure of personality: a bottom-up and top-down trait for each of the ABCDs:
This structure is a departure from the Big Five personality traits. Obviously, it includes those 5 (Extroversion, Neuroticism, Conscientiousness, Openness to Experience, and Agreeableness), but adds 3 more (Impulsivity, Imagination, and Honesty-Humility). As I've mentioned before, I'm a big fan of the Big Five (more on that here), so I find this new structure interesting but a little strange. Probably what is strangest to me is that 3 of the Big Five are considered bottom-up processes, rather than the more thoughtful, controlled top-down. I would have thought Agreeableness and Openness to Experience were the result of higher-level processing.
It's a somewhat artificial divide of course. Except in the case of injury to a higher-level part of the brain, even bottom-up processes are going to be shaped by higher-level thinking. Your degree of Introversion/Extroversion, for instance, may influence your most basic response to social stimuli, but it's going to take higher-level processing to understand how best to handle that reaction and also determine what you need in that situation (that is, I'm feeling X, so do I need alone time or time with others?).
What do you think about this new taxonomy?
*These two areas tend to be differentiated from each other, but I was always taught about them in combination, under the title "midbrain." The forebrain includes structures like the amygdala, hippocampus, and so on. They rank higher up than the hindbrain, but are still considered "subcortical."
The lowest parts of the brain (the hindbrain - the cerebellum, pons, and medulla oblongata) control the basics of life: breathing, heartbeat, sleep, swallowing, bladder control, movement, etc. The midbrain/forebrain* controls processes that rank a little higher on the continuum, but still not what we'd consider high-level processing: emotion, sleep-wake cycle and arousal, temperature regulation, and the transfer of short-term to long-term memory (the very basics of learning), among other things.
Finally, the cerebral cortex, the outer-most part of the brain that developed last evolutionarily speaking; it is responsible for what we call consciousness, and this part of the brain in particular is responsible for many of the traits that differentiate humans from other animals - memory, attention, language, and perception. Other animals have a cerebral cortex as well but not nearly as developed as our own.
These various brain structures work together, and sometimes a lower part of the brain will take over for the higher parts of the brain, especially when there is some kind of disorder of higher brain function. Sandeep Gautam over at The Mouse Trap discusses the work of Paul McClean, and refers to activity coming from the lower brain areas as "bottom-up" and activity from the higher brain areas as "top-down." In his post, he discusses the ABCDs - affect (emotion), behavior, cognition (thought), and desire - and links these bottom-up/top-down processes to different personality traits, offering an eight-part structure of personality: a bottom-up and top-down trait for each of the ABCDs:
- Affective
- Bottom-Up: How we respond to stimuli, specifically Introversion/Extroversion
- Top-Down: Analyzing the situation for things that require increased vigilance and potentially anxiety, a trait called Neuroticism (aka: Emotionality)
- Behavioral
- Bottom-Up: Basic response to stimuli, Impulsivity or Impulsive Sensation Seeking
- Top-Down: A more thoughtful response to stimuli, including considering how that response might impact oneself and others, which could lead to inhibition. This trait is known as Conscientiousness
- Cognition
- Bottom-Up: Degree of distractibility or focus when encountering new things, which manifests as the trait Openness to Experience
- Top-Down: Making connections between concepts, a trait known as Imagination
- Desire/Drives
- Bottom-Up: Degree of aggression in one's reactions, a trait known as Agreeableness
- Top-Down: A process driven by expectation, which impacts one's desire to help or hurt others. He refers to this trait as the Honesty-Humility dimension
This structure is a departure from the Big Five personality traits. Obviously, it includes those 5 (Extroversion, Neuroticism, Conscientiousness, Openness to Experience, and Agreeableness), but adds 3 more (Impulsivity, Imagination, and Honesty-Humility). As I've mentioned before, I'm a big fan of the Big Five (more on that here), so I find this new structure interesting but a little strange. Probably what is strangest to me is that 3 of the Big Five are considered bottom-up processes, rather than the more thoughtful, controlled top-down. I would have thought Agreeableness and Openness to Experience were the result of higher-level processing.
It's a somewhat artificial divide of course. Except in the case of injury to a higher-level part of the brain, even bottom-up processes are going to be shaped by higher-level thinking. Your degree of Introversion/Extroversion, for instance, may influence your most basic response to social stimuli, but it's going to take higher-level processing to understand how best to handle that reaction and also determine what you need in that situation (that is, I'm feeling X, so do I need alone time or time with others?).
What do you think about this new taxonomy?
*These two areas tend to be differentiated from each other, but I was always taught about them in combination, under the title "midbrain." The forebrain includes structures like the amygdala, hippocampus, and so on. They rank higher up than the hindbrain, but are still considered "subcortical."
Tuesday, February 7, 2017
Booze Clues
Human beings have been enjoying booze for 9,000 years, according to an article in this month's National Geographic. Though previously, alcoholic beverages were considered consumables, the author pulls together evidence that argues these beverages have a special place in our culture, being connected to important traditions and even inspiring us and pushing us forward as a species:
The article also features a great infographic about alcohol throughout history:
All over the world, in fact, evidence for alcohol production from all kinds of crops is showing up, dating to near the dawn of civilization. University of Pennsylvania biomolecular archaeologist Patrick McGovern believes that’s not an accident. From the rituals of the Stone Age on, he argues, the mind-altering properties of booze have fired our creativity and fostered the development of language, the arts, and religion. Look closely at great transitions in human history, from the origin of farming to the origin of writing, and you’ll find a possible link to alcohol. “There’s good evidence from all over the world that alcoholic beverages are important to human culture,” McGovern says. “Thirty years ago that fact wasn’t as recognized as it is now.” Drinking is such an integral part of our humanity, according to McGovern, that he only half jokingly suggests our species be called Homo imbibens.Drunken Monkey would be a great name for a brewery. Just sayin'.
The active ingredient common to all alcoholic beverages is made by yeasts: microscopic, single-celled organisms that eat sugar and excrete carbon dioxide and ethanol, the only potable alcohol. That’s a form of fermentation. Most modern makers of beer, wine, or sake use cultivated varieties of a single yeast genus called Saccharomyces (the most common is S. cerevisiae, from the Latin word for “beer,” cerevisia). But yeasts are diverse and ubiquitous, and they’ve likely been fermenting ripe wild fruit for about 120 million years, ever since the first fruits appeared on Earth.
From our modern point of view, ethanol has one very compelling property: It makes us feel good. Ethanol helps release serotonin, dopamine, and endorphins in the brain, chemicals that make us happy and less anxious.
To our fruit-eating primate ancestors swinging through the trees, however, the ethanol in rotting fruit would have had three other appealing characteristics. First, it has a strong, distinctive smell that makes the fruit easy to locate. Second, it’s easier to digest, allowing animals to get more of a commodity that was precious back then: calories. Third, its antiseptic qualities repel microbes that might sicken a primate. Millions of years ago one of them developed a taste for fruit that had fallen from the tree. “Our ape ancestors started eating fermented fruits on the forest floor, and that made all the difference,” says Nathaniel Dominy, a biological anthropologist at Dartmouth College. “We’re preadapted for consuming alcohol.”
Robert Dudley, the University of California, Berkeley physiologist who first suggested the idea, calls it the “drunken monkey” hypothesis.
The article also features a great infographic about alcohol throughout history:
Labels:
beer,
human brain,
identity,
science,
trivial
Tuesday, November 15, 2016
There Are No Old Dogs, Just New Tricks
Throughout my adult life, I've run into many people who talk about wanting to learn some new skill: a new language, some topic of study they didn't get to in college, a new instrument, etc. When I've encouraged them to go for it, I often hear the old adage, "You can't teach an old dog new tricks." As a social psychologist with a great deal of training in behaviorism, I know for many things, that just isn't true. It might be more difficult to learn some of these new skills in adulthood, but it's certainly not impossible. In fact, your brain is still developing even into early adulthood. And the amazing concept of brain plasticity (the brain changing itself) means that a variety of cognitive changes, including learning, can continue even past that stage.
A new study in Psychological Science examined training in different skills, comparing individuals from ages 11 to 33, and found that some skills are better learned in adulthood. They included three types of training: numerosity-discrimination (specifically in this study, "the ability to rapidly approximate and compare the number of items within two different sets of colored dots presented on a gray background"), relational reasoning (identifying abstract relationships between items, sort of like "one of these things is not like the other"), and face perception (identifying when two faces presented consecutively on a screen are of the same person or different people). They also measured verbal working memory with a backward digit-span test, which involved presentation of a series of digits that participants had to recall later in reverse order.
Participants completed no more than 1 training session per day over 20 days, through an online training platform. They were told to use an internet-enabled device other than a smartphone (so a computer or tablet). The tasks were adaptive, so that performance on the previous task determined the difficulty of the next task. To compare this training program to one you're probably familiar with, what they created is very similar to Lumosity.
They found that training improved performance on all three tasks (looking at the group overall, controlling for the number of training sessions the participants actually completed). But when they included age as a variable, they found some key differences. The improvements they saw in numerosity-discrimination were due mainly to the results of the adult age group; when adults were excluded, and they only looked at adolescents, the effects became non-significant. The same was also true in relational-reasoning performance. Though there was a trend toward more improvements among adults on face-perception, these differences were not significant. You can take a look at accuracy by skill, testing session, and age group below (asterisks indicate a significant difference):
Another key finding was that there was no evidence of transfer effects - that is, receiving ample training in one task had no impact on performance in a different task. This supports something psychologists have long argued, much to the chagrin of companies that create cognitive ability training programs (ahem, Lumosity): training in a cognitive skill improves your ability in that skill specifically, and doesn't cause any generalized improvement. That's not to say doing puzzles is bad for you - it's great, but it's not going to suddenly improve your overall cognitive abilities.
But the key finding from this study is not only that "old dogs" can learn new tricks, but that for some tricks, older really is better.
EDIT: I accidentally omitted the link to the study abstract. But the good news is, I discovered it's available full-text for free! Link is now included.
A new study in Psychological Science examined training in different skills, comparing individuals from ages 11 to 33, and found that some skills are better learned in adulthood. They included three types of training: numerosity-discrimination (specifically in this study, "the ability to rapidly approximate and compare the number of items within two different sets of colored dots presented on a gray background"), relational reasoning (identifying abstract relationships between items, sort of like "one of these things is not like the other"), and face perception (identifying when two faces presented consecutively on a screen are of the same person or different people). They also measured verbal working memory with a backward digit-span test, which involved presentation of a series of digits that participants had to recall later in reverse order.
Participants completed no more than 1 training session per day over 20 days, through an online training platform. They were told to use an internet-enabled device other than a smartphone (so a computer or tablet). The tasks were adaptive, so that performance on the previous task determined the difficulty of the next task. To compare this training program to one you're probably familiar with, what they created is very similar to Lumosity.
They found that training improved performance on all three tasks (looking at the group overall, controlling for the number of training sessions the participants actually completed). But when they included age as a variable, they found some key differences. The improvements they saw in numerosity-discrimination were due mainly to the results of the adult age group; when adults were excluded, and they only looked at adolescents, the effects became non-significant. The same was also true in relational-reasoning performance. Though there was a trend toward more improvements among adults on face-perception, these differences were not significant. You can take a look at accuracy by skill, testing session, and age group below (asterisks indicate a significant difference):
Another key finding was that there was no evidence of transfer effects - that is, receiving ample training in one task had no impact on performance in a different task. This supports something psychologists have long argued, much to the chagrin of companies that create cognitive ability training programs (ahem, Lumosity): training in a cognitive skill improves your ability in that skill specifically, and doesn't cause any generalized improvement. That's not to say doing puzzles is bad for you - it's great, but it's not going to suddenly improve your overall cognitive abilities.
But the key finding from this study is not only that "old dogs" can learn new tricks, but that for some tricks, older really is better.
EDIT: I accidentally omitted the link to the study abstract. But the good news is, I discovered it's available full-text for free! Link is now included.
Thursday, October 27, 2016
Lying Liars Who Lie
In the words of one of my favorite characters, Gregory House:
One lie can easily turn into two, and small lies can easily become big lies. And today I encountered some recent research that suggests why.
Whenever you tell a lie, you experience a little twinge of emotion - usually guilt. That guilt may not be enough to keep you from lying, especially if the lie benefits you and does not necessarily hurt someone else. And past research has shown that increased exposure decreases our emotional response over time. So just like your first break-up is likely to hurt a lot more than your fifth break-up (cue "The First Cut is the Deepest"), the guilt you feel from your first lie is going to be much greater than the guilt you feel after lie #793.
To test this hypothesis, and get at the specific brain response to lies, researchers had people participate in a game while undergoing functional Magnetic Resonance Imaging (fMRI):
So, to answer Liz Phair's question, "Why I Lie?" the answer is: it just keeps getting easier.
One lie can easily turn into two, and small lies can easily become big lies. And today I encountered some recent research that suggests why.
Whenever you tell a lie, you experience a little twinge of emotion - usually guilt. That guilt may not be enough to keep you from lying, especially if the lie benefits you and does not necessarily hurt someone else. And past research has shown that increased exposure decreases our emotional response over time. So just like your first break-up is likely to hurt a lot more than your fifth break-up (cue "The First Cut is the Deepest"), the guilt you feel from your first lie is going to be much greater than the guilt you feel after lie #793.
To test this hypothesis, and get at the specific brain response to lies, researchers had people participate in a game while undergoing functional Magnetic Resonance Imaging (fMRI):
Specifically, we adapted a two-party task used previously to elicit and measure dishonesty. Participants advised a second participant, played by a confederate, about the amount of money in a glass jar filled with pennies. We changed the incentive structure over the course of two experiments such that dishonesty about the amount of money in the jar would either benefit the participant at the expense of their partner (Self-serving–Other-harming, Experiment 1), benefit both (Self-serving–Other-serving, Experiment 1), benefit the partner at the expense of the participant (Self-harming–Other-serving, Experiment 1), benefit the participant only without affecting the partner (Self-serving, Experiment 2) or benefit the partner only without affecting the participant (Other-serving, Experiment 2). Importantly, the participants believed that their partner was not aware of this incentive structure but thought that they were working together at all times to provide the most accurate estimate, which would benefit them both equally. A baseline condition enabled us to infer the amount of dishonesty on each trial without the participant being instructed to act dishonestly or required to admit to dishonesty.The researchers observed dishonesty escalation - that people became more dishonest over time - when it was self-serving. This was the case even when the lie hurt the other person, though people were more likely to be dishonest when served the other as well. (So they still lied if it hurt the other, but not as much as if it helped the other.) Results from the brain scan showed reduced amygdala activity over time. As I've blogged about previously, the amygdala, part of the mid-brain, is involved in emotional response.
So, to answer Liz Phair's question, "Why I Lie?" the answer is: it just keeps getting easier.
Sunday, October 23, 2016
Trying to Be Less Dumb
Previously, I blogged about a book called You Are Not So Smart. Last night, I started the sequel, You Are Now Less Dumb:
As the previous book, it deals with cognitive biases, which I've blogged about before. But while the purpose of the previous book was to make you aware of these various biases, the goal of this book is to help you "learn from failings" and "feel more connected with the community of humanity." Less succinctly:
Applying g-forces to the human body can have many interesting effects - not just pushing the body around, but also (when one "pulls too many g's") keeping blood from getting to your brain. As a result, you pass out. The Air Force and places like NASA use centrifuges as they teach pilots techniques they can use to keep blood in their brain, and keep themselves from passing out. During practice, many of the pilots pass out, and interestingly, they may report visions and hallucinations - which sound strikingly similar to visions reported by people with "near-death experiences":
As the previous book, it deals with cognitive biases, which I've blogged about before. But while the purpose of the previous book was to make you aware of these various biases, the goal of this book is to help you "learn from failings" and "feel more connected with the community of humanity." Less succinctly:
You think seeing is believing, that your thoughts are always based on reasonable intuitions and rational analysis, and that though you may falter and err from time to time, for the most part you stand as a focused, intelligent operator of the most complicated nervous system on earth. You believe that your abilities are sound, your memories are perfect, your thoughts are rational and wholly conscious, the story of your life true and accurate, and your personality stable and stellar. The truth is that your brain lies to you. Inside your skull is a vast and far-reaching personal conspiracy to keep you from uncovering the facts about who you actually are, how capable you tend to be, and how confident you deserve to feel.You may wonder why, as a psychologist who is familiar with much of this research, I would read this book. Wouldn't it make sense for someone without that knowledge and education? But the thing I love about reading these types of books, especially when they accurately discuss the research, is that I come away with new insights and connections. I also enjoy some of the different anecdotes and findings they bring in from other fields, that help give me a wider view and understanding. For instance, in the first chapter, McRaney talks about the importance of narrative in understanding ourselves and others. Part of his evidence for the importance of these narratives? Centrifuges.
Applying g-forces to the human body can have many interesting effects - not just pushing the body around, but also (when one "pulls too many g's") keeping blood from getting to your brain. As a result, you pass out. The Air Force and places like NASA use centrifuges as they teach pilots techniques they can use to keep blood in their brain, and keep themselves from passing out. During practice, many of the pilots pass out, and interestingly, they may report visions and hallucinations - which sound strikingly similar to visions reported by people with "near-death experiences":
The tunnel, the white light, friends and family coming to greet you, memories zooming around--the pilots experienced all of this. In addition, the centrifuge was pretty good at creating out-of-body experiences.
As Whinnery and other researchers have speculated, the near-death and out-of-body phenomena are both actually the subjective experience of a brain owner watching as his brain tries desperately to figure out what is happening and to orient itself amid its systems going haywire due to oxygen deprivation. Without the ability to map out its borders, the brain often places consciousness outside the head, in a field, swimming in a lake, fighting a dragon--whatever it can connect together as the walls crumble. What the deoxygenated pilots don't experience is a smeared mess of random images and thoughts.
Narrative is so important to survival that it is literally the last thing you give up before becoming a sack of meat.
Friday, October 21, 2016
How a Picture of Margaret Thatcher Demonstrates the Way We Process Faces
You may remember seeing this picture of Margaret Thatcher in your introductory psychology textbook:
Notice anything odd about either of these pictures (other than being upside-down, of course)? I'll give you a hint that there is something wrong, and it's much more obvious if you look at it right-side up:
Obvious, and also slightly terrifying. Thatcher's mouth and eyes have been inverted in the picture on the left. It's completely obvious when the face is viewed right-side up, but often unnoticeable when viewed upside down. This demonstration, known as the Thatcher Illusion, was first demonstrated in 1980 by Professor Peter Thompson from the University of York. But this demonstration was about more than just making the then Prime Minister look funny. It helped show the way we process faces.
Previously, people speculated that we processed faces piecemeal - taking in each feature. But Thompson argued that we take a holistic view of the face, taking in the overall look and configuration, as well as noting the individual features. When the faces are presented in a way we rarely see (upside-down), we look at the individual features, but since we're used to seeing them right-side up, they appear correctly placed. When the images are flipped right-side up again, we are able to immediately see the issues with the configuration.
Notice anything odd about either of these pictures (other than being upside-down, of course)? I'll give you a hint that there is something wrong, and it's much more obvious if you look at it right-side up:

Obvious, and also slightly terrifying. Thatcher's mouth and eyes have been inverted in the picture on the left. It's completely obvious when the face is viewed right-side up, but often unnoticeable when viewed upside down. This demonstration, known as the Thatcher Illusion, was first demonstrated in 1980 by Professor Peter Thompson from the University of York. But this demonstration was about more than just making the then Prime Minister look funny. It helped show the way we process faces.
Previously, people speculated that we processed faces piecemeal - taking in each feature. But Thompson argued that we take a holistic view of the face, taking in the overall look and configuration, as well as noting the individual features. When the faces are presented in a way we rarely see (upside-down), we look at the individual features, but since we're used to seeing them right-side up, they appear correctly placed. When the images are flipped right-side up again, we are able to immediately see the issues with the configuration.
Thursday, October 13, 2016
Sweet Dreams Are Made of This
Worth Psychology recently tweeted this link to a New York Magazine story, "5 of Humanity's Best Ideas of What Dreams Actually Are." Believe it or not, I learned a thing or two in the article:
While we're awake, we think through things and examine patterns, but we also have to devote a lot of energy to being awake and interacting with the world, making decisions that takes up space in our limited working memory. Sleep is a time of rejuvenation, when our body replenishes, heals, and repairs. Our cognitive resources are free to work through problems, and our brain is consolidating memories, getting us closer to a synthesized solution. And because our problems range from intellectual (like Descartes's) to social and psychological, the content of our dreams can also range from thought problems to interpersonal issues.
Sweet dreams, everyone!
The earliest recorded dream is from the Sumerian king Dumuzi of Uruk, who ruled just before Gilgamesh, sometime around 2500 BC. “An eagle seizes a lamb from the sheepfold,” a translation reads. “A falcon catches a sparrow on the reed fence … The cup lies on its side; Dumuzi lives no more. The sheepfold is given to the winds.” The king was freaked out about his dream, and occasioned the first recorded dream interpretation, care of his sister, who was evidently a professional at these things. Sister’s advice: Some bad shit is about to go down, so you’d do well to hide.The article then goes through the 5 theories, offering some background and explanation for each. Unfortunately, they give a lot of attention to the Freudian/psychoanalytic perspective, that dreams are your brain trying to tell you something.
- Dreams are pragmatic prophecies - This is not to say dreams are actual prophecies, but rather, that because we know our situation well and because human survival comes from our ability to think about and prepare for the future, we could have dreams about things that we expect to happen. Dreams in this theory are about preparation, so that we are ready to deal with situations in our wakeful life.
- Dreams tell you what to do - Unfortunately, for this item on the list, they really just followed along with the previous item, even delving into actual prophetic elements (such as mentioning that Lincoln dreamed of a White House funeral days before his assassination). However, one story they share, about Descartes, gets at a different aspect of dreams - that of consolidating and synthesizing information, and using that to solve problems:
"In the 17th century, René Descartes, the great doubter, had his life course shifted by a series of dreams he had one November evening. In Soul Machine: The Invention of the Modern Mind, historian-psychiatrist George Makari reports that Descartes had a series of sleeping visions that prompted him to realize that 'spatial problems could become algebraic, which crystallized a vision of a natural world underwritten by mathematical laws,' thereby changing his life and eventually the popular, scientific conception of reality."
This realization was likely not a sudden insight, but an issue Descartes had spent a great deal of time thinking about. Because of the memory consolidation aspect of sleep, combined with dreaming about something that had been on his mind, Descartes was able to solve his mental puzzle.
- Dreams are communications from the unconscious mind - The good old "Your dreams are trying to tell you something" hypothesis. I've blogged before about my thoughts on Freud and psychoanalysis, and why social psychological findings are a much better explanation for some of the things Freud and his ilk observed. And of course, I would argue that this "theory" differs very little from the first and second items on this list. (I would also argue that none of these first three items meet the definition of theory, hence the quote marks.)
- Dreams are data - This one is not actually a theory at all, but rather a thinly veiled reason to share the Sleep and Dream Database, a crowdsourcing site that has catalogued 20,000 dreams (which, to be fair, is a cool idea, and a site I'll be visiting myself). Using these data, researchers have been able to examine psychological themes, and the results suggest that, because we are rarely alone in dreams and we tend to dream about people we are close to, we use dreams to explore the quality of our relationships with others. Honestly, that would have been a better bullet point for this list.
- Dreams are your memories in action - And finally, the author mentions the memory consolidation and learning aspects of dreams. Of all these "theories," this is the only one for which they offer research support over anecdotes. The author hints at the neural network aspect of memory, and also shares findings from a study on learning that allowed some participants to nap afterward, resulting in enhanced learning. (In fact, many such studies have been performed.) But they also discuss a study on male zebra finches that also provides support for this theory. Birds aren't born knowing how to sing certain patterns, but instead learn them. Researchers at the University of Chicago found that male zebra finches showed the same patterns of neuron firing while sleeping as they were when they were singing, suggesting they may be practicing the songs in their dreams.
While we're awake, we think through things and examine patterns, but we also have to devote a lot of energy to being awake and interacting with the world, making decisions that takes up space in our limited working memory. Sleep is a time of rejuvenation, when our body replenishes, heals, and repairs. Our cognitive resources are free to work through problems, and our brain is consolidating memories, getting us closer to a synthesized solution. And because our problems range from intellectual (like Descartes's) to social and psychological, the content of our dreams can also range from thought problems to interpersonal issues.
Sweet dreams, everyone!
Wednesday, October 12, 2016
Don't Tell Momma What You Know
Not long ago, a post went viral that shouted to the masses "New Research Establishes That Intelligence Is Inherited From The Mother." Hopefully everyone called their mom and thanked her for this incredible gift, not because it's true (it's not), but because it's always good to call your mom and thank her for something.
Not the point of the post. The point is that, no, you don't inherit your intelligence from your mom. At least not completely. In a recent post for Forbes, science writer Emily Willingham ripped this post to shreds:
But more than that, she offers a far more complex view of intelligence, reminding her audience that only about half of what we call intelligence comes from genes:
I saw many of my friends, including fellow scientists, share this hot mess of a post. Why? Willingham has one potential explanation:
In my case, I know my mom is responsible for a lot of my intelligence, not just from genetics, but from offering an enriching world. She would teach me new words and ask me to try to use them in a sentence that day. I was probably the only first-grader who knew "tumultuous." She taught me the difference between independent and dependent variables when I entered a science fair in 7th grade. And she instilled me the curiosity that continues to influence my efforts today. She's also the reason I write.
Regardless of what genes were passed on (or not), there's a lot we can do for children.
Not the point of the post. The point is that, no, you don't inherit your intelligence from your mom. At least not completely. In a recent post for Forbes, science writer Emily Willingham ripped this post to shreds:
A garbled post from a website called Second Nexus has gone viral in my feeds (and possibly yours), likely because of its eye-catching headline claim that “New Research Establishes That Intelligence Is Inherited From The Mother.” The piece is bylined “Editorial Staff,” presumably because everyone was too embarrassed to put a real name on it.Ouch. She goes on to offer some education on genetics, including that 1) women "tend to" have two X chromosomes (but not always), 2) one of those X's had to come from the father, 3) and having two X's doesn't double your odds for receiving some trait because cells might shut down most of one X, reducing its influence.
But more than that, she offers a far more complex view of intelligence, reminding her audience that only about half of what we call intelligence comes from genes:
While maybe half of our intelligence as we currently define and measure it is inherited, that proportion is in turn fractured into many many genetic variants scattered across our genomes. These variants operate together in various ways to form what we view as intelligence. And each of those fragments of heredity that contributes is itself subject to a host of environmental factors, both in its immediate molecular world and inputs to the whole organism, that will influence function. And that influence continues after birth as an ongoing mutual interplay of gene variants and environment. It’s layer upon layer upon layer of interacting pieces. So no. Not just your mother. Not just the X chromosome. Not even just genes.Willingham then traces back the sources the original post used to make its claims, including Psychology Spot (a blog she calls "a dumpster fire of poor information about genetics and embryonic development"), and Cosmopolitan.
I saw many of my friends, including fellow scientists, share this hot mess of a post. Why? Willingham has one potential explanation:
Those headlines. It’s an irresistible invitation to humblebragging, whether you have a mother and think you’re a genius or you are a mother and think your children are geniuses or you’re feeling feminist and want to stake a claim that women bring the smarts to this world. That’s a pretty solid built-in audience ready to click…and share.This is not to say mothers aren't important for their children's development and intelligence. About half of our intelligence comes from genes, and short of genetic engineering, there's not a lot we can do about that. But the other half seem to be environmentally influenced. Mothers (and fathers, and really anyone interacting with children) have a lot more control here. For instance, some research shows that offering growing infants lots of stimulation can result in more developed brains - the environment can actually influence the physical. Taking children out to see the world, and experience new things, does enrich them and boost their intelligence. (Note: this is real stimulation, not on a screen - though some would argue television has deleterious effects on children, the most we can say from research is that it has no effect. And don't get me started on the Baby Einstein videos...)
In my case, I know my mom is responsible for a lot of my intelligence, not just from genetics, but from offering an enriching world. She would teach me new words and ask me to try to use them in a sentence that day. I was probably the only first-grader who knew "tumultuous." She taught me the difference between independent and dependent variables when I entered a science fair in 7th grade. And she instilled me the curiosity that continues to influence my efforts today. She's also the reason I write.
Regardless of what genes were passed on (or not), there's a lot we can do for children.
Tuesday, October 11, 2016
What Would I Do If I Could Feel?
In one of my favorite childhood stories, The (Wonderful) Wizard of Oz, the tin man was unable to feel emotions because he did not have a heart. He, Dorothy (with her dog, Toto) the Scarecrow, and the Cowardly Lion make their way to see the Wizard of Oz in the Emerald City so that each can get their heart's desire (pun fully intended). Though in the book, the Wizard fools each character into thinking he has actually given them what they want, in the MGM movie version as well as one of the best adaptations of the source material (The Wiz), the realization is that each character had what s/he wanted all along. They just didn't see it, and their thinking had blinded them to the strengths they already possessed.
Though these are just stories, research suggests that how we think about ourselves and our abilities impacts our reality. It can actually change us, for better or for worse. Today in "This Week in Psychological Science," I learned about a new study that finds how we think about emotions actually impacts how they are represented (and experienced) in our brains. Specifically, the researchers set out to examine thinking about emotions as categories, or thinking about them on a continuum:
They conducted two studies, in which participants examined pictures portraying emotions while undergoing MRI. In the first study, participants saw faces displaying fear or calm to varying degrees, and either categorized them or marked where they fell on a continuum for "fear" and "calm." In the second study, participants judged their own responses to graphic images, either as categorically "bad" or "neutral" or along a continuum. They found different levels of activation in parts of the brain responsible for emotion, specifically the amygdala and the insular cortex (the part of the cerebral cortex closest to the amygdala and other structures in what is known as the midbrain):
Activation was greater when the perceived intensity of the negative emotion was greater. When people were only given categories to work with, they had to force pictures into those holes, even when the displayed emotion wasn't very extreme. So a picture showing very mild fear that a participant categorized as fearful produced greater activation in those parts of the brain. Activation was lower when the perceived intensity of the positive or neutral emotion was greater. Allowing a "gray area," through use of a continuum, would allow for more accurate perception, while being forced to use categories may change perception, leading people to perceive the target as being more similar to the category selected. Obviously, this goes beyond emotions, and could have some important implications for other work involving categories, such as stereotypes.
An important aspect of emotion perception that has been overlooked concerns the difference between the continuous nature of the sensory inputs that people receive and the categorical nature of their thinking about emotion. In facial expressions, the contractions of various facial muscles can vary continuously to create gradations of movements (Jack & Schyns, 2015). But people typically talk about these expressions in categorical terms, calling them expressions of “fear” or “calm,” for instance (Barrett, 2006).Whether we think about emotions continuously or categorically does have some important ramifications. A nuanced view of emotion can make us better at detecting how others are feeling, while thinking about them in categories means that a facial expression or behavior has to reach some threshold before we decide the other person is feeling a certain emotion. This would in turn impact our behavior and interactions with others. It might also impact how we manage our emotions. Viewing emotions continuously also allows us to understand more complex emotions, made up of combinations of the basic emotions. According to Ekman, there are 7 basic emotions:
They conducted two studies, in which participants examined pictures portraying emotions while undergoing MRI. In the first study, participants saw faces displaying fear or calm to varying degrees, and either categorized them or marked where they fell on a continuum for "fear" and "calm." In the second study, participants judged their own responses to graphic images, either as categorically "bad" or "neutral" or along a continuum. They found different levels of activation in parts of the brain responsible for emotion, specifically the amygdala and the insular cortex (the part of the cerebral cortex closest to the amygdala and other structures in what is known as the midbrain):
Activation was greater when the perceived intensity of the negative emotion was greater. When people were only given categories to work with, they had to force pictures into those holes, even when the displayed emotion wasn't very extreme. So a picture showing very mild fear that a participant categorized as fearful produced greater activation in those parts of the brain. Activation was lower when the perceived intensity of the positive or neutral emotion was greater. Allowing a "gray area," through use of a continuum, would allow for more accurate perception, while being forced to use categories may change perception, leading people to perceive the target as being more similar to the category selected. Obviously, this goes beyond emotions, and could have some important implications for other work involving categories, such as stereotypes.
Thursday, September 29, 2016
Susan Schneider Williams Shares Her Story with the Readers of Neurology
Actor Robin Williams died by suicide in August of 2014. In the time that followed, there was much speculation as to why a man who brought joy to so many people could take his own life. His widow, Susan Schneider Williams finally shared that Robin had been suffering from Lewy body dementia, a neurodegenerative disease that runs a rapid course; it has no cure. Personally, I was surprised that she didn't speak about this sooner - her announcement came a year after the suicide. Today I found out the real reason (via Neuroskeptic) in an article Susan published in the journal Neurology, which is freely available here.
Robin and Susan knew something was wrong, and multiple doctors identified multiple explanations for his constellation of symptoms (Parkinson's disease, major depression, and so on). They prescribed many medications that helped only a little or, in some cases, made it worse. As Susan discussed in her article, Robin was doubtful of every diagnosis he heard. He knew something was wrong, and though he didn't know what exactly, the diagnoses being thrown at him didn't get to the heart of what he was experiencing. Imagine how frustrating and terrifying that must have been, not just for Robin but for the person closest to him. He wasn't diagnosed with Lewy body until after his death. Would knowing the true cause have given him some comfort? I see now why she didn't want to speak out right away. Not only that, she wanted to learn more about the disease that had taken her husband:
Robin and Susan knew something was wrong, and multiple doctors identified multiple explanations for his constellation of symptoms (Parkinson's disease, major depression, and so on). They prescribed many medications that helped only a little or, in some cases, made it worse. As Susan discussed in her article, Robin was doubtful of every diagnosis he heard. He knew something was wrong, and though he didn't know what exactly, the diagnoses being thrown at him didn't get to the heart of what he was experiencing. Imagine how frustrating and terrifying that must have been, not just for Robin but for the person closest to him. He wasn't diagnosed with Lewy body until after his death. Would knowing the true cause have given him some comfort? I see now why she didn't want to speak out right away. Not only that, she wanted to learn more about the disease that had taken her husband:
Three months after Robin's death, the autopsy report was finally ready for review. When the forensic pathologist and coroner's deputy asked if I was surprised by the diffuse LBD pathology, I said, “Absolutely not,” even though I had no idea what it meant at the time. The mere fact that something had invaded nearly every region of my husband's brain made perfect sense to me.Since then, Susan has become involved with the American Academy of Neurology and serves on the Board of Directors of the American Brain Foundation. She shares her story now in the hope that it will help others:
In the year that followed, I set out to expand my view and understanding of LBD. I met with medical professionals who had reviewed Robin's last 2 years of medical records, the coroner's report, and brain scans. Their reactions were all the same: that Robin's was one of the worst LBD pathologies they had seen and that there was nothing else anyone could have done. Our entire medical team was on the right track and we would have gotten there eventually. In fact, we were probably close.
But would having a diagnosis while he was alive really have made a difference when there is no cure? We will never know the answer to this
Hopefully from this sharing of our experience you will be inspired to turn Robin's suffering into something meaningful through your work and wisdom. It is my belief that when healing comes out of Robin's experience, he will not have battled and died in vain. You are uniquely positioned to help with this.
I know you have accomplished much already in the areas of research and discovery toward cures in brain disease. And I am sure at times the progress has felt painfully slow. Do not give up. Trust that a cascade of cures and discovery is imminent in all areas of brain disease and you will be a part of making that happen.
Wednesday, September 28, 2016
Why Antidepressants Fail: The Answer May Lie at the Heart of Social Psychology
Depression is one of the most common mental disorders in US, affecting almost 7% of the population:
Though antidepressants are a frequent treatment for depression, as well as related disorders, studies estimate that they are ineffective in 30-50% of people. And a new study may have uncovered why:
Though antidepressants are a frequent treatment for depression, as well as related disorders, studies estimate that they are ineffective in 30-50% of people. And a new study may have uncovered why:
The new research suggests it is at least partly down to people’s environment whether or not antidepressants work. Antidepressants may give the brain a chance to recover from depression, but more is needed. The rest could be down to being exposed to relatively low levels of stress.The study was done in mice, so more research is needed to confirm whether this holds true for humans. But this finding aligns with a variety of social psychological (and related) theories. In fact, the influence of environment on physical health has long been the subject of public health and health services research and is part of the motivation for changes to healthcare delivery (like the patient-centered care model).
Tuesday, September 13, 2016
More Fun with Perception
Another fun demonstration of perception, this image of a grid and 12 black dots showed up on Facebook and Twitter yesterday:
You probably can't tell that there are 12 dots, because you can't actually see all of them at the same time. Many people report seeing only 1-2 at a time. I was able to squint and see 4 at a time, but that's the most I can manage. So what's going on in this crazy picture?
A little digging and I found out the image was originally uploaded by a Japanese psychology professor, Akiyoshi Kitaoka, and a similar puzzle was presented in this article, which contains many optical illusion demonstrations.
The reason you can't see all 12 dots is because you have to use peripheral vision to do so, and while our peripheral vision is fine for seeing big things, it's not so great with fine detail:
You probably can't tell that there are 12 dots, because you can't actually see all of them at the same time. Many people report seeing only 1-2 at a time. I was able to squint and see 4 at a time, but that's the most I can manage. So what's going on in this crazy picture?
A little digging and I found out the image was originally uploaded by a Japanese psychology professor, Akiyoshi Kitaoka, and a similar puzzle was presented in this article, which contains many optical illusion demonstrations.
The reason you can't see all 12 dots is because you have to use peripheral vision to do so, and while our peripheral vision is fine for seeing big things, it's not so great with fine detail:
In this optical illusion, the black dot in the center of your vision should always appear. But the black dots around it seem to appear and disappear. That’s because humans have pretty bad peripheral vision. If you focus on a word in the center of this line you’ll probably see it clearly. But if you try to read the words at either end without moving your eyes, they most likely look blurry. As a result, the brain has to make its best guess about what’s most likely to be going on in the fuzzy periphery — and fill in the mental image accordingly. That means that when you’re staring at that black dot in the center of your field of view, your visual system is filling in what’s going on around it. And with this regular pattern of gray lines on a white background, the brain guesses that there’ll just be more of the same, missing the intermittent black dots. Those dots disappear and reappear as your eye jitters around "like a camera that’s not being held stably," [vision scientist, Derek] Arnold says.
Friday, September 9, 2016
Cheese, Gromit! Cheese!
Finally, a study to explain why I feel compelled to eat the cheese stuck to the pizza box, even when it ends up being 30% cardboard:
Obviously, cheese addiction isn't as severe as addiction to other opioids (such as, say, cocaine or morphine), or else we'd hear about a lot more cheese addiction treatment programs.
Cheese contains a chemical found in addictive drugs, scientists have found. The team behind the study set out to pin-point why certain foods are more addictive than others. Using the Yale Food Addiction Scale, designed to measure a person’s dependence on, scientists found that cheese is particularly potent because it contains casein. The substance, which is present in all dairy products, can trigger the brain’s opioid receptors which are linked to addiction.Your body contains these receptors in the brain, spinal cord, and digestive system. Opioids have many side effects, including pain relief and euphoria, which alone can be addictive (who doesn't like being happy and pain-free?), but they also can directly cause addiction through molecular changes in the brain and involvement of naturally occurring (aka: endogenous) opioids in the human body - chemicals like endorphins, dynorphins, and endomorphin. These chemicals can affect the release of other chemicals in the brain, which can results in behavioral impacts (such as causing drug-seeking behavior).
Obviously, cheese addiction isn't as severe as addiction to other opioids (such as, say, cocaine or morphine), or else we'd hear about a lot more cheese addiction treatment programs.
Subscribe to:
Posts (Atom)