Alexander Swan, an assistant professor of psychology at Eureka College, said use of the social network TikTok for sharing videos has been helpful, particularly for college-age students.
“They’re content creating in a way and giving viewers a means to humorously explore their own quarantine,” said Swan, who holds a doctorate in psychological and brain sciences from the University of California-Santa Barbara. “Humor is an incredible way to deal with stress, and I promote this coping strategy a lot in my health psychology course.”
Noting that laughing is a stress-relieving aerobic exercise that releases endorphins and dopamine, Swan added, “So watch those TikTok videos, those comedy films or specials, or have a good laugh session with friends or family over Zoom.”
Other connection apps he suggested are iMessage, Facetime, Google Hangouts, Zoom, WhatsApp and Facebook Messenger. “Though we may be social distancing, it doesn’t mean we have to be alone,” he said. “Loneliness is devil in the details here and it’s incredibly important to stave off the feelings in any way we can.”
April and May 2020 have been busy months! Despite the seemingly endless back half of the semester, the livestreaming, and video-editing, two very cool things have happened.
I was interviewed for two pieces for the media organization UNILAD. I offered my thoughts on the 20th anniversaries for two enduring films: American Psycho (2000) and Final Destination (2000).
Have you ever clicked on a link like “What does your favorite animal say about you?” wondering what your love of hedgehogs reveals about your psyche? Or filled out a personality assessment to gain new understanding into whether you’re an introverted or extroverted “type”? People love turning to these kinds of personality quizzes and tests on the hunt for deep insights into themselves. People tend to believe they have a “true” and revealing self hidden somewhere deep within, so it’s natural that assessments claiming to unveil it will be appealing.
Aspsychologists, we noticed something striking about assessments that claim to uncover people’s “true type.” Many of the questions are poorly constructed – their wording can be ambiguous and they often contain forced choices between options that are not opposites. This can be true of BuzzFeed-type quizzes as well as more seemingly sober assessments.
On the other hand, assessments created by trained personality psychologists use questions that are more straightforward to interpret. The most notable example is probably the well-respected Big Five Inventory. Rather than sorting people into “types,” it scores people on the established psychological dimensions of openness to new experience, conscientiousness, extroversion, agreeableness and neuroticism. This simplicity is by design; psychology researchers know that the more respondents struggle to understand the question, the worse the question is.
But the lack of rigor in “type” assessments turns out to be a feature, not a bug, for the general public. What makes tests less valid can ironically make them more interesting. Since most people aren’t trained to think about psychology in a scientifically rigorous way, it stands to reason they also won’t be great at evaluating those assessments. We recently conducted series of studies to investigate how consumers view these tests. When people try to answer these harder questions, do they think to themselves “This question is poorly written”? Or instead do they focus on its difficulty and think “This question’s deep”? Our results suggest that a desire for deep insight can lead to deep confusion.
Confusing difficult for deep
In our first study, we showed people items from both the Big Five and from the Keirsey Temperament Sorter (KTS), a popular “type” assessment that contains many questions we suspected people find comparatively difficult. Our participants rated each item in two ways. First, they rated difficulty. That is, how confusing and ambiguous did they find it? Second, what was its perceived “depth”? In other words, to what extent did they feel the item seemed to be getting at something hidden deep in the unconscious?
Sure enough, not only were these perceptions correlated, the KTS was seen as both more difficult and deeper. In follow-up studies, we experimentally manipulated difficulty. In one study, we modified Big Five items to make them harder to answer like the KTS items, and again we found that participants rated the more difficult versions as “deeper.”
We also noticed that some personality assessments seem to derive their intrigue from having seemingly nothing to do with personality at all. Take one BuzzFeed quiz, for example, that asks about which colors people associate with abstract concepts like letters and days of the week and then outputs “the true age of your soul.” Even if people trust BuzzFeed more for entertainment than psychological truths, perhaps they are actually on board with the idea that these difficult, abstract decisions do reveal some deep insights. In fact, that is the entire idea behind classically problematic measures such as the Rorschach, or “ink blot,” test.
In two studies inspired by that BuzzFeed quiz, we found exactly that. We gave people items from purported “personality assessment” checklists. In one study, we assigned half the participants to the “difficult” condition, wherein the assessment items required them to choose which of two colors they associated with abstract concepts, like the letter “M.” In the “easier” condition, respondents were still required to rate colors on how much they associated them with those abstract concepts, but they more simply rated one color at a time instead of choosing between two.
Again, participants rated the difficult version as deeper. Seemingly, the sillier the assessment, the better people think it can read the hidden self.
Intuition may steer you wrong
One of the implications of this research is that people are going to have a hard time leaving behind the bad ideas baked into popular yet unscientific personality assessments. The most notable example is the Myers-Briggs Type Indicator, which infamously remains quite popular while doing a fairly poor job of assessing personality, due to longstanding issues with the assessment itself and the long-discredited Jungian theory behind it. Our findings suggest that Myers-Briggs-like assessments that have largely been debunked by experts might persist in part because their formats overlap quite well with people’s intuitions about what will best access the “true self.”
People’s intuitions do them no favors here. Intuitions often undermine scientific thinking on topics like physics and biology. Psychology is no different. People arbitrarily divide parts of themselves into “true” and superficial components and seem all too willing to believe in tests that claim to definitively make those distinctions. But the idea of a “true self” doesn’t really work as a scientific concept.
Some people might be stuck in a self-reinforcing yet unproductive line of thought: Personality assessments can cause confusion. That confusion in turn overlaps with intuitions of how they think their deep psychology works, and then they tell themselves the confusion is profound. So intuitions about psychology might be especially pernicious. Following them too closely could lead you to know less about yourself, not more.
It’s been a while since I actually made a post on my website updating my academic life. Well, since the education and technology class has finished, that’s pretty much what I am back to. We’ll see if I can keep up.
Anyway, this summer it looks like I’ll be teaching two classes in the Psychology Department: Health Psychology and Intro to Research Methods again. I’m excited to develop a student-focused Health Psych class from the ground up. I am also excited to re-tool and revamp my research methods class from last year. I’m probably going to use a new book which I believe might be better and more approachable.
However, the best part of teaching research methods again is the ability to implement part of the education and technology final project. Take a look at the video my colleague, Molly Metz, and I made below (it’s intentionally silly):
So we can’t really implement the personalized adaptive learning platform in 6 weeks, but I can implement and integrate the ZAPS portion (or something like it) into the class in that time, just to make sure the students know what the class is about, as well as understanding the need for psychological science at such an early part of their college career. We will pretest attitudes and interests, move through the ZAPS process, finishing up with a small paper and a posttest of attitudes and the like, then compare the pretest and posttest for any changes. Hopefully there’s a publication in there somewhere (Teaching of Psychology seems like the appropriate place, no?). And more importantly, hopefully the new perspective in this type of course will lead to better prepared students in the upper-division classes and lab classes at UCSB. One can only hope that becomes truth.
My health psych class won’t be as technologically advanced, but I do hope to get the students interested in health psych by having them participate in a health behavior change assignment for the 6-week session. College students are full of bad habits, so maybe a few of them will continue to change their behavior after the course is completed. Showing them real studies with important health implications is also important–my goal is to only use the book as a support, not a complete resource for the course. I find this boring and predictable.
There is apparently lot’s of work to be done in the next couple of months, since both classes are the first session of summer school! And then a trip to Berlin for a conference! 2013 is one heckuva year!
Design Problem: Making method and statistics courses more palatable and to get students to understand psychological science in the realm of psychology.
Solution #1 (Alex’s Idea)
Introduce a class project or projects in a methods course that facilitates discussion of design, implementation, and analysis to the course.
This laboratory engine has an online component, where students register for the site, and may take part in famous psychology experiments from cognitive, social, and other areas in psychology.
Modules can allow for individual registrants that can be aggregated into class data.
Students can also print out their individual performance for aggregation into a spreadsheet for analysis.
Modules are presented to students using Flash.
Aspects of the Project
Depending on the size of the class, the interactive component can either be group-based (for small-medium size classes) or iClicker/crowdsourced-based (for large lecture classes).
Depending on the length of the term, one or two projects can be completed.
In addition to the normal curriculum of the course, the project will incorporate required course content. This integration should help students realize the way psychological students is done in the real world, giving some context to the field in general
While the module(s) will be chosen prior to the start of the course to ensure that all milestones are hit and the curriculum matched, the students will discuss the design of the experiment, with various activities in-class to facilitate answers.
Students will then do the experiment online, effectively being their own subjects.
More in-class activities will facilitate the analysis of the data phase of the project
Following part of the procedure from Ciarocco, Lewandowski, & Van Volkom, 2013, students will complete a small results section and truncated discussion section
Expectations
Due to the introductory nature of the course, simpler designs are required; the Zaps experience needs to be highly controlled.
The project will hopefully show students the utility of psychological research and assuage misconceptions of psychology.
The writing portion of the project will prepare the students for the laboratory courses where more writing is involved.
Solution #2 (Molly’s Idea)
As identified in an earlier post, the educational challenge Alex and I have decided to approach concerns the teaching of undergraduate psychology research methods and statistics. To review, undergrad psych majors tend to dislike methods and stats classes due largely to three major issues: misconceptions about psychological science; disconnect between methods, stats, and content; and the lack of inherent interest in the material. As a result, students experience high levels of anxiety, low perceived value of the material, and low interest in engaging with it.
One possible solution I would like to propose is the use of the adaptive learning technology being developed by Knewton (http://www.knewton.com/about/). This platform, currently being developed and tested with private investors, aims to personalize the learning experience as much as possible by utilizing user data and adjusting the lesson plan according to user needs and preferences. Described by the website as a “recommendation engine,” Knewton uses both personal history and education research to suggest tailored lessons and activities to fit specific needs. For example, based on the finding that trouble with algebra word problems is more likely due to impaired critical reading skills than any math-related issues, a student struggling with word problems will be directed to reading comprehension exercises. As with other platforms based on user data (such as the tracking of shoppers using discount cards or optimizing Amazon search functions), the platform delivers better content as it collects more data.
Technology such as this is remarkably well-suited to address each of the components of our challenge described above. Students can fill out a survey gauging their knowledge of psychological science, and be directed toward resources to correct any misconceptions. The adaptive learning technology is useful for all students, but will be especially helpful to those experiencing high levels of anxiety by tailoring lesson to their knowledge and taking them at their own pace toward achieving the target objectives. Knewton’s “cross-disciplinary knowledge graphs” will go far in integrating the statistical and methodological concepts learned with the rest of psychology, and it could be programmed to direct students to simulations, research articles, or videos about concepts students are either most interested in or need the most help engaging with. By personalizing the learning experience as much as possible to the needs of each learner, Knewton’s adaptive learning technology can greatly enhance the experience and enjoyment of students in stats and methods courses.