psychology

hbr.org
"Our primitive mind knows something bad is happening, but you can’t see it. This breaks our sense of safety. We’re feeling that loss of safety. I don’t think we’ve collectively lost our sense of general safety like this. Individually or as smaller groups, people have felt this. But all together, this is new. We are grieving on a micro and a macro level."
This message about grief was good to hear and something I need to keep in mind.
Google Docs
Google Doc with some helpful suggestions for getting through sheltering in place:
"Our brains have a bias toward negativity in order to scan for danger and keep ourselves safe. If we don’t want to become depressed and anxious, we have to make an effort to move toward the positive. If you think of everything that can possibly go wrong all of the time, you will have given your brain the experience of bad things happening even if none of your fears come true. Use this as an opportunity to catch your negative thoughts and identify them as old mental habits rather than as truths."
I found this helpful.
Ferns
"But if folks make more money off of customers when they reduce latency, there has to be some power in increasing latency."
This is a hack I can get behind. If you can't slow down the velocity of information on social networks at least you can physically slow down the social networks on the piece of the network you control.
greatergood.berkeley.edu greatergood.berkeley.edu
"Shockingly, they found a positive and statistically significant relationship between the amount of coverage dedicated to mass shootings and the number of shootings that occurred in the following week."
I wish more people knew about the media contagion problem—especially people in the media.
conceptuallabor.com conceptuallabor.com
I really enjoyed this essay about Conceptual Labor. Sometimes the work we need to do is understanding the work we need to do. It reminded me of a favorite saying of mine by Victor Frankl that if you have a why you can get through almost any how. (Paraphrased, it's from Man's Search for Meaning which I should reread.) I think I saw this link on Mastodon, but not finding links again is my theme today.
The Verge The Verge
image from The Verge
Casey Newton is back with another look at the human cost of social media.
I asked Harrison, a licensed clinical psychologist, whether Facebook would ever seek to place a limit on the amount of disturbing content a moderator is given in a day. How much is safe?

“I think that’s an open question,” he said.
Important reporting here that I hope will help people that these powerful corporations are forgetting.
Wired Wired
image from Wired
“In a connected, searchable world, it’s hard to share information about extremists and their tactics without also sharing their toxic views... Labeling extremist content or disinformation as ‘fake news’ doesn’t neutralize its ability to radicalize.”
This article describes the information contagion problem perfectly. I wonder if all information should come with specific instructions to stay grounded after consuming it.
nytimes.com nytimes.com
image from nytimes.com
I’ve been meaning to post this article ever since it came out but I’m being compassionate with my past self about not doing it yet. My future self is used to disappointment so that guy should be pleasantly surprised it’s off his plate. Anyway, lots of good psychology here to help with productivity. Add reading it to your to-do list.
The Verge The Verge
cover image from The Verge
I have to link to this excellent reporting by Casey Newton. This is an important article that shows the human cost of maintaining large centralized social networks. I think it also reveals a sick society where people are constantly uploading psychologically scarring material that other people then have to sift through. I felt like Facebook's response was weak—at some point the we're growing too fast to keep up and we're so new at this doesn't work. As Bloomberg points out, companies have always said artificial intelligence is just around the corner to save the day. I think that's why companies view human moderators as a failure of technology rather than a key piece of their success. Matt Haughey ran an indie corner of the open internet for years and knows Content moderation has no easy answers. Just because it's hard doesn't mean we shouldn't hold Facebook accountable. They made decisions that created this problem and it's a shameful aspect of the internet we need to fix.
Tim Harford Tim Harford
"Trying to get some work done with an internet-enabled device is like trying to diet when there’s a mini-fridge full of beer and ice cream sitting on your desk, always within arm’s reach." Here’s yet another fascinating digital-habit-changing story. I like the connection he makes with economic psychology. I do think we're all working with the Endowment effect and Escalation of commitment as we consider the value of digitizing every aspect of our lives.
latimes.com latimes.com
image from latimes.com
I agree with this. Repeating a frame or an idea—even to mock it—distributes and strengthens that idea. I love Colbert but I stopped watching a long time ago. Laughing wasn't enough to make up for the disturbing source material. It reminds me again that the old Internet cliché don't feed the trolls is something the media hasn't adopted yet. See also the great way Jay Smooth put it: Don't Link to the Line Steppers.
Towards Data Science Towards Data Science
image from Towards Data Science
“The data we are shown is not the only data there is.” A good description of a statistical analysis problem and a reminder to think about causes of data not just data you see in front of you. This reminds me of that old zen saying don’t confuse the moon with the finger that points at it.
« Older posts  /  Newer posts »