[If you already think that intuitions/emotions/gut sense are important from a self-improvement standpoint, feel free to skip over this. one]
Here’s something I forget enough that seems worth repeating, at least for me: your feelings matter, and they matter a lot more than I might give them credit for.
This is a core CFAR rationality idea—listening to your System 1, reconciling your intellectual and emotional sides, and all that. If you’ve been reading what others in the community have been writing for long enough, this probably seems boring and super evidence at this point. Still, though, I notice that I’ve been slow to be 100% behind endorsing this angle, and there’s not a lot on MLU about getting in touch with your intuitions.
There are good reasons to be hesitant. There are, of course, also bad ones.
For me, it seems like most of my unwillingness to consider this in more detail takes the form of “this sounds like other stuff that I know is bullshit”. And that actually does seem pretty reasonable. Implication by association is a fast heuristic that does often cash out in the real world.
First, a quick overview of what I mean by caring about your feelings / System 1 / unconscious / emotions:
For one broad example, there are things that we “have” to do, like filling out taxes, responding to jury duty, finishing homework, etc. which aren’t very appealing to do. As a result, we have a tendency to put off these tasks; such procrastination is a vicious cycle that screws us over.
Here, it’s not that we don’t know that it’s beneficial to get said task done (indeed, we’d prefer a world where the task was magically done for us already), but there’s something else that holds us back from finishing the task on time. Surprising perhaps no one, it turns out that the way we feel towards things influences our ability to get the task done.
Anyway, the reason I think I was hesitant to go out and fully push for the “feelings matter” angle seems to be because it sounds just the like the sort of bad advice your internal stereotypical TV guru spits out: “Your mind controls your body, and your body the world. Your mind is the key to the world.”
I’m not really a fan of those self-help gurus. And that seems to have made me more recalcitrant when it comes to espousing advice that sounds a lot like what they’d say.
(Was I failing to *actually* consider Obvious Advice? Or just falling prey to status and identity considerations? You decide!)
The other reason is that feelings are a fuzzy thing. With introspection being our main method of access, we can’t always count on accuracy. This is especially relevant when some of the most interesting things in this space come from aliefs, i.e. things that we secretly want / think / feel / believe, which we don’t necessarily explicitly endorse, but implicitly expect.
This of course sounds a lot like the sort of quasi-Freudianism that everyone’s learned to stay wary of— the sort of practice that mainly involves making stuff up. If you can be “truly” feeling X, independent of whether or not you explicitly claim to feel X, this opens a lot of doors to bad reasoning.
And that’s problematic, as I want to endorse feelings (and all the other internal stuff in our heads) as an important component of figuring out self-improvement, while also (on some level), trying to anticipate potential criticism based on comparisons to these largely derided schools of thought.
But I think it’s good to put all that aside for the time being.
I’m happy now to assert that feelings matter, and paying attention to this hard-to-objectively-measure quality in your head is actually incredibly relevant to getting things done.
This brings up a whole bunch of questions like “Well, can you teach these sorts of skills, then?” which groups like CFAR and Leverage seem to be looking into.
They might be poorly defined, and maybe there’s a better abstraction we can use. But there’s little use denying that they point to something very valuable.
Meta: It’s sorta neat to see the convergence in this blog as I end up in a rather similar state of conclusions as CFAR, at least when it comes to rationality as a domain. I’m curious if, looking back at the progression of these posts, the progression in ideas is easily traceable for anyone new to all of this—i.e., if someone were to read all of MLU, would they be able to say, “Ah! Looks like X, Y, and Z led to Owen changing his thoughts!”