People often spend a lot of time criticizing things they mostly or almost always enjoy. It’s easier to pick apart something close to you than something you don’t know much about. You’ll be more knowledgeable about all the relevant arguments and counterarguments, and so better able to defend your side of choice.
Imagine a fan of a sports team. A naive view might be that this fan would criticize other teams more than the one they support, because supporting a team means standing by them. If you’ve ever talked to an intense fan during an off year, you’ll know this is not the case. Because of the proximity of the team to their mind—maybe they know all the players, or they watch the coach’s interactions with the press, or they listen to team-specific commentary throughout the week—they know exactly what’s going wrong (and will often generously supply the ideas that will fix it all).
This also shows up in a diversion of effort. It’s easy to know what’s bad about your team, but with so many other teams out there, to know the same level about all of them would require either intense dedication to the sport as a whole, or a lessened dedication to one’s own team. Sure, the fan spends more time praising their team than any other, but the absolute amount of time spent criticizing is also higher.
This writing itself is a victim of this principle. I know almost nothing about the history of France, so I’m not spending my time discussing the differences between seigneurialism and feudalism. Even if these or some other such topics were more important than what I do write about, I don’t know enough to write about them. I can’t prioritize what’s globally important, only what’s important and known to me. (Of course, this isn’t my actual method of picking what to write about either. It’s more a random selection of what could be fun and interesting to write about.)
Because of this effect, I suspect that a lot of us spend our nitpicking on ideas that we believe deserve the least nitpicking!
Signaling might also play a big role. If the way to gain status in some community is to know a lot about X, and the way to seem like you know a lot about X is to be negative and picky about its finer details, then we should expect many of the highest-status people in the community to have expressed the strongest misgivings about what the community is centered around.
Tractability may also be a consideration. If your goal is too large or unwieldy for either you or your audience to understand, you’ll probably have more success tackling a smaller subproblem first. The most tractable problems are low-hanging fruit, but just because a problem is easy to solve doesn’t mean it’s important to solve it. If the material you’re critiquing is itself easy to understand, it’s not just easier produce a criticism, but it’s also easier to grow an audience for that criticism. Our most important problems to solve are not necessarily problems with a wide audience, and they’re very likely not easy problems to solve.
I don’t really know the extent to which this allocation of criticism is a problem or how to fix it if it is. Isn’t expertise just the ability to have a well-informed opinion given loads of time to understand a topic? If the expert’s opinions happen to be negative, so be it. Tunnel vision and trapping yourself in a bubble are probably the most salient aspects of this phenomenon to avoid, but they’re difficult to break out of (especially if you don’t even realize you’re in a tunnel/bubble). This is yet another of those cases where I know my thinking could be better, but I just don’t see any easy fix.