I want to discuss the advantages — and pitfalls — of assuming the reasonable best of people.
Most people are mostly good, most of the time. That’s a thrice-qualified statement. Nobody’s perfect, although as I like to say, that’s no excuse not to try.
Some people are scum. Some are selfish beyond belief; some are actually sadistic. There’s always hope for redemption, but it rarely happens. These, however, are a tiny majority. Far more harm has been done by ignorance than by genuine evil. Sufficiently advanced stupidity is indistinguishable from malice, and it is usually the simpler explanation. It’s also practically easier to deal with. While the ignorant may be stubborn, this stubbornness is born not out of conviction itself but out of simple conservatism. If you assume that someone is ignorant rather than malicious, on evidence which could support either conclusion, you’re simply more likely to be right.
Even the smartest and noblest people make mistakes. Even then, it is seldom harmful, or risky, to assume their best intentions.
And what’s the cure for ignorance? Information. Only if the person concerned either genuinely doesn’t give a damn, or is too proud to accept information, will this sort of help fail completely; and even then it won’t make things worse, but will rather give you more information yourself and justify you in condemning the behaviour or opinions in question.
If someone’s acting like an ass, they probably don’t know it. If you thought you were acting like an ass, you would want to change. Those who genuinely wouldn’t care are small in number indeed. Even if they do know it, if nobody speaks up, they’ll assume that nobody really has much of a problem with it. This goes out to the people who called me on my overconfidence when it was getting out of hand and making me seem arrogant — I couldn’t see it from this side of my eyes, but I hope that hearing it has helped me turn it down. The reason they called me on it was because they assumed that I wasn’t being a git for its own sake, or because I genuinely felt superior or thought their opinions worthless, but rather that I simply didn’t realise how I came across. If they hadn’t assumed the reasonable best of me, everyone concerned would be worse off. Even if they had been mistaken in that assumption, nobody would have been worse off for their making it. That kind of went off in a more introspective direction than I expected, but the example stands.
Cultists and fundies, for the most part, aren’t evil, although they do vile things; a cultist is more likely to be brainwashed than truly malevolent. Exceptions can be made for those who stand to make money off said brainwashing — Scientology, of course, being the most notorious offender here — and those who recognise the damage done and do nothing to avert it. And of course, just because they’re brainwashed doesn’t mean they shouldn’t be stopped from abusing children with threats of hellfire and ostracism, but it does mean they should be treated with sympathy. This is why we now look at criminal sentencing with an eye to rehabilitation, as well as the old (and still justified) motives of vengeance and security.
And what’s the cure for ignorance? Information. Only if the person concerned either genuinely doesn’t give a damn, or is too proud to accept information, will this sort of help fail completely; and even then it won’t make things worse, but will rather give you more information yourself and justify you in condemning the behaviour or opinions in question.
If someone’s acting like an ass, they probably don’t know it. If you thought you were acting like an ass, you would want to change. Those who genuinely wouldn’t care are small in number indeed. Even if they do know it, if nobody speaks up, they’ll assume that nobody really has much of a problem with it. This goes out to the people who called me on my overconfidence when it was getting out of hand and making me seem arrogant — I couldn’t see it from this side of my eyes, but I hope that hearing it has helped me turn it down. The reason they called me on it was because they assumed that I wasn’t being a git for its own sake, or because I genuinely felt superior or thought their opinions worthless, but rather that I simply didn’t realise how I came across. If they hadn’t assumed the reasonable best of me, everyone concerned would be worse off. Even if they had been mistaken in that assumption, nobody would have been worse off for their making it. That kind of went off in a more introspective direction than I expected, but the example stands.
Cultists and fundies, for the most part, aren’t evil, although they do vile things; a cultist is more likely to be brainwashed than truly malevolent. Exceptions can be made for those who stand to make money off said brainwashing — Scientology, of course, being the most notorious offender here — and those who recognise the damage done and do nothing to avert it. And of course, just because they’re brainwashed doesn’t mean they shouldn’t be stopped from abusing children with threats of hellfire and ostracism, but it does mean they should be treated with sympathy. This is why we now look at criminal sentencing with an eye to rehabilitation, as well as the old (and still justified) motives of vengeance and security.
An additional point on the ignorance and information aspect — there may well be situations where reasonable people can come to different conclusions depending on which part of the information they met first.
ReplyDelete(Thanks to pfh for pointing this out on his blog some five years ago.)
In these cases, information will not necessarily cure disagreement. There are two (or more) possible conclusions, and each party remains convinced of a different one even after they've exchanged all their information.
But you can't have something be both true and untrue. If a person is only basing their understanding on part of the information, then their conclusions are less likely to be the true ones. Rarely is it possible for two people to disagree on a point and for both of them to be correct (opinions on the best flavour of ice cream notwithstanding).
ReplyDeleteIf someone has reached an untrue conclusion because of faulty or missing information and then does not change their view when complete or accurate information is presented, then they are a stubborn git and an idiot. Plain and simple.
Indeed. My main point, though, is that we should assume they aren't deliberately being a stubborn git, and give them the information, because this makes them less likely to be a stubborn git and it makes us more justified in rejecting them if they are a stubborn git.
ReplyDelete@Quincy, I think you need at least a continuous variable for the effect.
ReplyDeleteFor an example, suppose that some number is typically either around 7 or around 160 (where "around" means, say, ±3). Further, let's suppose that the choice between the two is not random, but correlated with something else, so that it's plausible for someone to have seen a run of the 160±3 kind first. At that point, they're going to rightly reject any single 7±3 example — it's obviously anomalous, erroneous, a misunderstanding or other mistake. That's just common sense.
You can't even blame them rationally; anything 50σ out really is likely to be simply wrong.
How could a run of the 160±3 kind first happen? Well, obviously, it could be cultural — their community exposes them preferentially to that one. A more nasty situation is when it's time-related, so that 160±3 used to be right, but suddenly 7±3 is more appropriate — and examples crop up sufficiently rarely that each of them is treated individually. As pfh put it, "they'll just keep on as they always have while things go straight to hell".
Indeed, but such thinking is not strictly rational, which I think is Quincy's point; if the situation of 160-or-7 rationally follows from the information available to our hypothetical person, and after this is pointed out he clings to his previous beliefs that it is only 160, then this may be understandable but it is certainly also stubborn and irrational. That our course in this situation is to ensure that he is informed, rather than assuming that he is being stupid, before we judge him, is my contention.
ReplyDeleteOh, it's rational enough under the circumstances...
ReplyDeleteRemember, too, that we ourselves might not believe 160-or-7; we might believe 7, because our order of encountering the numbers is opposite. This can obviously happen in the cultural-filtering case, but also in the time-related one (for younger people, it has always been 7). That would make communication even more difficult.
Note also that there might not be any additional information to give - both parties might have already seen it all, just in different order.
If we have all the information, and all the information points to 160-or-7, then it is not rational to believe anything other than 160-or-7, no matter what order we came on the information in. It may be understandable, which was my initial point, but it is most certainly irrational.
ReplyDelete