|
|
- IdeaLogging, HumanNature.
-
- <[Brent]> Great thoughts! I may be completely wrong, but it strikes me that this is a necessary part of keeping a grip on a massively complex system. If we did the opposite--discarding models every time there's a hiccup in the data--wouldn't it be much more difficult to make sense of the world?
-
- I don't see this as a tragedy; I see it as a necessary part of being finite in an infinite universe. But, again, I could be wrong.
-
-
- <[Brennen]> I started that sentence about tragedy differently at first - something along the lines of "I can't guess whether this tendency is, by itself, adaptive - or simply falls out of more basic human traits with immediate consequences for survival - but I think that it may be, after sex and death, the third essential component of the human tragedy".
-
- Anyway, I recognize the point you're making. There's a reason to retain a model that has basically worked to date, in the absence of a replacement or even a coherent refinement. And of course if we tended to abandon and replace the models we use wholesale on encountering a single exception, we'd be paralyzed most of the time...
-
- Actually I think this is key: I feel like, in an evolutionary environment (be that the one in which humans evolved over the long term, or just the social scenes we have to cope with), the pressures you're talking about have a more proximate impact than the effects of retaining-bad-schemas. To put it another way, fitting the data to the model is probably rewarded (or at least not-punished) in the immediate term and on the micro scale - and punished most severely in the long term, on the macro scale. And often at the level of the group, organization, or society - which brings up another really key thing: Models are often things held by *groups*, and it's almost always true that individual humans form their own, semi-private models for dealing with things. Which explains how an organization full of people who know full well that they're dealing with the world on broken terms can go right ahead doing so...
|