by Steven J. Owens (unless otherwise attributed)
I'm still looking for good phrases to encapsulate the following:
I find that evocative, sometimes humorous, phrases can be effective communication devices. They slip past the listener's defensive reflexes and get them to actually evaluate what's being said.
A few examples, mostly from the world of carpentry and power tools and the like, for some reason...
"Always wear your safety squint."
"Always cut towards a major artery -- that way you'll be sure to be careful!"
"I dunno boss, I cut it three times and it's still too short."
When somebody cuts themselves with a dull blade: "One of them is dull."
I think a lot of the articles/tutorials/books in the software world get hung up on conceptual vs. concrete, and thus confuse the user. "Hung up on the concept" both in a general and in a specific way.
In the general, by extolling the virtues of the concept like it's the next best thing since sliced bread, they mislead the reader into expecting to hear more substantive stuff.
In the specific, by getting too focused on the abstract concept as opposed to a concrete implementation, paradoxically, they make it harder for people to grasp the abstract concept.
I once read the worst example of this syndrome, by the way, the Vignette StoryServer docs. Half the words in the docs were Capitalized Phrases(tm). Most of them didn't represent anything other than a general concept - like Vigenette StoryServer Template(tm), aka a template.
There's a particular kind of handwavy approach to solving complexity that I've seen come up at least three or four times over the years (though I can only think of two examples right now).
A valid approach to complexity (and to many problems) is to move the problem around, generally in order to move it to an area that's either easier for you to to solve (because of your particular resource constraints) or easier in general to solve.
However, there is this tendency, most recently highly visible in the java enterprise application world with XML configuration files, to do this and then proceed to wave your hands, declare the problem solved, and move on. Or in other examples, to say, "Now I can afford all the complexity I want!" and go hog wild.
Either way doesn't really solve the problem. XML is better for expressing some things, but that doesn't make it better for everything - for example, for building a programming language. This is not a fault of XML - I'm convinced there's a more fundamental, underlying human tendency at work here.
Another example, perhaps less obvious, is in "rich UI" design, where you have a zillion little gewgaws and UI tricks that you can apply to, in theory, make the UI "easier" to use. But the underlying fact is that if you don't make sure you're answering the right question to begin with, it doesn't matter how prettily you phrase the answer.
This isn't simply a bandwagon effect, it's more akin to an "out of sight, out of mind" effect. It may be a special case of avoidance behavior, in common with behaviors like pushing problems higher and higher (or deeper and deeper) into layers of abstraction. Maybe something like Passing the Buck, but in a software design sense.
Also see (in Lexicon) "complexity".
Habitual Vision or Code Words or Religious Icons or Hot Button Words - Words take on an identity of their own. Signal vs. meaning. The words become no longer good or bad because of what they mean, but because of the identity they've taken on. "Discrimination" is a good example. To discriminate is a verb, while discrimination is used as a noun; I wonder if there's a conclusion to draw from that. I think this is an underlying aspect of human nature: the tendency to convert oft-repeated meaning into signal. I know there's a similar tendency at the biological/neurological level of perception; I wonder if the two are related.
One person suggested shibboleth, which is an interesting word, but not quite the sense I'm looking for.
The phrase hot button word may be the closest I can come, a word that sets people off, causes an emotional reaction without conscious thought behind it. But I kind of feel hot button word focuses the attention on the behavior of the listener, and not on the underlying concept of the human tendency to turn meaning into signal.
I recently added Habitual Vision to this entry, because it is perhaps a good example of the syndrome, in another context, sight. Sight is more of a trained physical skill than most people realize. We tend to think of sight as akin to knowledge - we take in information and we process it. But the way our vision works depends a lot on how many times we've seen something. To use a common example, once you buy a particular (distinctive) model of car, you suddenly see it everywhere. That's because you've memorized the shape - your vision is now trained to see it everywhere.
The same thing happens with words. The down side to this phenomenon is that whereas we map the sight of a car to a thing, we map the sight of a word to meaning - and meaning is a heck of a lot slipperier. Also, the habitual meaning can overwhelm the reality in not-useful ways.
A vicious corollary of this is that you can use the "habitual hearing" to deliberately deceive people, by carefully saying something using terms you know the hearer will habitually interpret otherwise. For some reason, when I try to explain this facet to people (well, geeks mostly - I don't find that non-geeks hold still long enough :-) - they get this right off the bat, but they have a really hard time grasping the more ethically useful corollary that you have to pay attention to your terms to avoid doing this accidentally.
I've even had the occasional experience where somebody disclaims all responsibility for the hearer misinterpreting them even if they know that the hearer will interpret certain terms in certain ways (there's a limit to how far you can do to get somebody to hear what you're honestly saying, but still...).
People seem to often reply to a question by providing an ambiguous answer to a different question. Sometimes this is a trivial matter of using different terms (most recent example: "Q: Is xyzzy synchronous or not? A: Xyzzy is non-blocking."), sometimes it's more profound. Sometimes (when done by salesmen, which seems to be often) it's misleading - either deliberately (though the salesman may not be aware of it) or unconsciously (because the salesman doesn't really understand technology and instead has been trained with a bunch of minah-bird responses based on key phrases in the questions). In general, it's a pain in the ass.
People seem to often read something (an argument, a document) with blinders, without trying to understand it, but merely shallowly looking for things they can object to - they go on autopilot, scanning for targets vs. an honest read, though that implies skimming, which isn't really what I'm after. They're not actually listening to the words, simply ignoring them and listening for the signal-words. Until you can successfully make an argument for something, until you can get into the skull of the proponent and understand their reasoning, you haven't really understood it. This is not always easy, but I think often a critic betrays their lack of doing so, or even their lack of trying to do so, in the shallowness of their criticism. This does not necessarily mean the failure is a premeditated slander; see the next entry.
'Psychologist George Miller long ago said something so important that I call it Miller's Law; he said, "In order to understand what another person is saying, you must assume that it is true and try to find out what it could be true of."' -- http://people.howstuffworks.com/vsd1.htmThere's a tendency, and maybe it's only natural and human, to try to decide whether something is really worth investing the effort to understand it. In intelligent, reasoning people this often results in the shallow "autopilot" reading I describe above, because they invest some effort, but not enough. This is particularly true, I've noticed, in discussions of topics (like software architecture and software project methodologies) where there's a lot of thrashing as people struggle to articulate the topic, let alone the problem and the solution. It takes a lot of effort just to figure out what people are talking about, let alone critique it.
The converse of reading on autopilot is the struggle for proper semantics. Often it's hard to understand somebody's explanation of, for example, a software framework, because they attempt to define a distinct and new set of semantics for it - often the new set of semantics is more than half the value.
The flip side is that humans understand new things by analogy with known things. Too many times, people seem unwilling to make that connection in explaining their thing. Sometimes it's because they're struggling to wrap their own head around the topic, to articulate it and fully explore it. Sometimes it's out of fear, fear of a very valid risk that the reader will trivialize the new thing, or fail to make the semantic shift.
Personally, I recognize both sides of the challenge, but I'd rather have the analogy so I can comprehend most of the new thing quickly, and focus my efforts on understanding the tricky bits, and making the semantic shift.
This often goes hand-in-hand with a tendency to describe too abstractly, out of fear of a concrete example leading to too much fixation on particulars of the example. I've had my own challenges in getting people to let go of such fixation on particulars, but I still think avoiding concrete examples is a cop-out. Concrete examples are a better approach.