Lexicon
by Steven J. Owens (unless otherwise attributed)
Intentionally Hacking the English Language
I like evocative use of language. It helps make ideas stick,
it helps you remember them when you need them, and it helps you
get the point across.
This is a bunch of terms and phrases, and even some sayings.
Most of them are things I found, heard or read elsewhere. A
couple are my own inventions that seemed to get a reaction out
of people.
In no particular order:
- Scroll Blindness
- The effect of being unable to find information
you need because it is buried in too much other, less interesting
information. From Robert L. Read's "How To Be A Programmer". I like
this phrase not just for what it's about, but for how I can use it as
a metaphor for other types of human behavior.
http://samizdat.mines.edu/howto/HowToBeAProgrammer.html
- Window Blindness
- I coined this phrase the other day, and I like it
so much I'm adding it here. Window Blindness is generally any
example of behavior similar to the sort of bipolar selective memory
that Windows users have about windows reliability and ease of use.
One day they're ranting about some bug or crash or usability issue,
the next day they're defending Windows reliability and ease of use.
- "Always cut towards a major artery ... that way you'll be careful.
- This is something a friend - one
of the better programmers I know, and also a pretty good carpenter -
told me, and it's a really nifty example of a technique I love, using
humor to make something stick in your head. After hearing this one, I
can't cut anything without remembering and chuckling at this -- which
means I can't help but remember to make sure I'm cutting away from myself.
- "Heat until dangerous"
- Another saying, this time from the cooking
world. When you heat up oil for frying something, you want it to get
hot enough that a drop of water sprinkled on the oil will pop and
skitter around, and you risk an oil burn if you're not careful. "Heat
until dangerous" describes the heat level succinctly, and also reminds
you to be careful.
- Put on your own oxygen mask first
- From standard airline
emergency orientations, used at headrush.typepad.com to make the point
that your responsibility to all of your customers requires you to
avoid overcommitting and underbidding to please just one of your
customers.
http://headrush.typepad.com/creating_passionate_users/2005/11/whos_in_control.html
- Organizational Accent
- From Rands In Repose.
"Getting folks in
the same group, with the same organizational accent to talk coherently
to each other is hard enough. Meetings give us the opportunity to
include other organizations with other accents."
http://randsinrepose.com/archives/agenda-detection/
- Fibber McGee's Closet
- When you "clean" by cramming everything into
a closet. From the old radio show "Fibber McGee and Molly", which had
a closet crammed full of junk that would tidal-wave onto the next
person who opened it. Most programmers can think of good examples of
Fibber McGee's closet in their software projects...
- Abstraction ad Infiniteum
- Sometimes layers upon layers upon layers
of abstraction aren't about hiding ugliness, but about avoiding
reality. Or just indecision; it's hard to get the balance right.
- "You got your dirt in boss's hole..."
- A paraphrase from Cool Hand Luke,
basically it's about insanely unreasonable management types. In the movie,
a prison guard points at a spot of ground (i.e. there's no hole or ditch
there... yet) and says:
Boss Paul: That ditch is Boss Kean's ditch. And I told him that dirt in it's your dirt. What's your dirt doin' in his ditch?
Luke: I don't know, Boss.
Boss Paul: You better get in there and get it out, boy.
- Hit and Run Management
- Joel Spolsky (joelonsoftware.com) came up
with this phrase to describe a style of poor management resulting from
50's era Command & Control style management:
"Nobody at Juno owned anything, they just worked on it, and different
layers of management happily stuck their finger into every pie, giving
orders left and right in a style which I started calling hit and run
management because managers tended to pop up unannounced, give some
silly order for exactly how they wanted something done, dammit,
without giving any thought to the matter, and leave the room for
everyone else to pick up the pieces."
http://www.joelonsoftware.com/articles/fog0000000072.html
- Financial Hangover
- Like a hangover, only resulting from the
financial impact of a night out, party, vacation, etc. I'm not sure
if this is really good enough to deserve a listing here, but let's run
it up the flag pole and see if people salute.
- Teddy Bear Debugging
- Often the process of explaining a problem to
somebody else is enough to make you see the answer to the problem. Also sometimes called "rubber ducky debugging"
From Rob Pike's The Practice of Programming:
"One university
computer center kept a teddy bear near the help desk. Students with
mysterious bugs were required to explain them to the bear before they
could speak to a human counselor."
- Push The Shiniest Button
- A good phrase that a friend dropped into
conversation one day, vividly depicting somebody interacting with
things they don't have the faintest understanding of. Not sure
whether he came across the phrase somewhere, or invented it. In
reference to the writings of Analee Newitz (techsploitation.com) he
said, "she has a lot in common with John Katz, i.e. not a serious
investigator so much as someone who pushes whatever geek buttons are
shiniest."
- Boil the Ocean
- An industry phrase meaning to attempt something
futilely ambitious. I first came across the term in one of Joel
Spolsky's columns, in the context of getting everybody on the internet
to install some browser add-on. For some reason, "get the
insanely massive installed base to upgrade all at once" really seems
to resonate with "boil the ocean."
- FLAQ
- A FAQ written by a marketing flack. The term could be
stretched to mean any supposedly-informational content that is in fact
just advertising copy.
FAQs originated on the Internet and tended to be marvels of clearly
written information, mainly because they were written by intelligent
people who wanted to save time and noticed that certain questions came
up again and again (i.e. frequently).
Until, that is, the internet boom. Marketing flacks noticed that a
lot of websites had FAQs, and started putting up FAQ sections in
commercial websites, filled with useless advertising copy.
The forward to the dead trees edition of the Java FAQ put forward the
idea that FAQS are so useful precisely because the repeated
discussions end up identifying and explaining the underlying confusion -
the question behind the question - that usually leads to somebody
asking a FAQ.
Another term for this is "X for Y" questions. This is where somebody
is asking the wrong question, because of a misunderstanding of some
prior factor or underlying aspect. "I want to use X to solve problem
Y", when X is the wrong solution. Often "to solve problem Y" is left
unstated.
The original FAQS were literally "frequent", both in being
asked and in being answered, and the answer usually was a discussion,
which led, after going through the wringer a few times, to recognizing
the "X for Y" pattern.
- Pasta Proposal or Spaghetti Estimation
- A warped misuse of a legitimate technique, the strawman document (see below).
Programmers know the term "spaghetti code" to refer to
messy, tangled code.
"Throw it at the wall and see if it sticks" is an old saying for
how to tell if spaghetti is done cooking.
We came up with "spaghetti estimation" (and the
later, more formal Pasta Proposal) to describe an odd, annoying, and all too common situation
where a proposal is put together by salesmen and thrown at a potential customer to see what sticks.
This often results in unfeasible projects.
Frequently what happens
is that a client prospect thinks that they should buy something, but
doesn't know what. In a sane and rational world, you'd have somebody
good at analysis (i.e. not a salesman) sit down with them and get a
decent idea of what's going on in order to develop a proposal. In the
real world, often this is politically impossible on both sides, for a
variety of reasons. Instead, the salesman makes guesses, pressures
the back office folks to throw together a proposal, and throws that at
the client to see if it sticks. Sometimes this is about features,
other times it's about cost and time estimates (e.g. the client
doesn't know what ballpark they want to be in).
Note: A strawman is a logical fallacy; inventing a weak
argument (the strawman), attributing it to your opponent, beating the
stuffing out of it and then declaring victory. A strawman document is
a useful technique for getting the ball rolling in an ambiguous
situation. People are generally not good at abstracting, but are much
better at reacting to something concrete. Everybody's unsure what's
needed, nobody feels certain enough to put forward a proposal. So you
make something up and explicitly call it a strawman proposal to avoid
panicking people (or being blamed for it being wrong). Now everybody has
something concrete to react to, which helps you figure out what the
real proposal should contain.
- Road-to-Hell Technology
- Paved with good intentions. Client-side
javascript (in a pre-AJAX context) is one such technology. I don't
mind javascript as long as a) we aren't functionally impaired when it
breaks and b) we don't allow it to sidetrack any significant degree of
resources. However, the nature of javascript is such that even with
the best of intentions, both of these will inevitably happen.
- Shiny Technology
- I was just editing this and read "push the
shiniest button" followed by "road-to-hell technology" and had an
epiphany. Certain kinds of technologies are a recipe for trouble, not
because of anything inherently wrong with them, but because of their
allure. Mostly I'm thinking about technologies that non-techs somehow
feel more sure of themselves with; not that they understand the
technology, but that it is for some reason more viscerally accessible
to them (see javascript). This leads them to assume that their
intuition is valid, when it usually isn't. However, most
technologists have had an encounter or two with shiny technology
themselves.
- Slogware, Slogtech
- Product or technology that by its nature you
simply CANNOT "work smarter" at, you have to just slog through it.
Oracle is a good example of this, particularly getting an Oracle
server to exactly, precisely the same patch level as a working
installation so you can move an application over to it.
- Fritterware
- An ex-colleague, Jim Venis (the "ven" part is pronounced
as in "venn diagram") coined this, for technology that tends to suck
you into frittering away countless hours of time fiddling with it and
fine-tuning the settings, without any real concrete payoff.
- Enabling
- A term from Alcholics Anonymous, meaning to help
somebody avoid confronting their addiction, by helping them avoid
having to deal with the consequences of their addiction. Sometimes
technologists do this, by leaping in
to fix the consequences of bad technology decisions. I'm not
suggesting that you should play games with your users or colleagues,
but it's important to recognize what's going on.
- Irrational Technology
- Relevant to Enabling. Enabling is
particularly tricky when dealing with other technologists, because
technologists (correctly, most of the time) tend to assume that
problems are the result of incomplete knowledge, and that resolving a
specific problem and filling in the knowledge gap that caused it will
prevent a recurrence of similar problems. For rational technologies,
this tends to be true. For irrational technologies, this is not.
When dealing with a fellow technologist who does not want to accept
your advice that a particular technology is irrational, it's important
to be careful not to become an enabler.
- Systematic vs. Idiosyncratic
- I use this dichotomy to illustrate two opposing tendencies in
knowledge. This is closely related to rational/irrational. With
rational technologies, you can assume that there is a pattern (model,
paradigm) to the technology. As you come to understand the pattern,
you start to be able to anticipate it, and hence get better at dealing
with the technology. Some topics defy this, for example automobile
repair and Microsoft Windows products. With these idiosyncratic
technologies, you have to rely on brute-force memorization of tricky
little details and exceptions.
Most idiosyncratic technologies also qualify as slogware or slogtech.
- Transparent Complexity vs Obfuscated Complexity
- Some things just
are complex, no matter how you try to hide the complexity. I am
convinced that the worst excesses of bad UI design are attempts to
hide such inevitable complexity. The result is layers of obfuscation
added onto an already-complex topic. I'd rather just have transparent
complexity, complexity that I can more easily grapple with directly,
than complexity buried under layers of gratuitous obfuscation.
Or to put it another way, "Great, now instead of debugging this
with chopsticks, I have to debug it with chopsticks... while wearing
oven mitts."
- Obvious Complexity vs Subtle Complexity
- Clutter is obvious complexity; subtle complexity often may
not seem complex at all - and then comes back and bites you.
I'm not entirely satisfied with this naming. Other candidates:
Shallow complexity vs. deep complexity.
Obvious complexity vs obfuscated complexity.
There's also "accidental complexity vs essential complexity", via the great Fred Brooks' book "The Mythical Man-Month": Accidental complexity relates to problems which engineers create and can fix; for example, the details of writing and optimizing assembly code or the delays caused by batch processing. Essential complexity is caused by the problem to be solved, and nothing can remove it; if users want a program to do 30 different things, then those 30 things are essential and the program must do those 30 different things.
Note, sadly "irreducible complexity", which would be quite apt, has been used by some creationist for their nonsense explanation of creationism.
Note, see also "high surface area to volume", below.
- Monkey Code
- Coding that a
trained monkey could do. A classic case in point is getters/setters
in Java; delegation for unmodified methods in Java; all the
scaffolding required for JDBC in Java. Any repetitive coding you have
to do should be done by a program, or better yet, should be built into
the technology.
(I intellectually prefer "Monkey Code", but I always
end up saying "Donkey Code" for some reason - maybe "Donkey Kong"
imprinted itself too strongly on my young mind.)
- Catholic Tech, or Rosary Code
- Technology that has a lot of
inherent monkey code (e.g. "First you say 500 Hail
Marys"...). (coined by James Deikun)
- Semantics
- Often used as a derogatory comment in dismissing a
question of terms, "you're quibbling over semantics". In some
cases it's correct to avoid quibbling over semantics. In other cases,
semantics are half the job. Besides the importance of choosing the
right metaphor and the right terms, working with the wrong set of
terms and labels (often due to a legacy codebase) adds considerable
cognitive overhead.
- Cognitive Overhead
- Additional load imposed on your thinking process, often by extraneous elements like poorly-chosen taxonomy.
- Spiritual Code
- As in "frustration is good for the soul." (based on
a comment by John Roth on the extreme programming mailing list).
- Koolaid Technology, Way of Life, One True Way
- From the
phrase "drink the koolaid", generally meaning to completely buy into a
mass delusion (for example a cult; put "koolaid" and "cult" together
and you can guess the origin of the phrase, which seems strangely
popular in certain technology circles :-).
There's a frustrating
tendency for technologies to have a web of interdependencies.
Microsoft and Oracle are two good examples. A friend suggests Zope
is an example, but I haven't worked with it.
To get the advantage of the technology, you have to sign up for the
entire way of life implicit in the technology. These are Koolaid
technologies. This is true in both the larger sense, where you might
say there's an entire ecology of interdependent technologies
(Microsoft software development is a good example of this) and in the
smaller sense, IDEs or particular components or component suites.
A
technology's Koolaid-ness isn't *always* evil, but I tend to be wary
of it. Aka "a little bit pregnant." Koolaid is not a sign that
technology is inherently bad; gratuitous koolaid is bad. Most koolaid
is gratuitous.
A friend uses this less-pejorative phrase, "heavy lifestyle
assumptions", as in "almost all frameworks come with heavy lifestyle
assumptions."
- Clever Stupidity
- Sometimes I use this phrase in a complimentary
sense; somethings doing things the stupid way can be very clever.
Most of the time, I'm talking about doing things that seem clever but
are ultimately stupid (like most uses of javascript, which don't add
significant value and degrade the user experience in other ways).
- High surface area to volume
- From a post by Tikhon Jelvis at Quora:
"...quite a lot of projects have what I think of as a "high surface area but low volume". That is, they have a lot of code that does small, mostly unrelated things without deep structure or conceptual complexity. Building up thousands of lines of code like this is often both easy and inevitable.
Consider, for example, many GUIs: they have whole layers of menus, dialogs and wizards. All that UI code requires many lines of code (especially in older versions of Java with its anonymous inner classes!). But those lines of code don't do that much. For example, each menu item needs a line declaring its name, specifying which menu it goes into and binding a function to it -- you're looking at 2 or 3 lines per item, even for trivial ones like "help" or "quit"."
Tikhon Jelvis' website
See original (unformatted) article
Feedback