by Steven J. Owens (unless otherwise attributed)
My general career advice is to work towards a long-term goal; you build a career the way you build a chess strategy, one move at a time, each move is supported by previous moves and laying a foundation to support future moves. Look at your game globally, act locally.
Concomitant with that, you have to figure out what your passion is for, to figure out what you want to build towards for your long term goal.
Each job you take, look at the implications in five different dimensions, which each have implications in short term, medium term, long term, etc, ranges:
Not necessarily in that order of priority.
Monetary is the obvious one everybody focuses on: if I get/take this job, will they pay me well? Duh... but how good is well, and how much is well enough? When do you prefer less monetary reward in return for intangibles?
Okay, will I get a chance to add or extend a good skill? To learn a tool skill? To tackle a new kind of project, or new kind of work?
Will this job look good on my resume in general?
Will this job give me the opportunity to add another bullet-point to my resume skills?
Will this job on my resume show a good progression/development of my overall person?
Will this job move me into or further along in an industry or a type of development area or a type of job that I want?
And most importantly, will this job nurture my passion?
I have this pet theory I call the "i-curve" (this name was a lot sexier before the internet bust :-). Don't mistake this for distinct career phases or anything, it's just a sort of roadmap. The "curve" part of the picture is that development, activity, motion, jobs, etc, are not distributed evenly along this line. Technologies don't move smoothly along this line, either, but they are usually somewhere along it.
You have to figure out where along the line you feel best. Best being a balance between passion and comfort, generally, and comfort meaning, "comfortable tackling that sort of problem".
Ideation is "pure" research, i.e. figuring out a better compression algorithm, or other stuff like that, more creative work. Pure ideas, higher mathematics being a good example of this. The early, original work in public key crytography is a really, really good example of this.
"Innovation" has been co-opted by Microsoft, who have some sort of odd definition of it that doesn't seem to fit the rest of the world's definition, so I need to find a replacement word, but for now let's stick with the original meaning. Innovation is in between ideation and implementation, and mostly seems to consist of finding something useful to do with those pure ideas, and prototyping that. Sometimes, especially since the Clinton administration (they changed the rules on academic funding and spinoff startups), this is followed by the innovators then leaving research or academia and trying to come up with a salable implementation (often called "productizing" a technology). Sometimes years pass before the rest of the world catches up to the innovation and makes it economically feasible or relevant, just plain useful enough.
Implementation is the kind of stuff most software people do. Build it to solve the problems in front of you, make it work well. Sometimes it's cutting-edge, bringing innovation into implementation, but mostly it's just using tried-and-true technologies and techniques to implement custom solutions to well-known, well-explored problem spaces.
I suppose that, statistically, most implementation work is just cookie-cutter app development, cobbling together yet-another-order-processing-system. This is more a reflection on the state of our industry, and in any event it's not really where I want to be, both personally and strategically. Personally because there's not much room for me to make a difference in that sort of project. Strategically because that's the kind of job that eventually gets automated away, either bit by bit as components become more standardized, or entirely as one company dominates the market.
Industrialization is when a specific technology starts to become more commodified, when the general class of problems that the technology solves have become so well-defined and well-accepted (which are not the same thing, but which often must converge before a technology arrives), and the solutions so well-implemented, that piece of the solution becomes less of a variable. The closer something gets to "install and configure it and then forget about it", the more industrialized it is.
SSL is pretty far along the curve, well into industrialization for most web users, for example. For web developers, it's still pretty industrialized, mostly you can just ignore it. You have to know more about it if you want to develop a client program that uses SSL, but most of the hard stuff is already done for you.
Ignorability is when you can pretend the technology isn't there, with some effort. Web servers, application servers, database servers, are generally at this stage. They take work (sometimes lots of work) to set up and configure and in some cases (databases) to apply them to your problem. But they generally don't occupy the focus of your attention.
Invisiblity is when you just plain forget its there most of the time. Phones are generally a good example of this. So are ethernet cards; it's hard to forget the network's there (because it doesn't always work :-) but when was the last time you spent more than 30 seconds noticing your ethernet card?
Let's look at the example of quantum tunneling. This is a weird effect that happens at a ridiculously small scale (16 nanometers), where electrons sort of pop up where you don't expect them (sound of quantum physicists screaming in horror in the background at my simplistic explanation). Quantum tunneling was ideated decades ago, but largely stayed in the realm of metaphysics until chips became so compressed that quantum tunneling became something they had to factor into designs.
I'd like to put together some brief case studies of how things progress through these stages. Three examples I can think of, offhand: