Wednesday, December 25, 2013

A New Broom

At the end of the year, I look back and think about the auld lang syne1...like many of you, I would suspect. This past year, for a lot of us, saw a jump into yet another technology stack as the newest broom - in this case dustjs - sweeps the UI engineering world (thanks to the powerhouse known as LinkedIn and later, PayPal, who took the ball and ran with it). It was bound to happen sooner or later, of course. As one person commented about learning to code, you must realize that if you choose to become a developer you will learn to code every day for the rest of your life.2

This realization - or remembrance on my part - brings up another point, however. People, especially geeks in my line of work, get über excited about new technology, and a lot of that excitement is misplaced. Let me say at the outset, there are usually legitimate reasons for getting excited about a technology stack, but we are too often subject to more hype than reality.

Long, long ago in a galaxy far, far away...when I started on my career path, I worked for a company that was built on COBOL 74, courses covering C were finally making it into university programs, and RDBMS languages like PROGRESS and PARADOX were making inroads on PCs (which were still fairly new themselves). Those first few years when businesses were setting up token-ring networks and figuring out how to share data across multiple workstations that weren't linked to a mainframe were exciting - primarily because it was a minor revolution. The cost of several PCs, even in those days, was considerably lower than the cost of a mainframe and it became a lot less expensive to do business.

In a few years we were all connecting to a little corner of the Internet called "the World Wide Web" and there was another minor revolution as the online world enabled a globalization of business that previously had been the domain of international businessmen (and let's be honest, there really weren't international businesswomen in those days - so we're slightly less sexist now - not much, but a little). In the almost 20 years since, we've gone from HTML, with its simple styling tags to CSS, JavaScript, and HTML5 in front-end code and gone through (something like six production versions of) ColdFusion, ASP and ASP.Net...and C#.Net...and well...everything Dot Net, Perl (and LAMP), PHP, Stripe, Ruby (and Ruby on Rails), Python...and on and on. Yes, the world has improved...except...it really hasn't.

In my time, I've written software for a number of organizations, and nearly every time I've changed organizations I've changed the language(s) I've used. This realization leads me to two conclusions - both relevant to today's environment. First, I cannot recall the last coding class I took that actually applied to what I was doing or about to do in which I did not know as much, or more, about the topic than the instructor. On the other hand, I easily recall the last time I used what I learned in a number of courses in maths, language, history, and philosophy - so that is what I will make sure I pass along to the next generation.3 Second, the technology used by someone (organizations included) is nearly irrelevant.

Why would I, a geek at my core, say something like the technology is nearly irrelevant? Because I believe it to be true. For evidence, we need only look to the any of a number of organizations, both public and private, functioning quite well on technology that is decades old. We might also consider that while there is no doubt that I am not a fan of complexity, newer technologies are sometimes improvements or even required even though they typically add complexity on any one of the many layers an organization has. To see the truth of this one need simply follow the path of the World Wide Web from document markup and delivery to ecommerce.

In my own career you may be surprised at the number of organizations that I have worked with that have said "we're not responding to <something> fast enough because our technology is out-dated, but we'll be able to respond much more quickly if we start using <your favorite technology>". You might also be surprised at the number of times an organization has been wrong when it has made such a claim. Let's look at this idea from a few points of view.

First, let's consider the learning curve associated with any new technology. If we have, for example, engineers expert in C, they will likely be able to switch to C++ with little effort. If, on the other hand, we switch to Java (which is a much more likely scenario) they are likely not experts and may in fact be novices, not only with the language itself, but with purer object-oriented languages in general. It takes time and effort to become an expert, time and effort that make the claim of being able to respond more quickly if a different technology is used questionable.

Since responding quickly is often a question of productivity, let's look at another activity - developers building productivity tools. Historically, manufacturing operations had a person (or persons) designated as 'tool and die maker'. However, because of the distributed nature of application development that position has been abandoned. There may be productivity teams who look at how productivity can be increased, but typically those groups are too far separated from the process and paint with too broad a brush to be as effective as possible. Further exacerbating the problem - when the underlying technology changes, tools used previously are obsolete, even if the problems those tools solve are not. This alteration of process alone increases the learning curve associated with new technology, and if the underlying problems have not been addressed, new tools must be built to replace the obsolete tools.

These are just two of the myriad issues associated with changing a technology stack. So, if it's not the technology that makes an organization successful, what is it? How the organization functions.

One of our idiomatic expressions is "a new broom sweeps clean", and unfortunately this is true in organizations - sometimes through the diminishing of knowledge capital by significant turnover in human resources but more often simply because the new person has a different way of doing things or is used to a different technology. How new brooms are put to use is a part of how an organization functions, and if each new broom sweeps clean, there is little history from which to learn.4

Another aspect of how an organization functions - where money is budgeted - demonstrates the organization priorities. In short, budgets are power - the larger the budget, the greater the power, and an easy way of securing a larger budget is by making the case that <your least favorite technology> is out-dated and must be replaced. You can claim that you won't be able to compete for better job applicants or that support for <your least favorite technology> is going away or that <some other technology> is faster and more reliable. Some of your claims might even be true, but they need not be verifiable to induce a mild fear response and increase your budget. Of course once you have the budget, you have to spend it to keep it - otherwise you lose your standing of power in the next round.

Of course, the truth of the matter is that there are job applicants who are expert in <your least favorite technology> who are happy to work for well-functioning organizations and most established technology will be continually supported, simply because organizations providing the technology understand the importance of backward compatibility. As for claims of <your favorite technology> being faster and more reliable...sometimes those are true, but typically only after becoming established technologies. Increased complexity rarely produces speed or reliability as offspring; rather, complex systems fail in complex ways and, unfortunately, failure is not an option - it's a certainty built into every system.

For all the hype, the technology should not be the ultimate concern. For instance, LinkedIn could just as easily run their entire site using classic ASP rather than dustjs. I understand their reasons for not doing so and I understand their reasons for getting away from a fragmented stack5 but I also recognize that getting to a point where the stack is fragmented is a function, not of the technology, but the organizational culture and the way in which the organization operates.

There are usually a number of legitimate reasons for changing a technology stack, but let's step away from the hype of how "<my favorite technology> will help us get to market faster" (there are other, better ways to do that - Lean UX6 for example) or the "how <my favorite techonology> is better than <your favorite technology>" arguments that have been known to start something akin to a holy war in engineering departments. We're better than such puerile behavior, and maybe if we're not focused on how great the squirrel scampering by is, we can focus on building from our strengths and making more awesome...because sometimes we do not need a new broom, we need to learn how to sweep.

Notes: links open in a new window
  1. Days long past or in American vernacular, the good 'ol days.
  2. Dachis, Adam. Don't Learn to Code: Learn to Work with Technology.
  3. You can read more of my thoughts about coding classes in "Coding education, coding life".
  4. Those who cannot remember the past are condemned to repeat it. [Santayana, George. The Life of Reason; or the Phases of Human Progress. New York: Charles Scribner's Sons, 1905.]
  5. Basavaraj, Veena. Leaving JSPs in the Dust: moving LinkedIn to dust.js and client-side templates.
  6. Gothelf, Jeff. Lean UX: Getting Out of the Deliverables Business.

No comments:

Post a Comment