Showing posts with label integrity and trust. Show all posts
Showing posts with label integrity and trust. Show all posts

Wednesday, February 28, 2018

My Pen Is My Tongue

A series of tweets about self-documenting code
A few days back I sent out a series of tweets about "self-documenting code". Self-documenting code is an idea that's been around for many years, like stories about the wee folk...and like the wee folk, no one has seen self-documenting code.

The short version of the tweet series is that if you're writing code, you should be writing documentation as well - it's really too important to skip. This post, however, is not really about self-documenting code, but rather about how to write documentation, and more specifically a certain piece of documentation that you should never neglect.



I entered the whole techno-geek world at a time when computer labs were a real thing. Punchcards and shelves full of binders stuffed with documentation were commonplace. Documentation isn't like that anymore. When Java came along, I was almost enthused to use JavaDoc because of the level of clarity it added when writing the documentation. Now that nearly all code written by large, technologically advanced firms is either in Java or JavaScript (or ECMAScript), JavaDoc and JsDoc are - or should be - the de facto standard.

There is seldom serious argument against using one of these two tools anymore. There is disagreement about how the tools should be used, however. In the JsDoc community, one of the points of contention is the @author tag. To be clear, the JsDoc tool authors have stopped using the author tag and no 'contributor' tag has been added. It might seem, from their use (or non-use) that these tags are unimportant, and in fact, that is a common perception, especially in light of the advances in source code management, or what we used to call "version control".

However, not only should you use the authorship tag(s), you should be encouraging everyone else to use them as well.

It would come as a surprise to no one if I reminded you that we write code to solve problems. Not only are we writing code to solve problems, we're writing code to solve complex problems. For example, no one would write code to add two numbers...doing simple calculations on large data sets, perhaps, but there is a "complexity bar", below which we wouldn't dream of using code to address a problem. The first step of writing code is understanding the problem you're trying to solve.

As a hypothetical example, let's assume you've inherited a project. You've read the documentation that describes the solution to the problem the code offers, but after getting a small understanding of the problem combined with the solution being used, you have a list of questions. Why was this particular solution chosen over other solutions, for example. You can make some assumptions, but wouldn't it be nice to be able to contact the author to ask for their insight? Code, even well-documented code, is only a partial story. Just like every fan of a book turned into a motion picture knows, even faithful adaptations leave out bits that someone thought important. The first reason to include authorship information in your documentation, then, is the abundance of information it can point you to.

The common response to this concept is that the authorship information is not needed in the documentation because source control software, like git (my personal favorite), can track that information and expose it through tools like blame.

This response, however, misses the purpose of such tools. Version control is tied to a specific change...in git parlance, a commit. Yes, you can look at a particular line and see the last change of that line - the author of that change - but that is qualitatively different information than the author of a solution...and that information is generally only the last change. In order to get authorship information you must follow changes to a specific line back through history, and if at any point history was squashed or rewritten, that information is gone. Version control tools are excellent at solving the problem the author intended them to solve, as the author understood the problem; do not expect another author's code to solve a problem as you understand it.

Another reason authorship is important is we, as an industry...and really we as the human race...have difficult acknowledging the contributions of women and persons of color. The list of women who have significantly contributed in STEM fields without attribution is long...far too long. Not including attribution participates in that system of oppression by reinforcing the status quo. If we want to have any hope of disrupting patterns of discrimination, patterns that have existed for millennia, we must combat it at every turn.

A while ago I wrote a post called Visibility and Obscurity that described a situation in which attribution was changed on work I had done. In academia this is typically called plagiarism, and in most instances it's a punishable offense. Even outside academia, claiming to have done something you have not done can have serious consequences - Scott Thompson's resume scandal is evidence of that.

We should be writing code we can release with pride. Build things you're proud of and put your name on it...and give that same consideration to others. Amplify voices that are too often silenced or ignored - it does not diminish your contribution and it makes a difference. If it only makes a difference to the woman or person of color who finally has their contribution recognized - that's enough. If the only people who see an authorship reference are your employees, your colleagues, that's enough - they are important too.

Happy coding.

Monday, June 26, 2017

Visibility and Obscurity

Several years ago, when I first started at PayPal, the front-end development environment was still fairly young. As a result, tools that might have existed in other environments were missing.

As a veteran coder, I quickly grew tired of repetitive tasks - I wanted to be writing code - and set about writing scripts that developed into a significant tool suite. I shared that tool suite with both front-end and back-end developers (there were no full-stack developers in those days) and the use of those tools spread throughout the company, across the globe.

Out of that activity, there were two different experiences that bear examination. I'll address the later of the two experiences first.

In later years, as the development environment matured, another engineer - one responsible for establishing a standard development environment - took control of the tool suite (totally understandable) and put his name on my work (not understandable). The tools I had birthed and nurtured through numerous changes in the development environment, and continually promoted so they would be visible to all engineers - were adopted and their new foster father promoted himself as their creator when they became visible to upper management.

This is not an unusual situation. It happens all too often - much more frequently to women, of course - that someone other than the individual who has done the work takes credit, especially as the work becomes more visible.

That experience taught me two lessons. First, how you handle it says volumes to those who see the situation. Second, obscurity can be moments away, behind someone else's shadow, even when you think the visibility you've worked to cultivate over years is secure.

The second experience was much more pleasant. On a regular visit to a development office, I was introduced to an engineer who had recently joined the company. The engineer and I exchanged pleasantries - the normal "nice to meet you" bit - and then the engineer who introduced us told her my username (which was explicitly tied to the aforementioned tool suite)...and her expression and demeanor shifted dramatically. As someone who's never been in the "popular" club (yes, I've been a nerd and geek since before secondary school), that reception was quite an ego boost.

I had no real expectation of receiving such a reception - none of my long-time friends who'd seen me develop the tools reacted in the same manner - and it caught me by surprise. That reception also taught me a lesson - there will be some ways in which you're always more visible than you believe you are.

History is eager to write out of the picture those who have struggled to build great things - whether it's a woman who's made a significant contribution to our community (like Nicole Sullivan, the creator of OOCSS) or a man who is more interested in the work than the credit (like Nikola Tesla).

When you find yourself in these situations - situations of visibility and/or obscurity - how you navigate those shoals says volumes about your ambition, your drive, your values - such as integrity and trust, and what you know to be true about yourself. In those situations, may you have fair winds and running seas.

Happy coding.

Wednesday, May 31, 2017

Unnecessary Complexity: A case against ReactJs

I'll admit, even though the title of this post might imply otherwise, my experience with ReactJs is limited. Unlike a lot of UI engineers, I have been working primarily in pure HTML, CSS, and JavaScript since I began more than two decades ago. Oh, sure I've used popular JavaScript “libraries” in the past – like YUI – and I've written more than a few over the years for some pretty big companies. I've also used some pretty popular “frameworks” – like BackboneJs – and combined them with other JavaScript libraries (e.g., NodeJs, Express, and DustJs). Overall, though, even though I had no prima facie opinion of ReactJs, I've avoided it – in much the same way that I've avoided winning the lottery – but all that has changed with the current workscape as more and more companies adopt ReactJs.

I should mention that I'm generally not a fan of any websites or applications built without using Progressive Enhancement, but then if you've read much of my writing you already know that, so the subtitle – A case against ReactJs – is a little misleading as this isn't just a case against ReactJs but against a practice of which the use of ReactJs is just an example.

I also must point out that I'm not a fan of the WSOD that results from many JavaScript-driven pages. While that's more a general issue with client-side frameworks and how they're woven into a front-end architecture, it also applies to ReactJs. I'm also not a fan of loads of JavaScript that is dependency heavy, intercepts DOM events, encourages a development process that isn't progressive, or discourages graceful degradation.

So, why single out ReactJs when it's clearly not the only library to do this? Good question. It's popular. Massively popular. From the number of job descriptions including it as either a requirement or “nice-to-have”, it's pretty easy to see that without knowledge of or experience with this particular library (it's not a framework), it's becoming very difficult to even get past the CV screening phase.

Although ReactJs is not the only example of client-side libraries that dot the Interwebz landscape, as one of (arguably) the most popular libraries, it bears close examination. And although ReactJs isn't the only example of what's wrong with UI engineering – there are plenty of other examples – most of them boil down to the willingness of engineers to sacrifice the user experience in an effort to make their job easier.
The more layers are piled into increasingly complex systems, the more failure paths we introduce. We’ve learned that automation does not eliminate errors. Rather, it changes the nature of the errors that are made, and it makes possible new kinds of errors. Capt. Chesley B. “Sully” Sullenberger

I hear you, and yes, they do say that it's a poor craftsman that blames his tools, which means all this flak I'm directing toward ReactJs might be misplaced. Am I not just blaming a tool for poorly-written code? There are two distinct lines of response that I would take. First, saying that it's a poor craftsman that blames his tools does not imply that the tools used are unimportant. No craftsman would wield a dull blade that made rough cuts when fine cuts were the goal. Every craftsman knows that other famous tool-related expression that clearly says when the only tool you have is a hammer that everything looks like a nail. There is a tool appropriate to every job. Second, I would posit that libraries and frameworks – things like ReactJs – are not, in fact, tools.

If we look at the artisan analogy, the tools in that case are HTML, CSS, and JavaScript. Libraries like ReactJs, and even frameworks like AngularJs, are not really an artisan's tools – they're not the base ingredients that make up a dish, the closer analogy is that they're the prepared foods other artisans have made. As prepared foods, they make the kitchen's job easier, but at a cost, because they remain in the end product even after it's gone to market. They're the frozen, processed food of the Interwebz, encouraging people working in the kitchens to masquerade as Beard Award winners.

Taking this foodie analogy further, as each new processed food is typically built upon other processed foods, the list of ingredients (dependencies) grows longer as “more layers are piled into increasingly complex systems”. These increasingly complex systems (dishes) are not only more fragile but also fraught with other issues, such as increased payload size (which is an issue for anyone connecting over a data-limited network, like mobile) and performance issues as all downloaded code executes in the browser. In the end, users end up with a bloated mess, but at least the “engineers” got the code out in time. They are literally, as the saying goes, “getting shit done”...and just as we wouldn't call someone working in a kitchen combining prepared foods into what must only loosely be described as a “dish” a “chef”, we shouldn't call those who create the monstrosities only loosely termed a “user interface” an “engineer”.

We must, as a community, get back to building actual user interfaces. We must, as a community, stop the madness. We must, as a community, become engineers again. Get the HTML out of your JavaScript. Get the CSS out of your JavaScript. Build agnostic, slim interfaces that everyone can use. We must, as a community, because we are the only ones who can.

Friday, January 10, 2014

What Isn't Said

If you're a follower of this blog, you'll notice that my posts tend to fall into three general categories. There are posts about how do something, like put PayPal on your Facebook page, build a slider toggle, or include reference notes, posts that show a different side to things in technology industry news that catch my eye, and posts that are general career advice from someone who has spent a few years in a very turbulent industry. I'm not sure if this post fits in any of those three categories, or if I'm starting a fourth after reading the blogs written by two men I consider to be, at the very least, something more than acquaintances (http://www.thejourneyismydestination.com/ and http://www.codercowboy.com/). I should point out, I suppose, that neither of these has the reputation in the industry of Eric Meyer (http://meyerweb.com/) or Nicholas Zakas (http://www.nczonline.net/), but I suppose that gives them a little more influence in my estimation because they are writing, not because they need to but because they need to, and I see something of myself in that, and besides, their year-end posts were good.

In addition to the inspiration from other bloggers, this time, as I looked back on the past year and looked forward to a new year (as many of us do at the start of a new year) I came across interviewing tips from recruiters and one in particular caught my eye as I read through the post, asking myself the interview questions as part of my year-end self-reflection. The question caught my eye, in part because as someone who has conducted several interviews and 'phone screens', I find it to be a question that I've been asked, but have never asked - it's simply what is your greatest weakness.

This time, perhaps I found insight that has eluded me in previous years or perhaps I have rediscovered a forgotten truth, but I recognize that there are those who see my greatest weakness only as a weakness, while I see my greatest weakness as a strength as well. This difference in perspective likely comes about because we all expect that other people to not only understand our actions - because they're based on beliefs that spring from rational thought - but to share those rationality-generated beliefs our actions are based upon. However, that universally-held, unspoken belief is false - there are those who do not understand our actions and do not share our beliefs, and likely never will - their perception is fixed and the die is cast.

Here's where I offer a bit of advice. When this happens to you, and it's extremely likely it will, at some point although you will not fully realize what is happening you will attempt to cast what others see as your greatest weakness as a strength. This is a Sisyphian task, and no matter how many times you roll that boulder up the hill, scrabbling for every inch of dirt, it will roll back down, and all the while, none of us acknowledge or challenge our perspective unless we trust each other - really trust each other - and remember that we're human, doing the best we can with any given situation.

As the past year closes and a new one is begun, I am also reminded that good leaders know the strengths of those on their team, and beyond that, great leaders see strength where sometimes even team members see only weakness. As we work together, maybe we should take Peter Drucker's words to heart and listen for what's not said - search for those points of weakness - and talk about why we see them as weakness but to then use our trust of each other to move beyond that to see them - really see them - not only as weakness but as a hidden strength, because whether we're the team captain or just one of the players, we can all benefit from the humanity that comes from trusting each other.

Monday, December 16, 2013

You will be assimilated

Freedom is irrelevant. Self-determination is irrelevant. You must comply.
Borg Collective

You will be assimilated. Resistance is futile.
Hugh

In "Conversion and Acquisition", I wrote about the inverse relationship between conversion and acquisition, possible causes of the inverse relationship and how it might be fixed. In this (much shorter) post, we're going to look at this same issue from another angle.

Let's assume that you have not implemented a forced acquisition method and you ask yourself "what do I know about my customers" and then ask the same question after you implement a forced acquisition method - will your answer be the same? Unlikely. The motivation and values of repeat customers are likely different than the motivation and values of people who are occasional users. Let's consider the simplest of these differences - repeat customers are have a vested interest, to at least some degree, in your continued operation whereas those who intend to be a single-use visitor are not invested in your business to any degree.

Why is this important? Every day we make assumptions and decisions based on what we know about our users. If our representative sample changes, those assumptions and decisions must also change. There may be simple assumptions about the design of a web page that are incorrect - assumptions that we can address by A/B testing, but what if there are assumptions associated with the risk of a transaction or possible fraud - those are considerably more difficult to test and correct.

In short, the more you rely on knowing your users the more contraindicated a forced acquisition method is.

Oh yeah, it's a bloody evil thing to do, too...just look at the Borg.

Saturday, November 9, 2013

What's in a name?

Women face a number of barriers in science-based endeavors, perhaps more so than in other fields1. This matter is not really even open for debate. What is up for debate is whether or not it's justified and whether or not we will actually do anything about it.

Much debate surrounds the causes of the gender disparity evident in many fields. Some argue that girls and women do not pursue STEM2 educational programs and therefore either show a lack of interest in the topics or aren't generally qualified to pursue the programs. This is, almost certainly in part, due to traditional gender roles, but it cannot be limited to that as the limitations based on traditional gender roles have decreased as time has passed and societal norms have adjusted.

Another portion of the lack of pursuit of STEM programs by women is almost certainly self-inflicted doubts. This can be seen in a (1946) conversation between Einstein (yes, that Einstein), and a South African girl named Tyfanny. In corresponding with her, after she revealed her gender, Einstein said,
I do not mind that you are a girl, but the main thing is that you yourself do not mind. There is no reason for it.3
Einstein recognized, in Tyfanny's words, the self-doubt resulting from generations repeating the societal refrain "you're a girl".

These problems are significant, and we must fight tenaciously to overcome them; however, these facts alone are not enough. These are facts of history - facts that society has dealt with for years and yet, one might argue that while female representation is much lower in STEM-related fields, it is significantly imbalanced in many fields4. Why is this? Why are we not convinced that especially science, technology, engineering, and mathematics are about ideas and not something as trivial as gender? Are we really so blinded to not be convinced that women can think as well as men?

I refuse to believe that it is something in our conscious behavior, and I posit that our bias goes much deeper than we originally thought. Even though we have convinced ourselves that even if the larger populace does not subscribe to a meritocracy those of us in STEM-related fields are well into a meritocracy, we have deceived ourselves.

In what should have been a mind-blowing study written more than a decade ago, Rhea E. Steinpreis, Katie A. Anders, and Dawn Ritzke revealed that both men and women demonstrated gender bias in hiring recommendations.5 The subjects for this particular study were all PhD-level psychologists - people who should recognize that science is about ideas and not gender, people who should recognize trivial and non-trivial information for what it is. In a similar study, written just last year, it was demonstrated that even among science faculty at research-intensive universities, gender biases favor male students.6

What these two studies illuminate is that our gender bias is so thoroughly ingrained that even individuals who are trained to deal directly with data, identifying what is trivial and non-trivial on a daily basis, are incapable of suppressing something as trivial and unreliable as name-based gender bias. Before anyone starts with the 'academia vs. real-world' arguments, a cursory search regarding this topic yields some very interesting anecdotal evidence that supports the same hypothesis.7

We are, like the characters in Shakespeare's Romeo and Juliet, using names as a priori judgments. These two studies also speak volumes about our decision quality, our hiring and staffing policies, our integrity and values, our knowledge about our ability to evaluate people and ourselves, and even our ability to manage diversity.

When otherwise qualified candidates are eliminated from the process based upon their name it's easy to see where a significant portion of the disparity originates. We can work to correct gender stereotypes and eliminate gender roles from early education, we can do a number of things to encourage girls to enjoy and pursue STEM education and programs, we can even build gender-based groups that encourage and promote not just gender balance, but women in the work-force on university and work campuses across the country. None of our efforts to increase education, ban words, or anything of the sort will mean anything until we address eliminating the gender bias that is demonstrated to occur at the first step in any selection process.

Of course, one of the worst parts of this situation is that even though this has been a known issue for more than a decade, we've done nothing to change the situation even though it is incredibly easy. How easy? Here are four simple policies that every organization could adopt with little to no impact to their schedules or bureaucracy, which would alter the landscape significantly:
  1. Publicize the existence of gender biases in relation to CV's and resumes and what is being done to compensate for it or correct it.
  2. Replace names with unique codes on all CV's and/or resumes that are submitted prior to their being screened.
  3. Restrict access to names and codes during the selection process
  4. Identify discussion of a candidate's name as especially problematic and a punishable offense
As a bonus, when these policies are introduced, other name-based biases will be reduced or eliminated as well - the most notable are race, nationality, and religion, because when Juliet, in Romeo and Juliet, asks...
What's in a name? That which we call a rose by any other name would smell as sweet.
Romeo and Juliet, Act II, Scene II
...as it turns out, there's more than enough information, and if you don't believe me, just ask Romeo.





Notes and references. Links in the notes and references list open in a new window
  1. You can find the research regarding the types of barriers women in science face, published by AAUW in "Why So Few? Women in Science, Technology, Engineering, and Mathematics", at http://www.aauw.org/research/why-so-few/
  2. Science, Technology, Engineering, and Mathematics
  3. This tidbit is revealed in "Dear Professor Einstein: Albert Einstein’s Letters to and from Children" by Alice Calaprice, along with views on gender's relationship to the study of science that were far ahead of his time - i.e. it doesn't matter.
  4. One recent edition of philosophers' sound-bites (Philosophy Bites, by David Edmonds & Nigel Warburton) references 44 males and 8 females - a paltry 15%.
  5. The study is called "The Impact of Gender on the Review of the Curricula Vitae of Job Applicants and Tenure Candidates: A National Empirical Study" and you can easily find it online and read it in its entirety - which I recommend.
  6. The study is called "Science faculty’s subtle gender biases favor male students", by Corinne A. Moss-Racusin, John F. Dovidio, Victoria L. Brescoll, Mark J. Graham, and Jo Handelsmana. You can read it at http://www.pnas.org/content/109/41/16474.full.pdf+html.
  7. In the blog post "I understood gender discrimination once I added 'Mr.' to my resume and landed a job", an individual seeking employment in a non-STEM-related field relates how self-identifying as a male on his CV made a positive change in the response rate to his inquiries.

One last note: If you follow this blog, you might have noticed that I've been missing of late. To offer explanation (not justification or apology) I will say that sometimes personal lives get very busy, we have a temporary shortage of creativity (e.g. writer's block), and we need time to work up the courage to say what we need to say how we need to say it rather than just exclaim "WTF!" and be done with it. For me, it's been a mixture of all of these as I've seen my oldest niece married, contemplated my daughter's education, and ruminated regarding how to address gender disparity in hiring for quite some time, even discussing the policies that will correct this with women in technology companies before writing this post.

Tuesday, September 3, 2013

I'm not dead

There's a three-person scene in Monty Python and the Holy Grail in which the dead are being collected. Every time I read another blog post about how Progressive Enhancement1 is dead, and today there was another,2 I'm reminded of this conversation. For those who are unfamiliar with this particular classic, here's a bit of the dialogue from that scene.
Citizen: (Carrying a not-yet-dead-man over his shoulder) Here's one.
Collector: Nine pence.
Not-yet-dead-man: I'm not dead.
Collector: What?
Citizen: Nothing. (handing the Dead Collector his money) There's your nine pence.
Not-yet-dead-man: I'm not dead!
Collector: 'Ere, he says he's not dead.
Citizen: Yes he is.
Not-yet-dead-man: I'm not.
Collector: He isn't.
Citizen: Well, he will be soon, he's very ill.
Not-yet-dead-man: I'm getting better.
Citizen: No you're not, you'll be stone dead in a moment.

Aside from the comedic dialogue that goes on in my head when I hear these arguments, I have other, more philosophical thoughts - such as "the conclusion presented is not supported by the arguments presented because they are [insert your favorite logical fallacy here]". Rather than evaluate any one post, I'll sum up the general approach and say that after repeatedly hearing arguments against Progressive Enhancement, I've determined that the arguments that aren't fallacious usually go something like this:
  • because of browser improvements Progressive Enhancement is no longer needed
  • Progressive Enhancement takes too much time and money

While I might argue the factual basis of the "too much time and money" argument, and my experience runs counter to this statement, I'd rather focus on the first argument because it is the most prevalent. Let me also say that this will not be an attempt to placate anyone in the 'Progressive Enhancement is dead' group - I'll leave that to others who are better suited to twee statements intended to make people feel better.

Instead of arguments about the effectiveness or apologies for offense, what I will say is that the issues Progressive Enhancement was specifically intended to address are not, and cannot be, addressed by newer browsers, even if browsers are "the world's most advanced, widely-distributed application runtime (sic)". The primary goals of Progressive Enhancement are to make information accessible through sparse and semantic code - of course it reduces other issues - e.g. XSS attacks in inline style and scripts - as well.

Since we are being paid to think, it may serve us better in determining if Progressive Enhancement is not yet dead if we first determine whether or not the goals of Progressive Enhancement are still relevant rather than claiming that Progressive Enhancement is dead because browsers are much more advanced. Of course, each of these topics should be covered in their own post; however, for this venture a simple, cursory look will have to be sufficient.

First, we ought ask why a sparse code base is important and whether it is served by better browsers. In a previous post, I mentioned an Oakley site that was issuing 400+ requests and ran around 850MB. Granted, in the interim they've modified the site and it's now down to a slim(mer) 80+ requests and 3.5MB. If we're paying per MB downloaded, we might not worry about 3MB, but 850MB might be another story, not to mention the amount of time required. In an increasingly mobile world, sparseness must be a consideration.

Second, we ought to consider semantic code. As the variety of user-agents increases, the relevance of semantic markup becomes more pronounced. While the effects of semantic markup on SEO may be debatable, other effects are not.3

Third, and possibly most importantly, we ought ask why accessibility is important and is it served by better browsers? The answer to this must be that while accessibility is greatly affected by better browsers, there are still significant accessibility issues surrounding current methods of development. While it may be argued that Progressive Enhancement does not solve all of these issues, it does answer a greater number of these issues than other methods.

To suggest that accessibility is unimportant, or even less important than other factors, as is implied when Progressive Enhancement is shunned, is a question of integrity, trust, and value. One cannot be egalitarian or even trustworthy (or, arguably, functioning with the law in many countries) if one does not address issues of accessibility. We, as a development community, simply must consider accessibility first, because it affects every other decision we make from design to content to the manner in which content is delivered to the user-agent.

Here we've come to it - accessibility is the most important argument for Progressive Enhancement, and one that is seldom made, and one that those in the "Progressive Enhancement is dead" group have completely ignored. To consider browsers as the sole user-agent consuming the information presented on 'the web' is either simply misinformed or grossly short-sighted. Since this has been ignored for some time, let me be clear - browsers may be totally awesome run-time environments, but they're not the only user-agents out there, and proceeding as if they are will open you up to all manner of evil.

Of course, to suggest that Progressive Enhancement is the only method that will address these three simple issues is likely to devolve into a false dichotomy (and hence a logical fallacy). I must, therefore, acknowledge that there are other methods which can address the issues, but none that are as simple.

At the end of our quick glance at Progressive Enhancement it appears the need for it, or something like it, not only remains, but grows stronger, unless of course you're going to say sparseness and accessibility aren't important and play the part of the Collector and strike our Not-yet-dead Progressive Enhancement with a club before putting it on the cart of dead methodologies.


Notes: links open in a new window
  1. Progressive Enhancement is a strategy in which a document is coded then additional technology, i.e. CSS and JavaScript, are layered on top of the document to ensure accessibility and semantic markup. Learn more about Progressive Enhancement at http://en.wikipedia.org/wiki/Progressive_enhancement.
  2. Progressive Enhancement: Zed’s Dead, Baby by Tom Dale. 
  3. Semantic markup is not only cleaner, it's also more accessible. Read more about the importance of semantic markup on "How Important is Semantic HTML".

Wednesday, August 7, 2013

If I were to be completely honest

Much is made of being honest in the workplace, and rightly so. If we cannot rely on one another, the progress we may make and, indeed the success of the team, is at significant risk. In fact, I've blogged about honesty - in at least a tangential manner - on more than one occasion in the past.[1]

Typically, we think about "honest" in self-referential terms - e.g. I am being honest, but I believe that's inappropriate. Being "honest"[2] is a social construct that can only be understood externally. In essence, "honest" is an attribute one can assign to someone (or something) else, but attempts to assign it to themselves makes little sense, in part because assigning it to ourselves is in itself not honest - basically, if we are to be honest, we are not honest. We can reasonably argue the semantics of honesty, truth, and deception with regard to general communication; however, that's tangential to this post - which makes it more a conversation for the pub (or the comment section). And I could just parrot Mr. Ballinger's recent post about honesty,[3] but since this is my blog, I'm going to put my own take on it and go beyond the little "white lies" we tell each other and focus on the deeper, more insidious dishonesty pervasive in organizations - a method of relating to people we might call the "game of thrones".

In the vast majority of organizations I've enjoyed work with, there has been an underlying political current that can easily carry away the unwary. In some cases people are far enough on the periphery that they can easily survive in the eddies of political maneuvering or simply wade in the shoals with little risk; in other cases, people fall victim to strong or hidden currents, often without even being aware they were at risk. In popular culture, the Song of Ice and Fire series gives us one scenario after another wherein words are carefully chosen to be accurate but not precise or, in some cases, to actively deceive, and always delivered concern and sincerity - and nearly always resulting in a character's death. This pattern is prevalent enough in corporate culture that there is a term for it - grin-fucking.[5]

While our lives at work would be markedly more pleasant if this pattern of political maneuvering passed into the dustbin of history, and there are all sorts of blog posts that suggest just that,[6] it is extremely unlikely that such an event will ever happen. Where does this leave us - the few brave souls who would stand in the breach to prevent the rampaging hoards - or perhaps the skittering vermin?

First, keep in mind that just as in Martin's fantasy, the threat does not generally come from the King-Beyond-the-Wall but from those whose cause you fight for every day. This is a sad, hard fact - one that people will try to deny for their group as they claim that their group does not (or would not) engage in grin-fucking. Trust everyone but cut the cards[7] - it is more likely that, contrary to popular idioms, what you don't know will hurt you.

Second, just because other people are grin-fucking that doesn't mean you have to engage in the behavior. There will be risk involved with disassociating yourself from those who engage in the game of thrones. It is unlikely, even with the support of upper management, that corporate culture will shift dramatically when the currents are strong. Sometimes it will be the case that, as Cersei says, you win or you die, and in other cases you will be able to avoid the headsman. In both cases, however, you will find that your life will be better - which is really what is important.

Notes:
  1. Trust everyone at the table, but cut the cards anyway and Ethical conflicts are like the TARDIS
  2. Honest is defined as "free from fraud or deception; genuine or real; humble or plain; reputable or respectable; creditable or praiseworthy; marked by integrity; marked by free, forthright, and sincere expression; innocent or simple"
  3. No filter: the meanest thing Paul Graham said to a startup
  4. The Song of Ice and Fire is also known by the name of the first novel in the series A Game of Thrones by George R. R. Martin
  5. Dishonest assent accompanied by a smile, and often other social niceties. Urban Dictionary definition
  6. Don't be a Grin Fucker and Public and Open Debate is the Highest Form of Democracy are two of the better entries on the list.
  7. "Trust everyone but cut the cards" is Robert's Rule #18, and comes from a series of rules loosely fashioned after the Wizard's Rules.
  8. George R. R. Martin quotes.

Wednesday, May 1, 2013

Doing the impossible (with Robert's Rule #31 & #32)

This post is not going to be a 'how-to', or an example of critical thought applied to a current topic, but rather a prosaic reflection with career advice mixed in - somewhat like earlier posts.

When I was a young child, we were discussing poetry in primary school and I pulled a book of American poets off the shelf and found It Couldn't Be Done by Edgar A. Guest[1]. I still recall the note my schoolteacher sent home saying how much the poem reminded her of me, even though I cannot find the note - of course that was quite a number of years ago.

I would not encourage anyone to be excessively optimistic (I would say pollyannaish, but I believe that unfairly associates optimism with feminism), there have been a number of credible studies that demonstrate the benefits of positive thinking[2]. Yet,  there is something beyond even positive thinking that I feel is crucial to our survival in a corporate environment - an indomitable will. For some, an indomitable will manifests as an "incorruptible patience" or "a destructive pursuit of perfection"[3]; for others, there are other ways - but they have this in common - they are not skill related and won't be found on the Programmer Competency Matrix[4] - in fact, it's those times that we're faced with a situation that we know is beyond our bounds that this applies, and it's what made Bert Bell's belief that on any given Sunday any team could beat any other team in the league real. (Robert's Rule #31 - success is about more than skill - is based on that belief.)

A personal story - several years ago I was preparing to fly to California for an in-person interview for a position that I considered a dream job when I found out that my sister, who lived 2000 miles away, was critically ill, and a few hours before I was to leave for my day-long interview, I learned she had died. My grief was beyond anything I had borne before, but I also recognized that the only thing I could do at that point was request PTO and book a flight - and one more day would make little difference in the grief or support that I, or anyone else in my family, could offer.

Even though I knew a rigorous interview process was beyond my bounds in that circumstance, I went and did my best. After I returned home, I contacted my employer and booked a flight, and left the next day. After I secured the job (yes, I did get it), I discovered that some who interviewed me noticed (what they interpreted as) a lack of enthusiasm - several commented to me that interviewing in that situation was something they thought couldn't be done, yet it was done - and well enough to secure the job.

So, here's the lesson I learned that day - if you want something enough and your will is indomitable, you will likely succeed - success is not guaranteed, mind, but very likely.

There was another lesson I learned that day - one that's probably more important
that I carry with me, especially every time I interview a candidate - on any given day we see only a part of a person, and like in jazz, the important bits might be those not heard, so make allowances for what you don't see (Robert's Rule #32).

Or, in the words of Bill (in Bill & Ted's Excellent Adventure), "be excellent to each other".

Notes:
  1. If you've never read this poem, do so. I recognize as a (more cynical) adult that it's a bit trite, but it's somewhat uplifting and motivational, and there are times we all need that.
  2. The article How the Power of Positive Thinking Won Scientific Credibility is an interesting read regarding the evolution of thought and research in this field, and has links to a number of those studies.
  3. These are two traits of a fantastic programmer from Signs that you're a good programmer.
  4. http://sijinjoseph.com/programmer-competency-matrix/

Monday, March 19, 2012

Don't be afraid to be wrong (Robert's Rule #19)

[Tweeted 2011-06-14]

I know it seems pretty obvious, but leadership is leading. You would think that this doesn't need to be said, but often it does, because apparently we forget.

You might be amazed at just how many people in the industry are unwilling, likely because of office politics, to step out on a limb. Sure, you could be wrong; but you might not be. Since this isn't really about being ill-informed, we'll assume that you're doing more than making lucky guesses, but even if you're not, if you're not making judgements and, more importantly, letting people know what those judgements are, you're not leading.

What's the worst that could happen if you express a reasoned judgement and you're wrong? It's usually not as bad as you think. (Remember, we're assuming that you're just wrong, not ill-informed and wrong, because that's another problem.) Some of the world's greatest minds were wrong; some of them were wrong frequently. So, we're (most of the time) talking about other people knowing you were wrong...and if that's the case then this may come as a surprise...everyone is wrong sometimes, and you hiding that you're wrong doesn't mean other people think you're never wrong.

On the other hand, what if you hide your judgement all the time? If you hide your judgements people won't assume you're right all the time, they'll assume that you believe you don't know enough to make a reasoned judgement or that you're too afraid to express your judgement for some other reason. Generally, neither of those are well-received in scientific or engineering communities.

So, make a reasoned judgement, learn to express the judgement in a manner that reveals your reasoning, and most importantly, don't be afraid to be wrong (Robert's Rule #19).

Friday, March 16, 2012

Trust everyone at the table, but cut the cards anyway (Robert's Rule #18)

[Tweeted 2011-06-06]

Living in an area where there are multiple casinos within a short drive or a long walk, I've learned to see some things using gaming metaphors. One of these metaphors is trust everyone at the table, but cut the cards anyway (Robert's Rule #18), and if you are an empiricist like Hume, then this rule will automatically make sense.

How does it apply to work? First, if you are not able to trust your colleagues, work (and probably life) will be miserable. Of course the reverse is also true; trusting your colleagues will go a long way in making work not be the worst part of your life. In fact, I've had some jobs I should have hated because they were such a poor fit and yet I didn't because of my colleagues.

Second, not only will the inability to trust your colleagues make life miserable, it will be very difficult to accomplish what you need to accomplish as well. The amount of time you spend countering the machinations of office politics in a hostile environment will outweigh whatever other successes you have. In addition, those times in which you don't succeed your discomfort will be worse because of the negative self-talk that comes out of the lack of trust and your assumptions about yourself.

Of course, this doesn't mean that you should just blindly trust. After all, your trust can be pretty easily misplaced, and this is your livelihood we're talking about here. You can't just go about willy-nilly assuming that everything your colleagues do and say is true, and even a series of lucky guesses ends sometime, which is a very good reason to confirm what you believe to be true.

Of course all of this is to say trust everyone at the table, but cut the cards anyway.

Saturday, March 3, 2012

Ethical conflicts are like the TARDIS (Robert's Rule #7)

[Tweeted 2011-05-03]

One of the things that system theory gives us is a self-reinforcing loop. Granted, systems theory isn't the only thing that contains this concept; however, its centrality to systems theory is significant.


One of the fables that I have enjoyed over the years goes a little something like this.... Once upon a time a thief was being pursued by the authorities. In his haste to evade capture, he dashed into a temple and donned the robes of a holy man. The authorities entered the temple to find a man in the robes of a holy man and asked if he had seen the thief they were pursuing. He assured them he had not. However, in order to continue to evade capture he had to continue to play the role of the holy man. After many years, a severe drought fell upon the land and the ancient scriptures said only the sacrifice of a holy man could end the drought. In his final act in the role of the holy man, the thief sacrificed himself. As he lay dying, rain began to fall.


We all know examples of little actions that grow into larger, more important actions. The most common example of this is deception. To maintain deception ever larger deceptions are required; they are the ultimate pyramid scheme. This is why most advisers caution against something even as small as padding your resume.


Once, while interviewing candidates for my employer, I was given the task of assessing technical abilities in web languages. All candidates had been pre-screened by telephone and resumes had been reviewed, so by the time the candidates arrived, we were confident in presenting several problems so that we could observe and assess their abilities. I should say that I find this task pretty meaningless in many cases. With the advent of online resources and the tendency of the development community to share resources and experiences, a few minutes of searching can answer most questions. Because of this, I also tend to be very lenient in my assessments and somewhat laid-back in approach.


However, in one instance, I presented the problem (in written form) and the candidate immediately handed the problem back to me saying "I can't do this". After 10 minutes of me encouraging him to attempt an answer, even if he was not confident of the result, he continued to refuse.


I have to add here that one thing we don't tell candidates is that if we believe that they are not qualified for the position they are seeking they may still be qualified for other positions in the company, and we may adjust the interview to determine if they are a fit for a different position.


The candidate I was interviewing was likely qualified for a different position; however, it was plain to see that he had been deceptive with his resume and during the pre-screen. It was a small thing, really, to say "yes, I can do that" when really he couldn't. The problem was that saying "yes, I can do that" when he clearly knew that he couldn't, blocked him from progressing further with that employer. Completely.


Simple, but difficult, especially when there's so much on the line. Now he's just one more example that helps me keep in mind Robert's Rule #7 -- the first ethical conflict is usually so small it appears minor and so large it becomes a major career factor, and the candidate will continue to work for small start-ups, never really progressing to the next level.