So, who shot first, Han or Greedo? We all know that Han did, right? He may be a smuggler and a total cheat, but he's no fool...right?
Here's the thing. We hate change. All of us. We hate change so much that addressing change is even built into our religions in one form or another, whether our religion insists that the supreme being does not change or whether our religion promotes the idea that it is our response to change that is problematic. Only one thing is clear - since before Heraclitus' proclamation that no one steps into the same river twice, we have recognized that change in the human sphere is inevitable.
"What", you may ask, "does that have to do with Han and Greedo"?
Han and Greedo had met before and though Greedo had been humiliated in the past, this meeting was decidedly different. Greedo was prepared to shoot Han; however, Han also knew this meeting represented a change in behavior and shot Greedo.
We all face changes in jobs, technology, and lifestyles that affect our career. The trick is to know when to pull the trigger and do things differently. For some changes, the trigger will undoubtedly get pulled for you rather than by you, and in some cases you'll likely be on the wrong end of the blaster and be left in the cantina. Try to minimize those instances. Wait until the last moment if you must, but don't be the one who shoots second. (Robert's Rule #24)
A long time ago in a galaxy far, far away... I gave a lecture called Getting Paid to Think to an academic society. In it I presented a simple hypothesis - an education in the humanities and thinking (e.g., Philosophy) is more beneficial than a skill-based education (e.g., Computer Science). This blog is dedicated to getting you to think as I discuss a variety of topics, most of which are related to my career in the tech industry.
Showing posts with label command skills. Show all posts
Showing posts with label command skills. Show all posts
Saturday, May 19, 2012
Monday, March 19, 2012
Don't be afraid to be wrong (Robert's Rule #19)
[Tweeted 2011-06-14]
I know it seems pretty obvious, but leadership is leading. You would think that this doesn't need to be said, but often it does, because apparently we forget.
You might be amazed at just how many people in the industry are unwilling, likely because of office politics, to step out on a limb. Sure, you could be wrong; but you might not be. Since this isn't really about being ill-informed, we'll assume that you're doing more than making lucky guesses, but even if you're not, if you're not making judgements and, more importantly, letting people know what those judgements are, you're not leading.
What's the worst that could happen if you express a reasoned judgement and you're wrong? It's usually not as bad as you think. (Remember, we're assuming that you're just wrong, not ill-informed and wrong, because that's another problem.) Some of the world's greatest minds were wrong; some of them were wrong frequently. So, we're (most of the time) talking about other people knowing you were wrong...and if that's the case then this may come as a surprise...everyone is wrong sometimes, and you hiding that you're wrong doesn't mean other people think you're never wrong.
On the other hand, what if you hide your judgement all the time? If you hide your judgements people won't assume you're right all the time, they'll assume that you believe you don't know enough to make a reasoned judgement or that you're too afraid to express your judgement for some other reason. Generally, neither of those are well-received in scientific or engineering communities.
So, make a reasoned judgement, learn to express the judgement in a manner that reveals your reasoning, and most importantly, don't be afraid to be wrong (Robert's Rule #19).
I know it seems pretty obvious, but leadership is leading. You would think that this doesn't need to be said, but often it does, because apparently we forget.
You might be amazed at just how many people in the industry are unwilling, likely because of office politics, to step out on a limb. Sure, you could be wrong; but you might not be. Since this isn't really about being ill-informed, we'll assume that you're doing more than making lucky guesses, but even if you're not, if you're not making judgements and, more importantly, letting people know what those judgements are, you're not leading.
What's the worst that could happen if you express a reasoned judgement and you're wrong? It's usually not as bad as you think. (Remember, we're assuming that you're just wrong, not ill-informed and wrong, because that's another problem.) Some of the world's greatest minds were wrong; some of them were wrong frequently. So, we're (most of the time) talking about other people knowing you were wrong...and if that's the case then this may come as a surprise...everyone is wrong sometimes, and you hiding that you're wrong doesn't mean other people think you're never wrong.
On the other hand, what if you hide your judgement all the time? If you hide your judgements people won't assume you're right all the time, they'll assume that you believe you don't know enough to make a reasoned judgement or that you're too afraid to express your judgement for some other reason. Generally, neither of those are well-received in scientific or engineering communities.
So, make a reasoned judgement, learn to express the judgement in a manner that reveals your reasoning, and most importantly, don't be afraid to be wrong (Robert's Rule #19).
Thursday, March 15, 2012
Too often 'we can' erroneously becomes 'we should' and 'we will' (Robert's Rule #17)
[Tweeted 2011-06-03]
Once upon a time I was in a meeting in which we discussed a web application that was scheduled for deployment in the immediate future. As we were working out the implementation details, we came upon the issue of needing to access private, restricted, highly confidential information.An additional wrinkle was the need for the database to be maintained by the system of record, which was on the internal network.
As we were discussing the options, one person (I'll call him n00b) suggested that we could easily solve the problem by joining the server to the DMZ and the internal network simultaneously. My response was an immediate "no, we can't do that". "Oh yes we can", the n00b replied. "All we need to do is install two network cards and use one for the DMZ and one for the internal network." In honesty, I was not the first one to laugh out loud, my manager was.
The n00b was insulted and said that he had used this approach for one of his clients (outside of work) and so I ended up telling him that what such a plan would do is create a bridge between the DMZ and our internal network, making not only the database server vulnerable, but the internal network as well. The n00b had a few more, equally appalling suggestions, but in the end the group, collectively, brought him to a measure of enlightenment.
Of course we had the technical ability to do what n00b suggested, just like I've had the technical ability to do hundreds of other blatantly stupid things and several more that weren't quite blatantly stupid (even if they were of equally questionable value).
Perhaps more disturbing than a n00b fighting for a bad idea is that if the n00b had been higher up on the food chain, rather than the n00b he was, the situation might have turned out differently. I've certainly been in situations where I've known what was asked was a bad idea and would even likely turn to bite me in the nether regions, and still I've had to implement the bad idea because 'the decider' made the decision.
We all face such situations; in fact, they're far from uncommon. This is why Rule #17 states that too often 'we can' erroneously becomes 'we should' and 'we will'. Robert's Rule #17 is simply a recognition of a sometimes disturbing truth we, as technologists and engineers, live with every day.
Once upon a time I was in a meeting in which we discussed a web application that was scheduled for deployment in the immediate future. As we were working out the implementation details, we came upon the issue of needing to access private, restricted, highly confidential information.An additional wrinkle was the need for the database to be maintained by the system of record, which was on the internal network.
As we were discussing the options, one person (I'll call him n00b) suggested that we could easily solve the problem by joining the server to the DMZ and the internal network simultaneously. My response was an immediate "no, we can't do that". "Oh yes we can", the n00b replied. "All we need to do is install two network cards and use one for the DMZ and one for the internal network." In honesty, I was not the first one to laugh out loud, my manager was.
The n00b was insulted and said that he had used this approach for one of his clients (outside of work) and so I ended up telling him that what such a plan would do is create a bridge between the DMZ and our internal network, making not only the database server vulnerable, but the internal network as well. The n00b had a few more, equally appalling suggestions, but in the end the group, collectively, brought him to a measure of enlightenment.
Of course we had the technical ability to do what n00b suggested, just like I've had the technical ability to do hundreds of other blatantly stupid things and several more that weren't quite blatantly stupid (even if they were of equally questionable value).
Perhaps more disturbing than a n00b fighting for a bad idea is that if the n00b had been higher up on the food chain, rather than the n00b he was, the situation might have turned out differently. I've certainly been in situations where I've known what was asked was a bad idea and would even likely turn to bite me in the nether regions, and still I've had to implement the bad idea because 'the decider' made the decision.
We all face such situations; in fact, they're far from uncommon. This is why Rule #17 states that too often 'we can' erroneously becomes 'we should' and 'we will'. Robert's Rule #17 is simply a recognition of a sometimes disturbing truth we, as technologists and engineers, live with every day.
Subscribe to:
Comments (Atom)