Skip to main content

Success doesn't always breed more success, sometimes it causes more harm later on and a small bit of skepticism is always welcome

This has been baking in the back of my head for quite a while now and it happened to me and the team that I am part of, at work, in the last couple of months. I think it is worth analyzing what actually happened and worth keeping the lessons learned in the back of your head all the time, especially during estimations.

Now on with the story. Initially when I joined the company I currently work at, we started slowly doing work and getting to know the application with its business logic. We had a lot of doubts in the beginning. So during the estimation meetings at the beginning of a sprint, we were constantly unsure of the tasks that we had to estimate. At first glance this seems like something bad but it actually had a silver linning. That doubt kept us analyzing the tasks that we had to do more in depth. We also looked into existing code more often which helped us too.

And quite often we would discover hidden issues that at first glance were not so obvious, issues like hidden dependencies between tasks, more required changes that were not so obvious at first glance, special edge cases which had to be covered or incompatibilities between newly designed features which had to be implemented and the existing ones or even between the newly designed features themselves.

Eventually we were lucky or unlucky, depends on how you see things, and we got pretty successful. After a while we learned the application pretty well and we didn't have to waste so much time learning the application when starting a new task. We delivered a lot more features and effort with each consecutive sprint. Everyone was really impressed by our progress, including us.

But we weren't aware of the fact that we were becomming more cocky and full of ourselves. This led us to stop analyzing the tasks so deeply during the estimation meetings. Which caused to underestimate quite a few tasks, which in turn made us seem like we were delivering less and working less when actually things were much more complicated than we initially anticipated. We only superficially analyzed them and didn't look inside the code as often to get a sense of what changes they might imply.

We also became more friendly with one another. This caused us to agree with each others opinions and ideas without scrutinizing them enough. It's good if everyone gets long, actually it is crucial, but should try to look for faults and downsides in your colleagues idea even if you get along really well.

It took a while to get things back on track and this is where the sprint reviews and retrospectives really came in handy. It was our chance to explain to the customer that things were much more complicated than initially anticipated and come up with a proper explanation with sound arguments as to why things turned out badly.

Now thinking back, there should always be a healthy dose of skepticism during estimations. In small doses, skepticism makes us think things properly and in depth. On the other hand, too much of it makes us get lost in details so much that we forget the bigger picture.

It can be really slippery and it can happen slowly enough for people not to notice it. I actually read about this on the internet and it's actually a phenomenon that happens in other fields such as the army. In the israelian army they have a technique especially dedicated to situations like this. In any group of people when making a decision, at least one person is designated by default to disagree with everyone to force the entire team to analyze things thoroughly. To be honest, during meetings I am the guy that kind of guy that disagrees with everyone just a bit to make sure that they thought things through.

This doesn't happen at a small scale like it happened to me and the team that I am part of. There are quite a few big companies that fell in this trap, companies like Microsoft, Nokia and so on. Not all of them managed to recover from this. I will leave the talk about big companies here because it has already been talked enough everywhere on the internet and many people are sick of this.

Another place that this can happen is during interviews. It actually happened to me quite a lot of times when the interviewer asked me to solved a very well known problem, I became full of myself and after I finished implementing the algorithm/solution I didn't test it properly. Speaking of testing, a healthy dose of doubt and skepticism forces you to think of every possible edge case to test after you have implemented something at work. Too much of those can cripple you and force you to test every redundant case.

So in the end, a little bit doubt and skepticism should always be present, they are healthy up to a point. And everyone should have this reflex properly embedded in their behavior to analyze when the are becoming way too sure of themselves before it causes any damage.





Comments

Popular posts from this blog

Some software development common sense ideas

 I haven't really written here in a long time so it's time to write some new things. These are just some random common sense things that seemed to short to write about individually but seem really interesting and useful to me or other people 1. There is nothing fixed in software development, all things vary by circumstances and time Remember the documentation that didn't seem that important when you started the project, well after a couple of years one the application has grown and become really complicated, no one actually knows everything about the application anymore. So now you really need that documentation. What happens if you suddenly need much more people to develop the application because of some explosive growth? Without documentation, new developers will just look at the application like they look at a painting. This actually happened to me. Maybe in the beginning of a project, a technology really helped you a lot but as the project grew, it started making things...

My thoughts as an experienced developer in this field.

To be honest, I would have never guessed that      I would end up saying these these things, especially jn the beginning of my career. As you become more experienced, you kind of get to see everything and you become less impressed by new things. Honestly there is some sort of repetitiveness in this field and we kind of like to reinvent the wheel quite often, because well, we become a bit too easily bored. Take for example how the web looks now, a lot of it has been borrowed from other places, mostly desktop development, especially the parts about making custom reusable components.  Or from older frameworks, like state management in the UI which actually was first properly implemented in Asp .Net Webforms, as far as I know, something like 20 years ago, where you used to have the dreaded view state.  But most likely something similar existed before that too. I was not that surprised when React adopted initially this idea. Or even less surprised when hooks where in...

Some of my realistic thoughts about technical debt

 Lately I have been kind of stumbling into a couple of issues related to technical debt, a bit more than I really wanted to so I guess it's time to write about this some things. Also, some ideas about this have been brewing in the back of my head for some time now so now is the moment to write all of it. The first thing that comes in my mind when thinking about technical debt is how big is the area affected by it. In the best of circumstances it might affect just a specific area in an application, usually a specific feature because of bad implementation, unclear requirements, various disruptions, tight deadlines or many other reasons. This kind of technical debt usually doesn't have a wide ranging negative impact because it's local to a specific area. The only concerning case it when it affects a critical feature that it used often and updated often, even by other features. On the next level, several areas of an application might be affected by some technical debt issue. Th...