The idea of this post came from a question that Ajit raised during our conversation. He has been attending some interviews recently and has spotted an interesting tendency among the interviewers, i.e. “they tend to ask lot of can-you-define-xyz-testing kind of questions”! Ajit was curious why they are so much interested in the definition of some testing buzzword, whereas they should be more interested in judging the level of competence of the tester being interviewed, by asking her some practical questions.
I can understand how stupid it can be to ask such “define this testing terminology” kind of questions. The stupidity would be more obvious if we take a look at the following reasons:
1. No testing terminology can be defined in a single unanimous way. Any particular term may mean differently to 2 different testers, testing organizations, schools of testing, testing gurus depending upon their own understanding of that particular testing terminology. Then how can the interviewer ask the candidate to define something and expects that their definitions match?
2. A single underlying testing principle might be called differently (different testing terminologies) by a different group of testers depending on their organizational practice, education, school of thought etc. Then how can the interviewer ask the candidate to define something and expects that their definitions match?
3. Lets say, I have memorized the definition of a testing buzzword and I am lucky! My definition (luckily/coincidentally/by chance) matches with the definition that my interviewer had memorized from his testing institute days. So suddenly, I look more competent as compared to other candidates whose definitions had not (due to bad luck) matched with the interviewer. But what does this prove? Does this prove that I have applied this understanding in practice? Does this prove that I can apply my understanding of that buzzword in real life testing? Does this prove that I am a better software tester than the earlier candidates who could not define it in a way that maps to the understanding of the interviewer? If not, then I wonder what makes the interviewer think that asking such definitions is a good way to judge how skilled the candidate being interviewed is!
I am not saying that knowing the basic fundamentals (read as theoretical testing) is a waste of time. I agree, learning the fundaments of testing is equally important to start a career in testing. But wait, doesn’t that hold good for any other professions too? Isn’t having theoretical knowledge necessary for any other profession? Then why is the fuss? The problem seems to arise when people on the interviewer's chair start imagining that whatever they know is the ultimate truth about testing and all the rest (read as candidates) must also know it. This problem becomes even worse when the interviewers start to think that testing is just all about textbook definitions and fundamentals (types of testing, levels of testing, testing life cycle, verification vs. validation, smoke vs. sanity testing, test automation, testing certifications, and so on).
Stories like Ajit’s often make me wonder if software-testing definitions are so important for becoming a skilled tester that they must be asked in interviews! Is it an absolute necessity to know how something is called (definition) to be able to actually do it? I think the answer lies with the Mother Nature. An average human child starts to smile, cry, sleep, play, crawl, stand and even walk much before it actually learns to SPEAK (define?)! If a child can do so many things even before knowing how they are called and described (defined), then why should it be so mandatory for a tester to define something in order to apply it in practical testing? I have come across many exceptionally good testers who are great at testing even though they often don’t realize that there must be a term to describe what/how they perform testing. At the same time I have come across many testers who are like the bookworms of a testing bible (they know every nook and corner of the textbooks and know each and every word in it), but when it comes to apply that knowledge into testing, they often fail miserably.
Then why do we give so much importance to testing definitions in interviews? Is it wise to ask these questions just because we were asked the same in our interviews? Is it a good way to judge whether a candidate for testing is skillful by asking some testing definitions [as if there is no better way to judge a tester’s competency]! Let me hear what you think.