In an effort to understand the testing industry, explore ideas about the software development process and develop some interesting content, I am going to start conducting interviews with people in the software and testing industry for the Web Performance blog. Robin Goldsmith of Go Pro Management was kind enough to become the first Web Performance interviewee.
John: Robin, please tell us something about yourself, your background?
Robin: What you see is what you get: stunningly good looking, and I have been in the computer business since computers were working on wind power. I had a very technical background but also the good fortune to get involved with management consulting and thereby gain a management perspective too. I have been involved with Quality Assurance since before it was called QA. I’ve also been involved in outsourcing since the early days.
In the mid 80's I developed and taught what I believe was the first and only course on buying software, I’ve been involved in software acquisition from every angle. I’ve developed software for sale, assisted buyers and sellers, evaluated software, bought it, supported it, and have been the victim of what was bought. I’ve been very fortunate to be involved with a very well-known client in one of the first and largest efforts to outsource software to India.
John: Why would you use an automated testing tool as opposed to developing in-house testing scripts?
Robin: Any tool vendor, by their very nature and concentration, is able to develop proficiencies above and beyond what a normal company could develop in-house when a tool is only part of what they are doing. In addition, there really is no reason to reinvent the wheel, to coin an expression.
Having said that, I was a systems programmer, and I developed a lot of tools. I developed them because there were situations where there was no tool available, or the available tools didn’t do what I needed.
The other thing is I certainly appreciate there can be considerable issues involved with using and adapting the various commercial tools. I also feel that when you consider there are not sufficient technical people simply to use existing testing tools and develop a good testing environment, the chances that a customer has someone on staff with time and skill to develop a tool is unlikely and probably not the best use of their time.
John: Why do think there is a lack of skills in the industry?
Robin: I think the testing world is not attracting people. A lot of organizations feel that testing is not as important as programming. For example, during the technology boom when many companies were recruiting on college campuses, the recruiters would come to the campus and asked if a candidate could program. If they said yes, they became a programmer; and if they said no, they became a tester. The irony is that when the tester started work they had to use automated testing tools, and to use them they would have to know how to program. Developers encourage this view, many of whom consider testing not as rewarding or creative as programming. Consequently, many people would not intentionally or voluntarily seek a career in testing. However, I think they are mistaken. I ask my students which takes more creativity: creating the error or finding the error? Of course, the greatest creativity is needed to make excuses for the errors that developers created but testers didn’t find.
John: Where do you see testing tools technology heading?
Robin: I don't know that I see it heading anywhere, what I mean by that is that I would say that a year or two ago there was a lot of excitement about performance testing. I might be traveling in different circles, maybe organizations have gotten the issue of testing tools more under their belt. I haven’t been aware of the same level of excitement in the tool market as there was a few years ago. Back then, functional tools seemed to reach a plateau, and then most of the attention focused on performance testing; but it too now seems to have hit a plateau. I think that certainly there are areas of the web that continue to create and come up with new technologies and new test tool needs. There is greater attention on security now, and if you think about it security and performance is intimately related.
John: Here at Quotium Technologies we have been thinking a lot about the requirement of Independent Software Vendors (ISV) to demonstrate software scalability to their clients in the client’s environment. Can you recall any examples of ISV projects in your career where the ISV had to demonstrate scalability? What were the pitfalls and advice you'd have for ISVs in handling that issue?
Robin: Nothing comes to mind. A lot of my work is involved with presenting, training, and working directly with larger established companies and those with testing organizations. I think ISVs don’t fit those models and are not getting the training. Also, I participate in a lot of conferences, and I don’t see ISVs there. I suspect software companies are busy developing, not considering themselves as professional testing organizations.
John: Can you recall any scalability projects from the customer’s perspective?
Robin: For buyers in general, suitable use of testing is one of the hardest things for buyers to understand. For instance, very few buyers know how to structure an acquisition properly, and especially they fail to recognize how to use Proactive Testing. Yes, buyers need to define their acceptance criteria and test the ISV’s application; but its not just running tests after the software has been delivered. In my training and consulting, I also emphasize how getting a vendor to provide a plan for reasonable and suitable testing can be one of the buyer’s most valuable tools for managing the acquisition.
The effectiveness of both functional and performance testing depends primarily on having an effective testing process. Everyone says this, but for most folks it’s only words. A testing process defines what needs to be tested. An automated tool can’t do that; it only can assist the execution of the tests that have been identified. Proactive Testing as we present and practice it not only helps focus effort on the most important testing, but identifying reusable test designs can greatly enhance the speed and quantity of test automation; and most importantly, Proactive Testing helps developers speed delivery by preventing rework and showstopper errors.
John: Lastly, what are the trends in software development and testing?
Robin: It seems to me that once again things have quieted down, especially on the functional testing side. More and more organizations are less and less interested in writing automated scripts; they are looking for ways to automate the process. Therefore, more and more organizations are moving to data driven testing and the even more-efficient techniques using Action Words or similar frameworks, which involve some up-front technical effort to enable non technical people to create considerable numbers of automated tests. So far, though, these approaches seem mainly to be used for functional testing. However, while it's not very widely recognized, the same type of thing can apply to performance testing as well.
The other trend is related to these techniques for automating the automation; but it’s somewhat discouraging. To the best of my knowledge what still is not spoken about very much in organizations, regardless of what type of automated testing is conducted and whether home grown or whatever, is that the key element to effective test tool utilization continues to be the adequacy of the testing process. In fact, I fear that emphasis on the tools themselves tends to distract from identifying what to test, and especially doing it in a systematic and economic way that covers the risks and enables more reusable tests.
John: Thanks Robin for taking part in the Web Performance interview, I really appreciate you answers and talking to the readership and me.
Note: Robin and I continued to chat more and we discussed his recently released book, Discovering REAL Business Requirements for Software Project Success, within the context of the “bookends of software development,” that is, business process analysis at the start of a project and performance testing at the end. Robin’s book critiques and suggests important improvements to the conventional approaches used by many organizations in developing software. Robin points out that traditional development continues to encounter creep because developers continue to follow a “Field of Dreams” approach, thinking that whatever they decide to build must be what the business needs. Robin shows how to shift the focus to discovering the REAL, business requirements first—and using more than 21 ways to test that the business requirements are accurate and complete—before getting wrapped up in what is going to be built. While presented with respect to software, the same approach and techniques are equally valuable for quality, sales, and marketing.
I am going to start conducting interviews with people in the software and testing industry for the Web Performance blog. Robin Goldsmith of Go Pro Management was kind enough to become the first Web Performance interviewee.
http://www.ubwebs.com/
Web Application Development
Posted by: Account Deleted | October 19, 2011 at 07:49 AM
I always thought how was outsourcing back in the early days.
Ben Cliff
Posted by: Outsource Call Center | August 19, 2010 at 05:21 AM
A website without a live chat would be something like - A customer walks into a car showroom and looks at all the models; he spends some time, and walks out of the showroom with a contact number in hand. He may be left with numerous questions in mind. He may or may not contact the dealer. While a website with a Live Chat Support would be having a salesman in the showroom who can study the customer from the moment he enters the showroom, solving his answers all along and sending him back with all his queries well attended!
Site : http://www.livesalesman.com
Posted by: Live Chat Support Software | August 27, 2007 at 08:05 AM