Interview with Madhur Kathuria

Madhur Kathuria has coached nearly 300 teams for almost 75 clients across the US, Europe, South East Asia, Malaysia and Thailand. In this interview he talks about some of the cultural challenges for agile adoption. Read it here.

Interview with Elena Yatzeck

Elena was Chief Agilist for JP Morgan Chase Treasury Services and is now a VP of Corporate Compliance Tech. Find out how JP Morgan Chase reconciles agile with compliance and risk management demands. Read it here.

Friday, October 10, 2014

How To Rapidly Infuse Technical Practices Into Your Agile Teams

Martin Fowler once wrote, “The scrum community needs to redouble its efforts to ensure that people understand the importance of strong technical practices.” Jeff Sutherland has commented, “It is extremely easy to integrate Scrum with XP practices even on large distributed teams. This can improve productivity, reduce project risk, and enhance software quality.” Ron Jeffries wrote, “Manual testing cannot sustain the two week delivery cycle. Therefore, for a truly successful iterative software project, automated testing is absolutely necessary.” 

You have to trust the team to figure out what they need to learn, and that they will learn it: they should not wait to be trained.

But what if your teams do not know how to do these things? Agile promotes self learning: if you are going to trust the team to know how to do its work, then you have to trust them to figure out what they need to learn, and that they will learn it: they should not wait to be trained. Still, it can take a long time for a team’s members to learn a whole host of new skills on their own. Agile transformation is about making this happen rather than waiting for it to happen. How can you get a team to learn the technical side of agile?

One approach that we have seen work really well is to create training teams. These teams provide intensive hands-on classroom training, as well as on-site coaching as a follow-up to the training. In 2012 one of us (Scott) created such a team for training a very large IT organization in agile technical practices, and this recounts that experience.

The directive was to infuse knowledge of technical practices across all of the in-house software development teams across the organization’s many locations around the country. The practices included test-driven development (TDD), test automation, automated quality analysis, and continuous integration (CI). Rather than go through a lengthy narrative on each approach taken, we will summarize the approaches here and then discuss them:
  1. Training team consisted of two members (including one of us), each with strong development experience, and experienced with many agile technical practices and tools.
  2. Created a standard set of test tool stacks that could be easily installed. Started with test tool stacks to support testing for the most common development tool stacks in that environment (Java, Oracle).
  3. Trained entire teams at a time in a classroom, in person, with some lecture, discussion and mostly hands-on practice.
  4. The course took three weeks, with short two-hour sessions three times a week, minimizing impact on work.
  5. Web based training was not used because it is more difficult to build effective web based training materials. Further, we wanted to ensure that entire teams were trained rapidly and that we had their undivided attention for a period of time.
  6. Followed up with those teams after training them, in person, using a “coaching” approach.
  7. Measured code quality, test coverage and practice usage after the teams had been trained, and reported their progress in a management scorecard style report.
  8. Kept track of which teams had been trained, and reported that in the scorecard. This put pressure on group managers to get their teams trained.
  9. We found that many (but not all) testers could be turned into test programmers in ~ 3 weeks, further enhancing the overall benefits.

The results of this were that adoption of agile technical practices advanced rapidly across the organization. Some teams were eager for the changes, and some were resistant, but most were simply supportive of the new direction and were willing to give it a try and see how it worked. In one case, a team – one of the most experienced and productive teams – rejected the use of TDD, but since management had decided to make TDD mandatory, a senior manager spoke to that team’s manager and soon after that the team was using TDD. Fortunately, this was an outlying instance, since we value the Agile Manifesto principle of trusting teams to get the job done in the way that is best for them. To be fair, TDD has a very long learning curve, and it is normal to have some people resist because it is a very substantial change in how one works.

In another case a team that was given a large newly released application to support, because of the training, recognized quickly that the code they had inherited was challenged in terms of fragility and difficulty testing. The team reached out to one of the coaches, received in-depth coaching on how to deal with the problematic code and within less than a month became self-sustaining and was already seeing improvements in the code quality.

Creating the initial training materials was a very substantial task. It took two experts about four months to assemble an appropriate test tool stack and create the three week course. We then started running teams through the course, and following up with those teams afterwards by traveling to their location and staying for several months, sitting with the teams to see if they needed help putting what they had learned in a classroom into practice on their actual project. Our goal was to cement in what they had learned in an actual work setting. Training experts will note that our approach was therefore both experiential and contextual.

At the same time, we sought out those with an agile mindset and technical expertise for taking on the role of internal agile technical expert. This enabled the organization to have continued expertise, long after we left.

How this all works together

Training an entire team together was very effective because it was then possible to assess the outcome for the team’s work, and also because no one was missing from a team when others were not. The entire team was offline together, making themselves more productive as a group. This made it a team learning experience: they all shared a common training experience and therefore shared training reference points. It was a shared journey. Management reporting later focused on team performance with the new skills rather than on individual performance.

Having a standard set of tool stacks was essential, both for training purposes, and for sharing techniques across the organization. The intention here is not to standardize and prevent the use of alternative tools, but to create a common evolving baseline to foster communication and cross training. The standard test tool stack also made it possible to rapidly stand up the tools on a tester’s workstation – appliance-like. Testers who have transitioned to becoming test programmers are often less experienced technically than software developers, and installing and maintaining complex tools is potentially frustrating for them, making automated installation very helpful.

Many organizations with remote locations want to use Web based training for agile methods. There is no reason that Web based training cannot be used, and indeed there is a-lot of Web based training available for some of the tools. For example, to learn the tool called “git”, you can use this online interactive tool. However, we wanted to have an intense, immersive experience in order to ensure that developers were entirely focused on the training, in a group setting. The amount that needs to be learned is very broad and deep, and immersion is necessary. If the training were not in a classroom, then it is anticipated that people would not be allowed to devote entire days to it, which is what is needed in order to ramp up quickly. The whole point was to bite the bullet and make in investment and then reap the benefits as quickly as possible.

Assessment is always part of a learning program. Any teacher knows this: you need to test students. The highly innovative Khan Academy system is built around continuous individual assessment through an automated dashboard and progression to the next step when assessment criteria have been met, allowing each student to progress at his or her own rate. We realized that we needed to build assessment into our training process, to both assure that it was effective and to measure its effectiveness, which we felt was true to the empiricism of agile. Thus, after a team was trained, the team was strongly encouraged to not only record and display their code quality and test coverage, but also demo to the business their progress on a regular basis. This was done by integrating their test coverage tool, code quality tool and automated build output into an aggregated dashboard, which we then shared with management in a bi-weekly presentation. Reporting test coverage was mandatory and this was strongly supported by group management. Management was able to see the test coverage of teams that had been trained, and they were able to see that more and more teams were trained over time. This helped to assure management that the very substantial investment of the team’s time in training was worthwhile. It also provided an important metric for agile adoption, since technical practices are a key part of agile. In addition to reporting test coverage, agile coaches working with each team provided subjective assessment of each team’s progress, generally focused on impediments that management could do something about or should be aware of.

We found that many (but not all) testers could be turned into test programmers in ~ 3 weeks.

One of the most interesting discoveries was that most testers can become test programmers. The IT role of “tester” is generally applied to someone who conducts manual acceptance testing, consisting of writing “scripts” and then executing those scripts each time testing is needed. These scripts are written instructions for what to do to test the application, generally consisting of clicking on links, entering data into forms, checking the results that are displayed, etc. It is enormously tedious and labor intensive. Manual testing of this kind is not compatible with agile projects because agile projects rely on continuous testing, and that continuous testing is what enables one to have the short two week development iterations. It is therefore imperative to shift to using automated tests wherever this is possible. The question is, what do you do with all of your testers?

We found that most – but not all – testers could learn how to write automated tests. Automated tests are programs: they are test programs. They are executable code. This means that testers become programmers. Some testers are anxious to learn programming and are able to make the transition. Others are not. We found that among those who were, they were able to become productive test programmers after about three weeks, provided that they had a testing tool stack set up for them and the mentorship of a test automation coach.

We also took the same build, measure and dashboard system and began using the same techniques for software developed by external solution providers. This enabled us to measure the code quality of that software. Code quality ties back to many things such as standards compliance, test coverage, complexity, and other attributes. Using this approach we were able to convince contract management to add quality clauses into contracts. This also separated the vendors who said they were Agile from those who really were Agile in their technical practices. It also exposed a huge problem with one very large vendor in particular which resulted in a significant reduction in cost, potential liability and support problems. Overall, the use of code quality metrics helped in both supplier and contract management. One program utilizing this platform realized a multi-million dollar savings by informing the program managers on the relative quality which resulted in a significant reduction in support and re-work.

“Our space is too complex”

The same techniques have also been successfully applied in the embedded machine systems space. While significantly more complex and overall systems complexity is higher, the same delivery mechanism, teaching/coaching approach and condensed highly impactful training has been just as effective. Cost savings are more significant in this space, especially when moving more testing away from hardware-in-the-loop testing to more automated software-in-the-loop testing, reducing capital expenditures.

So far, we have not found a software space that did not benefit from this approach.

Conclusion

The general approach used here was highly effective, and it can be replicated outside of the domain of testing and continuous integration. For example, we believe that it can be applied to enterprise architecture, to governance, to release management and deployment, to security, and generally to any specialized area that is part of the solution delivery pipeline. The legacy approach of having discrete steps that are performed by different groups in sequence then gives way to an approach in which the development team performs every function, but with the support of specialists through training, coaching, automated testing, and process oversight. The role of specialized groups changes from doing to teaching; from being a gatekeeper to being a guide; from inserting speed bumps to helping teams to create a safer path.


Scott Barnes
Cliff Berg

No comments:

Post a Comment