AGILE BENEFITS AND PRINCIPLES

AN IN-DEPTH LOOK AT THE MERITS OF OUR METHODOLOGY

Agile methodology will:

  • Define and validate user requirements quickly
  • Develop features in sequence of highest priority
  • Set and achieve project milestones to reach full system functionality

Agile development will:

  • Build quality control into the process
  • Provide continuous feedback opportunities
  • Document acceptance criteria for each feature
  • Build automated tests into software
  • Reduce defects in software developed
AGILE
PRACTICE TOOL
ARTIFACTS FREQUENCY* PARTICIPANTS
Continuous Integration of Automated Builds Automated email indicating what development files were changed and whether the subsequent build was a success Continuous with every developer check-in ENVISAGE Operations Team
Planning Poker Relative estimate for each of the functional requirements (user stories) desired by the customer Weekly ENVISAGE Operations Team
Iteration Demonstrations Meeting minutes documenting what functionality was demonstrated, what functionality was accepted by the customer and what priority was assigned by the customer for any uncoded functionality Weekly or bi-weekly* Customer and ENVISAGE integrated team
Velocity Charts Chart indicating how many story points were completed in an iteration and what the total remaining story point count is Weekly or bi-weekly Customer and ENVISAGE integrated team
Paper Prototyping Scenarios representing typical job functions and a paper representation of the intended application design As needed with new design Customer and ENVISAGE integrated teamSubject Matter Experts (SMEs)ENVISAGE Operations Team
Test-Driven Features Acceptance criteria for each functional requirement (user story) with success or failure results from the development team in automated execution of the tests Continuous with every developer check-in; written documentation provided on request Customer and ENVISAGE integrated teamSMEsENVISAGE Operations Team
Test-Driven Development Automated email indicating the number of code function points that are covered with an automated unit test Continuous with every build ENVISAGE Operations Team
Iteration and Project Retrospectives Meeting notes indicating what needs to be recorded for institutional knowledge, what was learned, what should be changed and what is still puzzling Internally with each iterationExternally with each release to production*
CONTINUOUS INTEGRATION OF AUTOMATED BUILD

The ENVISAGE team follows a continuous integration cycle that includes automated product builds. Each time a developer checks in modified source code a battery of Acadis® automated regression tests will be automatically run to ensure that the code is clean and that it does not impact any another area of the application.

USER STORIES AND PLANNING POKER

The ENVISAGE team uses a method of relative estimating called “planning poker.” Each of the desired features is written as a user story with three elements: who wants the functionality, what action the user wants to perform and what business value the user realizes once the feature is implemented. The feature is then broken down into the smallest piece of usable functionality, so it is able to be completed in a single iteration and presents immediate business value. The ENVISAGE Operations team estimates the complexity of that feature in comparison with other features that have been developed in the past. Used in combination with Velocity Charts, the team is able to predict when features will be delivered to the customer.

ITERATION DEMONSTRATIONS

The ENVISAGE team will work with the customer at the beginning of a project to define the length of an iteration. The goal in any iteration is to deliver working features with business value that might be deployed. Often, large features are broken into smaller pieces.

At the beginning of each iteration, the team works with the customer to understand which of the small features (stories) are of the highest business value and prioritizes those stories at the top. The team works each of the stories to completion in priority order. By breaking down large modules/functions into more digestible and more easily evaluated features, the team will often be able to eliminate functionality that is thought in design to be important, but will never be used. At the end of each iteration, the team demonstrates the functionality that has been developed during the past iteration. The demonstration is expected to show that the defined acceptance criteria have been met and ideally results in the acceptance of the functionality.

Because iterations are short, it is rare for customers and development teams to get out of sync. If they are out of sync, only a small amount of rework is required to get back in sync. This eliminates the common problem of a software solution being delivered several months (or years) after the initial requirements gathering, and no longer meeting the business need either because the requirements were not well understood or they changed in the time it took to deliver the software.

The end of one iteration becomes the beginning of the next unless the customer and ENVISAGE team decide that the functionality delivered is of sufficient value to take the time to move to release (perform regression testing, final exploratory testing, deployment to production and communication to the user base). If key stakeholders are participating in the iterations, the communication cost for a release is significantly reduced. ENVISAGE works to minimize the cost of regression tests through the use of automated tests.

VELOCITY CHARTS

With each iteration, the team records which of the stories were accepted by the customer and how many story points that feature represented. The team uses a rolling average of the past six iterations to determine how many story points they can complete in an iteration. Because the team uses relative estimating in planning poker, the team is able to predict the rate at which they are completing story points and can project a delivery date on all features that have been estimated. The graphical representation of this information is called a burn down chart.

PAPER PROTOTYPING

In an effort to minimize the cost in developing software, the ENVISAGE team employs paper prototyping with brand new design areas to identify what design will meet user needs. This process involves members of the ENVISAGE team meeting with customer subject matter experts to understand the typical business functions that are performed. The team will create paper applications with individual pieces of paper representing each screen of the application. Users are then asked to participate in a usability study, where they “click” on buttons using their finger and “type” in fields writing on copies or transparencies. The development team observes the users and notes any areas where users have difficulties in performing the tasks. At the end of the usability session, the user is interviewed about missing functionality or additional feature needs. The team performs several usability sessions with different users and adjusts the paper prototype based on the feedback. This allows the team to adjust the application as many times as necessary and make very large changes to accommodate new functionality or usability issues without incurring a significant cost.

TEST-DRIVEN FEATURES

Test-driven features are the process of linking each requirement (user story) to specific, measurable and verifiable customer acceptance criteria. The requirement is documented in a testing application and tied to the code. If the code matches the stated requirements, the automated tests pass. This approach ensures that functionality does not pass from development to quality assurance until the specific criteria have been met. Test driven features force the development team to meet and often exceed the functional and performance requirements of the application. In addition, the execution of the automated test suite occurs with each build, preventing unknown breakage of other areas of the application.

TEST-DRIVEN DEVELOPMENT

The practice of test-driven development (TDD) is a test-first approach. As opposed to the traditional approach of writing a complete module or function first and then performing tests once development is complete, TDD requires programmers to continually test the software from the outset. Tests are created for most or all of the individual functions of a module. Generally excluded from TDD is the presentation layer or user interface, as TDD does not lend itself to testing graphical elements and therefore, the cost-to-benefit ratio in evolving applications is not justifiable. Tests are initially designed to fail based on agreed-upon acceptance criteria. Coded functions are then created and continually tested and refined until they pass. Multiple tests may be used to validate that a large set of behaviors passes. Tests are retained and reused anytime the function is changed or as part of a larger test.

The benefits of using test-driven development practices with full regression testing include well-architected pieces of code that adhere to programming best practices, and a full set of regression tests that allow the application to be re-factored at a rapid pace if business processes change.

Code coverage of new modules will be measured and is expected to exceed 80% using automated tools to capture and report results.

AUTOMATED TOOL REPORT DESCRIPTION FREQUENCY
Fitnesse Test results report A report that documents the results of all automated tests run during a build process against the customer acceptance criteria. The report will show # of tests run, # tests passed, # tests failed, and # tests ignored. Will be run during Iteration demos
NCover Code Coverage A report of the % of code that is covered by automated tests Will be provided on demand
VersionOne Outstanding Bug List A report that shows outstanding bugs in the software Will be provided with each release
Various Scalability Test Report Various reports including performance (# transactions per second) under load, response times and existing latency As necessary
RETROSPECTIVES

Sometimes referred to as a “lessons learned meeting” or “post mortem,” the ENVISAGE team performs a retrospective periodically at the end of iterations internally. At the end of each production deployment milestone, a retrospective meeting will be held with the customer and the ENVISAGE team to improve the process for the future.

In retrospectives, the team asks the following questions:

  • What did we do well, that we could benefit from doing again for future management?
  • What did we learn?
  • What should we do differently next time? and
  • What still puzzles us?

Answering these questions allows the team to benefit from previous learning experiences.

TRADITIONAL QUALITY ASSURANCE PRACTICES

In addition to the specific quality measures that Agile provides, ENVISAGE employs several traditional quality assurance practices.

EXPLORATORY TESTING

Once the customer is satisfied with the functionality demonstrated during the iteration demos (see above), the ENVISAGE Quality Assurance team will run a series of functional testing scenarios to ensure the delivered feature:

  • Integrates well with the rest of the Acadis® software
  • Meets security requirements
  • Meets or exceeds performance standards
  • Conforms to UI standards

This process is generally shorter than in a traditional application cycle, because much of the functionality has been tested in the course of building and automated regression suites exist to verify that no problems were introduced through changes or the addition of new features.

DOCUMENTATION

For each milestone release, ENVISAGE will provide appropriate build documentation. All defects identified in testing, error resolutions, and configuration changes will be recorded in the defect tracking system. Defect fixes will be ranked by impact and priority and separated from system enhancements.

DEFECT FIX PRIORITIZATION

The customer and ENVISAGE team will be responsible for determining which defects should be fixed before user acceptance testing should occur. The numbers of defects are smaller than in traditional application development projects because much of the functionality never enters quality assurance until it meets the functional criteria.

CUSTOMER INVOLVEMENT AND SIGN-OFF

Customer involvement and customer satisfaction go hand-in-hand during projects. Early in the process, the customer and ENVISAGE team will be involved in the prioritization of feature development and customer subject matter experts will be involved in the usability testing of designed features through paper prototyping. Because most of the testing is automated, the development team will be able to adjust the application and feature priorities based on this critical input. We have found that this process ensures the user community gets exactly what is needed and is actively engaged in evaluating trade-offs.

Throughout the process, the customer and the ENVISAGE team will be responsible for reviewing the functionality delivered with each iteration and for ensuring that the demonstrated software meets the stated needs. At the end of each release cycle, ENVISAGE will promote the software to the customer test servers for user acceptance testing (UAT) and final sign off.