requirements-testing: Difference between revisions
mNo edit summary
mNo edit summary
|Line 76:||Line 76:|
This raises some interesting possibilities for both documentation and test generation. As well as
This raises some interesting possibilities for both documentation and test generation. As well as use case documentation, this basic list could be used as input to generate a set of JWebUnit (Java) or Simpletest (PHP) test cases and assertions.
Revision as of 11:01, 16 July 2005
hTest is a concept for documenting acceptance tests and project requirements in XHTML. It exists to facilitate the progress and uptake of Agile development methodologies across a wide spectrum of projects and organizations.
There are numerous popular approaches to requirements definition and acceptance test/use case documentation, and various automated tools to run test fixtures and report back results.
Many of the accepted formats focus on diagram notation of use cases, but a lot of the time, test plans and documentation are captured in unstructured and arbitrary styles, often in Word documents or similar. One of the major issues with working with requirements in this document format is that it is very easy to fall out of step with revisions and changes, especially once development cycles are up and running on a project. Older requirements information tends to rot and become incorrect very quickly, unless documents evolve directly with the development process itself. Word documents are hard to author collaboratively in this way, and encourage clients and developers to take an adversarial stance.
Could there be a delarative equivalent of a UML use case diagram in XHTML? What would this actually achieve? It's easy enough to do in XML, but this wouldn't be viewable directly in a document format. Good idea not to overcomplicate the simple description of a user role, and possible interactions with other roles/services.
A simple HTML vocabulary could make it much easier to share structured requirements information through a wiki type format, which could thus be updated in realtime to reflect changing requirements and new knowledge about the conditions of a system.
Automated Acceptance Testing
To make use of the full benefits of automated acceptance testing, there needs to be a way to integrate code generation and test reporting directly into a workflow by integrating with the tools that are already being used in the common office environment. FitNesse (wiki based) and Arbiter (document based) tools are two of the steps in this direction. Arbiter uses list items to define test procedures, wheras FitNesse uses tables to define expectations. Apart from that, they work in a very similar way: upload a document/wiki page -> parse the test definitions -> generate the test code -> run the test code -> report back results. But they're still too attached to the concept of a "document" as a separate editable file, rather than a dynamic web resource.
A list of the common actions allowed by a web acceptance testing tool (In this case, Arbiter/Simpletest)
- Start at [link] (Go to [link]) - Click [text][submit] - Set field [name] to [value] (Add [name] to [value) - Should see (Expect to see)
Communicating and Sharing Acceptance Test Data
Who is the audience? Clients, developers, stakeholders. What do they want to see? An itemized list of test cases - whether the tests are completed or not, and if incomplete, where did the failures occur?
Current markup ideas:
<div class="test"> <h3 class="case">Local Login Test (incomplete)</h3> <ol class="results"> <li class="pass">Start at http://localhost/login <em>(pass)</em></li> <li class="fail">Expect to see "Please enter your login" <em>(fail)</em></li> </ol> </div>
Some concerns here with duplicating the pass/fail result as inline text in the EM or SPAN with the class existing on the LI. This markup, at least would be compatible with what we're currently reporting from Simpletest.
xoTest: XOXO outlines for defining macros
Example of how the compact attribute could be used to define a test "macro". In this case, "Login to localhost" becomes shorthand for referring to the sequence of nested steps.
<ol class="xoxo test"> <li>Add A Link <dl> <dt>Tests the process of adding a link</dt> <dd>additional notes here</dd> </dl> <ol> <li>Login to localhost <ol compact="compact" class="macro"> <li>Go to http://localhost/login</li> <li>Set username to "user"</li> <li>Set password to "pass"</li> <li>Click submit</li> </ol> </li> <li>Click "Add Link"</li> <li>Set url to "http://microformats.org/wiki/XOXO"</li> <li>Set description to "Extensible Open XHTML Outlines"</li> <li>Click submit</li> <li>Should see "Link was saved successfully"</li> </ol> </li> </ol>
This raises some interesting possibilities for both documentation and test generation. As well as functioning as use case documentation, this basic list could be used as input to generate a set of JWebUnit (Java) or Simpletest (PHP) test cases and assertions.