requirements-testing: Difference between revisions

From Microformats Wiki
Jump to navigation Jump to search
No edit summary
Line 7: Line 7:
There are numerous popular approaches to requirements definition and acceptance test/use case documentation, and various automated tools to run test fixtures and report back results.  
There are numerous popular approaches to requirements definition and acceptance test/use case documentation, and various automated tools to run test fixtures and report back results.  


Many of the accepted formats focus on diagram notation of use cases, but a lot of the time, test plans and documentation are captured in unstructured and arbitrary styles, often in Word documents or similar. One of the major issues with working with requirements in this document format is that it is very easy to fall out of step with revisions and changes, especially once development cycles are up and running on a project. Older requirements information tends to rot and become incorrect very quickly, unless documents evolve directly with the development process itself. Word documents are hard to author collaboratively in this way, and encourage an adversarial atmosphere between clients and developers.
Many of the accepted formats focus on diagram notation of use cases, but a lot of the time, test plans and documentation are captured in unstructured and arbitrary styles, often in Word documents or similar. One of the major issues with working with requirements in this document format is that it is very easy to fall out of step with revisions and changes, especially once development cycles are up and running on a project. Older requirements information tends to rot and become incorrect very quickly, unless documents evolve directly with the development process itself. Word documents are hard to author collaboratively in this way, and encourage clients and developers to take an adversarial stance.


== Documenting Requirements ==
== Documenting Requirements ==

Revision as of 06:18, 14 July 2005

About hTest

hTest is a concept for documenting acceptance tests and project requirements in XHTML. It exists to facilitate the progress and uptake of Agile development methodologies across a wide spectrum of projects and organizations.

The problem

There are numerous popular approaches to requirements definition and acceptance test/use case documentation, and various automated tools to run test fixtures and report back results.

Many of the accepted formats focus on diagram notation of use cases, but a lot of the time, test plans and documentation are captured in unstructured and arbitrary styles, often in Word documents or similar. One of the major issues with working with requirements in this document format is that it is very easy to fall out of step with revisions and changes, especially once development cycles are up and running on a project. Older requirements information tends to rot and become incorrect very quickly, unless documents evolve directly with the development process itself. Word documents are hard to author collaboratively in this way, and encourage clients and developers to take an adversarial stance.

Documenting Requirements

Could there be a delarative equivalent of a UML use case diagram in XHTML? What would this actually achieve? It's easy enough to do in XML, but this wouldn't be viewable directly in a document format. Good idea not to overcomplicate the simple description of a user role, and possible interactions with other roles/services.

A simple HTML vocabulary could make it much easier to share structured requirements information through a wiki type format, which could thus be updated in realtime to reflect changing requirements and new knowledge about the conditions of a system.


Automated Acceptance Testing

To make use of the full benefits of automated acceptance testing, there needs to be a way to integrate code generation and test reporting directly into a workflow by integrating with the tools that are already being used in the common office environment. FitNesse (wiki based) and Arbiter (document based) tools are two of the steps in this direction. Arbiter uses list items to define test procedures, wheras FitNesse uses tables to define expectations. Apart from that, they work in a very similar way: upload a document/wiki page -> parse the test definitions -> generate the test code -> run the test code -> report back results. But they're still too attached to the concept of a "document" as a separate editable file, rather than a dynamic web resource.

Test Vocabulary

A list of the common actions allowed by a web acceptance testing tool (In this case, Arbiter/Simpletest)

 - Start at [link] (Go to [link])
 - Click [text][submit]
 - Set field [name] to [value] (Add [name] to [value)
 - Should see (Expect to see)

Communicating and Sharing Acceptance Test Data

Who is the audience? Clients, developers, stakeholders. What do they want to see? An itemized list of test cases - whether the tests are completed or not, and if incomplete, where did the failures occur?

Current markup:

 <div class="testcase">
  <h3>Local Login Test</h3>
  <ol class="incomplete">
    <li class="pass">Start at http://localhost/login <em>(pass)</em></li>
    <li class="fail">Expect to see "Please enter your login" <em>(fail)</em></li>
  </ol>
 </div>

Some concerns here with duplicating the pass/fail result as inline text in the EM or SPAN with the class existing on the LI. This markup, at least would be compatible with what we're currently reporting from Simpletest.