You are on page 1of 4

result

document name(s)
document author
#

question

do all requirements have a unique identifier?

are all requirements atomic?

have the requirements been clearly, and unambiguously articulated?

are all requirements prioritised?

are the priorities explained?

do all requirements have identified contacts assigned?

are all requirements stated as pure business requirements i.e. without


stipulating a solution?

are all relevant constraints clearly documented?

have all assumptions, risks, issues, dependencies and decisions: been


clearly documented?
been assigned with a unique ID?
been assigned to owners?

10

have requirements been cross referenced where relevant?

11

have all diagrams or use cases been cross referenced to specific


requirements?

12

are all requirements feasible and testable?

13

have supporting documents been cross referenced where relevant?

14

have diagrammatic conventions been clearly explained?

15

do all hyperlinks work correctly?

16

has the document been checked for correct spelling, grammar and
formatting?

17

has the document been peer-reviewed?

18

is there a justification for the BA confidence rating?

19

has the project initiation document been referenced for scope?

20

has the document been walked through with the relevant business
representatives?

rework

name of QA reviewer
date of QA review
further detail

i.e. can they be broken down further into


separate requirements.

using the standard priority convention (1-4)

if not, there needs to be a clear justification

check with a developer and a tester if you're


not sure!

yes/no

action

notes

result
document name(s)
document author
#

question

have all data elements been identified that this change will create, read,
update or delete?

have specific data requirements been clearly articulated?

have data elements been effectively modelled?

has the appropriate data modelling convention been followed?

is there commentary acompanying the diagram(s) to aid


comprehension?

have static / master data elements been clearly identified and are they
easily distinguishable?

have business rules for the data items in question been clearly
articulated and cross referenced?

has database neutral terminology been used?

have any manual data fixes been clearly distinguished from automated
ones?

10
11
12
13
14
15
16
17
18
19
20

have data quality rules been specified and cross referenced?

rework

name of QA reviewer
date of QA review
further detail

e.g new data items required, changes to data


items

yes/no

action

notes

result
document name(s)
document author
#

question

have all business process impacts been identified?

has an appropriate business process notation been used?

have business rules for the processes in question been clearly


articulated and cross referenced?

have impacts on existing process repositories been identified?

have process boundaries been adequately identified?

have all process actors been identified?

has the process been documented at an appropriate level of detail?

are processes cross-referenced to requirements?

9
10
11
12
13
14
15
16
17
18
19
20

rework

name of QA reviewer
date of QA review
further detail

yes/no

action

notes

result
document name(s)
document author
#

question

have goals been clearly defined?

have goals been linked to the business case?

have the system boundaries been clearly defined?

have all functions been defined?

have all actors been defined?

have all data flows been defined?

have all data stores been defined?

have all exceptions been defined?

has an appropriate diagramming convention been used and explained?

10
11
12
13
14
15
16
17
18
19
20

rework

name of QA reviewer
date of QA review
further detail

yes/no

system = people + process + technology

e.g. use case, DFD

action

notes

You might also like