Technical rationale

Using a public domain reference model such as the HSL model is necessary for a number of reasons:

HFI assessment using the Human-System Lifecycle (HSL) reference and assessment models is based on the approach adopted by MoD for Software Capability Evaluation (SCE). MoD guidance on the conduct of SCE is given in DPMG TECH/490. SCE is now considered fully mature and has been used by MoD for contractor selection.

The headings used to discuss the rationale behind HFI PRA are:

For the author's immediate purposes, compliance with customer mandates may prove a simple answer for the UK MOD, but the wider question as to the role of process assessment as defining "What Matters" in HFI is still open. A very brief case for process assessment is made.

Experience has shown that Software Capability Evaluation (SCE) offers a number of clear benefits. The work undertaken by DERA CHS has been aimed at ensuring that HFI PRA delivers a similar scale of benefits. The value chain is expected to be as follows.

A formal PCAE provides a structured accountable means of assessing project risks relating to quality in use. The PI actions raised as a consequence provide a similarly systematic means of risk mitigation (in the case of PCAE, the tender would comprise a solution to the requirements plus a binding PI programme). The results of the evaluation provide an input to contractor selection and to the project’s risk management plan. PCAE can be used either to assess a few shortlisted contractors and use the results in both contractor selection and risk reduction, or to assess a single source or preferred contractor and use the results only for risk reduction. An HFI PRA assessment can be stand-alone or part of a larger assessment.

The obvious disadvantage to conducting an assessment is cost. However, this disadvantage is not a function of HFI PRA but is common to all serious approaches to CE and PI. Using an established model offers the benefits of scale and maturity. The estimated cost of a CE can be assessed against the project risk and a ROI estimate made. It is expected that CE using HFI PRA will always turn out to be very cost-effective.

Back to HFI PRA Background page

Use of standard assessment models and tools

The format of the HSL model follows that of ISO 15504 Software process assessment. Although initially developed for software capability assessment this standard provides a generic framework for all process capability assessment. Using ISO 15504 ensures compatibility with off-the-shelf tools, and methods for assessment. In order to support independence, longevity, a broad procurement base and international operations the HSL process model has been developed as a draft international standard.

An assessment using HFI PRA can use a number of standard approaches to assessment. Using an assessment framework based on established standards is essential for formal Pre Contract Award Evaluations (PCAE) - it helps to make the assessment lawyer-proof in the event of appeals or litigation by unsuccessful tenderers.

ISO 15504 Process assessment presents a standard for process capability determination. It defines a normative approach to the assessment of process maturity. The processes presented in the HFI PRA model conform to ISO 15504 requirements for variant processes. ISO 9000/2000 has an assessment standard for Process Improvement purposes required under the new version of the QA standard. HFI PRA can be used in conjunction with that relatively simple assessment scheme. CMM has standard approaches to Capability Evaluation, Assessment and Process Improvement. HFI PRA can be used in conjunction with these.

Compatibility with other assessments

The process model in HFI PRA has been designed to be compatible with other process models that might be used for related assessments.

Recently there has been international and MoD interest in model-based CE for System Engineering, but this is still maturing. MoD (DERA) have played an active part in the development of International Standards for assessment methods and for System Engineering processes, and this work is in harmony with the needs of the AMS. The HSL model has been designed to anticipate developments in System Engineering models. ISO 15288 System lifecycle processes presents a standard for the processes required to develop systems.

HSL model based on experience

HFI PRA assessment and the HSL model is based on a considerable body of expertise and a growing base of application experience. The principal source of material for the model has been ISO 18529 Human-centred lifecycle process descriptions which has the same structure and intended forms of use, but is aimed at situations where the link between acquisition and customer organisational development does not need to be made, and where safety considerations are less pressing.

At the time of writing, information exchange with the US DoD and Canadian DND has started. Similar approaches to HFI PRA are being investigated in North America based on US standards. It is intended to co-ordinate model development in a way that will support international procurement and multi-national operations and which will pool experience.

Should a PPO or a DEC Manager or an Armed Service Director wish to evaluate their business processes, there are other established models that may be appropriate.

Compliance with customer mandates

Where some high-level customer mandate has been given, whether in the form of an instruction or a project requirement, then it is reasonable to check whether it is being met. Obviously, this in no way helps Human Facotrs to define its professional scope of interest. The quality of what we find will be subject to the quality of the mandate, which is subject to all sorts of political constraints.

In the case of the UK Defence Procurement Agency, there are 3 high level HFI instructions (authoritative guidance) which must either be complied with or have a written justification for deviation. For the HFI CMM study, these instructions are an obvious starting point, and it can be expected that other organisations with their own instructions are going to tailor the HSL model to their own legitimate sources of authority.

Compliance with legislation

The author's informal analysis of European Health and Safety At Work Legislation as regards process requirements needs to be re-visited and validated. However, it should be possible to define legislative anchor points for process assessments for specific situations e.g. machinery control in Europe, or office IT in the US. This would be very powerful. The availability of COTS assessment methods and COTS Improvement methods with the new ISO 9000:2000 would really help to promote user-centred design, on the principle of "what gets measured gets done". The prospect of common assessment methods to support diverse Health and Safety legislation is very appealing.

Accountability for public expenditure

What matters as regards accountability for public expenditure is good management and/or a good plan. The assumption in much of the government sector is that a good management plan is the answer. If nothing else, an approved management plan provides legitimacy for the expenditure of public funds. The impression of an organised programme of work gives rise to an expectation of effective outcomes. There is much to commend this. Capability maturity models adopt a managerial ethos, and assume that managed processes are better than non-managed processes. However, there are two significant failure modes.

Firstly, on large projects that are generally fairly well run, there is a quality system covering document standards, configuration control, allocation of resources etc. This more or less guarantees that HF management plans will be at a high level of maturity, but says nothing about whether they are doing anything that will deliver Quality In Use.

The second failure mode is at the other end of the scale, where a small organisation works very informally, but does the right thing in the right way. This is the classic problem with approaches such as CMM or ISO 9000 and their underpinning managerial ethos.

Adoption of HCD - standard as best practice defence (a biased view of ISO 13407/18529)

The author and the HCPIG community consider that this is most likely to be successful.

ISO 18529 and the HSL model, like a number of the other approaches, make assumptions about the managerial ethos, but have specified outcomes which are the real basis of the assessment. However, the HSL model may well be too big to constitute a 'quick assessment'. A quick self-assessment against key processes may prove to be a very valuable tool in determining whether a project meets established best practice.

Ensuring that HF activities are carried out

The basis of this view is that there are specific activities which, when undertaken, will deliver Quality In Use.

The problems with this view are that

a) the activities may be undertaken but their outputs not incorporated into ththdesign (deliverables become shelfware) and

b) the activities may be undertaken in an informal manner.

Back to top

 

Alignment of acquisition and manpower development

This issue has been included because of its importance in some circumstances, where business evolution/organisational change and new technology need to be aligned. It is the basis of the US Military MANPRINT initiative. The UK Defence Procurement Agency instructions address this topic. The HSL model starts to address this issue by definition of related processes i,e, the inclusion of HS.4, which relates to manpower development rather than acquisition. There may be metrics that attempt to address this alignment directly, but the author is unaware of them.

Use of resources

There have been attempts to assess user-centred design by seeing whether

a) HF specialists are employed on the project and/or

b) users are involved with the project.

The involvement of HF specialists is (in the view of the author) neither necessary nor sufficient as an indicator of whether Quality In Use will be achieved. The involvement of users is (again a personal view) necessary, but not sufficient - worthy of consideration as an indicator, but not on its own.

Commitment and culture/Organisational maturity

There is considerable support for the view that the right culture in an organisation provides a valuable indicator of whether Quality In Use can be achieved. There is evidence to support this view (e.g. the work of Flanagan, Eason). Such indicators may well have a valuable role in self-assessment, self-improvement. They cannot be used by outside agencies for assesment purposes since they are too subjective. They are of limited diagnostic value since any improvement activity is going to comprise the improvement of specific processes. Culture indicators may well have a role complementing process metrics.

Production of deliverables

There is a view that producing particular deliverables is an assurance of Quality In Use. The author has seen too much Human Factors shelfware to credit this view with any value. Whether or not the production of any specific documents are contributory to Quality In Use assurance can only be judged in a specific context as contributors to design outcomes.

Participative design

There is a view that participation by those affected by the introduction of a new system is the essence of user-centred design. There may well be participation metrics. The underlying philosophy would not be socialist, more one of empowerment, but certainly not a managerial ethos.

Case for process assessment

The case for a process assessment is that it can concentrate on outcomes - did the project achieve Quality In Use. To get at this, there are a range of indicators such as practices and work products that can be used to gather evidence. There may be something better, but it hasn't been found yet.

Back to top