VSJ – April 2005 – Work in Progress

Jean Davison, David Deeks, Mark Ward-Casey and Mark Wilson of Sunderland University, an IAP Partner, write about a new arrow in the PISO quiver, which David and Jean have described here before.

As regular readers of these IAP pages may recall, PISO® (Process Improvement for Strategic Objectives) is a method that allows system users to redesign their own work-based systems. Reinforcing the method’s emphasis on the need to engage stakeholders within the redesign, the first major development arising out of PISO® was pisoSIA®. A method in its own right, pisoSIA® is designed to support the identification and analysis of all stakeholders in a system – i.e. users or otherwise.

As mentioned in the February issue, further new areas are currently being researched to augment the PISO® suite of tools, one of which is pisoMETRICS®. Maintaining the PISO® ethos, pisoMETRICS® will be simple and ‘end-user-usable’. It will provide a means to measure system aspects at the beginning and end of the PISO® analysis. Its aim is to offer a complete yet simple means of judging the viability of a project, providing clear metrics to inform the decision as to whether to keep things as they are, or to select one of the PISO® re-designs as most appropriate for implementation.

To meet this aim, pisoMETRICS® will provide metrics on three different bases. Two of these, Cost/Benefit Analysis and Risk Assessment, are well known and commonly used. For these, the PISO® research and development team is considering existing approaches and selecting/adapting as appropriate.

Cost/Benefit Analysis is self-explanatory. It provides a comparison of costs and benefits. If costs are high and benefits low, a project is unlikely to proceed. If the opposite is true, the project is likely to be given the go-ahead. Risk Assessment is similarly self-evident. It provides a means of evaluating the risk of taking a particular course of action. What risks are inherent in keeping things as they are? What risks are attached in making changes X, Y or Z? The PISO® approach places emphasis upon the fact that the stakeholders affected by a systems development are wider ranging than users of, or direct engagers with, a system. As these wider circles of stakeholders are considered – each with their own expectations of the ‘system’ – alternative costs and benefits associated with the various stakeholder groups may well pose varying levels of risk connected to implementation.

The third aspect is the PISO® team’s own development, which they have called ‘warm glow analysisTM’. This provides a measure of how stakeholders feel, in relative terms, about the current system compared with the proposed one(s). If management were to make a proposed change, is it possible to quantify any stakeholder resistance to these changes?

Proposed change can lead to uncertainty as to whether or not to implement a system – including a natural reticence in stakeholders who would prefer to stay with what is familiar. It is important to consider the effect of stakeholders upon a system (Davison et al 2002, 2004, Coakes and Elliman 1997) and PISO®’s identification of the wider circle of stakeholders can prove a key factor for successful implementation. Recognition of such a range of stakeholders increases the likelihood of identifying widely varying expectations of a project outcome – and similarly varying views as to whether proposed changes are desirable. Management stakeholders may well consider success from an economic standpoint, with end user or client/customer stakeholders considering the usability of a system or the quality/service it provides. Many projects fail to meet their potential simply because stakeholders resist the proposed changes – and this resistance can be open or covert. As well as occasions of flat refusal to use new systems, how many implemented ‘solutions’ somehow do not work as well as expected because users do not engage as fully or enthusiastically as had been hoped?

As implied above, such resistance identifies one form of risk. Risks are those potential events that lead to problems for a project, system or organisation. It is essential that risks are identified and managed successfully so that any business can run smoothly and continue to progress. A study by Parker and Mobey (2004) found that ‘lack of user involvement’ was ranked lowest as a critical risk factor in project management. The study concluded that the risk of choosing an inappropriate system was high and that greater user involvement could potentially help minimise these risks. Such research supports the PISO® team’s experience – thorough stakeholder involvement greatly aids successful systems implementation, by reducing the risk of stakeholder resistance.

PISO® has been very successful in developing stakeholders’ enthusiasm of in a system’s development. In fact, there have been numerous instances of end user/client stakeholders demanding that management make the changes that their PISO® project has developed, with ‘resistance to change’ being found within the management rather than the end-users.  A very high proportion of PISO® projects is subsequently considered by a majority of stakeholders to have been ‘successfully’ implemented. But what does ‘successfully’ mean? And from whose perspective? A study by Seddon et al (1998) that viewed stakeholder perspectives of system effectiveness demonstrated that different measures are necessary to gauge the effectiveness of a project in differing contexts. It is anticipated that as the stakeholders are involved in the development of a system from its current form up to the negotiated final form, some level of agreement as to improvements should be seen – but how to quantify it? How do we actually measure whether stakeholders are in accord with the changes and to what extent? Any systems development should satisfy users’ needs and continue to do so (Jayaratna 1994) – but in order to establish that these needs have been met, some form of measurement is required.

The solution has proved deceptively simple. It may be recalled that PISO® models the current and proposed systems using data flow diagrams (dfds). It is acknowledged that with the current emphasis upon an object-oriented view of systems development, such diagrams have become considered ‘old hat’ in many computing quarters – but they have found a most appropriate home within PISO®. In particular they bring clearly defined logicalisation steps missing from so many graphical representations (e.g. Process Mapping, Value Stream Mapping) and these steps have been found to be crucial to reliable re-design. Stakeholders quickly learn to use dfds extremely effectively, with the logicalised diagrams being the key to the development of re-engineered systems based upon agreed strategic objectives.

Such detailed graphical illustrations of the processes, data stores and data flows involved in the system have proved the basis for the ‘warm glow analysisTM’ approach. In early trials, this has proven simple to use yet gives a rigorous measurement of how well stakeholders rate the current system in comparison to the proposed one(s).

‘Warm glow analysisTM’ involves users being asked to allocate an approval rating from 0-10 to each of the processes on the ‘current physical system’ data flow diagram and then again to each of the processes on the ‘new physical system’ diagram. They are deliberately given no advice whatever as to the basis on which to measure a process except by ‘feelings’. Thus stakeholders are not forced into preconceived notions of their perception of the system and are able to give an independent rating. Subjective user perception (the level of ‘warm glow’) is therefore converted into objective measurement (numbers from 0 to 10). The measurable aspect is the mean difference between the ‘before’ and ‘after’ ratings.

A criticism sometimes levelled at the approach is that delegates involved in redesigning their own systems will be predisposed towards giving a favourable outcome – i.e. they designed the new system, so they are almost bound to allocate higher scores to it. This misses the point entirely. If stakeholder involvement in a system’s development has influenced them to prefer the new system, this only demonstrates the advantage of PISO®’s emphasis upon having stakeholders design the changes. The reason for individuals thinking a certain way does not matter however. The metric important to management is whether these user stakeholders prefer a new system or not, and to what extent. Such an indication of acceptability indicates the risk of a project from an end user perspective – the larger the positive swing, the less risk of the user stakeholders rejecting it.

Early trials have included The University of Sunderland, Northumbria Police and Northumberland Health Care. The ‘current system’ average rating so far is 5.2, and the ‘proposed system’ average 7.4, indicating general support for the changes developed using PISO®. No PISO® project has so far shown a lower result for the proposed system – although individual stakeholders have given lower ratings to specific processes in the proposed system.

This latter point indicates that much is to be gained by noting the differing process ratings by different stakeholders. An interesting example was the Northumberland Health Care analysis. A patient stakeholder rated old and new system processes equally highly indicating her satisfaction with the service she already received. However, staff stakeholders showed a significant swing towards the proposed system, indicating how much more easily they could provide the same level of care if the new processes were adopted. This reinforces the findings of Seddon et al (1998) who, as mentioned above, stated that different stakeholders would hold differing views of system effectiveness.

The signs are that pisoMETRICS® will provide a beneficial and practical tool for the evaluation of system development projects. Incorporating the ‘warm glow’ aspect of user acceptance of system change with the more conventional Cost/Benefit and Risk Analysis approaches provides an enhanced base for project evaluation.

References

Coakes, E. and Elliman, T. (1997). Stakeholders and Systems in a Turbulent Environment. 5th European Conference on Information Systems, Cork, Cork Publishing Ltd.

Davison, J., Deeks, D. and Thompson, J.B. (2002). Developing a Stakeholder Identification and Analysis Technique for use in Information System Redesign. 6th World Multiconference on Systemics, Cybernetics and Informatics (SCI 2002), Orlando, FL, USA.

Davison, J., Deeks, D. and Bruce, L. (2004). Stakeholder Identification and Analysis: A Medium to Aid Change Management in Information Systems Reengineering Projects. Journal of Systemics, Cybernetics and Informatics, 2(2).

Jayaratna, N. (1994). Understanding and Evaluating Methodologies: NIMSAD A Systemic Framework. London, MacGraw-Hill.

Parker, D. and Mobey, A. (2004). Action research to explore perceptions of risk in project management. International Journal of Productivity and Performance Management. 53(1) pp. 18-32.

Seddon, P.B., Staples, D.S., Patnayakuni, R. and Bowtell, M.J. (1998). The IS

Effectiveness Matrix: The Importance of Stakeholder and System in Measuring IS Success. Proceedings of the International Conference on Information Systems, Finland.

Talk to David on 0191 5152666 or email david.deeks@sunderland.ac.uk. See www.piso.org.uk for more details.

[Interesting project or development? Let us know at eo@iap.org.uk!]

Comments are closed.