VSJ – May 2005 – Symposium Report

As regular readers know, Council member Paul Lynham traditionally writes a report on the annual symposium to which we devote the whole of ‘IAP News’. Here’s this year’s.

The annual symposium of the Institution of Analysts and Programmers was held on Thursday, 11 March in Trinity House, Tower Hill, London. Mike Ryan, the Director General of the Institution began by calling for IT to be allowed to lead the way. He also noted that more business analysts are approaching the IAP as prospective members. The IAP is extending to businesses the partnerships that it has always had with academia. Mike went on to introduce this year’s chairman, David Morgan, a long-standing Council member of the Institution.

David introduced the first speaker, Peter Green, IT Director at the Daily Telegraph. Peter talked authoritatively about the Radiate Theory, which can be summarised as allowing IT to drive the business. From a legacy viewpoint, this might be seen as ‘the tail wagging the dog’. The concept of the theory is to make functional IT a business integration tool, allowing the integration of users, senior managers and business units.

Often, Business Unit Managers consider requests for new systems or IT upgrades only in terms of their relevance to that business unit, rather than to the business as a whole. Peter went on to give diagrammatic examples of the perception of a request from a sponsoring business unit (unit A), by other units within the organisation. For instance,  “if the system can do X, then we will change our model to that of unit A”. “If unit A changes its model, it will cause our unit additional costs”. “The factory is unclear if it can keep up with the new model”. “Unit D loves the new model”.

Such requests can be managed by gathering all business units together for brainstorming sessions. The idea is to create a large map of the upsides and downsides. Each element can be expanded, so that several layers are produced. This can then help in deciding on the true benefits of the change and in fully costing it. Peter went on to show a simple formula that can be used.

As with any serious project, it must not only be correctly defined but also have a champion to give active support at board level. Peter gave some pointers to successfully implementing such changes, including an iterative approach, the importance of soft systems management and checking the validity of the statements of objectives and their continual review.

Because IT isn’t a service in this model, IT costs become a function of business unit costs. Again Peter demonstrated a formula that could be used. He summed up by reiterating that technology should not be servicing business, but driving it.

Seb Bacon, Technical Director of Jamkit, gave an interesting presentation entitled ‘The Open Source Jungle’. Seb has a personal interest in getting charities and public bodies to use OSS (Open Source Software). He believes that OSS is practical, cultural and ethical and that it is an intersection between proprietary business and open academia. It is ready for servers, enterprise applications and some desktops. Seb offered an overview of the history of OSS and described source code as ‘free speech’. He gave an example of some source code that had been developed to interpret DVD encoding for use on non-MS operating systems. Delegates were greatly amused on seeing the code, bearing in mind his remark about free speech, as the code used 1 and 2 letter variable names and was comment-free!

Seb went on to provide the highlights of OSS development chronologically:

1969 saw the birth of UNIX by Thompson and Ritchie.

In 1978, the Berkeley System Distribution (BSD) version of UNIX was released. A year later, DARPA adopted UNIX and contracted Berkeley.

The 1980s saw the deregulation and commercialisation of software.

In 1983 AT&T released their first public UNIX. BSD was the first OS to support networking for the Internet.

In 1989, UC Berkeley extracted their networking code from the rest of BSD under a loose licence.

1991 saw the removal of all UNIX code from BSD. BSD finished using Internet volunteers.

Seb talked about hackers (as in programmers who like to discover how things work), with a section entitled ‘Cracks in Paradise’. He described how Richard Stallman had habitually written code to modify the behaviour of printer drivers. When, however, printers were supplied without source code, this ceased to be possible. Stallman started GNU in 1983. In 1989 the GNU General Public Licence was formed with the Copyleft concept (after Copyright). This can be summed up as, “you can use this software however you like, but if you use it publicly, you must give away the source code and release it under this licence”. Such licences have been described as ‘viral’ as they spawn further software and licences that must comply with the philosophy.

Seb compared BSD and GNU software by their ideals – the BSD licence is to produce better software, where GNU is to produce more and more free software. Free in this context means free as in freedom to change, rather than free as in ‘free beer’.

Seb described Linus Torvalds writing LINUX, the missing completed GNU kernel and Eric Raymond. He summarised by pointing out the advantages and disadvantages of OSS.

David Deeks from Sunderland University introduced PISO, Process Improvement for Strategic Objectives. David described how this methodology was formed and summarised it as a step-by-step business improvement process. He went on to outline the phases involved and the use of DFDs. PISO uses DFDs for system efficiency and to meet agreed strategic objectives.

The first stage, Stage 0, identifies an area requiring improvement. Stage 1 develops those objectives and Stage 2 analyses the current system (DFD logic). The final stage designs the new system.

Each stage is then broken down into further steps. For instance 0.1 identifies the general area of improvement, 0.2 identifies the initial stakeholders and 0.3 establishes the improvement area boundaries.

David stressed that the stakeholders don’t merely ‘get involved’  – they do the design. The system is a rigorous approach to requirements definition and the agreed strategic objectives drive the re-design. He referred to a case study using a GP system, which was driven by a need for increased efficiency. However there were other systems that were driven by a need for quality. These are not mutually exclusive, as quality doesn’t come at the expense of efficiency. This is because part of the process is the logicalisation of the system for efficiency.

David went on to describe commercial products that have come out of this work, including pisoSIA and a number of applications such as pisoMetrics and pisoLEAN. He also detailed training courses, certification and training organisations.

Henry Nash, of The Apsley Group, gave a presentation on ‘NBIC – The Next Big Thing’. He described NBIC as the convergence of nanotechnology, biotechnology, IT and cognitive science. Other descriptions include BANG – Bits, Atoms, Neurons and Genes, COMBINE – Cogno, Meets Bio, Info and Nano technology, and ‘The Singularity’.

Henry outlined the exponential pace of innovation and cited Moore’s observation of the doubling of power of computers every 18 months. He exemplified the future trend: $1000 will buy you the equivalent power and memory of a rodent brain by 2010, a human brain by 2020, a thousand human brains by 2035 and all the human brains in existence by 2060!

This not only applies to transistors, but to other technologies. For example, the Human Genome Project was originally expected to take 15 years and cost $3 billion. It took eight years to sequence the first 7% but the remainder took only another two. Celera’s map cost just $200 million. Such stories will be repeated with Proteinomics and the Human Cognome Project.

Henry went on to highlight some current examples of NBIC. There are artificial retinas with 5000-pixel resolution fed into patients’ optic nerves. There are neural interfaces implanted with 1000 neural connections. Synthetic biology and nanotechnology will allow the first artificial strands of DNA to be written permitting medicines to be created in situ, rather than having to be transported to the site of need.

The predicted IT involvement in NBIC may be:

2005-2010: The Tool era – IT provides increasingly more powerful tools.

2010-2015: The Embedded era – this is now an active part of NBIC services.

2015-2025: Non-human intelligence.

2020-2030: True NBIC era – NBIC now at work in most industries.


2005-2010: 1 sq mm computing nodes appear, at less than 10 cents each.

2010-2020: Such technology becomes invisible and nanotechnology becomes increasingly a part of the device.

Henry highlighted the technical issues to be solved in the next 10 to 15 years. Drug discovery will go exponential with ‘in silicon’ experiments rather than ‘in vitro’ being the norm. Most diseases will be curable or their progression halted. (Jim Bates remarked later that his goal was to survive until then!) Self-replication will transform accessibility to this technology. He summed up by saying, “IT changed the world. NBIC will change humanity”.

Brian Swan, of Exoftware, gave an excellent presentation on Agile Software. Brian started by highlighting the number of projects that were either challenged or impaired due to incomplete or changing requirements. He went on to list some of the members of the Agile family, including XP, SCRUM, Adaptive Software Development, Lean development, Feature Driven Development, Crystal and DSDM.

Agile methods favour individuals and interactions over processes and tools, with working software favoured over documentation, customer collaboration over contract negotiations and responding to change over following a plan.

Brian highlighted the principles of Agile development. Highest priority is to satisfy the customer through early and continuous delivery of valuable software. It should welcome changing requirements, even late in the development cycle. It should also deliver working software frequently and build projects around motivated people. The best way to convey messages is by face to face contact. Working software is the primary measure of progress and there is continual attention to technical excellence. Simplicity – the art of maximising the amount of work not done – is essential. The team also reflects on how to become more effective.

One of the results of an Agile development process is that productivity increases. Success is proportional to size, with small projects having a much better chance of delivery than massive ones.

Dr Philip McLauchlan of Imagineer Systems presented a session entitled ‘Look – no wires’. While supervising a post graduate student, he found that objects could be eliminated from frames e.g. a tennis player from a Wimbledon video. This led to the idea of removing from films objects such as wires used to allow people apparently to fly through the air. Thus it is a computer vision solution for postproduction in film and allied industries. In fact, the software developed can now be used for 2D tracking in visual effects, wire and rig removal, restoration, stabilisation, matte creation and rotoscoping.

The software uses the freely available Gandalf library available from Source Forge and also makes use of OpenRL. Several applications have been developed including Mokey and Monet. Monet incorporates Aspex Line Dancer acceleration.

Philip demonstrated how wires are removed from a film clip. This involved marking out areas on an initial frame, such as anchor points and the wires to be removed. He then ran through a few frames, just to ensure the software could track the wires. Finally, the whole clip was run and the wires were removed and replaced by the background automatically – very impressive. Some leading postproduction companies have already purchased the software.

Steven Jenkins of Safemessage gave an enlightening presentation on the safety of email. Generally, people feel safe when using email. However, there are security weaknesses and email is only as secure as sending a postcard. The contents of an email are virtually public, easily intercepted and can be forwarded (unintentionally or maliciously). Further, identities can be spoofed. Data lives forever because copies can be archived all over the world without the sender’s knowledge or approval.

The costs of email leaks include loss of valuable confidential data, client confidence and credibility. Direct costs are those of litigation and the effect on share price.

There are hacking and packet sniffing shareware applications that can be used for email interception. So is encryption the answer? Well, only partly. Emails can still be forwarded and delivery status is unknown. Often, encrypted material is quarantined at the firewall. Email client incompatibility may present a problem. And headers will still appear in plain text

So what is the answer? Any solution to address these problems should encrypt the entire message and any attachments. It must give control over forwarding, printing and the life of the message. It must provide direct routing. And it must not allow an unencrypted version to exist. Finally, it must be easy for the user to work with.

Steve concluded that everyone should be aware that email is insecure and that confidential information should only be sent over a system that provides absolute security, not just basic encryption.

In conclusion, David Morgan called on the IAP President, Jim Bates.  Jim felt that this year’s Symposium had been an outstanding success. He closed the event by inviting delegates to the Piano and Pitcher where a good time was had by all.

Comments are closed.