VSJ – December 2002 – Review of the Annual Symposium

Every year, Council member Paul Lynham, FIAP writes a review of the Annual Symposium. As this year’s marked the Institution’s 30th anniversary, we decided to devote the whole of this month’s IAP News to it.

Mike Ryan, the Director General, opened proceedings with a summary of events leading up to the Institution’s formation. He recalled how Bob Charles, its founder, had remarkable foresight in realising the need for an organisation to guide and promote software development professionals. Mike then welcomed the chairman for the event, Edward Wolton, past chairman of the BCS Young Professionals Group.

The first speaker was Jim Austin, Professor of Neural Computation at the University of York. Jim presented a fascinating session entitled Artificial Intelligence and The Brain. He began with the neuron, a concept introduced in 1943. He compared the brain to a computer; it uses 25 watts, calculates at 0.001 times the speed of a modern processor, yet is quicker at tasks like recognition. It has 1012 neurons, each with 1000 connections. It can hold 1015 numbers and is 78,125 times more complex than a Pentium III. Accuracy is believed to be about 4 bits.

Computers are good at adding, storing and pushing data around. They are bad at being reliable (compared with the brain), at finding knowledge and at complex tasks, such as recognising faces. They are not learning machines, unlike a neuron.

Neurons don’t use bits, like digital computers, but frequencies. However, how the data are encoded is unclear. With a NAND gate (a series of which could make up a computer), one FALSE sets the expression to TRUE, whereas a neuron’s output uses a weighted system:

Neuron output = threshold (input A x weight A + input B x weight B)

The weights allow us to learn. If circumstances change, the weighting system can be altered to give a better output. Unlike a gate, a neuron can take a ‘majority vote’; it can drop its threshold and still arrive at the correct answer. It has adapted to the information available. So neural systems have better tolerance to failure.

Using these ideas, Jim has built a ‘Cortex’ card with 28 processors and also has a complete machine, with 400 million neuron evaluations per second. A system has been produced to recognise addresses for the Post Office. If an address is spelt incorrectly, conventional software for finding the postcode will not work. However, the AI system gives better results. Another application is in intelligent text searching where a search for ‘chair’ can find words with the same meaning – bench and seat, for example.

A company, Cybula Ltd, has been set up to exploit the technology and Jim demonstrated some example systems.

The second speaker was Mike Wetton, Director of Allery Scotts Limited. His presentation was entitled Web Sites – A Marketing Perspective. Mike gave a great overview of aspects of marketing using the Web and what it can deliver.

He cited an example of how a Web site was integrated into the marketing mix. Mike was asked to give a critique on a new Web site. He saw that it didn’t incorporate any strings that could help search engines to locate it. However, the MD was pleased with the site, as the company had already sold three machines through it. Mike asked how was that possible, when it couldn’t, apparently, be found. It transpired that the company had used traditional advertising placed in different media in foreign countries and that the advertisements pointed to their Web site. So it was being used as a colour brochure, available to global customers and being promoted through traditional advertising – a very effective strategy!

Mike covered how the Web can be used by clever applications, giving an example of a site that was used to search for appeals against planning application rejections. Information that could be used to help prospective appeals could easily be found using the site’s search facility.

A Web site’s objectives are the paramount consideration in its planning. Questions such as ‘what is it for?’ and ‘what will it do?’ must be asked. Woolly objectives lead to disaster.

Mike outlined how customer relationships can be built using simple techniques. People like PowerPoint presentations, even when other attractive gimmicks fail. Also, they don’t want to fill in massive forms. One idea was to use a PowerPoint presentation on a site and to ask users for their telephone numbers and email addresses to access it. Potential customers could then be tracked, telephoned to confirm their degree of interest and emailed to follow up with newsletters and offers.

Mike pointed out the folly of organisations buying certain keywords with companies that would guarantee their site being at the top of the returned search lists for, say, the word ‘holiday’. Serious browsers usually use search strings, rather than single keywords. With carefully selected strings, a site could appear at the top of the lists on major search engines for next to nothing. For example, looking for ‘holiday’ would return millions of pages, yet ‘Cornish holiday cottage’ would return only a few, with better targeting. Therefore a good strategy is to produce results where there is little competition. Careful selection of strings is important and needs to be well researched. A tourist board using the word ‘holiday’ will attract no Americans as they would say ‘vacation’.

Paul Burton, of the Meteorological Office, spoke on Modelling The Weather. He gave a brief history of the Met Office. It was founded in the Board of Trade in 1854 and published its first public forecast in the Times in 1861. In 1919 it became part of the Air Ministry and started live TV forecasts in 1954. 1962 saw its first electronic computer installed. In 1964 it became part of the MoD. In 1996 it started operating as a ‘Trading Fund’ so that it has some independence. It has customers across the whole spectrum of industry.

Numerical weather prediction involves several stages from observation and first guess, processing observations, data assimilation, analysis, numerical model forecast, post processing to the final forecast. Observations are made from aircraft, balloons, satellite, sea and land. Data assimilation starts with a background field 3 hours before the prediction. The forecast is run taking into account observations at different times, iterating by changing the starting field, until the best fit is found.

The main software system is the Unified Model. It consists of 700K lines of FORTRAN 77/90 and C code. The hardware has continually changed over the years. Super computers have been used, the Cray T3E-1200 and 900, IBM mainframe 9672 R45/R25 with 6 CPUs and mass data storage of 800TB. The daily work involves 10 million observations, 40 operational forecasts, 1.2×1016 calculations, with 150Gb of input data producing 0.5TB of data output.

Paul gave the pros and cons of parallel systems, which are essentially ‘many hands make light work’ versus ‘too many cooks spoil the broth’. He went on to show a history of hardware use, ranging from an IBM 360/195 scalar processor architecture in 1970 through to vector, parallel vector, multi-parallel scalar processors to the current NEC SX6 multi-parallel vector processors.

Next year the Met Office’s HQ moves to Exeter where it will have 30 nodes of NEC SX6s in 2 computer halls for resiliency, giving 6 times its current capability. This will lead to increased resolution – 40Km global node, 20Km for the Atlantic and Europe and less than 10Km resolution for the UK. It is hoped to be able to get a high resolution for severe weather events (as these are very costly). Ensemble (a collection of many different models) techniques will give probabilistic forecasting in the medium range. It is hoped to get a better understanding of the local impact of climate change and a more sophisticated science model and feedback between systems.

Paul said that weather forecasting has been transformed by the use of high performance computing. It allows us to understand the impact of man’s activities on Earth’s climate. He ended by saying that the Met Office has an insatiable appetite for CPU cycles and that currently, the most powerful computer in the world was used by weather forecasters in Japan.

The first speaker after lunch was John Blowers of AngelBourse – a stock market for angel investors. John said that the company, based in Bristol, was launched in September 2002. It is an online stock market for unlisted companies.

John listed the market opportunities, stating that there are many SMEs seeking capital but the cost of going to existing exchanges is high. Also a shortage of investment opportunities and a lack of liquidity in angel investments exist – there is usually no defined exit route.

Large companies (around 35 000) make up 0.2% of the market while 33.4% (1.2 million) are growth companies. So 66.4% are small businesses or sole traders, numbering 2.3 million.

The idea (an old one, as there used to be regional UK stock markets) provides companies with an opportunity to release equity for liquidity or expansion or to discover their share price, while slashing costs of a public offering. The investors benefit from high growth phase in the business cycle as well as having access to new investments and creating liquidity in angel investments.

AngelBourse is also developing new products including a Property Bourse, OTC for company prospectuses, bonds and companies for sale. It has international developments in Germany and France with Deutsche Telekom, Xerra and Euronext. The company works with strategic partners who are sophisticated investors. Shares can be traded as a stock exchange on the Web, buying, selling and managing the portfolio.

The Web site uses Oracle for the back end database of investors and companies. NetChemistry manages this in the US. The front end is created in-house by AngelBourse’s technical team and uses HTML and JavaScript. WorldPay is used for the payment system.

In summary, the company is business rather than IT-driven. It uses pragmatic solutions for testing economic times. At present, outsourcing is the primary model, but when the time is right, proprietary systems may be developed.

Shirief Nosseir, of Computer Associates, presented The Value of Modelling. He pointed out that the e-Business challenge is to bridge the communication gap between a company’s business and IT goals.

The track record for large projects is poor. Only 25% are successful, 28% fail and the remainder cause major problems. In the US this translates to losses of $75billion. 60% of development problems are rooted in the Analysis and Design phases and 80% of resources are spent on maintenance. The average overrun cost is 189% and the average overrun time is 222%, leading to a 27-month backlog on end user requests.

Shirief maintains that modelling can help align business and IT goals. He used the house building analogy. The building inspector checks that the rooms are the correct height, doors in the right position, wiring and plumbing meet minimum safety standards. However, the people who are to live in the house are the ones who need to check if the views are good and the rooms are the right size. Modelling can be used to ensure that the user’s goals are met.

The ‘AllFusion’ screen shots displayed the classical CUSTOMER, ORDER, ORDER ITEM class diagram with the corresponding SQL statements to create tables and add indexes, automatically generated from the model. The product could use both forward and reverse engineering so models could be produced, redesigned and tables amended correspondingly. The product also caters for all standard UML diagrams. CA has the lion’s share of the modelling market but we didn’t get a price for this system. Shirief answered scepticism that the generated code may be inefficient by saying that the code produced in the target language could be manually tuned. There is also a product that could be used to model the presentation layer.

Andrew Hill, Flight Data Management (FDM) Programme Manager at Eurocontrol, gave a presentation on FDM – A New Concept For Europe. Eurocontrol is made up of a board of European ministers, each country having its own organisation, with Eurocontrol as the umbrella organisation.

Andrew clearly demonstrated a need for change. There is massive duplication of data used by the Aircraft Operator, the Initial Flight Plan Processing System, the Tactical System (allocating slots for aircraft) and the Air Traffic Control Flight Data Processing System! Many data exchanges are one-way and many data flows are partial. Stakeholders have inconsistent Environment Data describing routes and navigation beacons. A 4D trajectory is needed for each flight and this is recalculated many times, by different systems, each using its own algorithms and parameters.

As an example, he showed the problem of flying near a military airbase. Normally non-military aircraft would not be allowed to enter such airspace and so a route has to be planned around it. However, the military might exceptionally release it to a civil aircraft after a flight plan has already been filed. The route will then be changed but other traffic control centres may not be informed. In fact, up to 44% of flights have differences from the flight plan!

Correcting inconsistencies with flight data within the ECAC region costs 30.4 million Euro every year. The new concept is to have central Flight Objects that all stakeholders can access. Questions that arise from this idea include who is going to pay for it and how much intelligence should it have. Communications issues, integration with legacy systems, safety and security problems will also need addressing.

Some research has been carried out experimenting with different architectural models. QASE, a PC software package, was used to allow candidate physical models to be assessed, simulating air traffic data and showing the utilisation of the architectural components.

In the short term (2003-4), Andrew suggested that existing processes would be streamlined. For the medium term (2005-7) the existing interface standards will be updated, for example, to support transmission, reception and processing of some new messaging. The long-term goal (2007-10) will be for a significant new data-sharing environment, which will require infrastructure changes.

Jim Bates, the Institution’s President, rounded off the afternoon with some of his experiences in computer forensics.

[Don’t forget to email eo@iap.org.uk with your reports, news, opinions and events. We’re back to the usual format in February.]

Comments are closed.