Computers in business and industry
The early days of computing in New Zealand were, not surprisingly, dominated by the government but not long after the public sector made their pioneering moves into electronic data processing, private firms began to take advantage of the innovation presented by computer technology.
In the vanguard, naturally, were the subsidiaries of worldwide and overseas conglomerates which could take advantage of existing developments by their parent companies. But local pioneers were not long in coming; a Lower Hutt paint manufacturer, BALM paints, subsequently absorbed by Dulux, spawned not only one of the earliest industrial computer systems, but also one of New Zealand’s software pioneers. Perce Harpham emerged from his Dulux experience to found Systems & Programs Ltd, now Progeni, New Zealand’s biggest software house.
From these early days of private sector computing it was a matter of virtually uninterrupted exponential growth, despite a severe scarcity of computer expertise. A census taken in the mid-1970s revealed at least 150 computers around the country; an industry in its own right already employing more than a thousand people.
The early machines were proverbial monsters with little internal storage and even the program instructions themselves read and executed from ungainly storage ‘drums’. Internal circuitry was built from transistors and other discrete components or even failure-prone thermionic valves. The first major innovations driving the New Zealand private sector business market came around 1963, with the arrival of integrated circuitry, random-access memory and the first of the real ‘families’ of computers, the IBM 360 series. This gave some promise of growth in computing power to match the growth of the businesses themselves. The 360 brought the second wave of New Zealand business computing with major local companies like Dalgety and Cable Price Downer taking on the new machines.
Early business applications were, for the most part, fairly standard tasks — essentially substitutions for clerical work. The key justifications were always higher productivity, reduced cost and fewer staff. Manual procedures were duplicated fairly closely and often with little imagination. There was none of the ‘improved management information’ sought these days as first priority from commercial computer installations. Some commentators would contend that this ‘bottom-line’ attitude has never quite left New Zealand management and its use of computing power. As one analyst of long standing summed it up, ‘Creditors, debtors, general ledger and those classic accounting jobs made up most of the early workload; with insurance companies it was premiums — all the same job really. Trying to get people to pay the bills.’
The accounting emphasis came partly from need and partly because ‘there were already people around who knew how to do it’. Such applications could essentially be run in the same mode as the earlier ‘unit record’ equipment — the tabulators — put the master record, balance forward and payment details in one end and a new balance forward came out the other. And, as to whether the ‘bottom-line’ benefits actually materialised, it is a doubtful and in retrospect a muddy picture. Few definite statistics are available to us to indicate whether those early computers ever made a sufficient return on investment to justify their purchase. It is safe to say a good few did not and became cursed — unfairly perhaps — as millstones around the neck of the organisation. More than one company found itself complaining that it had to raise the prices of its goods to pay for the escalating costs of the computer that had been bought to reduce costs.
The computers of the 1960s were dedicated machines, most of them running a single application at a time, scheduled externally by paper instructions. Even when the sophistication of operating systems and multiple concurrent streams of work arrived, the aim was still to speed up the manual processes. Vast paper files became vast disc files and vast computer-generated reports, and disciplines of ‘back-up’ and security were primitive at best. Too often, the valuable information ‘resource’ of the organisation could be lost or damaged beyond repair. Then it was a matter of going back to the paper files and painstakingly regenerating everything.
There were inevitably stories of the failure of entire computer projects and of companies who were badly ‘burned’ by the whole experience. It is hardly surprising, in such a small market that many of the disappointed users turned back to computer bureaux run by supposed experts, and that the bureaux thrived. A high proportion of the early 1970s ‘large’ computer installations belonged to businesses subsisting entirely on running data processing for other companies.
At the same time, there seems to have been a hesitancy at the very beginning to admit failure openly. ‘Not that many actually backed out’, one consultant contended. ‘A lot of inefficient systems were kept going.’
Computing in the ’60s and early ’70s was a job requiring dedication and a reasonable knowledge of the intricacies of the particular machine. The turn of that decade had seen the maturing of ‘standard’ programming languages; but a standard was never exact, and if, by a detailed knowledge of the machine or bending the standard, you could make the program run faster, that was all to the good.
In the earliest days, there was a moderate input of recruit programmers from accounting and other conventional office disciplined people who had ‘been round the traps’, as one early user put it, and knew the business of the organisation. It was, however, not long before these yielded to the beginnings of the ‘technocrats’ who were becoming visible even in the late ’60s. They are usually thought of as some kind of boffin. ‘Most of the new people had an electrical bent or they were the sort of people who took motorbikes to bits over the weekend.’ There also appeared to be a predeliction for mathematics graduates. This view is perhaps an over simplification and many early computer recruits came from totally unexpected backgrounds; some of them continue as influential figures in New Zealand computing to this day.
Since government agencies were the first in the field it is not surprising that much of the original selection and training of computing staff had been done by the government and there was a flow of staff from the public to the private sector. It has been claimed by some that central government effectively became, and remained for more than a decade, the trainer to most of the industry in New Zealand; ‘The wastage of staff from government, particularly Treasury was terrible,’ said one government pioneer. It was only after persuasion from the government that the technical colleges started practical courses on computing to supply some of the missing expertise.
A good deal of reliance was placed on ‘aptitude tests’. Although they are not as extensively used today there is, in retrospect, little doubt that they fulfilled a need at that time. They even had their strong advocates. ‘When we did trust to our instincts and take on someone who couldn’t do the test, it was usually the test that proved right in the end,’ said one early manager.
The real problem came in the interface with the user. ‘Because the programmer knew about computing, he — or she, there were a few women in those days — was expected to know all about accountancy and the way the business was run as well.’ The natural result of such assumptions was poor specification and poor design of systems. ‘The end users were unable to describe what they wanted, and expected the computer people to fill in the gaps.’ In the mid-’70s, industry observers could still complain, with some justification, that ‘most DP personnel have tended to neglect general management principles … and, the other side of the coin, general managers normally are not as familiar with systems design principles and computing techniques as is desirable, given the relative level of expenditure on computing’.
One other major source of computing expertise in the early days was the computer companies themselves. ‘A lot of the early DP managers at IBM 360 installations were former IBM sales executives,’ recalled one of those early users. ‘If you bought a 360, you bought a man with it.’ IBM was then and has always remained, the dominant force in the market, with ICT, later ICL, holding second place well into the ’70s. Burroughs and NCR also expanded quickly, capitalising on their existing presence in the accounting machine market.
When applications became less basic and were required to meet the wider individual needs of the organisation, program development emerged as a painstaking ‘hand-crafted’ business, often it appeared an end in itself. Design of complex individual systems was lovingly recounted in the computer press of the day and at conferences. The idea of designing and developing one’s own database management system, as Winstones did in the early ’70s, would be unthinkable today but not so then. The users of those days seemed to look to the computer supplier chiefly for hardware power and the suppliers were happy to go along with this viewpoint considering themselves to be in business to sell hardware. ‘Hopefully, the supplier will come to discuss not only his equipment but also his programs,’ said one optimist, somewhat ahead of his time.
Software packages took a long time to emerge into full business acceptance; the usual criticism of the ’70s packages was that they were too rigid to meet the different needs of individual organisations. The commercial system was ‘not a simple-minded aggregation of boxes acquired from the public market,’ said a contemporary IBM commentator. ‘It must match the using company as a suit must match the wearer.’ A tall order for those days (1974). Systems development expertise was still a rare commodity, and combined computing and business knowledge even rarer.
What was happening was that a new concept was beginning to pass from theory into practice; the idea of the computer or complex of computers as a management information system. The computer was to be seen no longer purely as a business processing workhorse for individual applications, but as ‘co-ordinating prompt control of information as an aid to better decision-making’. ‘Applications’ were not simply band-aids or poultices to be slapped on that part of the organisation that was hurting. Thought was being given to the long-term requirement for information, when it was needed, where it was needed and in the format in which management needed it.
Unfortunately, it was and still is a difficult task to be all things to all people — to anticipate all possible demands for information. The techniques for canvassing requirements and translating them into an appropriate universal ‘model’ of the necessary data were lacking and many management information systems were fated to fall far short of their expectations. Nevertheless a start had been made and there were two technical developments that were helping the process. Surprisingly they at first seemed to be offering two conflicting solutions.
The first of these was the move of the mini computer into the commercial areas. The original ‘minis’ were thought of primarily as machines for scientific and industrial applications but once accepted in its commercial role, the minicomputer showed itself as particularly appropriate to the smaller New Zealand scale of business. An early prediction that the growth rate in minicomputer use would reach 20 to 25 per cent a year, compared to 15 per cent for large computers, was amply justified. In a way, the profile of the minicomputer foreshadowed the later growth of the micro.
But it also had another effect. It brought the dream, if not at once the reality, of ‘distributed processing’ — taking specific tasks away from the central mainframe and into the individual department or district office. In this way management much further down the chain, could control the flow of information necessary to carry out their tasks, the essential ingredient of a management information system.
The second was the advent of on-line facilities. In its simplest form this merely replaced batch processing by giving operators direct access to the files. They could then update the records or abstract information ‘on-line’. In its management information mode the communication lines were extended to reach into remote locations to where they provided the same, or some would say superior resources, to those of the on site mini. On-line computing came first in the guise of the ‘star’ shaped network, centred on one large processor. Indeed, it could be said that the terminals made the computer system more centralised than it had been. The need for immediate information and the likelihood that one terminal operator would be looking at many sets of data, bred the idea of the centralised ‘database’ in place of separate ‘files’ for different jobs.
Individual proponents of centralised and distributed processing are still with us today, though it would probably be fair to say that the distinction is becoming somewhat blurred.
The 1980s brought many developments but they will be remembered more than anything as the years of the microcomputer — the years when the small desktop machine grew from its hobby and educational role to take its place in serious business information handling. For the computer market worldwide, this meant another massive shake-up but for New Zealand it had its own significance. The computer now became relevant to a new and smaller scale of business, one much represented locally. As with most hardware development, its growth here was retarded by a lack of suitable application software for local use but more important was the lack of a suitable marketing base. The early micros did not come from the big companies with an established presence. For the most part they were sold by relatively small traders with agency arrangements. For small indirect vendors to become acquainted with their new product, so as to be in a position to advise prospective customers whether it would meet their needs and to provide after sale support, proved to be a difficult and costly task. Not all succeeded.
A further handicap was the 40 per cent sales tax imposed on ‘office machinery’ by the Labour government in 1975, largely for fear of job erosion. In retrospect, this was an inexplicable decision, widely regarded as having prevented small businesses in New Zealand from taking advantage of benefits of computing power for several vital years.
There are arguments that the high price of small computer equipment had a beneficial effect — holding back decisions until there had been more opportunity for careful thought; and leading, perhaps, to the development of software which used the machine’s capabilities more efficiently than did software developed overseas. Because of the relative size of businesses and the cost distortion effect of the tax, it has been said that where a United States or European company used a mini or a mainframe computer the New Zealand company had to do the same job with a micro. We certainly seem to have been saved from the worst excesses of ‘personal computer fever’; a rash of unco-ordinated applications, using contradictory data, growing on a multitude of separate machines bought by the more ‘enterprising’ executives within the company.
These were the positive effects, but the negative impact of the now abolished tax will be felt for some years to come, in the shape of a long-term lag in development; longer consideration has not always meant better solutions and there are still false paths to go down, several years after overseas users have had the chance to learn from their experience.
The big impetus for business micro development came when the tried and trusted names began moving into this part of the world. IBM’s Personal Computer met an enormous pent-up demand when it came to these shores in early 1983, 18 months after its release in the United States. Against the expectations of some sceptics, it rapidly established a leading position in the market here and world-wide, and begot its enthusiastic imitators. The IBM led to a flood of software from local and overseas developers to work with the new machine, and, in turn the flood of IBM PC software begot IBM PC-like machines designed to accommodate that software. The microcomputer customer base, IBM and non-IBM, grew steadily. What was more astonishing was the growth in local agencies for micros. Almost every significant machine in the world was represented on the small New Zealand market within a few years. There were repeated predictions of ‘shakeouts’ in the industry. With only three million of us, the sceptics said, sooner or later, everyone would have a micro, and with so many vendors operating in the market, some would have to founder. But the predictions proved unduly pessimistic. More agencies rapidly came in to replace those few which did expire, and to pick up on an ever-expanding range of products. The biggest casualty came later with the demise of not one, but two successive local enterprises named Access Data. First Access Data Ltd, then its successor Access Data Corporation, distributors of the Altos multi-user micro, went into receivership in 1983. There were various theories to account for the collapse. Access Data had been one of the more visible presences on the local micro market, and its failure sent shock waves through a vulnerable user community which had always had to ask itself, ‘will my local microcomputer distributor still be here to support me next year?’
While the microcomputer grabbed the limelight, significant changes were also taking place in the use and acceptance of the larger computers. The IBM imitators began eroding some of the dominant presence of IBM and the other big United States names. National Semiconductor, Fujitsu and Amdahl all found footholds in this country. Eating away at the bottom of the mainframe market was the ever-expanding mini, with new entrants like Prime Computing showing sudden majorsuccess in the local and Australian market. With minis pushing into the mainframe territory and the powerful micros taking over the ‘mini’ slots in the market, traditional boundaries between the three categories became blurred.
As the local computer users matured, they began in any case to think in terms of a connected ‘system’, a co-ordinated ‘solution’ to business information management, using whatever individual pieces of hardware and software were most appropriate. Increasingly, users were encouraged to make an exhaustive analysis of the information required by all parts of the company before buying. Standard packages and equivalent hardware and software offerings from many sources, and the broad range on offer from a single supplier made this approach easier. But the old ‘hardware first’ technical approach has proved a habit difficult to kick.
The spreading of the ‘information system’ tentacles outside the computer room, both by means of on-line systems and the take-on of micros by non-DP-trained executives, together with the ‘systems’ view of a company’s total information requirements, led to consideration of new ways of handling the flow of information. There were those who questioned how effectively the policies of management made themselves felt through the intermediate filter of the data processing department’s attitude to information processing and the role of the DP manager himself came under scrutiny. But it was soon clear that there were some hard decisions to be made on the integrity of the company-wide information base once everyone was allowed to dip into it and control was still necessary. Despite some adventurous proclaiming of the ‘information democracy’ the DP manager stayed on to fill this need.
There were also other problems arising at this time. New users had meant a new software development load when software development was already seriously behind schedule in most organisations. Local software ingenuity came up with part of the solution with practical applications of two separate approaches that were being considered, both here and overseas. One was to improve the productivity of programmers by providing them with better programming facilities. In this Progeni played a leading part with its PROGENI TOOLS. The other was LINC, one of the early ‘fourth generation’ systems development languages. Such languages aimed at allowing the problem to be specified in pure information and business terms and converted automatically to the technical vocabulary of the computer system. Many such ‘4GLs’ were springing up all over the world but LINC, supported by Burroughs Ltd, on whose machines it was designed to run, received world wide acceptance.
While there were suggestions that fourth-generation languages would do away with the need for specially trained programmers, this proved not to be the case, although many successful applications have been prepared with little or no assistance from data processing staff. The chief merit of fourth-generation languages is not that management can do all its own systems development and programming, but that the vocabulary is at least understandable to users who can draw up their specifications in a way that considerably reduced the load on the designer and reduced the risk of misunderstanding. They are also relatively easy to alter. A technical systems developer can show the potential user a quickly-developed system and the latter will have sufficient understanding of it to point out any deficiencies and suggest improvements, a technique known as prototyping.
A more general solution to the information gap between user and systems developer was sought in the provision of simpler methods of information retrieval, manipulation and reporting. So arose the ‘information centre’ concept. Promoted initially by IBM, it was clearly a non-machine specific answer. It advocated the provision of small software packages for the standard tasks of information retrieval, analysis and presentation, teaching the non-technically-minded users how to use these facilities and then leaving them substantially to their own devices, to handle their personal needs, with the occasional helping hand from professional DP staff. Several large firms adopted the idea, with mixed success. Sooner or later, in all but the simpler cases, it was found that the DP staff were needed to take an active role, rather than merely offering discreet help. Nevertheless the concept has considerable merit, requiring as it does, the active involvement of users in the development process.
An unusual feature of New Zealand’s use of computing is the relative lack of direct application in the primary and manufacturing industries. But perhaps this is not surprising. Kenneth Owen in a paper on Computers in Industry prepared for the International Federation for Information Processing had this to say:
Information processing for those whose job is to process information can be a neat and tidy affair; but for those whose job is to make things, or to run industrial processes, information processing is a more difficult and untidy matter.
Successful industrial computing systems can transform the business performance of manufacturing and process companies. These benefits have been demonstrated in large companies and are becoming progressively more relevant to smaller companies also as the cost of computing continues to fall. Computer aided design and computer aided manufacture have become familiar and fashionable as CAD and CAM, but there is more to industrial computing than CAD and CAM — and indeed, there is more to CAD and CAM than is represented by today’s state of the art in mechanical engineering.
There has been the odd outstanding effort in flexible automation and control of the manufacturing process — Fisher & Paykel must rate a mention here — but statistics acumulated by the DSIR show that even the use of computer-driven numerical control machine tools has been slow to take off. Robots are virtually absent. Effective use of the information-control aspect of manufacturing — material requirements, planning, work scheduling and monitoring — is still not widespread.
‘Office Automation’ has been the other watchword of the early ’80s. From the initial perspective of automating the work of the typist and secretary with a word processor, the rise of networking capability and the desktop computer has brought new meaning to the term. Office automation now includes; theoretically at least, the use by middle and upper management, to draw information from the company database and use it to formulate reports and generally assist in the decision making process. But successful use of the new potential has been patchy. There are still very few companies where electronic mail whizzes back and forth over the local office network, and the use of the desktop micro seems again to have stagnated at the word processing level, with the possible exception of the ‘spreadsheet’ financial modelling techniques now becoming popular with financial managers.
A more recent arrival among the ways of presenting electronic information is videotex, which has grown to prominence on the local market. Originally devised in the United Kingdom as a way of providing information in the home using the domestic television set as a terminal, it has matured into a simple communication and presentation technique for general business use and can send messages back in a similar fashion. The user calls up information by page number or ‘menu’ selection, often on a purpose-built terminal and he can send messages in a similar fashion. It took interminable bureaucratic discussion to bring the concept to fruition in New Zealand. Who would control the information — the Post Office or private industry? Which of several competing communication procedures would be used, and what effect would it have on the load on Post Office communications lines? But over the past two years, videotex has begun to be taken up with enthusiasm by operators of channels for all-purpose information, and particularly by suppliers of financial information — brokers, finance houses, the Bank of New Zealand and Databank.
The supply of business information through more conventional terminal networks also took off in the 1980s, on the basis of information services originally developed for in-house use by large companies, and through the Post Office’s Oasis link into the United States databases of commercial and technical information. Prominent observers of the New Zealand business scene, however, still lament the lack of use of such information channels which should, they say, be more widely called upon for product and market information vital to our international trading efforts.
Looking back it would be safe to say that the computer industry in New Zealand has come through a long period of evolution as a fragmented entity; many small ventures, a few large, and a sometimes uncomfortable mix of locally-owned industry and the subsidiaries and hangers-on of multinationals. Some notables have emerged; Progeni, Microprocessor Developments Ltd and David Reid with their efforts to build local computers, and many more on the software and consultancy side including early-comers such as Progeni (who started life as SPL), Datacom (formerly CBL) and Computer Consultants Ltd. In the wake of LINC’s success in particular, the potential of locally produced software as an export commodity has been widely recognised.
The computer industry had hoped for recognition and encouragement from the Industries Development Commission’s review of local electronics. But the final version of the Electronics Industry Plan rather watered down the original intention to move from consumer to professional electronics and the computing services industry expressed particular disappointment at the lack of recognition of its efforts or consideration of its potential. In spite of the lack of official encouragement there as been steady progress. NZI’s Paxus Information Services Division has grown from the bureau beginnings of IDAPS (NZ) to a multi-faceted Australia/New Zealand operation, and has absorbed many smaller but highly regarded companies along the way, including the Australian Hartley computer hardware manufacturing operation. The Andas Group has risen from Armstrong & Springhall’s office equipment operation to embark on a similar path, both under its own name and through the associated Powercorp complex of companies.
The rapid development of the use of computing in the late ’70s and early ’80s led to increased concern over attendant social problems, from the privacy of personal data, to the effect of electronic information and electronic ‘labour’ on the structure and employment prospects of industry. The question of privacy was highlighted by the development and use of the Wanganui Law-enforcement System, and the collapse of the Creditmen Duns credit agency with the subsequent offering of credit information tapes for sale. But more heat has been generated than real light, and this country still lacks any broad-ranging legal privacy protection. Concern with the employment consequences of automation waxes and wanes. After some clear concern in the early years of this decade, the benefits of computer aid, particularly in clerical and office work, seem to have calmed some of the fears. But the misgivings are still undoubtedly there, and a further downturn in employment could bring them fiercely to the fore again.
These questions are examined more fully in the chapter on Social Implications. They are very important but the problems will be overcome. Computing in New Zealand, it can be safely said, enters its second quarter-century in every sense as an established part of the business world.