|Chapter Three - The Technologies|
Information systems are concerned with the interpretation, communication and presentation of data, rather than with the traditional data processing tasks such as alphanumeric sorting, calculation or simple printing. In other words, information systems add value to data.
The most important aspect of this added value is the ability to connect the microprocessors in desktop computers, communications equipment, specialised equipment like electronic point-of-sale terminals, the contemporary equivalents of the old mainframe computer, telephones, machine tools, warehouse systems, photocopiers, typewriters and printers, alarm systems, televisions and so on. In this way, the processing power that would once have been localised in a large central computer is liberated and distributed throughout an organization. This diffusion of intelligence has changed the way people use computers, the nature of computing, and the structure and operations of organizations. It represents a challenge for large retailers - who have introduced computer technology in a more-or-less unplanned fashion in the past - and an opportunity for smaller retailers.
With the price of very sophisticated technology continuing to drop, small retailers may have the means as well as the flexibility to seize new opportunities as they become apparent. Communications networks can provide a quick and efficient way of searching out and contacting new suppliers; multimedia displays may allow a small retailer to hold less stock; smart-cards may help to avoid bad debt; low-cost computer-aided design (CAD) tools can give new impetus to craft retailers like tailors, cabinet-makers or hair-dressers, not to mention kitchen and bathroom suppliers and other specialists whose trade may benefit from aids to visualisation.
Larger retailers may be capable of using their market power to challenge smaller and specialist stores, but the adoption of an information-based approach to retailing has changed the competitive environment, creating pressures from globally expanding competitors and from new types of retailer who combine technology and customer-focus.
The applications of IT in retailing can be grouped into three broad categories:
back-office systems, and
All of these are witnessing fast-moving developments, with the main areas of activity currently being EFT and smart-cards, database marketing, and EDI and high speed communications technologies. But technological progress is so wide-ranging and rapid that it is difficult to distinguish relevant from irrelevant developments, and significant from insignificant products. Much of the bad investment in IT is due to a form of information overload - ironically something that IT itself should long ago have helped to alleviate.
Bad investment in IT is probably commoner than most people think. We are regrettably all too gullible when it comes to the promise of technology bringing the unalloyed benefits of progress, with the result that the very pace at which new technological products are announced or come to market seems to validate the claims of vendors, consultants, and software houses. Surely, when technology evolves this fast, the problem you can nametoday will have a solution tomorrow. But to make this judgement is to fall prey to a fundamental error: the problems most organizations confront most of the time are not themselves technological, while the products and services on offer from IT and telecommunications (IT&T) companies are. These products and services may indeed be solutions, as the sales pitch has it, but they are not as a rule solutions to the problems that really matter.
That said, the problems that really matter can often be eased, if not entirely eradicated, by the proper use of technology. The priority for any existing or would-be IT user must be to identify the business processes and corporate strategies appropriate to a technological approach. To do so, it is important to understand the potential of IT and the direction in which the technology is being driven.
At the heart of all IT development in recent years has been the idea of distributing processing power. This has many different guises, and a confusing multiplicity of names whose currency depends on which aspect of distribution is being emphasised: client/server computing, networking, downsizing, computer-communications convergence, end-user computing, enterprise-wide computing and telematics are a few of the commonest.
Briefly, the spread of the microprocessor has stimulated the development of a model of information systems in which many computers co-exist within an organization, rather than the one large mainframe of earlier years. These computers are often smaller and cheaper than mainframes to install and run (downsizing emphasises cost) and are often programmed and controlled to a large extent by the people who use them (end-user computing emphasises the way small computer users often run their own software and configure their own systems).
Within a single department or building, the computers are often linked together (networking) in such a way that some of them, or the programmes running on them, provide services like printing, database management, or network control to other computers or programmes (client/server computing). Departmental or local area networks (LANs) are often linked together across many departments or buildings in what is called a wide area network (WAN) facilitating an integrated approach to information systems within a whole organization (enterprise-wide computing). WANs have always made use of telephone lines, but now that telephone systems are themselves increasingly controlled by computer technology, the distinction between a network of telephones and a network of computers is beginning to fade. In time, we will learn to understand that a telephone is just a computer for processing speech signals (computer-communications convergence, or telematics). The flow of information, rather than the processing of data or the transmission of messages, is now the paradigm of computing and communications, to the extent that all the major nations and trading blocs of the world have declared their commitment to a policy of developing data or information superhighways as (to quote EC President Jacques Delors), the true arteries of the future economy .
Within retail, these developments have had an impact in key areas. For example, in-store systems are increasingly based on networked EPoS terminals which can be linked using a LAN to a back-office computer maintaining price data and information about stock availability. Handheld bar-code scanners can also be networked to the back office system to input information about shelf levels, and the back office system might be linked using a WAN to local suppliers, a distribution depot, or a head office system to support replenishment, ordering and management information systems.
Distributed processing in this context means that the right information, and the computing power to process it, can be delivered just where it is needed in the supply, demand or management chains. Suppliers receive sales and order information, checkouts are updated with details of the latest prices and promotions, corporate management can extract data about store performance, and marketing departments can track campaigns and customers.
Inevitably, there are problems connected with distributed processing - particulrly relating to the precise nature and extent of distribution. For example, it is far easier to maintain a single customer database - which, say, a marketing department can use to generate mail-out labels to a targeted group of customers - than to allow marketing to keep its own customer database which must be kept consistent with the corporate customer database. Yet, many retailers who embarked on the road to database or relationship marketing in the 1980s discovered that their systems couldn t handle the demands - capturing transaction data, producing regular sales reports and running so-called ad hoc queries from the marketing department meant that the system all but ground to a halt. Database queries are compute intensive and can tie-up machines for long periods. There is no simple technological solution to this problem: it makes sense to give marketing a customer database of its own (a parallel marketing database or PMD) and make sure that it is checked against the central database periodically, but this can be a tiresome administrative problem or a complex technological one. It may make just as much sense to invest in a computer to maintain a single database with enough power to support the projected traffic in transactions and queries. This is a relatively expensive approach, but for a company like Wal-Mart, which runs a computer known as an AT&T Teradata parallel processing database engine, it has proved effective.
Both approaches - one using a PMD machine networked into the corporate system and running a copy of a central database, and the other using a parallel processing database engine as a server for all the computers across a corporate network - may be described as distributed . The lesson is that jargon can cover a multitude of sins and virtues. Just because your system is called the same thing as your neighbour s doesn t mean it does the same thing - and what a system does is immesurably more important than what it is called.
Mainframes and Downsizing
The Wal-Mart example demonstrates how it is possible to integrate systems spread throughout an organization, so that store personnel, store managers, suppliers and corporate staff are linked by local area networks (LANs) at store and office level and by a wide area network (WAN) nationally. But it also demonstrates how processing power can be distributed around the system, so that users at every level have access to the right management information, and this is in many ways a more significant technological development.
At Wal-Mart stores, hard-wired EPoS terminals and handheld computers using wireless links send data on sales and stock on display, via satellite transmitter, to company headquarters in Bentonville, Arkansas, and - in the case of many fast-moving items - direct to manufacturers. The company maintains a 1.3 terabyte database of purchase information, that is roughly equivalent in size to 2 million average-length books. The ability to store, search and retrieve data in such quantities is itself a technical feat which new approaches to hardware and software have made cost-effective in recent years.
The Wal-Mart system uses a Teradata parallel processing database engine - a dedicated computer using a number of different processor units to split-up the tasks involved in maintaining and searching a huge database. The Teradata machine was for some years unique in its market, because its market was so specialised as to remain small. Recently, a number of manufacturers have developed similar systems to Teradata s many of them using off-the-shelf microprocessors to reduce costs. The very possibility of storing and processing vast amounts of data at relatively low cost has begun to encourage a whole new growth area in software technology called data-mining . This is based on the idea that data stored in sufficient quantity will yield a rich harvest of useful information if looked at in the right way. Wal-Mart has been a pioneer in this field.
About 1,000 of its suppliers have full on-line access to the information on the Wal-Mart database concerning their products, while some 1,800 workstations at headquarters use the data to track sales trends across the country and all 1,600 store managers are spplied with information on local purchaser preferences analysed in terms of over 2,000 customer traits , such as their propensity to buy garden equipment or patterned shirts.
IT vendors talk about a lot about distributed systems, or distributed computing, by which - broadly speaking - they mean anything that isn t a conventional mainframe arrangement. Until comparatively recently, the sort of information available to Wal-Mart s suppliers, store managers, or headquarters analysts would have come from a centralised mainframe-based datacentre. Anyone requiring information would have submitted a written request. The mainframe staff would have prepared the appropriate program and, when there was some free time on the system, they would have run the program. The information would have been printed out and delivered to the original requester by internal mail.
The datacentre-based mainframe operation is generally known as batch processing, since programs are run and data entered in batches, at the most convenient time for the people who operate the mainframe. The alternative to batch processing in a conventional mainframe environment is on-line transaction processing (OLTP) , in which programs are run continuously and data entered as and when it becomes available to the system s users. OLTP is the conventional model for sales and order processing and, of course, for EPoS systems: data, such as item price and availability, may be retrieved from a central mainframe database, a transaction registered and the database updated. Probably the most familiar variant of OLTP systems are to be found in banks and travel agents. But OLTP demands the use of structured data and strictly enforced transaction types. You can t use an OLTP system to ask speculative questions or to analyse trends - the very type of operation so typical of an information or knowledge-based operation.
The tendency for businesses to require information rather than mere data has highlighted the deficiencies of conventional mainframe operations. By contrast to the million-dollar mainframe, thousand-dollar PCs have been analysing trends, modelling operations and producing answers to what-if questions almost since they were first invented in the early 1970s. These capabilities of the PC have given rise to a whole family of ideas which now inform management thinking. The PC has encouraged widespread acceptance of a belief that computers enable a migration from accountancy and administrative functions to management information and decision support. The key aspect of PC technology is its interactivity - the ability of a user to control the system in real-time and to select, change and process data on-screen. Interactivity is the important and unifying characteristic of word-processing, spreadsheet modelling, desk-top publishing, computer-aided design and almost everything else people use PCs for. It is not a characteristic of mainframes, which have been designed to be fast, efficient and reliable processors of large volumes of data.
There has usually been a trade-off between the flexibility distributed systems and the power and reliability of mainframes. Increasingly, new small systems approach mainframe speed and reliability, and even surpass them, so that getting rid of your mainframe - downsizing - may become a realistic option (particularly for smaller companies) which can save money through the use of cheaper hardware and software, and the replacement of expensive development and maintenance staff by packaged software and support or maintenance supplied directly by system suppliers or third party specialists.
That said, the distributed systems approach should also allow companies to expand their information systems by adding more user workstations, or bringing in bigger server computers to run the network or manage a database. The ability to grow or shrink systems is called scalability, and distributed systems are always more scalable than mainframes. It is a pleasing irony that allows a mainframe to be connected to a networked distributed system if its special characteristics are required for some function such as managing a large database, or processing payroll records. The idea that systems can be extended a well as shrunk has given rise to the term rightsizing .
Burlington Coat Factory:
Downsizing systems to meet the challenge of growth
Burlington Coat Factory is a well-established clothes retailer based in New Jersey, USA. During the 1980s, the company grew rapidly, ending the 1970s with around 40 stores and a turnover of about $30 million and starting the 1990s with four times as many stores and an impressive 40 times the turnover. This growth created problems for the comapny s information systems. In particularly, an old Honeywell-Bull mainframe couldn t handle the increased volumes and a requirement to communicate with each of the stores. The company bought merchandise at its New Jersey headquarters and shipped required items to its retail outlets. The outlets themselves had adopted their own systems to cope with in-store administration and record keeping, but the result was inefficient and chaotic.
It was a reasonable system for a small company, Burlington s MIS director, Michael Prince, told an interviewer, but when we got big it was nowhere to be.
It was largely Prince s vision which saw the company begin to develop a new corporate information strategy based on the open system philosophy. An open system - unlike most proprietary systems - allows equipment to be added on relatively easily and the computers at the heart of the system to be upgraded and increased in power with minimal difficulty (in the jargon, open systems offer connectivity and scalability). In 1985, Burlington set up a new central distribution centre to serve new stores. This involved changing many of the company s established business processes some years before business process redesign became a fashionable term. Having looked at a range of computer systems to support these new processes - including proprietary IBM mainframes and powerful minicomputers from Digital Equipment Corporation - Burlington decided that a computer known as a parallel processor from a young start-up called Sequent should be at the heart of the new system. The Sequent Symmetry ran non-proprietary software (including the current industry standard Oracle database management system) and used open standards. Prince began to develop a system to take on some of the tasks required by the whole Burlington operation, this was essentially a network connecting the Sequent to something like 250 new in-store workstations. He believed he was taking a gamble, although subsequently many larger retailers have followed suit.
By the late eighties Burlington was running its financial and distribution systems on the Sequent, which cleared up many of the problems surrounding the delivery of merchandise to the stores. The mainframe still looked after purchase orders and sales order entry, among other functions. These were centralised operations and so seemed to fit better with the mainframe. But Prince had become convinced that the mainframe could be eliminated - and not only could, but should. The job of moving all the data and applications from the mainframe was long and complicated. It took some two years in all, but in early 1993, the company was able to shut down its mainframe.
Today, Burlington has six Sequents - all on the same network as the in-store workstations. The system allows staff at the Burlington stores to enter sales and replenishment data which can be passed direct to the company s distribution centres and to corporate headquarters where powerful software allows the data to be analysed for sales volumes, profitability measures and trends. Ironically, this allows management the sort of control it never could have hoped for in the days of a centralised mainframe. Now, Burlington s managers have information to work with, whereas before they merely had data (and often they could even get that). Undoubtedly more important is the fact that an open system can grow (or shrink) with the company - always allowing it to have the most appropriate and cost-effective technology for its requirements. This is known as rightsizing rather than the more familiar downsizing . Burlington s mainframe experience demonstrated just how easy it is for growth to overtake the capacity of some information systems to deliver benefts. In general, the affordable approach to rightsizing offered by open systems will be welcome to a dynamic organization likely to change in size or shape.
Estimates of the savings made by companies moving from large centralised computer systems to distributed networks suggest that Burlington has saved around a half of the capital cost of a comparable mainframe system by adopting its solution. There will be significant savings on software, too, which is traditionally overpriced in the mainframe arena. On the other hand, staff costs may be higher - especially considering Burlington s new dependence on relatively rare skills like network management. Prince employs as many if not more IS staff as he would if Burlington were still mainframe-based. In the longer term, there are undoubted savings on maintenance and costs hidden or accrued due to inadequate or badly-timed information. The key benefit, however, remains flexibility.
Migrating to open systems is a hard job. This is not rocket science, Prince has said, this is slogging through . But for a medium-sized operation like Burlington with a need to keep flexible and close to its outlets, migration is the first option.
Types of Distribution and Client/Server Computing
The concept of distributed processing dates back to before the PC, but it took on a completely new accent as PCs began to spread throughout organizations. Originally, distributed processing referred to the way a single task could be divided up between different processors. It was an analytical concept. With PCs, LANs, departmental minicomputers and corporate mainframes all coexisting within a single organization, distributed processing began to refer to the idea that you should be able somehow to integrate all this different computing engines into one big system. This was about synthesis.
The logic was simple - for an enterprise to make the best use of its computing facilities, it was important to link-up all the various islands of technology. The only question was how to do it. There are, in fact, four generally agreed models of distributed processing which can be applied to enterprises in which PCs and mainframes co-exist. All of them depend on the idea that different computers or different types of computer are suited to different tasks. This idea also underpins the popular notion of client/server computing, in which one computer, software program, or routine calls on the specialised services of another computer, software program, or routine. There is considerable confusion between the two terms distributed processing and client/server computing , but to all intents and purposes they mean the same thing.
The four types of distribution are:
1) Distributed presentation. This means that one computer or group of computers stores and manages data and runs the applications programs which use it, while another group acts as terminals but with powerful interfaces or so-called presentation managers to make the data and applications easy-to-use and understand.
2) Remote data management. This is currently the most widely-used form of distributed or client/server computing, and describes the Wal-Mart system. Here databases are kept on one specialised computer with end-users able to call-up data as they require it for use in their own application programs on their own computers with their own user interfaces.
3) Distributed data or database management. This is like remote data management, but the data can be kept on more than one computer. It is difficult to coordinate the data in such a system and may not be cost-effective. In practice, users tend to keep some data on their own computers (for example, address books or particular spreadsheet models). As long as this data is exclusively local to them no problem arises. The trouble starts when data can be shared by more than one user.
4) Distributed process (or distributed logic). This form of distribution allows applications programs themselves to be split between clients and servers. A typcal examples would be an advanced EPoS system in which the checkout computer processed a transaction locally and sent its results to a central computer for further processing. This is the most flexible form of distributed system and the one predicted to grow most in the next few years. (US IT market researchers, the Gartner Group, predicts that spending on these applications will grow from 3% in 1993 to 25% of total expenditure on distributed systems in 1997).
The common feature of all these types of distribution is that the end-user systems - PCs, workstations, EPoS terminals or whatever - have intelligence built-in. That means they are capable of presenting and processing data independently of a central mainframe or database engine.
This does not mean that the mainframe is dead, as a number of wilder-eyed IT commentators are eager to argue. Wal-Mart, for example, has several of the largest models of IBM mainframe at its headquarters. What it does mean is that users no longer have to wait for the datacentre to process a request if they want information. They can call it up, process it and display it themselves on their desktop systems, PCs or EPoS terminals.
The benefits are clearly demonstrated in the Wal-Mart example. In the past, store managers and corporate decision-makers would have had to wait for days or weeks before they got the information they needed to make realistic decisions. It would have been impossible to supply information to manufacturers. Today, store managers and corporate decision-makers can make decisions on a day-to-day basis, learn the impact of these decisions within hours, cut inventory, focus their marketing efforts, and collaborate with manufacturers to supply the right products for customers.
In the longer term, the sort of flexibility and speed offered by IT enables a company like Wal-Mart to expand beyond its home base. Wal-Mart now has 19 regional distribution centres linked into its system and 32 geographically focused buyers who can tailor stock lines to account for differences in consumer preferences at regional levels. The ability to analyse store transactions using its vast database has set Wal-Mart well on the road to micro-managing - the ability to focus sales and marketing efforts on a single identifiable customer. Already, daily transactions can be analysed in terms of local weather conditions, in-store positioning of items, or regional tastes. And what can be analysed can be planned for.
The impossibility of simply transplanting a store from one locality to another has always been a major impediment to retail expansion. The other stumbling-block has been the difficulty of managing stores at a distance. IT changes all that. With processing power distributed around a retail organization it is now possible to manage a global network of outlets from a single headquarters and simultaneously to target each store s offering at its precise catchment area. In this way, economies of scale can co-exist with variable product mixes and selling strategies. The result is plain to see in the compelling growth of IT-aware retailers towards a global market.
EPoS - The Key Technology
The single most important technological development in post-war retailing has been the introduction of the intelligent or electronic cash register (ECR). Without it, there would be no electronic trading or EFT. It spawned the electronic point-of-sale (EPoS) terminal and created the potential for a complete and radical transformation of retailing.
The first ECRs were simply cash-registers with electronic, rather than electro-mechanical or simply mechanical, keys and displays. They were effectively calculators with a cash-box and, in themselves, had only a marginal impact on retail operations. But the technological developments which they presaged have turned out to be dramatic.
The key technologies which brought electronics to the cash register were the microprocessor and the low power light-emitting diode (LED) display. (LEDs were rapidly overtaken by the more versatile but similar liquid crystal display - LCD.) In the complex recent history of electronics, the development of the microprocessor and LED display was inextricably tied up with the design and producion of the miniature electronic calculator - the first product to demonstrate the true commercial potential of the revolutionary integrated circuit or chip.
The story of the miniature electronic calculator dates back to 1965 and what was then still a small but pioneering electronics company, Texas Instruments, determined to find or create a market for its chips. TI applied for a patent on the calculator in 1968 and launched the world s first pocket device, the Pocketronic, in 1971. At that time, the calculator was still a crude instrument - it weighed 2.5 lbs, was based on a complicated four-chip design and used a thermal printing device as output - but the single-chip microprocessor and the LED display were under development and began to appear in small calculators within months of the Pocketronic s launch.
For obvious reasons, the cash register and the calculator or adding machine have always been complementary, and among the pioneers of the computer industry you will find manufacturers of cash registers in almost the same abundance as manufacturers of tabulators, sorters and calculators; they were all in the business of business machines . That being the case, it is hardly surprising that the miniature calculator spawned a whole new generation of cash registers. The only surprise is that it was predominantly Japanese companies like Casio and NEC who saw the opportunity to adapt calculator technologies to cash registers. And it was Japanese companies who used their production and marketing skills to dominate the market for electronic cash registers in the 1970s.
No sooner had TI launched its Pocketronic than it and one other company - Intel - actually produced single-chip microprocessors, ostensibly for use in calculating equipment. Intel s first device, the 4004, was actually designed as a result of a 1969 commission by a Japanese company, Busitronic, to produce a small calculator. It turned out that microprocessors were more useful than their designers had imagined. To do their job of adding, subtracting, multiplying and dividing, these devices necessarily incorporated all the features of a general purpose computer. It was inevitable that sooner-or-later somebody would connect a microprocessor to a keyboard, some memory and a display unit and make a working miniature computer.
Just who was the first to produce such a computer is a debatable matter, but once it had been done it was only a matter of time before what became known as a microcomputer found itself inside a cash register. The important feature of a computer is that it is a general purpose programmable machine. In other words, you can make essentially the same piece of hardware behave like a calculator, a cash register, a typewriter, or a lathe - depending only on the instructions you give it (the programme or software) and what you attach to its input and output. For computer manufacturers, the virtue of this flexibility is that they can sell essentially the same product in a range of different markets - invariably at a price comparable to that of so-called dedicated equipment. The buyers of this computer-based equipment benefit from the versatility of the equipment - a computer-based cash register can do a great deal more than any of its longer-established competitors, just as a word processor is more versatile than a typewriter.
During the 1980s, computer companies like IBM, Nixdorf and NCR introduced intelligent cash registers making use of their existing small computers, while specialist EPoS suppliers offered their own complete systems. Typically, they would buy computers and other component parts of the system, put them together and write specialist software. These are commonly known as turnkey systems, since the user has only to turn a metaphorical key and the system is working.
Simple electronic cash registers can represent a considerable advance on the mechanical variety, but the advance is quantitative. The electronic cash register is more accurate, faster, more secure and easier to use, but it remains just a cash register. On the other hand, a full-blown EPoS terminal is qualitatively superior. It allows price changes to be programmed in, it can be used with automatic input devices (such as bar-cod readers), it can be programmed to analyse transactions, or communicate with other terminals or a central processor, it can print out tabulated or analysed receipts, it can monitor customer flow; in other words, it can become the focal point of information processing for a retail operation, providing essential tools for management in a competitive environment.
The electronic till began to give way to EPoS terminals based on modified personal computers in the mid-1980s. These would often include small TV-type visual displays or very visible character displays using bright LEDs or backlit LCD panels. One important reason for the popularity of EPoS technology in the 1980s - particularly within the cash-focused service sector (bars and fast food outlets, for example) was its inherent ability to enforce anti-pilfering measures: notably, the accurate recording of all transactions, and the use of customer-readable displays. Another reason was the simplification of data entry, which meant that the cashiers did not have to remember price lists, but could, for example, use a flat membrane keyboard overlaid with the names of particular stock lines.
Inevitably, the EPoS terminals in larger outlets were linked to centralised computers used for stock control and the generation of management information. EPoS data ripples back through the system, allowing management to gauge sales trends more accurately, to replenish stocks more efficiently, and to introduce targeted promotional strategies and flexible pricing. At the point-of-sale itself keyboard data entry has been augmented by new automated techniques - notably, bar code scanning - which should free the time of cashiers to perform other tasks. In some cases, this has led to a greater throughput of customers and increased productivity, but in many situations (particularly supermarkets) the weak link in the point-of-sale is now the check-out itself where many items may have to be individually scanned and packaged. Not surprisingly, this has led technologists to experiment with more versatile approaches to recording prices and store designers to consider the way technology may help to get rid of check-outs and cash desks altogether. The combination of electronic funds transfer (EFT) with virtual reality or interactive multimedia may transform the whole character of retailing.
EPoS terminals are increasingly used to capture customer information for use in promotions and database marketing. In many cases, this has relied on store cards which tie transactions to named customers. Some stores - like Marks & Spencer - offer their own credit cards; others - for example, the Boston office goods store, Staples - offer customers a free club membership card when they make their first purchase. In smaller stores (for example, drinks retailers like the UK s Victoria Wine or niche stores like Radio Shack) staff may be expected to identify and enrol customers to a club or mailing-list. In these cases, stores rely on checkout staff to identify the customers they want to target rather than use crude or impractically complex analytic techniques on the data. Grocery supermarkets have resisted the trend to use EPoS for the collection of targeted customer data, largely because it runs counter to the other goal of EPoS to speed up checkout traffic. The spread of warehouse clubs like CostCo, which have named members , may create some pressure on supermarkets to cultivate database marketing. Other developments may allow the introduction of unattended checkouts, combining sophisticated scanning devices with a regulated store card or a smart card for payment.
To date, the move from ECRs to EPoS terminals has been driven largely by technological developments themselves. The very possibility of using PC-type hardware and software to develop so-called turnkey retail applications has encouraged major computer companies like NCR (now known as AT&T Global Systems), IBM, Olivetti, Siemens-Nixdorf and ICL to produce dedicated EPoS terminals, thus completing a circle begun when NCR was formed as NationalCash Register. Smaller companies with a claim to specialisation may use off-the-shelf components to produce retail systems under their own names, typically contributing the software or interconnection and installation skills in their own right. The components of a standalone EPoS system - a central processor unit, display, printer, keypad, weighing scales and cash drawers - are widely available from a variety of sources, and the use of PCs has stimulated both the introduction of new data entry devices and the development of store-wide and company-wide networking.
The key to wider usage of retailing technology is the adoption of standards. The proponents of open systems argue that the universal adoption of internationally recognised technical standards in computing will stimulate the market, encouraging competition and increasing choice. In retail, this has certainly been the case, although the standards followed tend to be de facto rather than de jure. Traditionally, like any other large enterprises, large retailers would adopt mainframe-based computing to facilitate back-office functions like payroll, accounts, delivery scheduling, and stock control. The computers used proprietary (non-standard) systems and were limited in their support of data entry at remote terminals. In short, mainframes can support form-based data entry (for example, order entry), but not the complexities or speed of sales transactions at an averagely well-stocked supermarket. This is simply because mainframe terminals are dumb - all the processing takes place at the mainframe which may have to handle tens or hundreds of transactions at any time. The answer is to put processing power in the terminal itself, which is precisely what EPoS systems do. An EPoS terminal will record a transaction involving tens of items without once having to refer to a central computer - all the details of product pricing and promotional offers such as Link & Save are held on the EPoS terminal itself, and the terminal performs all the necessary calculations and communication tasks. Price data would be downloaded from, and sales data uploaded to, a central mainframe only when necessary and when there is capacity to communicate and process such data. For example, price lists may be downloaded from the mainframe once or day, or whenever a terminal is switched on, while sales figures are frequently uploaded by polling all the EPoS terminals in a store or group of stores overnight to enable replenishment and reassortment orders to be put together most efficiently.
None of this would be possible without EPoS systems. It would not be anywhere near as widespread as it has become without the adoption of open systems standards. These have underwritten the development of second generation EPoS, combining processing power with routine data communications. Within stores, this tends to mean networks in a star configuration in which all the EPoS terminals are linked to a central hub, typically using low-cost unshielded twisted pair (UTP) wiring, but in the future increasingly employing wireless links. The network will be based on either the Unix operating system or Microsoft s disk operating system (MS-DOS) plus networking software such as Novell Netware. Between stores, wide area networks using leased telephone lines are the norm, but there are a number of increasingly sophisticated approaches to moving data from, say, an outlying store to corporate headquarters - for example, the use of very small aperture terminals (V-SAT) for satellites, pioneered by Wal-Mart.
Networked retail applications are now offered by a number of suppliers and they will usually run on a variety of computing and communications hardware. Among the leading players in this field are AT&T Global Solutions, Siemens-Nixdorf, IBM, and ICL.
The Market for EPoS
Retailing has gone through a particularly difficult period in recent years, as global recession has restrained the level of consumer spending in almost every sector. Retail volumes have remained static or even fallen almost everywhere over the last three or four years and the growth of the EPoS market has correspondingly slowed. The impact of recession on the EPoS market has not been uniform. Food supermarkets have generally suffered ess than most other types of retail outlet and increased consumer sensitivity has propelled expansion in the discount sector which, in turn, stimulated heavy discounting among other retailers.
New, networked, EPoS systems have been seen by many of the largest retailers as strategic weapons in a competitive war. On one level, networked EPoS allows significant productivity gains to be made and enables retailers to make savings by reducing stock depth. But networked EPoS has also proved invaluable in stimulating demand through in-store promotions. The ability to create special money-saving offers tied to combination purchases or quantity discounts has been perhaps the most popular feature of new EPoS systems for retailers. The technology allows promotions to be implemented without delay so as to maximise sales opportunities. Stores can respond rapidly to local conditions, overstocking, and low demand. Particular product line can be heavily promoted to generate sales. When every penny counts, the money spent on EPoS may be a valuable investment.
For many stores, EPoS has also opened up new business opportunities and enhanced the financial picture. The combination of EPoS with EFT (so-called EFTPoS ) now allows retailers faster access to consumer cash, particularly through the growing use of bank debit cards. Store cards increase customer loyalty, simplify database marketing, and allow retailers to offer a range of financial services to customers, from cash advances to car insurance.
Food supermarkets and hypermarkets have always been the biggest customers for EPoS terminals and related products, largely because of the sheer volume of transactions and the need to reduce check-out queuing times. Market research figures suggest that expansion will slow down in this sector over the next few years, with growth increasing in previously under-represented areas of retail - particularly, smaller food stores, and restaurants and services. According to market researchers Frost & Sullivan, supermarkets and hypermarkets will see their share of the market for EPoS systems decline from 37% to 32% by 1997, while small to medium sized retail chains will account for a correspondingly bigger share of the market as they move straight from ECRs to second generation EPoS [Table XXX].
ECRs currently represent something like a half of all transaction terminals worldwide, but estimates suggest that this proportion will decline to about a third by the end of the decade, reversing their position with respect to EPoS terminals. Internationally, the most significant growth is expected in newer markets. In Europe, Germany is the largest single user of EPoS terminals, while the US leads the world. The market for EPoS will undoubtedly be stimulated by the trend towards globalisation of retail: the terminals provide an essential source of management information for any international retailer, and it is hard to imagine how the increasingly international ambitions of companies like Wal-Mart, Toys R Us, Ahold and Marks & Spencer could be satisfied without the export of technology as well as store design and management strategies. Accordingly, the most substantial growth for EPoS is likely to be seen in those areas with the greatest potential for growth and the most activity by foreign retailers. Japan and East Asia in general display both characteristics, as do Spain, Italy and France. The current global market for EPoS terminals is worth approximately $3bn annually. This is expected to grow to $7bn by the end of the decade.
Deveopments in EPoS technology are likely to be driven by two factors: the demand by unit retailers and small independents for affordable systems with basic functionality and the demand by the leading multiples for more sophistciated systems. In all areas connectivity issues will continue to play an important role, and the large retailers are likely to spend a greater proportion of their budgets on peripherals, such as bar code readers, communications devices, printers, magnetic card readers, and, ultimately, smart card readers. The in-store system of the future will undoubtedly feature EPoS systems able to communicate with intelligent shelving and storerooms, to print out marketing information, to transfer funds from smartcrd credits or, over a network, from the customer s bank account, and to use artificial intelligence techniques to offer individually tailored promotions. The complexity of such systems will ensure that installation, training, maintenance and support will represent an increasingly large proportion of a retailer s total expenditure on IT.
EPoS: Organizational Impact and Critical Success Factors
In some ways, the EPoS terminal is the perfect demonstration of how computer systems have spread to every level of an organization and how they affect almost every job. Checkout staff operating EPoS terminals now have at their immediate disposal as much computing power as there would have been in a mainframe computer 25 years ago. The data collected at a checkout can provide vital input into the information systems that assist store and company management. Store managers are able to monitor checkout traffic, stock levels and staff availability. They can obtain rapid and accurate feedback about special offers or how changes to store layout affect purchasing patterns. Corporate executives can extract information from EPoS data to help assess profitability and determine market positioning and strategic direction.
By combining EPoS-generated product data and customer data extracted from EFTPoS (electronic funds transfer at point-of-sale) systems, it is possible for a high volume retail outlet to determine far more accurately than ever before who is buying what, where and when. (EFTPoS here covers the use of credit, charge or debit cards, and automatic cheque printing, although strictly speaking only the use of debit cards or systems for direct account-to-account transfers should be described as EFTPoS.)
The existence of laws which protect the privacy of individuals and protect financial data have encouraged many store chains to introduce their own credit cards which facilitate customer profiling and market positioning. But even the most basic data from credit, charge and debit cards can provide useful information. For example, simply recording account numbers against transactions can give a reasonably accurate picture of individual or household purchasing patterns. EPoS also allows some labour intensive processes to be automated, freeing staff to spend time on the implementation of quality and customer care procedures, or to develop new skills as a prelude to diversification. On the other hand, computerisation can be used to speed-up checkout operations and to reduce staffing levels.
In practice, most retail operations will do a little of both. There are undoubted and attractive immediate productivity gains to be made from the introduction of EPoS, but time and again the lesson of industry is that automation by itself is a short-term solution in a competitive market. Once everybody has EPoS, that particular route to competitive edge disappears. In the longer term, the abiding benefit of information technology is precisely that it enables a transformation of corporate structure and management style which will provide ongoing benefits in terms of organizational nimbleness and market awareness. The EPoS terminal can be no more than a glorified cash register; it may become a critical element of a management information system (MIS); but it will only achieve its full potential when it becomes a communications and information resource for checkout staff.
Comparisons between checkout operations at supermarkets and other category multiples (for example, hardware and do-it-yourself stores) are instructive. DIY stores have found that customer service is a critical element of success in the market. This is hardly surprising, since DIY is a specialist interest and customers frequently require help in selecting purchases. Supermarkets have gone some way down this road in their marketing of speciality foods and wines, but while DIY store checkout staff are often expected to provide a level of specialist assistance to customers, supermarket management has identified checkout queuing as an operational problem which would be exacerbated by any such approach. The perceived requirements of supermarket and DIY store management are different and, consequently, so has been their approach to IT.
This can be expressed in terms of the way IT is used to reiforce the so-called critical success factors (CSFs) of retailing. Five CSFs are commonly identified:
1) Increasing sales revenues;
2) Increasing gross profits;
3) Controlling operating costs;
4) Increasing productivity of capital and labour; and
5) Adding value for the customer.
These are associated with a variety of key variables, such as stock availability, stock depth, sales per unit area, and the range of in-store facilities on offer. The practical and strategic issues involved in managing a retail operation are often reducible to decisions about ranking CSFs. Because CSFs are not always compatible, they must be prioritised. For example, providing creche or restaurant facilities may encourage customers to spend more time in a store or even visit it more frequently - in this way, it adds value and may even help to increase sales revenues, but it will inevitably decrease the return on capital and the average profit-contribution per employee.
For the supermarket seeking to increase customer throughput, transaction size per visit, and the contribution of checkout staff the EPoS terminal is predominantly a productivity tool. For a store chain primarily concerned with adding value for the customer, the terminal allows checkout staff to provide information and other services that would otherwise be impractical. In some cases, for example, the Ikea furniture stores, this has led to a separation of tasks even within the EPoS function. Essentially the same equipment linked to the same database plays different roles depending on its location - on the one hand, there are streamlined checkouts collecting payment for goods at the store exit; on the other, there are in-store terminals where stock-availability can be checked, and orders taken for immediate or future pick-up, or for delivery.
The differences in the way EPoS is used are obviously not simply a matter of whim or convention. Different goods are sold in different ways, and nobody measures up their kitchen to buy a box of breakfast cereal. Supermarket transactions involve mostly a high volume of low value items, while a furniture store may reasonably expect to sell fewer but costlier items per transaction. This distinction is enshrined in the separation between durables and consumables. As a rule, customers will spend more time and effort choosing a durable than they will selecting a basket of consumables with a comparable total value. They will expect help in buying a durable and will, in return, be prepared to wait for delivery.
In this context, it is clear that different CSFs will be accorded different priorities. But the point about retailing in the late twentieth century is that there are no certainties any more. Expansion and diversification takes retailers into new and unfamiliar territories and competitive pressures demand that they banish complacency. A significant proportion of Ikea s business, for example, comes from off-the-shelf and impulse purchases, just like some food stores. In the 1980s, Marks & Spencer successfully launched itself as a food retailer. And many supermarket chains have realised that there is nothing to stop them selling hardware and no iron law to forbid their customers seeking help in making certain purchases.
In the past, management style invariably reflected organizational structure, and organizational structure in retail tended to reflect category and positioning. All low-end clothes stores looked the same and were managed in the same way; all supermarkets were laid out and operated on more-or-less identical lines. The impact of IT has been to allow management to cut across categories at a time when rigid structures were in any case proving to be counter-productive. Many commentators on the business and management scene have observed the tendency for levels of management to decrease and for organization to become flatter . Competition and recession have given flatter organizations an evolutionary advantage - they can adapt more easily. But IT has given them the opportunity.
All management is communication, and a flatter organization, after all, is simply one in which there are more channels of communication. IT makescommunication easier, simply because it encourages the development of networks for data transfer, but there is no inevitable connection between the use of computers and networks and the flattening of organizational hierarchies. IT is simply a tool that can help to flatten an organization when it becomes necessary. To use it requires the sort of vision that has driven a company like Wal-Mart to achieve pre-eminence in its market by spending more than $500 million on IT in the second half of the 1980s, to develop a satellite network linking over 1,600 stores and 3,000 suppliers into an electronic replenishment system based on EPoS and associated technologies. It is famously and rightly pointed out that Wal-Mart has only three levels in its management hierarchy and that the computer has spread throughout all these levels.
EFT, or EFTPoS, means the automation of authorisation procedures enabling payment for goods or services by means other than cash or cheque. In the context of retailing, this typically involves credit, debit or charge cards, although EFT technology has for some years been used by banks and large companies for credit transfers, payroll and similar transactions. EFT has not been adopted by retailers as rapidly as EPoS for a number of reasons. Firstly, the use of various forms of payment cards is not universally accepted. It may incur extra costs for retailers because of merchant fees charged by the credit or charge card companies, and the persistent problem of card theft complicates transactions where it does not involve undesirable risks.
Retailers themselves often argue that they alone get no benefit from handling plastic . Card issuers benefit from merchant fees, high rates of interest, or - in the case of banks - lower costs associated with clearance, while card holders benefit from greater convenience and - in the case of credit and charge cards - easy or even free credit. Retailers, on the other hand, only have costs to contend with: merchant fees, terminal equipment and communications links. Smaller retailers are increasingly charging customers a premium when they pay by credit or charge card.
The use of credit and charge cards is widespread in the US, but less popular in other countries. In Germany, Eurocheque cards are widely used with over 20 million in circulation. These can be used as cheque guarantee, cash dispenser or point-of-sale payment cards. Deutsche Bundespost has also promoted prepaid and charge cards for use with public telephones. Eurocheque cards are widely used in the Netherlands and Greece. In the UK, credit cards have been unpopular with retailers, largely because they are time-consuming and expensive, and despite plans to establish a national EFTPoS system, the most successful recent innovation has been the introduction of debit cards (Switch, Delta), already popular in Belgium (Bancontact, Mister Cash and Postomat) and in Portugal (Multibanco). Italy has only recently begun to adopt electronic payment systems - a fact which reflects the lack of cohesion of the Italian retail sector - while France has made determined efforts to develop so-called smart cards (see below).
The technology involved in transaction authorisation terminals is relatively simple. There are two separate processes involved: data capture and authorisation. Electronic data capture involves card recognition and the recording of account and transaction information. This may be transmitted on-line to a central system for authorisation, or the data can be stored and uploaded to the central system off-line at a later time. Authorisation involves determining whether the card is valid and if the proposed transaction is of an acceptable value. This is a conventional database query in which card and transaction details are checked against central records. Clearly, with low value transactions, the risk involved in off-line authorisation is minimised, and retailers must gauge whether the increase in trade attributable to their stores accepting plastic is sufficient to offset the risks of loss from fraudulent transactions. Apart from the costs involved in verifying transactions, retailers are often unhappy with the time the process takes. In a fast throughput environment, such as a supermarket chckout, this may involve unacceptable delays. In fact, most stores impose floor limits below which a card does not require authorisation, only data capture, and the retailer is guaranteed payment by the card issuer. For higher value transactions, authorisation is mandatory either by voice communication or direct connection from the terminal.
It seems unlikely that payment card-based EFT will expand significantly in any market until authorisation can be faster, cheaper and more reliable. There are too many practical and cultural barriers. Authorisation terminal manufacturers like the world-leading Verifone company may be able to extend their penetration of the market, but the real limits on EFT s growth are the bottleneck at the central computer end of the system, and the growth of organised fraud particularly involving collusion between staff and fraudsters. The imperative for large retailers to increase checkout flow rates also inhibits the growth of EFT in the most significant retail sectors. Meanwhile small stores and unit retailers gain no particular benefit from increasing their capacity to process payment cards.
The smart card may change the ground rules for EFT. This technology was pioneered by the French computer company, Bull, as long ago as the early 1970s, supported in France through a central organisation called Groupement des Cartes Bancaires, and developed by Bull, CII, Philips and Fionic Schlumberger. It has been taken up in some countries for a limited number of uses, but promises a great deal.
Unlike conventional payment cards which use magnetic strips to hold information, the smart card actually contains semiconductor memory and often a processor of its own. This represents a qualitative enhancement of the technology. The magnetic strip payment card would typically contain details of an account and its associated borrowing limit. Every transaction has to be checked against previous recent transactions to make sure that the borrowing limit has not been exceeded. A smart card, on the other hand, will maintain records of transactions on the card s own memory along with account and limit information. The card would be credited with sums drawn by the cardholder on his or her bank account. The authorisation terminal reads all the required data and calculates if a limit has been exceeded without having to connect to any other computer. Purchases are paid for by the direct deduction of the price from the card s memory, according to prior agreements between banks and retailers. The transfer of funds to the retailer can be effected on-line using a terminal or off-line by a paper settlement process. There will be a saving in the operating costs associated with on-line connection to a central computer, but more importantly the smart card will avoid checkout delays while authorisation is sought.
The smart card has other significant benefits. Security can be enhanced because the card s processing power allows more sophisticated methods of authentication than the use of four-digit PIN numbers or visual signature checks. Biometric authentication, for example, might involve fingerprints, retinal patterns, voice prints, hand shapes or dynamic handwriting characteristics, records of which are stored in the smart card as digital signatures. Other forms of authentication could involve programming the card with personal details or passwords that are easy to remember but that only a legitimate user would know. Embossing and elaborate paperwork would no longer be necessary, and users could obtain details of their own account status without having to wait for monthly statements. No doubt, smart card readers could be attached to office PCs and home computers.
There are two forms of terminal associated with smart cards - a reader/writer device with no on-line capability used to guarantee settlement for the retailer by certifying transactions for later submission to the bank, and a full-function EFTPoS terminal which can store transactions, hold lists of stopped cards, and transmit data directly to a bank. The transition from magnetic strip to smart card will be managed by introducing dual purpose certifiers and dual function cards. The smart card can provide the retailer with an instantaneous look atthe user s full credit position, thus avoiding misuse of cards by legitimate card holders. This has been the single biggest source of loss in the US and trials by Mastercard have suggested that smart cards may reduce this form of loss by as much as 60% - a result which could lead to lower interest rates and merchant fees.
The potential of smart cards has only been touched on. They could be used as versatile prepayment cards, rather like a programmable phone card. They could provide a variety of services - information retrieval via a simple terminal, electronic keys , promotional discounts or loyalty bonuses based on combined or frequent purchases. They could include a range of documentary data on users - passport details, social security number, drivers licence, donor card. For retailers they offer the possibility of a single, simple system for user authentication, authorisation, hot card list management and transaction data collection. They are very difficult to forge. For users they are as convenient as cash and much safer to carry. It seems probable that pre-paid debit cards will eventually be accepted for a range of low-value cash transactions - particularly in countries with severe inflation. The electronic purse , as it has been dubbed, is now the focus of standardisation efforts by leading industry players. But the unknown quantity in the equation is the banks and payment authorities, whose active involvement is necessary for this technology to flourish beyond the limitations of low-value transactions.
The banks are unenthusiastic about smart cards for a number of reasons. The lack of standards means that smart cards do not have the versatility of cash or dumb credit or debit cards. Start-up costs are high - even the most basic smart cards are around five times more expensive than their dumb relatives, and the banks would be expected to fund the installation of costly card reader/writers at retail outlets as well as terminals within the banks themselves. Finally, in the context of the deregulation of banking and the intense competition within financial services markets, smart cards could help retailers offer banking services to their customers at the expense of the banks themselves. One early retail venture was the introduction of Bull CP8 smart cards to the Swiss Migros department store chain in 1988. The Postomat+ cards were backed by the Swiss PTT, but the example is unusual. As market researchers Frost & Sullivan have pointed out, The world s largest smart card application is ... pre-paid debit cards that operate public phones in France.