PORTABLE MBA IN FINANCE AND ACCOUNTING CHAPTER 16 ppt

25 311 0
PORTABLE MBA IN FINANCE AND ACCOUNTING CHAPTER 16 ppt

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

536 16 INFORMATION TECHNOLOGY AND THE FIRM Theodore Grossman INTRODUCTION The personal use of information technology was discussed in an earlier chapter. This chapter will discuss the firm’s use of information technology. Of all the chapters in this book, the two dealing with information tech- nology will have the shortest half-life. Because of the constant flow of new technology, what is written about today will have changed somewhat by tomor- row. This chapter presents a snapshot of how technology is used today in in- dustry finance and accounting. By the time you compare your experiences with the contents of this chapter, some of the information will no longer be applica- ble. Change means progress. Unfortunately, many companies will not have adapted; consequently, they will have lost opportunity and threatened their own futures. HISTORICAL PERSPECTIVE To understand the present and future of information technology, it is impor- tant to understand its past. In the 1960s and 1970s, most companies’ informa- tion systems were enclosed in the “glass house.” If you entered any company that had its own computer, it was located behind a glass wall with a security system that allowed only those with access rights to enter the facility. One computer controlled all of a company’s data processing functions. Referred to as a host centric environment, the computer was initially used for accounting purposes—accounts payable, accounts receivable, order entry, payroll, and so on. In the late 1970s and 1980s, most companies purchased in-house Information Technology and the Firm 537 computer systems and stopped outsourcing their data processing. Recognizing the power and potential of information technology, companies directed the use of their technology toward operations, marketing, and sales; and they cre- ated a new executive position, Chief Information Officer (CIO), to oversee this process. In the 1980s, many companies gradually changed from host centric to dis- tributed computing. Instead of processing all of the information on one large, mainframe computer, companies positioned minicomputers to act as proces- sors for departments or special applications. The minicomputers were, in many cases, networked together to share data. Databases became distributed, with data residing in different locations, yet accessible to all the machines in the network. The personal computer had the greatest impact on the organization. It brought true distributed processing. Now everybody had their own computer, capable of performing feats that, until then, were only available on the com- pany’s mainframe computer. This created both opportunities and headaches for the company, some of which will be addressed in the section on controls. As companies entered the 1990s, these computers were networked, forging the opportunity to share data and resources, as well as to work in cooperative groups. In the mid-1990s, these networks were further enhanced through con- nection to larger, wide area networks (WANs) and to the ultimate WAN, the Internet. Companies are doing what was unthinkable just a couple of years ago. They are allowing their customers and their suppliers direct connection into their own computers. New technology is being introduced every day, and new terms are creeping into our language (Internet, intranet, extranet, etc.). It is from this perspective that we start by looking at computer hardware. HARDWARE Most of the early computers were large, mainframe computers. Usually manu- factured by IBM, they were powerful batch processing machines. Large num- bers of documents (e.g., invoices or orders) were entered into the computer and then processed, producing various reports and special documents, such as checks or accounts receivable statements. Technology was an extremely unfriendly territory. In many companies, millions of lines of software were written to run on this mainframe technology. Generally speaking, these machines were programmed in a language called COBOL and used an operating system that was proprietary for that hardware. Not only was it difficult to run programs on more than one manufacturer’s computer, but, because there were slight differences in the configurations and operating systems, it was difficult to run the same software on different com- puters, even if they were produced by the same manufacturer. In the 1980s, technology evolved from proprietary operating systems to minicomputers with open systems. These were the first open systems, 538 Making Key Strategic Decisions computers that functioned using the UNIX operating system. While, in the 1970s, Bell Labs actually developed UNIX as an operating system for scientific applications, it later became an accepted standard for commercial applications. Platform independent, the operating system and its associated applications could run on a variety of manufacturers’ computers, creating both opportunities for users and competition within the computer industry. Users were no longer inex- orably tied to one manufacturer. UNIX became the standard as companies moved into the 1990s. However, standards changed rapidly in the nineties, and UNIX has lost ground due to the development of client server technology. In the early 1990s, technologists predicted the demise of the mainframe. IBM’s stock declined sharply as the market realized that the company’s chief source of margin was headed toward extinction. However, the mainframe has reinvented itself as a super server, and, while it has been replaced for some of the processing load, the mainframe and IBM are still positioned to occupy important roles in the future. Server technology is heading toward a design in which processors are built around multiple, smaller processors, all operating in parallel. Referred to as symmetrical multiprocessors (SMPs), there are between two and eight processors in a unit. SMPs are made available by a range of manufacturers and operating systems, and they provide processor power typically not available in a uniprocessor. Faced with the demanding environment of multiple, simultaneous queries from databases that exceed hundreds of gigabytes, processors with mas- sively parallel processors, or MPPs, are being utilized more and more. MPPs are processors that have hundreds of smaller processors within one unit. The goal of SMPs and MPPs is to split the processing load among the processors. In a typical factory in the 1800s, one motor usually powered all of the machinery, to which it was connected by a series of gears, belts, and pulleys. Today, that is no longer the case, as each machine has its own motor or, in some cases, multiple, specialized motors. For example, the automobile’s main motor is the engine, but there are also many other motors that perform such tasks as opening and closing windows, raising and lowering the radio antenna, and pow- ering the windshield wipers. Computers are the firm’s motors, and like motors, they, too, have evolved. Initially, firms used a host centric mainframe, one large computer; today, they are using many computers to perform both special- ized and general functions. In the early 1990s, Xerox’s prestigious Palo Alto Research Center intro- duced “ubiquitous computing,” a model that it feels reflects the way companies and their employees will work in the future. In ubiquitous computing, each worker will have available differing quantities of three different size comput- ers: 20 to 50 Post-it note size portable computers, three or four computers the size of a writing tablet, and one computer the size of a six-foot-by-six-foot white board. All of the computers will work together by communicating to a network through, in most cases, wireless connections. The progress of chip technology has been highly predictable. In the early 1960s, Gordon Moore, the inventor of the modern CPU at Intel, developed Information Technology and the Firm 539 Moore’s Law, which predicts that the density of the components on a com- puter chip will double every 18 to 24 months, thereby doubling the chip’s pro- cessing power. This hypothesis has proven to be very accurate. Exhibit 16.1 shows the growth of the various Intel CPU chips that have powered the per- sonal computer and many other machines. As can be seen, the PC’s power has just about doubled every 18 to 24 months. This growth can be seen more dramatically when the graph is plotted logarithmically, as in Exhibit 16.2. EXHIBIT 16.1 Moore’s Law—charting the power of the growth of the PC. 1980 1982 1984 1986 1988 1990 1992 1994 1996 1998 2000 2002 Year MIPS (millions of instructions per second) 0 100 200 300 400 500 600 700 800 900 1000 EXHIBIT 16.2 Moore’s Law—charting the growth of the PC (logarithmically). 1980 1982 1984 1986 1988 1990 1992 1994 1996 1998 2000 2002 Year MIPS (millions of instructions per second) 0.1 1.0 10.0 100.0 1000.0 540 Making Key Strategic Decisions SOFTWARE Exhibit 16.3 represents the information systems paradigm. Operational control systems, which run the company’s day-to-day operations, are typically used by the lowest level of the organization, are run on a scheduled basis, and usually contain large volumes of input data, output reports, and information. These systems might be accounts payable, accounts receivable, payroll, order entry, or inventory control. Decision support systems are generally used by middle-level managers to supply them with information that they can use to make decisions. Usually run on an ad hoc basis and involving small amounts of data, budgets, exception re- porting, cash-flow forecasting, accounts receivable dunning reports, “what if ” analyses, audit analysis reports, and variance analyses are examples of these decision support systems. Many of the newer applications packages come with facilities for managers without any programming knowledge to create their own decision reports. Strategic information systems are used by senior management to make de- cisions on corporate strategy. For example, a retail company might use demo- graphic census data, along with a computerized geographical mapping system, to evaluate the most appropriate locations at which it should open new stores. A manufacturing company, given its demands for both skilled and unskilled labor, might use a similar method to determine the optimal location for a new plant. While most older hardware has given way to newer computers, most com- panies use a combination of newly acquired and older, self-developed software. The latter was developed over a period of years, perhaps 20 or more, using COBOL, which, until the early 1990s, was the standard programming language in business applications. Today, many companies’ mission critical systems still EXHIBIT 16.3 Types of information systems. Operational control systems Decision support systems Strategic information systems Information Technology and the Firm 541 run on mainframe technology, using programs written in COBOL; in fact, there are billions of lines of COBOL programming code still functional in U.S. business. These “legacy” systems have become a major issue for many, though, and were the key issue behind the Y2K problem. In many instances, they have grown like patchwork quilts, as they have been written and modified by pro- grammers who are no longer with their firms. More often than not, documen- tation of these changes and enhancements is not available, and the guidelines for many of these software applications no longer exist. Replacing these appli- cations is cost prohibitive, and the distraction to the organization caused by the need to retrain workers would be tremendous. Nonetheless, as a result of the Y2K problem, many of these systems were replaced, but large volumes of them were merely patched to allow for the mil- lennium change. These systems will eventually have to be replaced. If history is a lesson, many of these systems will not be replaced, though, until it is too late. In any event, the business community should not face the singular dead- line it faced at the end of 1999. Today, most programmers write in C++, C, or fourth-generation program- ming languages. C++ is an object oriented programming language; object ori- ented languages provide the programmer with a facility to create a programming object or module that may be reused in many applications. Fourth-generation programming languages are usually provided with sophisticated relational data- base systems. These database systems provide high-level tools and programming languages that allow programmers to create applications quickly without having to concern themselves with the physical and logical structure of the data. Ora- cle, Informix, Sybase, and Progress are some of the more popular relational database package companies. INTERNET TECHNOLOGY Nothing has impacted technology and society in the past 10 years more than the Internet. When Bill Clinton was inaugurated in January 1993, there were 50 pages on the Internet. Today, there are more than 200 billion pages. The un- derlying technology behind the Internet has its roots in a project begun by the U.S. government in the early 1970s. The network was originally developed by a consortium of research colleges and universities and the federal government that was looking for a way to share research data and provide a secure means of communicating and for backing up defense facilities. The original network was called ARPANET. ARPANET was sponsored by the Department of Defense’s Advanced Research and Planning Agency (ARPA). It was replaced in the 1980s by the current network, which was originally not very user friendly and was used mostly by techies. The Internet’s popularity exploded with the develop- ment of the World Wide Web and the necessary software programs that made it much more user friendly to explore. 542 Making Key Strategic Decisions The Internet works on a set of software standards the first of which, TCP/IP, was developed in the 1970s. The entire theory behind the Internet and TCP/IP, which enables computers to speak to each other over the Inter- net, was to create a network that had no central controller. The Internet is un- like a string of Christmas lights, where if one light in the series goes out the rest of the lights stop functioning. Rather, if one computer in the network is disabled, the rest of the network continues to perform. Each computer in the Internet has an Internet, or IP, address. Similar to one’s postal address, it consists of a series of numbers (e.g., 155.48.178.21), and it tells the network where to leave your e-mail, and data. When you ac- cess an Internet site through its URL (e.g., www.babson.edu), a series of computers on the Internet, called domain name servers (DNS), convert the URL to an IP address. When an e-mail, message, or data is sent to someone over the Internet, it is broken into a series of packets. These packets, similar to postcards, contain the IP address of the sender, the IP address of the recipient, the packet number of the message (e.g., 12 of 36), and the data itself. These packets may travel many different routes along the Internet. Frequently, packets belonging to the same message do not travel the same route. The receiving computer then reassembles these packets into a com- plete message. The second standard that makes the Internet work is HTML, or Hyper- text Markup Language. This language allows data to be displayed on the user’s screen. It also allows a user to click on an Internet link and jump to a new page on the Internet. While HTML remains the underlying program- ming language for the World Wide Web, there are many more user-friendly software packages, like FrontPage 2000, that help create HTML code. More- over, HTML, while powerful in its own right, is not dynamic and has its lim- itations. Therefore, languages such as JavaScript, Java, and Pearl, which create animation, perform calculations, create dynamic Web pages, and access and update databases with information on the host’s Web server, were developed to complement HTML. Using a Web browser (e.g., Netscape Navigator or Microsoft’s Internet Explorer), the computer converts the HTML or other programming languages into the information that the users see on their com- puter monitors. Internet technology has radically changed the manner in which corporate information systems process their data. In the early and mid-1990s, corporate in- formation systems used distributed processing techniques. Using this method, some of the processing would take place on the central computer (the server) and the rest on the users’ (the clients’) computers—hence, the term client- server computing. Many companies implemented applications using this technol- ogy, which ensured that processing power was utilized at both ends and that systems were scalable. The problem with client-server processing was that dif- ferent computers (even within the IBM-compatible PC family) used different drivers and required tweaking to make the systems work properly. Also, if the software needed to be changed at the client end, and there were many clients Information Technology and the Firm 543 (some companies have thousands of PC clients), maintaining the software for all of those clients could be a nightmare. Even with specialized tools developed for that purpose, it never quite worked perfectly. As companies recognized the opportunity to send data over the Internet, whether for their customers or their employees, they started to migrate all of their applications to a browser interface. This change has required companies to rethink where the locus of their processing will occur. Prior to the 1990s, companies’ networks were host-centric, where all of their processing was con- ducted using one large mainframe. In the early 1990s, companies began using client-server architecture. Today, with the current browser technology and the Internet, the locus has shifted back to a host-centric environment. The differ- ence, though, is that the browser on the users’ computers is used to display and capture data, and the data processing actually occurs back at the central host on a series of specialized servers, not on one large mainframe computer. The only program users need is a standard browser, which solves the incompatibil- ity problem presented by distributed data processing. No specialized software is stored on the users’ computers. Internet technology was largely responsible for many of the productivity enhancements of the 1990s. Intel’s microprocessors, Sun and Hewlett Packard’s servers, CISCO’s communications hardware, and Microsoft’s Windows operat- ing systems have all facilitated this evolution. While Windows is the predomi- nant client operating system, most servers operate on Windows NT or 2000, UNIX or LINUX operating systems. TODAY’S APPLICATION SYSTEMS In the 1970s and 1980s, application software systems were stand-alone. There was little sharing of data, leading to the frequent redundancy of information. For example, in older systems, there might have been vendor data files for both inventory and accounts payable, resulting in the possibility of multiple versions of the truth. Each of the files may have contained address information, yet each of the addresses may have been different for the same vendor. Today, however, software applications are integrated across functional applications (accounts payable, accounts receivable, marketing, sales, manufacturing, etc.). Database systems contain only one vendor data location, which all systems uti- lize. These changes in software architecture better reflect the integration of functions that has occurred within most companies. Accounting systems, while used primarily for accounting data, also pro- vide a source of data for sales and marketing. While retail stores’ point of sale cash registers are used as a repository for cash and to account for it, they are also the source of data for inventory, sales, and customer marketing. For exam- ple, some major retailers ask their customers for their zip codes when point of sale transactions are entered, and that data is shared by all of the companies’ major applications. 544 Making Key Strategic Decisions Accounts receivable systems serve two purposes. On one hand, they allow the company to control an important asset, their accounts receivable. Also, the availability of credit enables customers to buy items, both commercial and re- tail, that they otherwise would not be able to buy if they had to pay in cash. Credit card companies, which make their money from the transaction fees and the interest charges, understand this function well. Frequently, they reevaluate the spending and credit patterns of their client base and award increased credit limits to their customers. Their goal is to encourage their customers to buy more, without necessarily paying off their balance any sooner than necessary. Information systems make it possible for the companies to both control and promote their products, which in this case are credit card transactions. These examples of horizontally integrated systems, as well as the under- standing of the strategic and competitive uses of information technology, demonstrate where industry is headed. ACCOUNTING INFORMATION SYSTEMS As mentioned earlier, computer-based accounting systems were, for most com- panies, the first computerized applications. As the years progressed, these sys- tems have become integrated and consist of the following modules: • Accounts Payable. • Order Entry and Invoicing. • Accounts Receivable. • Purchase Order Management and Replenishment. • Inventory Control. • Human Resource Management. •Payroll. • Fixed Assets. • General Ledger and Financial Statements. Whereas in past years some of these modules were acquired and others were self-developed, today most companies purchase packaged software. In the 1980s, “shrink-wrapped” software was developed and introduced. Lotus Corporation, along with other companies, was a pioneer, selling software like its 1-2-3 application in shrink-wrapped packages. The software was accom- panied by sufficient documentation and available telephone support to ensure that even companies with limited technical expertise could manage their own destinies. There are a host of software packages that will satisfy the needs of com- panies of all sizes. Smaller companies can find software selections that run on personal computers and networks, are integrated, and satisfy most of the com- panies’ requirements. Quicken and Computer Associates have offerings that Information Technology and the Firm 545 provide most of the necessary functional modules for small and medium size companies, respectively. The more advanced packages, like Macola and Acc- Pac, are equipped with interfaces to bar-code scanners and scales, which, to- gether, track inventory and work in process and weigh packages as they are shipped, producing not only invoices but also shipping documents for most of the popular freight companies such as FedEx and UPS. These packages range in price from $100 for the entire suite of accounting applications for the small- est packages to approximately $800 per module for the larger packages, which, of course, have more robust features. While some of the smaller packages are available through computer stores and software retailers, the larger packages are acquired through independent software vendors (ISV), who, for a consult- ing fee, will sell, install, and service the software. The practice of using third party ISVs began in the 1980s, when large hardware and software manufactur- ers realized that they were incapable of servicing all of the smaller companies that would be installing their products, many of whom required a lot of hand- holding. Consequently, a cottage industry of distributors and value added deal- ers developed, in which companies earn profits on the sale of hardware and software and the ensuing consulting services. Larger companies are following a trend toward large, integrated packages from companies like SAP and Oracle. These packages integrate not only the ac- counting functions but also the manufacturing, warehousing, sales, marketing, and distribution functions. These systems are referred to as enterprise re- source planning (ERP) systems. Many of these ERP systems, available from companies such as SAP, Oracle, and BAAN, also interface with Web applica- tions to enable electronic commerce transactions. SAP has spawned an entire industry of consulting companies that assist large companies in implementing its software, a process that may take several years to complete. As in any soft- ware implementation, one must always factor into the timetable the process’s cost and the distraction it causes the organization. In today’s lean business en- vironment, people have little extra time for new tasks. Implementing a major new system or, for that matter, any system, requires a major time and effort commitment. INFORMATION TECHNOLOGY IN BANKING AND FINANCE The financial services industry is the leading industry in its use of information technology. As shown in Exhibit 16.4, according to a survey conducted in 1999 by the Computer Sciences Corporation, this sector has spent 5.0% of its annual revenue on IT, almost more than double that of any other industry, except the technology driven telecommunications industry. This graph also illustrates how integral a role real-time information plays in the financial services industry, whether it be for accessing stock quotes or processing bank deposits. The industry has become a transaction processing [...]... transmit their invoices and then to receive the subsequent payments While industries use different versions of EDI in different ways, their goals are always the same: minimize the processing time and lower inventory costs and overhead expenses An industry organization in Washington, D.C., developed and maintains a standard format that dictates how all transactions are sent, ensuring that all companies... developed in the 1960s and 1970s were accounting oriented, data processing, which is what information technology was then called, typically reported to the Chief Financial Officer, creating a control atmosphere consistent with accounting controls A central group of trained data entry operators was responsible for entering and verifying data Access to the “glass house” was restricted, and in some cases... now and will be in the future as they struggle with new IT investments INTER NET/ INTRANET/ EXTRANET The Internet, intranets, and extranets provide companies with a plethora of opportunities to find new ways of transacting business An alternative to some of the older technology, an intranet, a subsystem of the Internet, was developed in 1996 to allow employees from within a company to access data in. .. returns and credits, will present significant angst for the auditors and controllers of these firms Nonetheless, the financial services industry has embraced e-commerce and now offers most of its products over the Internet Online services include, among others, the purchase of stocks and bonds, online mortgages, and life insurance and online banking Because they are nontangible, these products and services... annually by 2004 Most of the focus of the investor community during 1999 and 2000 was on the B2C space, with millions of dollars made and lost as a result of people not understanding the business model Most of the money raised in venture capital was used for advertising to gain brand recognition, whereas very little was invested in infrastructure As a result, the B2C landscape is littered with the corpses... e-commerce The Internet works well in many cases, because, while it is not delivering the product itself, it is delivering information about the product, often in levels of detail and consistency that were never available in the physical world As noted earlier, the real action is and will be in B2B e-commerce Companies of every shape and size are realizing the opportunities for both ordering and selling their... allowing cross docking, whereby goods being received for specific stores were immediately sent to the shipping area designated for those stores, thus putting an end to all time lags Wal-Mart now has the lowest cost of inbound logistics in its industry Its selling G&A is 6% below its nearest competitor, enabling it to be the most aggressive retailer in its industry Wal-Mart aligned its IT strategy and infrastructure... and in some cases access to the data entry and report distribution areas was also restricted Because everything was self-contained, control was not a major issue In the late seventies and early eighties, online terminals began appearing on users’ desks, outside of the glass house, allowing them access to data Initially, these terminals were used for information inquiry Yet, even this limited function... 1980s and early 1990s, Wal-Mart adopted an everyday low pricing strategy To accomplish this goal, Wal-Mart needed to change the manner in which it both conducted business with its suppliers and managed the inbound logistics, warehousing, and distribution of merchandise to its stores It needed to abolish warehousing as much as possible and quicken the process by which stores ordered and received merchandise... to access computerized sales information directly from its computers, which, in turn, allows them to gauge Wal-Mart’s demand and then stage production to match it Wal-Mart effectively shifted the burden of warehousing merchandise from its own Information Technology and the Firm 553 warehouses to the vendors, eliminating the costs of both warehouse maintenance and surplus inventory The distribution centers . effort commitment. INFORMATION TECHNOLOGY IN BANKING AND FINANCE The financial services industry is the leading industry in its use of information technology. As shown in Exhibit 16. 4, according to a. the Internet. Online services include, among others, the purchase of stocks and bonds, online mortgages, and life insurance and online banking. Because they are nontangible, these products and. This chapter presents a snapshot of how technology is used today in in- dustry finance and accounting. By the time you compare your experiences with the contents of this chapter, some of the information

Ngày đăng: 05/07/2014, 13:20

Từ khóa liên quan

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan