TY - JOUR AB - The purpose of the section is to provide reviews of the wide variety of computer software and related material available for use by economists. Products reviewed may include econometrics, statistical and various optimisation and simulation packages available to economists for use in both teaching and research. From time to time overview articles related to computer software available in particular areas of economics will be published. The section will cover other functional software useful to academic economists (such as spreadsheets, database managers and mathematical wordprocessors), databases, bibliographical and other resources available on CD‐ROM and the Internet, CAL material and other software sold with textbooks. Some Software Reviews from previous volumes of this section of The Economic Journal are available in electronic form to users of the World‐Wide Web via the sosig server at the University of Bristol. The URL is http://sosig/cticce/ej‐rev.htm I am grateful to members of the CTI Centre for Economics at Bristol for their assistance with this service. Please send any feedback and comments about the service to me. Individuals who wish to express opinions on this or any other matters concerning the content and form of this section of The Economic Journal should contact the editor: GuyJudge, Software Review Editor, EconomicJournal, Department of Economics, University of Portsmouth, Milton Campus, Locksway Road, Southsea PO4 8JF. Telephone 01705 876543; email judgeg@pbs.port.ac.uk, fax 01705 844037. Software publishers who wish to have their product reviewed, or who require further information, should contact the Software Review Editor at the above address. The opinions expressed in the review articles are solely those of the reviewer and are not attributable to TheEconomicJournal or the Royal Economic Society. A Review of Data Envelopment Analysis Software Warwick DEA for Windows Version 1.00, Operational Research and Systems Group, University of Warwick, UK. Five user licence £1,600 + VAT (full version) (£800 + VAT for educational users). IDEAS Version 5.1, 1 Consulting Inc., Amherst, MA, USA. $945 (full version). Site licences available. Frontier Analyst Version 1.1, Banxia Software Ltd, Glasgow, UK. Commercial single user licence £1,650 + VAT (two user licence £2,370 + VAT), educational single user license £470 + VAT (further licence information on request). Test PC Gateway 2000 Pentium 90, 16MB RAM, 1GB hard disk, 64K colour display. Intended Use and Area of Application This review covers three commercially available packages used to undertake data envelopment analysis (DEA). DEA is a method of non‐parametric analysis, based on linear programming, used to analyse production functions, based on the mapping of a production frontier. DEA allows the relationship between inputs and outputs of a production process to be described in terms of the technically most efficient combination of inputs to produce a given output. Efficiency of other units in the sample is then measured in comparison to this frontier (see Charnes et al. (1994) for more details). Any choice between the packages under review may depend on potential users and potential applications, for example managerial, policy or academic use. Ease of Use Warwick and Frontier Analyst run under Windows, while IDEAS runs under DOS, although a DOS version of the Warwick package is available. Documentation with the Frontier Analyst software is basic and easy to follow, perhaps reflecting the targeting of a non‐academic market. It features an easy to follow introduction to performance measurement, building up all the outputs of the package in turn, from performance information through to reference sets. The user is then taken through the practicalities of undertaking analysis and making use of results. Most features of the manual are available on‐line within the package. The Warwick user manual clearly guides the user through all aspects of DEA and how the software can be used to solve several different models. The more intricate aspects of DEA modelling, for example, restricting the weights applied to variables, are also dealt with. The user is taken through model set‐up and data management and examples are used to explain in detail how to undertake all forms of analysis, with all outputs fully explained, including efficiencies, peers, targets, and weights. A certain amount of this information is available on‐line within the package. IDEAS features comprehensive documentation, but as with many DOS based packages, is not as instantly user friendly as those that run under Windows. Data preparation, analysis and results reports are comprehensively run through, and the computation of most results is dealt with rigorously. How to set up all the available models is dealt with in detail and there is a useful quick reference section in the manual. All the packages are straightforward to install, feature useful examples and give advice on data preparation, all taking previously prepared data files given a modicum of manipulation. Hardware requirements are shown in Table 1. Table 1 System requirements Software . Windows/DOS . RAM . Display . Hard disk . Frontier Windows 3.1 or higher 4MB Colour VGA 4MB Analyst (Windows 95 compatible) Warwick Windows 3.1 or higher (DOS version available), IBM 386 with co‐processor or higher. (486 or Pentium recommended) 2MB Monitor supported by Windows 2MB IDEAS DOS 5.0 or above, 486 or Pentium processor recommended EMS Colour display optional 550KB Software . Windows/DOS . RAM . Display . Hard disk . Frontier Windows 3.1 or higher 4MB Colour VGA 4MB Analyst (Windows 95 compatible) Warwick Windows 3.1 or higher (DOS version available), IBM 386 with co‐processor or higher. (486 or Pentium recommended) 2MB Monitor supported by Windows 2MB IDEAS DOS 5.0 or above, 486 or Pentium processor recommended EMS Colour display optional 550KB Open in new tab Table 1 System requirements Software . Windows/DOS . RAM . Display . Hard disk . Frontier Windows 3.1 or higher 4MB Colour VGA 4MB Analyst (Windows 95 compatible) Warwick Windows 3.1 or higher (DOS version available), IBM 386 with co‐processor or higher. (486 or Pentium recommended) 2MB Monitor supported by Windows 2MB IDEAS DOS 5.0 or above, 486 or Pentium processor recommended EMS Colour display optional 550KB Software . Windows/DOS . RAM . Display . Hard disk . Frontier Windows 3.1 or higher 4MB Colour VGA 4MB Analyst (Windows 95 compatible) Warwick Windows 3.1 or higher (DOS version available), IBM 386 with co‐processor or higher. (486 or Pentium recommended) 2MB Monitor supported by Windows 2MB IDEAS DOS 5.0 or above, 486 or Pentium processor recommended EMS Colour display optional 550KB Open in new tab Package Facilities Data input and management are straightforward in the Windows packages as spreadsheet data editors are provided to enter new data. Both take imported data, Warwick taking pre‐prepared files and Frontier Analyst data pasted from spreadsheets or tab delimited text. Data can then be edited in the packages. Data management in IDEAS is also possible, as is the importing of data as text. All the packages allow analysis of large numbers of units (Frontier Analyst 256 units/32 inputs and outputs, Warwick numbers of units, inputs and outputs only restricted by RAM, IDEAS 1,000 units/20 inputs and outputs). Performance The software were tested on a data set used in previously published analyses (Hollingsworth and Parkin, 1995; Parkin and Hollingsworth, 1997), and are publicly available (ISD, 1994). The data are from 75 hospital units and up to six inputs and six outputs were used. All the packages produced accurate and identical results, under assumptions of both constant and variable returns to scale, the Warwick software also providing information on a units returns to scale. All packages produce efficiency scores for each unit, information on comparator units and suggested targets to become efficient. IDEAS and Warwick also provide details on the weights given to the variables in the analysis. Conclusions IDEAS, while perhaps the least user friendly package, does produce an array of results and if careful use is made of the extensive documentation provided the user can quickly become familiar with the software. With this in mind, and the number of potential models available for analysis, IDEAS is a useful package for operational research experts and others doing academic research. IDEAS for Windows is under development. The Warwick software is an easy to use Windows package that, as with IDEAS, allows use of various DEA models. Warwick results can easily be subjected to further analysis and can easily be saved in a format appropriate for spreadsheets. Warwick or IDEAS would be the packages of choice for the expert analyst. However, the Warwick package may also be of use for straightforward managerial applications, once the appropriate DEA model for analysis has been specified. Frontier Analyst is probably the most user friendly of the packages, particularly in terms of presentation of results, giving many different outputs, such as frequency units are peers, in neat graphs within the package. As such, it would probably appeal the most for straightforward managerial and policy applications, and may open up DEA to a wider, non‐expert audience. However, it must be borne in mind that, although more options are planned for the future, at the present time the package allows only basic DEA analysis to be undertaken, there is no capacity for restricting the weights applied, and there are few options to change in detail the specification of the models used. It is very much a package for ‘do it yourself’ managerial applications. Given that DEA methods are very much still under development, especially in terms of the choice of which DEA model is to be used in specific policy applications, this may be an important consideration. In terms of the cost‐effectiveness of the packages reviewed, it really does depend on the intended use and user. For academic users, discounts make the packages relatively accessible. The ideal package would be one which combined the analytical capability of IDEAS and Warwick with the user friendly features of Frontier Analyst. BRUCE HOLLINGSWORTH11University of Newcastle upon Tyne References Charnes , A. , Cooper , W. W., Lewin , A. Y. and Seiford , L. M. ( 1994 ). ‘ Data envelopment analysis: theory, methodology and application .’ Dordrecht: Kluwer . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Hollingsworth , B. and Parkin , D. ( 1995 ). ‘ The efficiency of Scottish acute hospitals: an application of data envelopment analysis . Institute of Mathematics and its Applications Journal of Mathematics Applied to Medicine and Biology , vol. 12 , pp. 161 – 73 . OpenURL Placeholder Text WorldCat Information and Statistics Division . ( 1994 ). Scottish Health Services Costs, SFR5/Costs Book System Software . Edinburgh: Information and Statistics Division, National Health Service in Scotland Management Executive . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Parkin , D. and Hollingsworth , B. ( 1997 ). ‘ Measuring production efficiency of acute hospitals in Scotland, 1991–1994: validity issues in data envelopment analysis . Applied Economics , forthcoming. GARGH Modelling in Finance: A Review of the Software Options GAUSS Aptech Systems Inc., 23804 S. E. Kent – Kangley Road, Maple Valley, WA 93038, USA (tel: 206 432‐7855; fax: 206 432‐7832). RATS Estima, 1800 Sherman Avenue, Suite 612, Evanston, IL 60201, USA (tel: 708 864‐8772; fax: 708 864‐6221). TSP TSP International, PO Box 61015 Station A, Palo Alto, CA 94306, USA (tel: 415 326‐1927; fax: 415 328‐4163). Introduction The econometric analysis of financial data has received an increasing amount of attention from academics over the past decade or so. Great advances in the ability of computers to record, process, store, and retrieve huge volumes of data within an acceptable time have encouraged this interest. In broad terms, much financial data (particularly high frequency asset pricing series) has a number of specific characteristics which make its analysis more challenging than many other types of data. First, the fact that asset price series are generated at extremely high frequencies implies that the number of observations can rapidly expand into hundreds of thousands, or perhaps in extreme cases, even millions. Second, the inability of traditional linear (‘textbook’) structural and time series models to capture the most important features of the data have led to the development and implementation of a number of relatively new non‐linear models. Nonlinear models are generally more complex and difficult to estimate than linear models, and often an analytical solution may not exist so that a slower, iterative procedure may be required. Both of these facets of financial data imply that, as well as the more traditional features of software that are of interest (such as flexibility, user‐friendliness, comprehensiveness, etc.), the speed of the program in arriving at the correct parameter values is of paramount importance. It may be, for example, that some modelling strategies are simply not feasible for very high frequency data, since such a great deal of CPU time is required. The purpose of this review is to compare the relative properties of three popular software packages (GAUSS, RATS and TSP) for estimating one special case of a class of models which are generally of particular importance in financial econometrics, namely GARCH‐(1, 1) models, which are used in this review as an illustrative example. An outline of the remainder of this review is as follows. The next three sections give a brief description of each of the software packages used for comparison followed by two sections describing the GARCH modelling framework used as an illustrative example. The final sections offer some analysis and concluding remarks. GAUSS version 3.2 GAUSS is, unlike RATS and TSP, not an econometrics package as such, but rather it is a higher level programming language principally made up of commands and procedures for statistical analysis. Hence in principle all applications, even those such as linear regressions, must be programmed up from first principles. However, twelve modules, known as ‘Gauss Applications’ are available for specific problems, which will remove much of the work associated with using GAUSS. The most important add‐in modules for the financial econometrician (together with their most useful features) are likely to be Linear Regression (OLS, 3SLS, SURE estimation), Maxlik (maximum likelihood estimation using various descent algorithms), Optmum (function optimisation), Time Series (ARIMA modelling), and CML (constrained maximum likelihood). GAUSS versions are available for DOS, Windows, Windows 95, and UNIX platforms. RATS version 4.2 RATS is an econometric package designed specifically for time series analysis, although there are a number of procedures which would be useful for the analysis of cross‐sectional or panel data. RATS contains many features which may potentially be useful for a financial econometrician, including estimation of ARIMA and many members of the GARCH family of models, simultaneous equations and VAR modelling, testing for unit roots and cointegration, generation of forecasts and simulations, Monte Carlo studies and bootstrapping. Various estimation techniques are available, including ordinary least squares, generalised least squares, non‐linear least squares, maximum likelihood and generalised method of moments (GMM). All of these are either pre‐programmed into RATS as instructions, or are collected together in sets of instructions known as ‘procedures’. RATS is available for DOS, Macintosh, Windows or UNIX/VMS. TSP version 4.2 TSP is an easy‐to‐use general purpose package intended primarily for the estimation and simulation of econometric models. TSP also has a broad list of features useful for the econometric analysis of financial data, namely OLS, ARIMA modelling, Kalman filter estimation, ARCH/GARCH, Monte Carlo studies using data generated from various distributions, forecast generation, unit root and cointegration testing, VAR estimation, and GMM (multiple and non‐linear). DOS and UNIX versions are available. A Brief Note on GARCH Models Virtually all readers interested in econometrics could not have failed to witness the vast number of papers using the GARCH family of models. Indeed, hardly a week goes by without another variant or extension being proposed. Suffice to say here that this model class if of paramount importance in financial econometrics for it relaxes the (almost invariably unrealistic for financial data) assumption that the variance of the error process is conditionally as well as unconditionally fixed. GARCH models are also able, at least in part, to model the unconditional leptokurtosis observed in many financial series. This is not the appropriate place for a comprehensive review or list of references, but the interested reader is directed towards the excellent survey by Bollerslev et al. (1992) and the references therein. The simple GARCH‐(1, 1) model utilised in this study can be described by the following equation: (1) where xt is the dependent variable, ht is the conditional variance, vt is the Normal innovations stream, and b0, bv, bh are parameters to be estimated. Simulation Example In order to test the ease of use and in particular the speed of the three packages within a consistent and logical framework, the following ‘mini‐Monte Carlo’ study is employed.1 First, 500 series each of length 500 observations are generated using a Fortran program2 for a GARCH‐(1, 1) model with standard Normal innovations, as given in equation (1) above. For the purpose of data generation, the parameters are set to b0 = 0.0108, bv = 0.1244, bh = 0.8516. These values are derived from a GARCH estimation on a set of Swiss Franc/Sterling exchange rates, and are very reasonable values, given the results in the rest of the empirical literature on this subject. The task of each package was to read in each of the data files, estimate a GARCH‐(1, 1) model on the data, and then to write the results to a text output file. All three packages were run under DOS on an IBM‐compatible personal computer with a Pentium processor running at 133 Mhz and with 16 Mb of RAM. Results and Analysis A summary of some of the most important features is given in Table 1. The first feature worth noting is the surprising speed with which TSP was able to estimate all 500 GARCH models. It was over eight times quicker than the next fastest (RATS). This lightning speed is a little surprising, but may be a result of the longer history of TSP, which was originally developed without so many ‘bells and whistles’ to run on machines which were very slow with small working memories relative to those currently available. Speed is still of great essence today for the estimation of non‐linear models on large samples or models, or where a Monte Carlo study is employed. Ease of usage can be proxied by the number of lines of computer code which are required in each case. There are a number of stages involved in estimating a GARCH model: reading in the data, specifying a mean and variance equation, and a log‐likelihood function, running a linear regression to obtain appropriate starting values, and choosing the method of optimisation.3 All of the packages essentially follow these same steps, the crucial difference between them being the extent to which the user must specify or code up the steps himself, or the extent to which they have already been incorporated into a prewritten routine. GAUSS and TSP are clearly at opposite ends of the spectrum in this case; in the latter, the user need only write the word ‘arch’ (plus some options and the name of the dependent variable) in order to run a GARCH model.4 This extremely easy‐to‐use approach has a clear advantage for those who are seeking to run the program quickly but who are not interested in the technicalities. It may, however, be viewed as a disadvantage since the model appears as a ‘black box’, and the researcher does not even get to see the underlying model or log‐likelihood function, let alone specify it. Table 1 Open in new tabDownload slide A Summary of the Performance and other Features of GAUSS, RATS and TSP for Estimating 500 GARCH-(I,I) Models Table 1 Open in new tabDownload slide A Summary of the Performance and other Features of GAUSS, RATS and TSP for Estimating 500 GARCH-(I,I) Models Thus the user must also trade off a great deal of flexibility for this simplicity: ARCH, GARCH and GARCH‐M are the only variants that are preprogrammed in the current version of TSP. If the researcher wanted any of the other large number of variants, such as EGARCH or the GJR formulation, he would have to program up the whole GARCH routine from scratch in order to be able to modify it. Adaptations would clearly be much easier in GAUSS or RATS, where the user must specify the mean, and variance equations and the log‐likelihood function anyway. Some sample code for GARCH estimation using GAUSS and RATS is readily available on various World Wide Web pages and FTP sites, and in the case of RATS, is also distributed with the program and documented in the manual. The only package which seemed to experience difficulties converging to a sensible optimum value of the parameters using the default convergence criteria for most of the models was GAUSS. Although convergence was not always achieved by the other two packages, the estimated parameter values were never far from their theoretical values. The problem was particularly serious with GAUSS since it caused the whole system to crash out of the program, whereas RATS and TSP would simply move on to the next data series. This is, in the opinion of the author, likely to be a particularly worrying problem in general since the estimation was ‘easy’ here where the true DGP was known and appropriately specified in the estimated equation.5 Another point of comparison between the three packages is the amount of useful output that is automatically displayed with the regression results. As well as the parameter values, standard errors, and the corresponding t‐ratios, TSP and RATS also give out the number of observations used. Both also give information on the number of iterations taken and the maximised value of the log‐likelihood (they are given by the former if the silent option is not specified). These pieces of information may be useful in themselves, or may be required to compute diagnostics or for hypothesis testing. RATS and GAUSS could, of course, potentially provide more diagnostic information, although these would need to be written additionally. Programming support is a valuable tool for the applied econometrician. To the knowledge of this author, the only supplier to provide support for users who run into programming difficulties (as opposed to the physical operation of the software) is Estima, the supplier of RATS. They operate an e‐mail service, where users who have run into difficulties with the operation of the software, or whose programs just do not work, can send an e‐mail to the help‐line. This service is extremely useful for researchers who have invested a great deal of time in writing programs, only to tear their hair out when those programs will not run, or will not give sensible answers. The lack of technical support and of good documentation for the applications modules in the case of GAUSS has already been highlighted in an earlier review in this journal by Heywood and Rebello (1996). The cost of each package may also be an important consideration for hard‐up academics on tight budgets. RATS and TSP are both obtainable from their respective creators in the United States at a cost of $500 for both the Windows and DOS versions together. RATS is also distributed in the United Kingdom by Timberlake Consulting for a price of £350 for the DOS version. GAUSS cannot be obtained directly from Aptech, but again is available from Timberlake at a cost of £455. Since this does not include any of the add‐in modules required for easy and effective use, GAUSS is by far the most expensive of the three packages reviewed. Conclusions To sum up, all three packages can be used to estimate simple models like a GARCH‐(1, 1) without too many difficulties. The three essentially follow the same steps; the essential difference is in the extent to which those stages are already programmed into instructions which can be called, or the extent to which they must be written by the researcher. GAUSS and RATS clearly represent the most flexible environment in which to work while TSP is noteworthy for its speed and compactness of code. However, RATS would definitely be the preferred choice of this author given the evidence above, and since it provides a useful compromise between simplicity and flexibility. One certainly needs to be a considerably more adept programmer in order to be able to use GAUSS. Furthermore, a large number of GARCH variants such as the GJR, E‐GARCH and Q‐GARCH models have been programmed‐up in RATS and are available from Estima's FTP site or dial‐up bulletin board. CHRIS BROOKS22University of Reading References Bollerslev , T. , Chou , R. Y. and Kroner , K. F. ( 1992 ). ‘ ARCH modelling in finance: a review of the theory and empirical evidence . Journal of Econometrics , vol. 52 ( 5 ), pp. 5 – 59 . Google Scholar Crossref Search ADS WorldCat Heywood , C. and Rebello , J. ( 1996 ). ‘ Time series analysis of cross‐sections: a software guide . Economic Journal , vol. 106 ( 438 ), pp. 271 – 5 . Google Scholar Crossref Search ADS WorldCat Fortran Powerstation (Version 4.0) Microsoft Corporation, One Microsoft Way, Redmond, WA 98052‐6399, USA. Tel 206 936 8661. With the ISML routines: standard price $749, academic price $295; without the ISML routines: standard price $595, academic price $199. Fortran PowerStation is Microsoft's ‘Integrated Development Environment’ (or IDE) for building software with the Fortran 90 language. While the imminent death of Fortran has been predicted at least since the advent of PL/1 on mainframes, or APL on smaller computers (where are these languages now?) Fortran has survived, giving rise to the old riddle, ‘What will be the computer language of the next century? … Nobody knows but it will be called Fortran ‘Why does Fortran continue to be a useful option for such economics tasks as simulating general equilibrium effects of monetary policy rules or computing nearest neighbour semiparametric estimates with large data sets? The answer seems to lie in three qualities of good Fortran programs: (1) Fortran programs typically execute quickly; (2) Fortran programs port well from machine to machine; and (3) the very limitations of Fortran commands often mean that programs written by one person can be understood by others. There is a body of public domain source code in Fortran available for accomplishing a wide variety of economic, statistical, mathematical, and engineering tasks. The fact that many of these routines are tested to run quickly and error‐free makes the language attractive. Many recent textbooks on numerical analysis (for example, Sewell (1988) include source code examples that are written in Fortran. 1. Fortran go The advent of Fortran 90 has increased the likelihood that Fortran will survive the death of those aged practitioners that do not care to learn how to debug a program written in a newer ‘object oriented’ language. Fortran 90 is the first major language to standardise structures that can take advantage of parallel computer architectures. The trend in modern computers to be designed around more, rather than faster, processors is likely to continue into desk top computers. Additional processors will only add more speed if the software is written so that several tasks can be accomplished at the same time rather than serially, which is how most programmers still conceive of their work. The importance of Fortran 90 is that it marks the first time that all the vendors, engineers, scientists, and developers from around the world, with their different programming styles and with their separate interests in the new incarnation of Fortran, could agree on a common interface by which the programmer could signal to the compiler than an operation could be done in a parallel manner. Further, this monumental task is done with a parsimony of commands that clarifies foreign code and allows programs to be written without large dictionaries of possible commands. Thus, evaluating a vector dot product, a task that should be done in parallel, rather than serially, is always signalled with the intrinsic function, dot_product (vector a, vector b), no matter what particular machine or implementation of Fortran 90 is being run. In addition, Fortan 90 clears up some of the maddening inflexibility of Fortan 77. For example, variable names are no longer restricted to eight characters, and the fixed format source code (seven blank spaces reserved on the center for statement labels or continuation characters on a line seventy characters wide, a throw back to the days of computer punch cards) can now be written in a free format mode, much like other languages. Many of the popular extensions to Fortan 77 which are non‐standard but almost universally used (such as Do‐End Do constructs) are now codified into Fortan 90. Microsoft has greatly enhanced the usefulness of Fortran 90 with its implementation, Fortran PowerStation. This platform offers many options for developing programs written in Fortran 90. Of particular interest to economists are features that facilitate (1) the input of source code; (2) debugging the program; and (3) compiling a fast running executable program with full access to the Windows 95 system routines. I will cover each of the three features in detail. It should be noted that even if the main intention is to write programs to be run on a department wide platform, such as a mainframe computer, PowerStation offers powerful tools for the development and testing stages of program design. These tools can be used to produce and debug a source code that can then be ported a high performance platform for execution with a different compiler. PowerStation's strict adherence to the Fortran 90 Standard should make this porting process painless. 2. Producing a Source Code Microsoft has made the composition of a Fortran program much easier than the simple expedient of using a word processor to save a text file. Command words are immediately highlighted in blue, comments immediately turn green, and other options can be switched on to use colour as a visual signal to help in programming. For fixed format input, the sixth column is always highlighted in green, greatly reducing the risk of syntax errors resulting from inadvertently shifting a column to the center. For those of us who write in free format, there are also nice features. A toggled option automatically adds the blank spaces needed to indent the next line to conform with the one above. (This greatly facilitates the use of indenting to indicate ‘if‐end if blocks and ‘do’ loops.) A simple toggle key gives the user a full screen for editing source code. Selecting a command and pressing F1 gives the description of the command from the on line reference manual. Thumbing through large files is greatly eased by the use of ‘bookmarks’. The list goes on and on. Indeed, I can only think of a few things that are missing from the editor. While matching parentheses are indicated by a key stroke, matching ‘if‐end of, or ‘do‐end do’, statements are not. I would like a quick more visually informative way to check the dependencies in the subroutines. Finally, a back‐up option, where previous versions of the program were not automatically overwritten at compilation time, would prevent those uncomfortable times when a program that runs well is turned into a program that does not run at all. However, these are minor complaints. No matter where I run a Fortran program, I now always compose my code on the PowerStation. I have come to depend on its many features. 3. Debugging Perhaps the most impressive aspect of the PowerStation is its ease in debugging programs. Syntax errors detected in compilation are indicated by a message in a separate window; double clicking the message sends one straight to the offending statement in the source code. Errors in program logic, always much harder to detect, can be sought with a debugger. Break points are set with the click of a mouse button, and conditions for the break point can be imposed with a dialogue box. Thus, if I want execution of the program to stop at a given statement only if a certain expression is false, it is easy to tell the debugger to do so. Alternatively, one can step through the program a statement at a time through a variety of buttons which allow subroutines to be skipped or examined in detail. Examining the value of variables is made easy through two windows, one containing all the variables and arrays of the subroutine where the program has stopped, and another containing variables or expressions of particular interest to the programmer. These expressions may be typed into the window, or selected in the source code and dragged into the window with the mouse. When the values of variables or expressions change, the value appears in red, helping to detect errors. Indeed the debugger is so rich in features that there are many I have not yet learned to take advantage of. Suffice it to say that I now check all of my Fortran code with a run through Microsoft's debugger before I am convinced the output is reliable. 4. Writing an Application Microsoft offers many options for designing and stand‐alone applications. Before compiling any program for the first time, the authors must choose a ‘project’ type for her program. This determines whether her program will have simple character output, graphical output, or output that can use many of the routines embedded in the Windows system. Indeed, the centrality of the concept of a project leads to the greatest weakness in the PowerStation program. Most of the frustrating mistakes made by programmers new to the PowerStation come from misunderstanding the subtle distinction between a project (the application actually being run in Windows) and the text file that represents the source code of the Fortran program. It is the open project (not the source code in the main window) which is compiled, linked and run. Novices to the PowerStation will often fail to link a project to the code, and then fail to understand why the code in the main windows does not compile. Microsoft should have written a ‘bare bones’ option for novices, where a program not formally linked to a project is automatically associated with a simple default project when the program is compiled and executed. Compiling and linking a project automatically creates an application. Menu driven options allow a programmer to optimize her code for different processors. A wide variety of other options can be set. For example, one option allows commands from other platforms than this particular version of PowerStation to compile. Does the code execute quickly? I was very impressed. A Gauss program (albeit with many ‘do‐loops’ which shows Gauss at its very slowest) took about ten times the amount of time to execute as the Fortran program which replaced it. I see many of the programmers in our department now executing their Fortran code on desk‐top Pentium based computers rather than porting their code over to our main‐frame computer because the gain in speed on the main frame is not worth the effort to port the programs to it. Perhaps the evidence is not a fair test of other methods of executing code, but the fact remains that all the programmers I have seen starting on the PowerStation have been pleasantly surprised with the speed of execution. The ease of compiling an easy‐to‐use application opens up new possibilities for submission of economic research journal articles. With the new complexity in the use of numerical analysis to handle important economic questions, verification of the sensitivity of results is now problematic. Some authors have made their source code available, but many are hesitant to do so because so much fixed cost has gone into the development of the software code. Writing several publications from the modified source code is a way to recoup the large fixed cost before other authors can use the code. With an application written in Fortran PowerStation, an author can submit a menu‐driven application which allows others to check her claims of robustness or to check solutions of problems with known analytically derived properties. Further the compiled application could be made available on a Web site for use as an educational or policy‐making aid. This would be even more true if Microsoft were to include some useful graphics routines for displaying output in two and three dimensions. (A two‐dimensional graphing routine is included in the sample programs, but it is a bit clumsy.) For programmers who are facile with UNIX, PowerStation includes a number of powerful command line tools very much like their UNIX counterparts: NMAKE (like UNIX make), LIB (like UNIX ar), LINK, FL32 (like UNIX f77). With some effort, one can import UNIX make files for LAPACK and other large libraries, and use the Microsoft tools to build the library. For those programmers with large C + + libraries, mixed programming in Fortran 90 and C + + is made easier by the fact that Microsoft Developer Studio (the IDE included with PowerStation) is the same IDE used by visual C/C + +. There are many other features on the Fortran PowerStation CD Rom. The version I use, PowerStation Pro, has a compiled version of the IMSL mathematical and statistical library, which includes some useful functions and subroutines. Included in all versions is the Numerical Recipes Fortran 77 library. In reading the manual, I came across a reference to Fortran 90 for Scientists and Engineers, which I wrote down hoping to look it up in the library. It turned out to be included in hypertext on the CD Rom. Many Fortran products have the feel of being written by programmers who would really rather be working with C. I have worked with Fortran compilers where using the debugger involved translating the program first to C and then using a C debugger. Microsoft PowerStation, on the other hand, feels like it was written by people who understand and love Fortran. They understand its moods and its strengths. It is a pleasure to program with such a product. BEN CRAIG33Federal Reserve Bank of Cleveland Footnotes 1 Note that this exercise is merely undertaken to compare the properties of the packages, and is not intended to constitute a full Monte Carlo study. 2 This should ensure that none of the packages has an unfair advantage over the other which might be the case if the data were generated from within one of the programs under consideration. 3 In all cases, the BHHH method was used since this was available in all three cases and is well documented in the literature. 4 Peculiarly, a GARCH‐(1, 1) model is the default for the command ‘ARCH’ and not an ARCH model as one might have suspected from the name. 5 Although it must be said that the author has experienced particularly extreme problems with non‐convergence in the past using TSP. Reference Sewell , G. ( 1988 ). The Numerical Solution of Ordinary and Partial Differential Equations . New York: Academic Press . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC © Royal Economic Society 1997 TI - Software Reviews JF - The Economic Journal DO - 10.1111/j.1468-0297.1997.tb00022.x DA - 1997-07-01 UR - https://www.deepdyve.com/lp/oxford-university-press/software-reviews-WPW03cViuo SP - 1267 EP - 1280 VL - 107 IS - 443 DP - DeepDyve ER -