UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

A GIS application for regional streamflow analysis 1993

You don't seem to have a PDF reader installed, try download the pdf

Item Metadata

Download

Media
ubc_1993_fall_cathcart_jaime.pdf
ubc_1993_fall_cathcart_jaime.pdf [ 5.4MB ]
Metadata
JSON: 1.0050485.json
JSON-LD: 1.0050485+ld.json
RDF/XML (Pretty): 1.0050485.xml
RDF/JSON: 1.0050485+rdf.json
Turtle: 1.0050485+rdf-turtle.txt
N-Triples: 1.0050485+rdf-ntriples.txt
Citation
1.0050485.ris

Full Text

A GIS APPLICATION FOR REGIONAL STREAMFLOW ANALYSIS by JAIME GRANT CATHCART B.A.Sc., The University of British Columbia, 1987 A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF APPLIED SCIENCE in THE FACULTY OF GRADUATE STUDIES (The Department of Civil Engineering) We accept this thesis as conforming to the required standard THE UNIVERSITY OF BRITISH COLUMBIA June 1993 © Jaime Grant Cathcart, 1993 In presenting this thesis in partial fulfilment of the requirements for an advanced degree at the University of British Columbia, I agree that the Library shall make it freely available for reference and study. I further agree that permission for extensive copying of this thesis for scholarly purposes may be granted by the head of my department or by his or her representatives. It is understood that copying or publication of this thesis for financial gain shall not be allowed without my written permission. Department of  riu;^irl'eCrri(1 The University of British Columbia Vancouver, Canada Date  DE-6 (2/88) ABSTRACT Regional analysis is a statistical procedure used by hydrologists to estimate streamflow parameters for ungauged streams. The procedure involves analyzing historical flow records for streams in a particular region, in order to estimate flows for other streams in the region. Traditionally, the process of obtaining and summarizing relevant streamflow data has been rather inefficient and time- consuming. Regional streamflow analysis is performed frequently in British Columbia, and there is a need for a system designed to facilitate this process. This thesis involves the development of such a system. The system is comprised of a computer program (DATSUM) which statistically analyzes and then summarizes streamflow data for British Columbia, in a format which can be read and displayed by a geographic information system (GIS). Water Survey of Canada (WSC) streamflow records are the raw database, and the GIS used for this project is a locally developed product, known as QUIKMap. The system operates on DOS based personal computers, and information is presented to the user as a series of symbols on a digitized map of the major rivers and streams of B.C. Each symbol represents a WSC streamflow station, and detailed information about each station can be simply obtained by selecting the stations of choice with an electronic computer mouse. This information can then be used for regional analyses or other streamflow dependent applications. The thesis also involves adapting a flow estimating program, called FLOOD, to make it compatible with the DATSUM program. The FLOOD program estimates extreme high or low flows for streams with little or no streamflow data, and is perfectly suited to be used in conjunction with the DATSUM / GIS system. Finally, use of the DATSUM / GIS system is illustrated with a regional analysis of the Kitseguecla River. The results are then used as input for the FLOOD program, and the river's 200 year flood is estimated. iii TABLE OF CONTENTS Pace ABSTRACT ^ LIST OF TABLES  ^vii LIST OF FIGURES  ^viii ACKNOWLEDGMENT  ^ix Chapter 1. INTRODUCTION  ^1 2. REGIONAL ANALYSIS  ^5 2.1^Regional Analysis^ 5 2.2 Regional Flood Frequency Analysis^ 6 2.3^Limitations of Regional Flood Frequency Analysis^7 2.4 DATSUM/QUIKMap/FLOOD System  ^8 3. GEOGRAPHIC INFORMATION SYSTEMS (GIS) ^ 10 3.1 Why a Geographic Information System Was Chosen ^ 10 3.2 What is a Geographic Information System? ^ 11 3.3 Advantages of GIS Over Traditional Spatial Data Management Systems ^ 12 3.4 Disadvantages of GIS's 13 3.5 GIS and Hydrology 14 3.5.1 Using GIS to Generate Hydrologic Model Inputs ^ 15 3.5.2 GIS and Expert Systems ^ 16 3.5.3 GIS/Hydrologic Model Interfaces ^ 18 iv TABLE OF CONTENTS (continued) Chapter 3.5.4 Database/GIS Interfaces 3.6^Selection of a GIS Package ^ 4.^DEVELOPMENT OF THE DATSUM PROGRAM ^ Page 18 19 21 4.1 Initial Considerations 21 4.2 WSC Data Files ^ 22 4.3 DATSUM Summary Files ^ 22 4.4 DATSUM Program Organization 23 4.5 Instructions for Use of DATSUM ^ 25 4.5.1^Installing the Program 25 4.5.2 Creating and Loading WSC Data Files 25 4.5.3 Running the Program ^ 27 4.5.4 Exiting the Program 29 5. OPERATION OF THE DATSUM/QUIKMAP SYSTEM ^ 31 5.1 System Organization 31 5.2 Operating the QUIKMap Application 31 6. THE FLOOD PROGRAM ^ 40 6.1 Description of the FLOOD Program ^ 40 6.2 Advantage Over Frequency Analysis 40 7. SAMPLE APPLICATION ^ 42 7.1 Kitseguecla River 42 7.2 Data Summary Procedure ^ 42 TABLE OF CONTENTS (continued) ^ Page Chapter 7.3^Regional Analysis 7.4^Additional Data 8.^DISCUSSION AND CONCLUSIONS ^ BIBLIOGRAPHY ^ Appendix 47 48 51 56 A. DATSUM PROGRAM DETAILS ^ 59 A.1 DATSUM Summary File 60 A.1.1^Fields of the DATSUM Summary Files 60 A.1.2 Example Summary File - Region 08JA ^ 62 A.2 Macros: Names and Functions ^ 65 A.3 Programming Code ^ 76 A.4 Installing Customized QUIKMap Files 116 B. FLOOD PROGRAM DETAILS ^ 117 6.1 FLOOD Program Organization 118 B.2 Instructions for Use ^ 119 B.2.1^Installing the Program ^ 119 B.2.2 Running the Program 119 B.2.3 Exiting the Program ^ 122 B.3 Macros: Names and Functions 123 B.4 Programming Code ^ 127 vi LIST OF TABLES Table Page 7.1 Regional Flood Data for Kitseguecla River Near Skeena Crossing 46 7.2 Estimates of Peak Flood Parameters for Kitseguecla River ^ 47 7.3 Extreme Flow Records for Station 08EF004 ^ 49 7.4 Estimates of 200 Year Mean Daily and Instantaneous Floods on Kitseguecla River ^ 50 vii LIST OF FIGURES Page Figure 4.1 Sample DATSUM Screen Message  ^28 4.2 Sample DATSUM Input Screen  ^28 4.3 Sequence of DATSUM Screens Requesting Extreme Flow Information 30 5.1^QUIKMap Data File Directory Menu  ^32 5.2 QUIKMap Data File Directory Menu  ^32 5.3 Basemap of B.C. with WSC Streamflow Regions  ^33 5.4 Sequence of QUIKMap Data File Directory Menus  ^34 5.5 WSC Streamflow Stations - Region 08JA  ^35 5.6 Sample QUIKMap Map Legend  ^37 5.7 Sample QUIKMap Data Display Screen  ^38 7.1^Use of QUIKMap's "Radius" Function  ^45 A.2 Flow Chart Illustrating DATSUM Macro Relationships  ^73 B.2.1 Initial Input Dialog Box  ^120 B.2.2 Additional Flow Data Dialog Box ^  120 B.3 Flow Chart Illustrating FLOOD Macro Relationships^126 ACKNOWLEDGMENT The author wishes to express special thanks to his supervisor, Dr. S.O. (Denis) Russell, for his invaluable guidance, assistance and encouragement throughout the development of this thesis. Thanks is also extended to the engineers and management of Crippen Consultants, particularly Wynn Morgan and Mario Merlo, for their generous technical and financial assistance. Finally, the author is indebted to the Science Council of British Columbia for endorsing the project and providing substantial financial assistance. ix CHAPTER 1 INTRODUCTION Water is a special substance. It is the medium from which life began and it is vital for the continuing existence of all species. It enhances human lives with its many forms of natural beauty and its amazing resource potential. Canadians are very fortunate, as they have an abundant supply of water. However, this water supply is not unlimited, and Canadians must take care to manage their water resources in a responsible and knowledgeable fashion, so that future generations may also enjoy the benefits of Canada's water. A key element in the management of water resources is the ability to understand the water cycle and its related processes, such as rainfall, runoff and stream flows. For centuries man has been attempting to manage water resources, be it to control floods, irrigate fields, generate electricity, or preserve natural beauty. The water cycle, however, is unpredictable, and the greatest challenge has been to accurately estimate or measure the water supply. A whole science, known as hydrology, has developed around this pursuit. Over time, many water supply estimating techniques have evolved, but the science of hydrology is still largely empirical. With the advent of computers, estimating techniques have become more complex and sophisticated, as systems are able to manage increasing amounts of data. Despite this, there is still great difficulty and inadequacy in trying to model a system with so many variables. One approach to estimating stream flows which has been popular for years, but which is still evolving and improving with the growth of micro-computers, is 1 regional analysis. Regional analysis is a statistical procedure based on the idea that as stream flow patterns and volumes are determined by precipitation and runoff conditions, then all streams and rivers in the same geographical area will tend to have similar flows. The procedure involves analyzing historical records for streams in a particular region in order to estimate flows for ungauged streams in the same region. Over the past twenty years a new area of computer technology, known as geographic information systems (GIS), has been slowly evolving. In the past five years, with the growth of the personal computer, development of this technology has accelerated rapidly and it is now being embraced by hydrologists and engineers. A GIS is a spatial data handling system. It is capable of analyzing and displaying spatially related data, which makes it particularly suited to hydrologic modeling; many hydrologic parameters are dependent on area and thus their areal extent must be assessed before their values can be determined. Considerable work has been done in the development of GIS applications for the generation of input values for hydrologic models. In addition, systems which interface between GIS's and hydrologic models have been developed. In contrast, very little work appears to have been done to create data\GIS interfaces. All types of electronic databases are currently available, but they do not have much practical use if they cannot be readily utilized by other computer systems. This thesis involves, amongst other things, the creation of such an interface. The objective of this thesis is to statistically analyze and summarize streamf low data for British Columbia and to present it in a way that will assist hydrologists in performing regional analyses. In British Columbia, streamflow records for hundreds of streams have been kept for varying periods of time over the last 100 years. This data is compiled and updated on an annual basis by Water Survey of Canada (WSC), and is now available in CD-ROM format. 2 This thesis involves the development of an Excel computer program (DATSUM) which processes raw data files from the CD information disc to create summary data files for Water Survey of Canada (WSC) streamflow regions in B.C., in a format which can be read and displayed by a geographic information system (GIS). The GIS is able to display a detailed digitized map of the major streams and rivers of British Columbia, and then upon request of the user, show the data summary files on the map, so that regional relationships between the various streams can be readily established by a hydrologist. It should be noted that many of the attributes usually associated with a GIS are not utilized by this project. The GIS, in this case, serves only to display data and does not perform any complex analysis. In addition to the DATSUM program, this thesis involves the rewriting, in Excel language, of a flow estimating program (FLOOD) developed by Dr. Denis Russell, a civil engineering professor at The University of British Columbia. This program, originally written in Pascal language, estimates design flows (peak flows, low flows, etc.) on the basis of a stream's mean annual flow and its coefficient of variation, two parameters displayed by the GIS system. It was considered necessary to rewrite the program so that the data summary program and the flow program could be operated with the same software. This text is set out in the following manner. Chapter 2 describes the need for design flow estimating techniques, with emphasis on regional flood frequency analysis. Chapter 3 describes geographic information systems in detail and outlines the major areas or research and development of hydrological applications. Chapter 4 deals with the development of the data summary program DATSUM. Chapter 5 explains how the GIS program operates with the files created by the DATSUM program. Chapter 6 describes the design flow estimating program FLOOD, and explains how it can be used. Chapter 7 demonstrates how the whole 3 system operates, with an example application for the Kitseguecla River, in central British Columbia. Finally, a summary and discussion of the implications of the system are given in Chapter 8. 4 CHAPTER 2 REGIONAL ANALYSIS British Columbia contains thousands of streams and rivers. Most of the major waterways have a gauging station where streamflows are recorded. However, many small streams and rivers have no gauging stations or historical streamflow records. As the interest in small stream development and management in the province continues to increase, due in part to the province's energy focus changing from large scale hydro development to small scale hydro, there is an increasing need for a simple, yet effective, method for estimating flows for ungauged streams. 2.1^Regional Analysis The most common and effective method for estimating streamflows is regional analysis. Regional analysis is a statistical procedure based on the idea that streams and rivers in the same geographical area will tend to have similar flows, since stream flow patterns and volumes are determined by precipitation and runoff conditions. The procedure involves analyzing historical records for streams in a particular region, to provide a basis for estimating flows for ungauged streams in the same region. The estimates are made by adjusting the recorded flows according to relationships between various catchment characteristics or parameters. Generally, such analysis requires significant time and expertise, and it is the intent of this thesis to develop a system to simplify and expedite this process. 5 2.2^Recional Flood Frequency Analysis Regional analysis is most often performed as part of a flood estimating procedure. In fact, this is so often the case that the majority of hydrology texts only discuss regional analysis in terms of regional flood frequency analysis, a flood estimating technique which combines regional analysis with frequency analysis. Frequency analysis is a statistical technique for estimating the size of a peak flow event which will be equaled or exceeded within a specified time period. There are a number of different flood frequency analysis techniques available, but their differences are minimal, and there are certain elements which are common to most. These elements are described in the text" Hydrology of Floods in Canada!' (Watt et al. 1989) as follows: 1. The regional boundaries are defined and gauging stations are screened for acceptable records. 2. A single-station frequency analysis is carried out for acceptable stations within the region. 3. Relations are postulated between certain flood statistics and the physiographic and climatic characteristics of the drainage basins. 4. Values of required physiographic and climatic characteristics are determined for each drainage basin. 5. The relations postulated in (3) are cast in the form of prediction equations, the coefficients of which are determined using multiple regression (a method used to relate basin characteristics to flow parameters) or other statistical techniques. 6 Typically, a regional flood frequency analysis is performed for various "homogeneous" regions of a large geographic area, such as a province or state, and then the derived equations are published and made available to interested parties. These parties then use the information to make quick estimates of design flows. One example of such a project is a regional flood frequency analysis for the state of Alaska (Lamke 1979). This was an extensive study, undertaken by the United States Department of the Interior Geological Survey, involving the analysis of streamflow data for 260 gauged basins. The state was divided into two climatic regions and regression equations were generated for floods of different recurrence intervals. The average standard errors of the regression equations for the two areas were 48 and 74 percent, which is fairly representative of the amount of error which can be expected when applying this type of analysis to such a huge area. 2.3^Limitations of Regional Flood Frequency Analysis Regional flood frequency analyses have been performed for various parts of Canada ( Watt et al. 1989) and the technique is reasonably reliable and effective in areas with similar physiographic, geomorphologic, and climatic characteristics. However, there are a number of difficulties, drawbacks and limitations involved with using this technique. To begin with, the technique only works when there is sufficient data. All gauging stations used in the analysis must have similar periods of record, and only those stations with 10 or more years of record should be used (Watt et al. 1988). In British Columbia this is a serious problem as many areas of the province have minimal streamflow records. Another problem is that the technique is only reliable and effective in areas with similar physiographic, geomorphologic, and climatic characteristics. In addition, it is often very difficult to define the boundaries of a "homogeneous" region 7 and to determine the significance of different characteristics within a region. In British Columbia, the terrain and climate are so variable that it is virtually impossible to produce reliable flood frequency models for large areas of the province. Regional flood frequency analysis can be performed on small regions of the province as needed, but many small projects cannot afford the time or expense to undertake an extensive study, and must rely on previously published studies. In many cases these studies, as with the Alaska study discussed previously, involve large areas and consequently have large errors. In addition, the system is unable to directly incorporate site specific information into the analysis. Frequency analysis requires information on mean annual floods, but additional information, such as the largest flood, or the flood which has not been exceeded in a number of years, is not given consideration. Finally, a major weakness of this technique is that once the prediction equations and regression coefficients have been produced, the user is unable to incorporate engineering judgement into the estimating procedure. Generally, the user of the equations is not familiar with all the decisions and judgement involved in producing the regression equations, and thus the equations may be used improperly. Basically, it becomes a "black box" procedure with all the associated difficulties. 2.4 DATSUM/QUIKMap/FLOOD System The GIS application developed in this thesis (DATSUM/QUIKMap system), when used in conjunction with the FLOOD program, provides an alternate means of estimating flood flows which is simple and efficient. It offers many benefits over traditional techniques by requiring the user to provide considerable judgement and input to the analysis process. Each analysis is performed on a region by region basis, rather than trying to "homogenize" large areas, with the size of region and 8 significance of different records determined by the individual user. This involvement forces the user to be aware of the systems limitations and the relative validity of results. In addition, the system is quick and user-friendly and does require the complex mathematics associated with regression analysis. As well, a means is provided for incorporating various types of flow data into the analysis which could serve to significantly improve estimates. Finally, results are presented with confidence limits (low, probable and high values) so as to provide the user with a gauge or 'feel" for the quality of the results. The most common basin characteristic used in regional analysis is basin area, but other factors such as basin slope, mean elevation, mean annual precipitation, orientation relative to mountains, and number and size of lakes are also significant. As this project is aimed at producing a system suitable for a prefeasibility level of study, the data generated and produced is presented in a unit area format and is designed to facilitate a regional analysis based primarily on catchment areas. If the user wishes to make a more sophisticated regional analysis incorporating additional information, then that information must be obtained from a different source. The type of data generated and produced by this system is discussed in Chapter 4 and listed in Appendix A.1. 9 CHAPTER 3 GEOGRAPHIC INFORMATION SYSTEMS (GIS) 3.1 Why a Geoqraphic Information System Was Chosen Dr. Denis Russell, a civil engineering professor at the University of British Columbia, has written a computer program, named FLOOD, for estimating design flows on the basis of estimates of a stream's mean annual flow and its coefficient of variation (cv) (Russell 1991). The program works very well for streams that have available streamflow records, as it is easy to calculate mean flows and cv's. However, the area where this program has the most useful application is with streams that have either no or very sparse records. This is the case for the majority of small streams in the province. In the past, if one wished to use this program to estimate flows for an ungauged stream, one would have had to undertake the following lengthy process : Obtain a map of the province, locate the target stream, and identify any WSC streamflow stations in the area. 2. Extract the streamf low data for every WSC station in the area. (In the past, this data was only available from WSC in book form, but recently, a CD-ROM data source has been published.) 3. Analyze the data to generate unit area means and cv's for every station. 4. Perform a simple regional analysis to estimate the mean and cv for the stream in question. 5. Enter the estimated data into the FLOOD program. 10 Since steps 1) through 3) can be quite time consuming, it was considered desirable to develop a system which would eliminate them and present all the required data in a simple user-friendly package. The intent of this thesis is to develop such a package. It should be noted that this system does not attempt to estimate flow values, but rather is designed to aid engineers and hydrologists in making estimates. A program which attempts to make estimates of streamflow values could be dangerous. People are inclined to believe computer outputs with blind faith, and it is felt that more realistic and reliable values can be obtained by combining regional information with engineering judgment. In other words, the system will provide the user with information about the various streamflow stations in a particular region, and the user will have to "eyeball" the figures and estimate the best composite value for the stream in question. The FLOOD program prompts the user for low, probable and high estimates of a value, which lends itself very well to this "eyeballing" procedure. At first it was thought that the best way to create a data analysis and display system would be to develop a program using C++ software, which would generate a map of the province and display the mean and cv data for each streamflow station at the correct location on the map. However, this proved to be a daunting task, and upon further investigation, it was discovered that geographic information systems (GIS) could provide the required facility. Why reinvent the wheel when many systems designed specifically to handle spatial data were commercially available? 3.2 What is a Geographic Information System ? The latest computer tool to be applied to the problem of estimating streamflows is the geographic information system (GIS). A GIS is a spatial data handling system. Basically, it can be thought of as an image (map) display system connected to a database system. However, it is more sophisticated than that, as a 11 GIS must not only be able to store and display data, but it must also be able to manipulate and analyze data, and then display the results of these analyses (Marble 1984). A GIS is able to determine, in a quantitative fashion, the complex relationships which exist between map elements. 3.3 Advantages of GIS Over Traditional Spatial Data Management Systems GIS's were developed out of a need to be able to overlay spatial data sets, so that relationships between the different layers of data could be determined. Traditionally, if one wished to integrate two or more spatial data sets, one would have to transform the data sets to a common map scale, prepare a transparent overlay for each set, arrange the different overlays so that coordinate systems are aligned, then manually prepare a composite overlay which shows areas where the different data types overlap to form various data groups. This procedure can be extremely complex and time consuming, and thus its use has been limited. The development of GIS technology has made this type of analysis very efficient and enables the user to selectively analyze only the data which is pertinent to a particular situation. As well, the availability of a system which is able to quickly and efficiently process large amounts of data encourages the consideration of different options or alternate approaches to an analysis, which otherwise may have been missed due to time, cost and manpower constraints. In addition to the ability to analyze spatial data, GIS's provide other advantages over the traditional spatial data management systems. For instance, once a data base has been established and stored in a GIS, it is easily modified and updated as needed. This is in contrast to analog map documents which are costly and time consuming to change. Another advantage of GIS's is that they allow the user to quickly move from one area of the map to another, and to zoom in and 12 out of a map image as required. Traditional paper maps provide only one view and are difficult to work with if the area of interest spans more than one map sheet. GIS's are also superior over traditional maps in the sense that they provide a readily available system for producing graphics for reports or presentations. Most GIS software systems are able to quickly produce high quality scale plots or prints, and supporting map features, such as roads, lakes, streams or culverts can be incorporated in the graphics whether or not they are GIS features (ie. connected to the data base). Finally, digitized maps are now being produced by government and private agencies which are compatible with most GIS packages, thus providing a source of high quality basemaps. In British Columbia, there is currently an ambitious mapping program underway to produce a digital terrain model of the entire province. This project is known as TRIM (Terrain Resource Information Management) and is a joint venture between government and industry in an effort to create a common geographic reference base (Balser 1989). By 1996, over 7000 digitized maps, at a scale of 1:20 000, will provide planimetric and topographic data for the whole province. This will serve to dramatically facilitate the use of geographic information systems in B.C. 3.4^Disadvantages of GIS's The technology of geographic information systems is very impressive and it is easy to get carried away and use GIS's inappropriately. Other computer systems may be far more suitable for a particular application and care must be taken when assessing the suitability of a project. It must be kept in mind that a generic GIS is not a modeling tool; it is a spatial data handling tool that can enhance modeling. GIS's have a number of drawbacks. For instance, building a GIS data base can be very time consuming and expensive, and often it is justified only if the 13 particular area impacts on many lives, and/or the model will be used for long term ongoing monitoring and studies, and/or the investigation is very complex. In addition, GIS's generally require large amounts of data, and locating and obtaining appropriate data may be a problem. Finally, geographic information systems is a specialized technology requiring highly skilled operators to fully utilize its capabilities. This requirement may limit the usefulness of a system to engineers or hydrologists. A note of caution which is particularly applicable to GIS's: as computer systems become more complex and automated, it is imperative that the user understand both the geographic information systems and any models used in conjunction with them. The user must be able to anticipate the expected range of output values in order to assess the quality of results. GIS's provide "a vehicle for generating impressive, seemingly accurate maps expressing complex spatial relationships with minimal understanding of actual spatial relationships" (Berry 1987). There is danger in the fact that technology is advancing so rapidly that it may be getting ahead of knowledge about using systems and interpreting results. 3.5 GIS and Hydrology GIS technology has been developing for approximately 20 years and has found extensive application in the forest industry. It is starting to gain acceptance in many engineering fields, such as mining and geology, and over the last ten years has seen increased use in hydrologic modelling (Haan, Nofziger and Zhang 1989). However, there is still not widespread use of GIS in most hydrology circles. This is surprising, as many hydrologic parameters are areally-based and interrelated, and geographic information systems are ideally suited to deal with this type of data. Perhaps the reluctance to adopt this technology is due in part to the fact that a GIS is a specialized tool requiring skilled personnel to fully utilize and understand its 14 capabilities, and many engineers and hydrologists would rather deal with systems that do not have such stringent operational requirements. Regardless, GIS use continues to gain momentum and will become more prominent as computer technology advances. 3.5.1 Using GIS to Generate Hydrologic Model Inputs The majority of water related GIS projects has involved using GIS's for either water resource management or the derivation of hydrologic parameters for input into hydrologic models. This second area of study has received considerable attention recently as hydrologic computer models have found greater use with the advent of the personal computer. The integration of GIS's with hydrologic models is playing an increasing role in the design, calibration and comparison of hydrologic models. Geographic information systems have been applied to hydrologic modeling in several recent studies. Moeller (1991) developed a GIS application to be used in conjunction with the SCS unit hydrograph methodology in the HEC-1 hydrologic model. His system automates the tedious and time consuming process of calculating areas for the derivation of SCS curve numbers, percent impervious values, and subbasin areas. These three parameters lent themselves very well to GIS analysis as they are all areally-based so that an areal extent was required in order to quantify their values. Muzik (1988) developed a system similar to Moeller's, in that he used a GIS to store data on physical land characteristics and rainfall data, and then generate SCS runoff curve numbers. He developed his system in association with another computer system which retrieved information from the GIS and computed several runoff parameters. 15 Wolfe and Neal (1988) used a GIS to provide data input to a finite element model. The layering capabilities of the GIS were used to overlay maps of soil type and land cover in order to identify areas which were hydrologically similar. Stuebe and Johnston (1991) developed a GIS application to assist in all stages of modeling stormwater runoff. The system was used to generate input for the USDA SCS TR-55 hydrologic model. Elevation, soil and landcover data were entered into the GIS and "hydrologic response units" were created and assigned a runoff curve number. Runoff volume values were then calculated for each grid cell and the watershed was delineated. The system then used the runoff area and runoff volume values to estimate a runoff volume for the watershed. 3.5.2 GIS and Expert Systems Another area of study which is producing some encouraging results is the combination of GIS applications with expert system technology. An expert system is a computer program designed to mimic the decision making or problem solving processes of a human expert. These systems are designed to deal with uncertainty and therefore have enormous potential in the field of hydrology, where situations are rarely well defined. An expert system can be used to capture the expertise of a human expert in a manner which can be interpreted and used by other computer systems, such as a GIS. GIS/expert systems are generally designed to produce input parameters for hydrologic models, but they differ from basic GIS's because they combine the efficiency of the GIS with the "mature hydrologic expertise" of an expert system to optimize the outputs. VanBlargan, Ragan and Schaake (1990 and 1991) have done a considerable amount of work in this area and have produced a system which automatically generates drainage area boundaries, times of concentration, and hydrographs. The system requires digital elevation and stream data as input. It 16 delineates basin areas by calculating the flow direction of every grid point on the basis of a number of rules (ie. the expert system) and then groups all the grid points which flow towards a common stream. Travel times across each cell are also computed and then the time of concentration for a basin is calculated by examining each possible flow path to determine the longest hydraulic path. Unit hydrographs are then generated from time area curves defined from the area and travel time values. The unit hydrographs can then generate complete streamflow hydrographs for any given storm. In addition to significant time savings over traditional means, this expert system may provide improved estimates of values. For instance, when determining times of concentration, the expert system examines every possible flow path and considers the slope and land use for each grid cell. In contrast, traditional methods could not afford this attention to detail, as it would be far too time consuming to be practical. Thus, flow paths were generally estimated as the longest distance seen on a contour map, a simplification which could produce considerable error. Obviously, expert system/GIS packages offer benefits over basic GIS applications, and this area of study will probably see the greatest development and use in the future. To date, however, hydrologists seem reluctant to embrace expert system technology. Perhaps this reluctance is due to the fact that engineers tend to be very cautious and skeptical of any system which claims to handle uncertainty. Water related processes (weather, runoff, streamflow, etc.) are inherently uncertain, and engineers have been witness to many failed attempts to integrate this uncertainty into conventional computer programs. With continued successful application, this technology will gain acceptance and should play a major role, likely in conjunction with geographic information systems, in the future development of hydrologic modelling. 17 3.5.3 GIS/Hydrologic Model Interfaces As explained previously, geographic information systems are finding application as input data generators for hydrologic models. Currently, most systems require an operator to use a GIS to generate hydrologic parameters, which are then manually entered into a hydrologic model. The two systems operate independent of each other, which is inefficient. In response to this, some work has recently been done in the area of developing GIS/hydrologic model interfaces. An interface acts as a link between the two systems, so that the data produced by a GIS is customized to suit whatever application is being run with the hydrologic model. Fisher (1989) describes a conceptual design for such an interface. This system would provide a graphic, menu-driven interface whose structure would be easily adaptable to various applications, including different hydrologic models. More work needs to be done in this area and once the various systems start "talking" to each other, GIS's will find greater application and use in hydrology. 3.5.4 Database/GIS Interfaces One area of study that seems to have received very little attention is the development of interfaces between raw data bases and geographic information systems. All types of electronic databases are currently available, but they do not have much practical use if they cannot be readily utilized by other computer systems. This thesis involves, amongst other things, the development of an interface between Water Survey of Canada's CD-ROM streamflow data base and a geographic information system called QUIKMap. This interface analyzes raw data to generate streamflow parameters which are then formulated so that they can be read and displayed by the GIS. The system is designed to summarize large quantities of data so as to provide users with a simple means of viewing data on a region by region basis, for the province of British Columbia. In contrast to most 18 other hydrological GIS applications, the GIS, in this case, serves mainly to display data and does not perform any complex analysis. As far as the author knows, this project is the first attempt in Canada to use a GIS system to aid in estimating streamflow parameters for ungauged streams for a whole province, in such a simple, yet effective, fashion. 3.6 Selection of a GIS Package For the purposes of this study, the GIS software QUIKMap, developed by the Axys Software company of Sidney, British Columbia, was chosen. This software package was selected because: 1. The system is able to store and display spatially related data. 2. An appropriate map of British Columbia was in print and available for use with the system. The Ministry of Environment has produced a digitized 1:2 500 000 scale map of B.C., called the Blue Atlas, which is available to any interested party. This map represents all the major streams and rivers in B.C., and is completely compatible with the QUIKMap software, making it perfectly suited to our application. 3. The system is very easy to use and operate. In contrast to many of the more sophisticated GIS's, QUIKMap does not require the operating skills of a full-time GIS technician. It was the intent of the author to produce a product which could be used simply and quickly by any hydrologist. 4. The system is relatively inexpensive, at approximately $800, compared to the average costs of $3,000 to $4,000, and a high of approximately $20,000, for other GIS software. This low price will not prevent small 19 consulting firms from purchasing the software and utilizing the results of this project. It should be noted that the reason this system is relatively inexpensive, is that it does not contain complex data analysis and manipulation capabilities and it does not handle polyline data as well as many other systems. Technically, QUIKMap would not be considered by many to be a "true" GIS, but rather a desktop mapping system. However, as this project is largely concerned with presenting point data, which QUIKMap handles very well, this is not a concern. 5. It is a product developed, sold and supported by British Columbians. As this project is largely funded by the Science Council of B.C., it was felt that it would be appropriate to utilize local products and expertise. 20 CHAPTER 4 DEVELOPMENT OF THE DATSUM PROGRAM Once a GIS package was selected, the next step was to determine how the system displayed data, how the data available on the CD-ROM could be formatted to be compatible with the GIS, and how it could be analyzed to produce the required results. 4.1^Initial Considerations The first factor to be considered was that the CD-ROM data is in ASCII format, while the GIS system requires database files in dBASE III format. The second factor was that the CD-ROM data is only available in a very basic form, and it needed to be analyzed and manipulated before being exported to the GIS. After much consideration, it was decided that the requirements could best be met with a spreadsheet program, and the software package Excel, produced by the Microsoft Corporation, was chosen. In addition to meeting all the requirements listed above, this software, or its compatible competition, Lotus 1-2-3, is available at almost all engineering consulting firms. At first it was hoped that an interactive system could be developed that would create and display data as needed. However, the GIS, Excel and CD-ROM software do not have the capability to interact, and so data summary files have to be created independent of the GIS. The files are saved to a directory which is then accessed by the GIS. 21 4.2 WSC Data Files Another unforeseen problem was that CD-ROM data files can only be exported and labelled one file at a time. As the province of British Columbia contains over 1600 streamflow stations, and two files were needed (monthly data and extreme data) for every station, over 3200 files had to be extracted and saved before any analysis could even begin. This proved to be an extremely time consuming and labour intensive process, and as a result, this project was limited to creating data summary files for only the south-west and central-west areas of the province. This required only approximately 1000 data files and was deemed sufficient to demonstrate how the system operates. If the user wishes to view data for other areas of the province, then data summary files can be created as needed. Fortunately, once the data summary files are created, they should be relevant for at least three or four years. It will not be necessary to produce new data summary files every year as the values calculated by the system are statistically based, and an additional year of records will not be too significant. The files created for this project are based on data up to and including 1990, and it should not be necessary to recalculate the values until the 1993 data is available. By then, hopefully, the CD system will be able to produce batches of files, which will dramatically streamline the data extraction process. If the latest data is only needed for a particular region, and one is not interested in producing data for the entire province, then up-to-date data files can be created easily. 4.3 DATSUM Summary Files All the streamflow stations in Canada are organized in terms of regions (ie. region 08JA), and every station is given a name associated with a particular region(ie. 08JA001, 08JA002, etc.). Thus, it was logical to organize the data summary files in terms of regions, and the Excel data summary program, DATSUM, 22 is designed to do just that. Basically, the program takes all the raw CD-ROM data files for a particular region of the province and performs a number of calculations to create streamflow parameters for each file. It then exports all the calculated values to a data summary file, where additional information pertaining to the data display in the GIS is added. A data summary file, which can be read by the GIS, is created for every region in the province. An example of the data summary file produced for region 08JA is shown in Appendix A.1.2. Also given is a description of every data type created in the database summary file. It should be noted that this data summary file is shown as it appears in either a dBASE III or Excel environment, but that it is displayed differently in QUIKMap. An example of how QUIKMap displays an individual line from a summary file is shown in Figure 5.7. The data summary files created by DATSUM contain a variety of different streamflow parameters. These specific parameters where chosen because they are some of the most commonly used hydrological values, and because this version of DATSUM was designed specifically to create inputs for the FLOOD program. However, with a basic knowledge of the Excel language, the program can be easily modified to calculate other streamflow values. 4.4 DATSUM Program Organization The DATSUM program is written in Microsoft Excel macro language, and operating the program requires a basic knowledge of the Excel worksheet environment and commands. If the user is not familiar with Excel, then the Excel User's Manual (Microsoft Corporation, 1991) should be referenced. The program DATSUM consists of nine macrosheets and two worksheets. The macrosheets are named MACRO1A.XLM, QUIKDATA.XLM, INDEX.XLM, FLOW.XLM, RATIO.XLM, SETUP.XLM, OPEN.XLM, DATSUM.XLM, and FUNCTION.XLA, and the worksheets are named DUMM001E and SUMBASE.XLS. 23 The sheet MACRO1A.XLM contains small macros that perform very simple and specialized functions related to formatting the individual streamflow data files. The sheet QUIKDATA.XLM contains macros that specify what sort of symbols will represent each streamflow data file. The sheet INDEX.XLM contains macros that take information from the raw index data files and transfer it to the summary files. The sheet FLOW.XLM contains macros that link other macros together to format individual streamflow data files, while the sheet RATIO.XLM contains macros that perform similar functions for individual max flow files. The sheet SETUP.XLM contains macros designed to setup the other macros so that they will operate for a specific data file or group of files. The sheet OPEN.XLM contains macros which open and close different combinations of DATSUM files. The sheet DATSUM.XLM contains only one macro, and it is the master program macro which opens and closes files as needed and orchestrates all the other macros. The sheet FUNCTION.XLA is an "add-in" macrosheet and it contains an "add-in" macro which calculates the correlation coefficient of an array of numbers. Finally, the sheet DUMM001E is a dummy file used by the system when a certain type of file is missing, and SUMBASE.XLS is a generic worksheet onto which the results of a DATSUM analysis are exported and saved. A listing of the 67 macros which make up the program, and a description of their functions, is given in Appendix A.2, as is a flowchart which illustrates the inter- macro relationships (Figure A.2). A copy of the programming code is given in Appendix A.3. Some of the code is not shown in its entirety, but in general, the missing code is text which is not significant to understanding the programming. Note that macrosheet FUNCTION.XLA is not mentioned in either Appendices A.2 or A.3 because it is very simple and does not merit inclusion. 24 4.5 Instructions for Use of DATSUM The following text outlines instructions for installing and operating the DATSUM program. This program has been designed to operate in the Microsoft Excel software environment, although with modification, it may run in the Lotus 1-2- 3 environment. If the instructions are followed carefully, the program should run without incident. However, it should be noted that the program is not very robust, in that it has not been written with many safeguards against user errors, and could potentially be destroyed with improper use. Therefore, it is highly recommended that a back-up copy of the program always be kept on hand, so that if the user inadvertently destroys the program, it can easily be restored to its original state. 4.5.1 Installing the Program Install the DATSUM program by copying the files MACRO1A.XLM, QUIKDATA.XLM, INDEX.XLM, FLOW.XLM, OPEN.XLM, RATIO.XLM, SETUP.XLM and DATSUM.XLM into a directory in the C:\  drive titled QUIKMAP\FLOWDATA MACRO (ie. C:\QUIKMAP\FLOWDATA\MACRO). Next, copy the file FUNCTION.XLA into the EXCEL directory titled XLSTART (ie. C:\EXCEL \XLSTART), and the file SUMBASE.XLS into a directory in the C:\ drive titled QUIKMAP\FLOWDATA\SUMMARY (ie. C:\QUIKMAP\FLOWDATA  \SUMMARY). This is imperative! The DATSUM macros have many references to files in these directories. Therefore, the program will not run properly if the files are loaded into different directories, unless the file references are changed appropriately. 4.5.2 Creating and Loading WSC Data Files The DATSUM program creates streamflow data summary files for streamflow regions in British Columbia. In order to do this, it requires monthly flow and extreme flow data files for every streamflow station in a particular region. As mentioned 25 earlier in the text, this data is compiled and updated annually by Water Survey of Canada (WSC), and is available in CD-ROM format. The following are instructions for obtaining the raw data files from the CD-ROM in the correct format, and loading these files so that they can be accessed by the DATSUM program. Note that the DATSUM program could be used to create streamflow data summary files for any province in Canada, but that this thesis limits the discussion to B.C. The WSC CD-ROM surface water database package, known as HYDAT (Environment Canada, 1990), can be obtained from the Water Resources Branch or Environment Canada. The package consists of a user's manual, some data access software, and a CD-ROM disk containing surface water data for all of Canada. The package allows the user to view and copy a variety of data types, in a variety of data formats. The DATSUM program has been designed to operate on files formatted in a very particular manner, and the following steps should be taken to ensure that the files are created properly. For the sake of brevity, it is assumed that the reader has access to the HYDAT User's Manual, and so the instructions below are given in a very simplified form. 1. Enter the province window and select the province of interest; in this case, British Columbia. 2. Create an index file of all of the flow files in a particular region. Use the filter function to select only those files in a particular region (ie. all stations with numbers beginning with 08MH, for example) and only those files with flow records (ie parameter=discharge). Save the file with the "Output", "Print", "File" and "All" commands, and name the file regionnameINDE (ie. for region 08MH, name the index file 08MHINDE). 26 3. For every station within a particular region, create mean flow and extreme flow files. Two files must be created for every station, except in the situation noted below. Do this by selecting the station of choice, then selecting either "Mean" or "Extreme". Wait for the system to create your file on the screen, then select the "Output", "Print", "File" and "All" commands. Finally, name the file with its station number and the letter F or E, to indicate whether the data is mean (F)low or (E)xtreme (ie. 08MHOO1F or 08MH001E, 08MHOO2F etc.) Take note that every streamflow station may not have extreme data. If this is the case, then an extreme file cannot be created. Instead, simply record the name of the station so that this information can be inputted to DATSUM when the system prompts the user for such information. Also, make sure that the dummy file, DUMM001E, is copied to the same directory as all the files for that region. This is further explained in step 4, below. 4. Once all the required files have been created for every station in a region, create a directory with the region name, and then transfer all the files to it. Create this directory as a subdirectory of CAQUIKMAP\FLOWDATA (ie. for region 08MH, save the INDEO8MH file, the DUMM001E file (if needed), and all the 08MH***F and 08MH**"E files to CAQUIKMAP\FLOWDATA\08MH.) It may sound unnecessarily complicated to be creating so many different directories, but the user could potentially be dealing with hundreds of files, and this process is necessary to maintain order. 4.5.3 Running the Program Enter the Microsoft Excel environment and open the file OPEN.XLM. Run the "OPEN" macro by pressing the keys <CTRL> < o >. This will automatically open all the files necessary to run the DATSUM program. Start the DATSUM program by 27 pressing the keys <CTRL> < d >. A number of messages and prompts will appear on the screen (Figure 4.1). Simply follow the prompts and make appropriate selections. Figure 4.1 - Sample DATSUM Screen Message. The first significant input required by the program will be a four character WSC streamflow region name (Figure 4.2). Figure 4.2 - Sample DATSUM Input Screen. 28 After this data has been entered, the system will prompt the user for the names of those stations in the region without extreme data records (Figure 4.3). Once this information has been entered, the program is ready to run. The user will be required to wait momentarily, and then a beep will sound to indicate when the analysis is complete. The program saves the results of the analysis to the CAQUIKMAP\FLOWDATAISUMMARY directory in both Excel (.xls extension) and DBase III (.dbf extension) format. 4.5.4 Exiting the Program Once the program has saved the results of an analysis, the user is given the option of either exiting the program or running it again. If the exit option is chosen, the program will close all the DATSUM macros except DATSUM.XLM and OPEN.XLM. The user must close these files manually without saving any changes. 29 Do you wish to create a dummy extreme data file for a sheamflow station? Answer YES or NO_ IYES R.M.= '';§•§V. Enter the name of the missing station_ Make sure to put the characters "E" and "." at the end of the name_ ie. 081411001 E. Microsoft Excel As you should be aware, WSC's extreme flow data base is not complete. Thus, the program must substitute a dummy file for those stations without extreme data records. You will now be prompted to input the names of those stations for the current region. Figure 4.3 - Sequence of DATSUM Screens Requesting Extreme Flow Information. 30 CHAPTER 5 OPERATION OF THE DATSUM/QUIKMAP SYSTEM 5.1^System Organization This chapter will describe how the DATSUM summary files are displayed by the QUIKMap GIS software. This description will be very basic, however, as it is assumed that the reader is familiar with the operation of QUIKMap. If this is not the case, and the reader has difficulty following the text, then the QUIKMap operation manual should be referenced. This particular application of QUIKMap has been designed to present streamflow information on all the different streamf low stations in B.C., for the purpose of regional analysis. When performing a regional analysis, one does not usually know the names of the various streamflow regions around the stream of interest. Thus, one must first identify these regions, before one can request detailed streamflow information. To facilitate this process, the system has been setup with two subsystems; the first allows the user to view the locations of all the streamflow regions in the province and identify those in the area of interest, while the second permits the user to access detailed streamflow information. What follows is a description of these two subsystems and how they can be operated. 5.2 Operating the QUIKMap Application Before entering the QUIKMap operating environment, it is necessary to change the current directory of the computer system to the directory containing all of the relevant data files (ie. CAQUIKMAP\FLOWDATA\SUMMARY). This will 31 ensure that the system will be able to access all the necessary files. Once this is done, the user may enter the QUIKMap program and select the appropriate base map. The next step is to then select the data file of interest. Upon selecting the data file option from the pull-down menu, a small window will appear on the screen and display the following: I WSC Station Locations I WSC Station Data Figure 5.1 - QUIKMap Data File Directory Menu If the user selects "WSC Station Locations", then another window will appear on the screen to allow the user to select different regions of the province. This window will appear as follows: All BC WSC Stations South-west BC WSC Stations South-east BC WSC Stations Central-west BC WSC Stations Central-east BC WSC Stations North-west BC WSC Stations North-east BC WSC Stations Figure 5.2 - QUIKMap Data File Directory Menu Upon selection of the appropriate region, a number of small names (the streamflow regions) will appear on the map (Figure 5.3). In order to identify the streamflow regions in the area of the stream in question, the user need just 32 Figure 5.3 - Basemap of B.C. With WSC Streamflow Regions activate the coordinate display option, so that the latitude and longitude of the cursor appears on the screen, and then move the cursor to the location of the stream. The user can zoom in and out of the map display in order to isolate individual stations or view larger areas as needed. Once the user has identified the streamflow regions of significance, she/he is now in a position to ask the system to provide data on the various streams in the region. To do this, the user returns to the initial pop-up window and selects "WSC Station Data". This initiates another series of pop-up windows and the user simply clicks the mouse on the suitable choices and works through the system to select the appropriate streamflow regions (Figure 5.4). 08D Regions South-west BC 08E Regions South-east BC 08F Regions 'Central-west BO orlo* 108J Regions' 'Region oeJA Central-east BC 08K Regions Region 08JB North-west BC 08M Regions Region 08JD North-east BC 080 Regions Figure 5.4 Sequence of QUIKMap Data File Directory Menus. Subsequently, various symbols will appear on the screen, each symbol representing a different streamflow station (Figure 5.5). The symbols appear in a variety of shapes, sizes and colours, depending on the characteristics of each stream. In addition to the symbols, a small box containing information about the stations will appear beside each symbol. If the station has five or more years of record, then the station name, unit area mean annual flow value, and 34 Figure 5.5 - WSC Streamflow Stations - Region 08JA flow pattern index value are given. (The flow pattern index is the coefficient of variation of the mean monthly flows and gives an indication of how the flow values change from month to month.) If there are less than five years of record, then only the station name and an "insufficient data" message are given. This message signals to the user that any statistical information should be used with caution because of the limited data. A symbol legend, as in Figure 5.6, can be brought up on the screen to guide the user, and from the symbols alone, considerable information about each stream can be obtained. In addition, if more detailed information is required, the user need simply place the cursor on the appropriate symbol and press the key <E>, and a detailed data window, as in Figure 5.7, will appear on the screen. This window displays the data summary file created by the DATSUM program, and as explained earlier, the type of information displayed can be controlled by editing the DATSUM program. 36 MAP LEGEND CATCHMENT SIZE (km2) 0 (brown)^ 0 - 100 (yellow)^100 - 500 (blue)^500 - 1,000 (light blue)^1,000 - 5,000 El hed)^5,000 - 10,000 (violet)^10,000 - 15,000 (green)^15,000< UNIT AREA RUNOFF (cme per km2) 0^0.000 - 0.0250 0.0250 - 0.050 0.050 - 0.075 0.075 - 0.100 0.100< ):(^INCOMPLETE DATA SOLID SYMBOLS =ACTIVE STAllONS EMPTY SYMBOLS = DISCONTINUED STATIONS INSUFFICIENT DATA = <5 years of data Figure 5.6 - Sample QUIKMap Map Legend 37 STREAMFLOW STATION GENERAL INFORMATION Station Number^08JA014 Station Name^VAN TINE CREEK NEAR THE MOUTH Rainfall Catchment Area (km2) 153 Station Coordinates - latitude degrees 53 Station Coordinates - latitude minutes 15 Station Coordinates - latitude seconds 30 Station Coordinates - latitude degrees 125 Station Coordinates - latitude minutes 24 Station Coordinates - latitude seconds 50 STREAMFLOW STATISTICAL PARAMETERS Mean of the mean annual runoffs (m3/s)^ 0.9 Unit area mean of the mean annual runoffs (m3/s/km2)^0.006 Coefficient of Variation of the mean annual runoffs^0.52 Flow Pattern Index (0 means no monthly change in flow)^1.22 Time span over which streamflow records were kept^1974-1990 Number of years of streamflow record^ 17 Number of complete years of streamflow record^16 Mean of the annual max daily flows (m3/s) 11.35 Unit area mean of the annual max daily flows (m3/s/km2)^0.074 Coefficient of Variation of the annual max daily flows^0.54 Mean of the annual ratios of max inst. flow to max daily flow^1.16 Coefficient of Variation of the annual ratios^ 0.14 Correlation Coefficient between max daily and ratio inst./daily^0.34 Number of complete years of extreme flow record^14 QUIKMap DISPLAY ATTRIBUTES Type of Data Symbol/Line type^1 Size of Symbol 5 Symbol Colour 14 Symbol Hatch Pattern^1 Label Text^ 08JA014/nMAR = 0.006/nFPI = 1.22 Label Size 1 Special Instructions NOTE: Any negative value (ie. -99.99) indicates missing or incomplete data. Figure 5.7 - Sample QUIKMap Data Display Screen. 38 One can see that the system provides a very efficient and user-friendly way to obtain streamflow information, which can aid hydrologists in numerous applications. One example would be to use the information presented as input to a flow estimating program, such as FLOOD. 39 CHAPTER 6 THE FLOOD PROGRAM 6.1^Description of the FLOOD Program The FLOOD program was developed by Dr. Denis Russell, a professor in the Civil Engineering department of the University of British Columbia. This program is designed to estimate extreme high (flood) and low flows for streams with little or no streamflow data. The program uses regional data to create initial flow estimates, and then updates the estimates according to additional information, through the application of Bayes' theorem (Russell 1981). The basic concept is to set up a compound probability distribution on the basis of prior information, and then update the weights of component distributions in light of additional information. This paper will not attempt to describe in detail the theory behind the method, but rather refers the reader to Dr. Russell's paper (Russell 1991). It should be noted that the original version of the FLOOD program allows the user to select the underlying probability distribution type from a list of four, while the simplified Excel version of the program limits the user to only the Extreme Value Type I (Gumbel) distribution, as this is the most appropriate distribution for extreme, or flood, flows. 6.2 Advantage Over Frequency Analysis Traditionally, the most common method of translating raw streamflow data into future probabilities of occurrence, is frequency analysis. This technique requires annual peak flow data, but does not give consideration to other types of data, such as the largest flood, or the flood which has not been exceeded, in a 40 number of years. If the data base is very large, then the effect of this additional data is often inherent in the analysis, but if the data base is small, which is often the case, then the significance of these values is not accounted for. This sort of data, which is often available through a high water mark or the recollections of local residents, may prove to be very important in arriving at final flow estimates. The FLOOD program provides a means of incorporating this information into the flow estimating procedure. In Appendix B, one can find instructions for use of the Excel version of FLOOD, and a description of the various files which make up the program. 41 CHAPTER 7 SAMPLE APPLICATION 7.1^Kitsequecla River For the purpose of demonstrating this system, a regional analysis was performed to estimate streamflow parameters for the Kitseguecla River, and then these results were used as input for the FLOOD program, to estimate peak flows. For this demonstration, it was assumed that the 200 year daily and instantaneous peak floods were required. WSC station 08EF004, The Kitseguecla River near Skeena Crossing, is located in central-west British Columbia and has a drainage area of 728 km2. The river flows in a south-westerly direction through mountainous regions, and there are wide variations in precipitation and runoff in the area. The river was gauged from 1960 until 1971, and there are eleven years of max daily and max instantaneous flow records. 7.2 Data Summary Procedure As described earlier in this document, the DATSUM program was used to create streamflow data summary files for various WSC regions, including all of those in the central-west area of the province. Thus, as the Kitseguecla River lies in this area, a regional streamflow database was available, and the hydrologist was able to use the GIS system to quickly identify the relevant WSC streamflow stations, and then to access and summarize the data for each station. This was done in the following manner, and the results of the analysis are shown in Table 7.1: 42 1. The user entered the GIS system and requested the "Streams of B.C." basemap, a digitized map of the province showing all the major streams and rivers. 2. From the pull-down menus, the user selected the "WSC Station Locations" and the "Central-west BC WSC Regions" options, resulting in a display of all of the WSC streamflow regions in the central-west area of the province (similar to Figure 5.3, but only displaying stations in the central-west region of the province.). The user then moved the cursor to the location of the Kitseguecla River, and identified the streamflow regions in the local area as 08DB, 08EB, 08EC, 08EE, 08EF, 08EG. 3. From the pull-down menus the "WSC Station Data" and "Central-west BC" options were then selected, followed by selections of each of the six streamflow regions of interest. This resulted in a screen display of a symbol and message for every streamflow station within these regions (similar to Figure 5.5), a total of 62. 4. The next step was to limit the analysis to the most relevant stations. As these stations are located in a mountainous region there are wide variations in basin characteristics and runoff values. Therefore, it is difficult to determine what factors may make one station more relevant than another. The selection process can be made on the basis of a variety of different parameters, depending on the type of information available and the extent of the analysis. If this was to be a detailed analysis, then the selection would likely be based on a combination of different factors and the process would involve a considerable amount of time and hydrological judgement. However, as this analysis is simply for 43 the purpose of demonstrating the DATSUM/QUIKMap system, and as the number of streamflow stations involved is very large, a simple approach was adopted. In this case, proximity to the Kitseguecla River was chosen as the limiting factor. The QUIKMap system has a function to facilitate this type of selection, and so the "Radius" option was chosen. This allows the user to select only those stations within a specified radius of a chosen point. A radius of 100 km around the station 08EF004 was chosen and the resulting screen image is shown in Figure 7.1. All those stations marked by a small arrow have been "tagged", and with an additional command, the screen will redraw and display only those stations, a total of 31. 5. Further reductions were made by eliminating those stations with less than 5 years of streamflow records and those with incomplete extreme flow records, resulting in a final total of 11 stations. The data for these stations was then viewed and the results are summarized in Table 7.1. It should be noted that the system also has the option of allowing the user to print out individual data files as needed. Traditionally, the process of identifying, selecting, viewing and summarizing data is very time-consuming, but with this efficient and user-friendly GIS-data file package, Table 7.1 was generated in approximately 30 minutes. 44 Figure 7.1 - Use of QUIKMap's "Radius" Function Table 7.1 - Regional Flood Data for Kitseguecla River Near Skeena Crossing Station No. River Period of Record No. of years of Ex. Record Drainage Unit Area Area km2^MAR FPI Mean Flood CV Flood Mean Ratio Correlation Orientation Coefficient 08DB001 Nass River above 1929-1990 22 18500 0.042 0.96 0.187 0.16 1.04 0.23 N to S Shumal Creek 08EB005 Skeena River above 1970-1990 16 12400 0.029 1.08 0.164 0.19 1.05 0.2 N to S Babine River 08EC001 Babine River at 1929-1985 10 6480 0.007 0.7 0.016 0.38 1.04 -0.51 N to S Babine 08EC013 Babine River at Outlet of Nilkitkwa Lake 1972-1990 14 6790 0.007 0.7 0.021 0.35 1.01 -0.19 NW to SE 08EE008 Goathom Creek near 1960-1990 26 132 0.013 1.04 0.118 0.46 1.26 0.31 S to N Telkwa 08EE020 Telkwa River below 1975-1990 13 368 0.038 0.92 0.244 0.33 1.12 0.78 W to E Tsai Creek 08EE025 Two Mile Creek in 1982-1990 7 20 0.006 0.26 0.023 0.68 1.15 0.92 E to W District Lot 4834 08EE028 Station Creek above 1985-1990 6 11 0.025 1.04 0.191 0.21 1.29 -0.09 S to N Diversions 08EF001 Skeena River at Usk 1928-1990 33 42200 0.022 0.94 0.117 0.2 1.03 -0.03 N to S 08EF005 Zymoetz River above 1963-1990 25 2980 0.035 0.86 0.22 0.55 1.25 0.54 NE to SW O.K. Creek 08EG011 Zymagotitz River near 1960-1990 28 376 0.062 0.71 0.452 0.4 1.48 0.49 NW to SE Terrace 08Ef004 Kitseguecla River near Skeena Crossing 1960-1971 11 728 0.021 0.97 0.184 0.36 1.5 0.56 NE to SW low^probable^high mean max daily flood (m3/s/km2) coefficient of variation ratio of daily to instantaneous correlation coefficient 0.02 0.16 0.4 0.16 0.36 0.6 1.12 1.28 1.48 0.24 7.3^Reclional Analysis The next step was to perform an analysis of this data so as to estimate streamflow parameters for use in the FLOOD program. In order to calculate max daily and max instantaneous flows for various return periods, the FLOOD program requires inputs of low, probable and high estimates of a stream's mean daily flood and coefficient of variation, its ratio of peak daily to instantaneous flows, and the correlation coefficient between the ratio and the mean daily flood. For Kitseguecla River these values were estimated as follows: Table 7.2 - Estimates of Peak Flood Parameters for Kitseguecla River Reasons for the selection of these values are as follows: 1. Mean daily flood: The mean daily flood values differ by as much as an order of magnitude, and as there is no clear correlation between flow and drainage area, orientation, or period of record, it seemed reasonable to use the average value as the most probable estimate, and then to specify a wide range between the low and high estimates. Some thought was given to discarding the low and high values, but as station 47 08EE025 is one of only two stations within a 30 km radius of station 08EF004, this approach was abandoned. 2. Coefficient of variation: The coefficients of variation fall within a much narrower range than the daily flood values, but in a similar fashion, they do not correlate strongly to any basin characteristics, and so the same approach as used for the daily flood estimates was adopted. 3. Ratio: The ratios of max daily to max instantaneous peak flows tend to decrease as the catchment area increases. Thus, the ratio estimates were based on the ratio values of stations with catchments of similar size to station 08EF004. Three stations, 08EE020, 08EF005, and 08EG011, were identified with this characteristic, and the most probable ratio estimate was given as the average of their ratios. The high and low estimates were simply the high and low values of these three stations. 4. Correlation Coefficient:^In contrast to the ratio values, the correlation coefficients between the max daily floods and ratios of max daily to max instantaneous flows are not strongly related to catchment size, so the average value was used. 7.4^Additional Data The flood program permits the user to input additional flood data in order to improve or update the flood estimates. Station 08EF004 has 11 years of flood records (Table 7.3), and increasing amounts of this data were inputted to FLOOD to demonstrate how additional data alters the estimates. Initially, five years of flood data were used (1960-63,65), then nine years of data(1960-63,65-69), and finally, nine years of data plus the fact that a flow of 229 m3/s was the largest flow in 11 48 Table 7.3 - Extreme Flow Records for Station 08EF004 KITSEGUECLA RIVER NEAR SKEENA CROSSING - STATION NO. 08EF004  ̂ Max Instantaneous Discharges Max Dail ^Discharges Year Flow (m3/s) Time Date Flow (m3/s) Date 1960 106 07:30 PST 15-Oct 87.2 15-Oct 1961 265 12:15 PST 31-Oct 137 31-Oct 1962 97.7 12:05 PST 1-Nov 86.4 1-Nov 1963 190 23:10 PST 21-May 153 22-May 1965 171 07:30 PST 22-Oct 114 22-Oct 1966 603 04:30 PST 24-Oct * 229 24-Oct 1967 125 23:00 PST 6-Jun 112 2-Jun 1968 251 03:00 PST 21-May 203 21-May 1969 167 00:30 PST 24-May 119 24-May 1970 237 12:44 PST 26-Jul 160 26-Jul 1971 119 18:36 PST 6-Jun 75.3 7-Jun years. All 11 years of flow were not used as the system is designed for situations where there is very little data, and can only handle 10 pieces of additional data. However, all 11 yeas of data were used in a conventional flood frequency analysis, and the results of this, in comparison with the FLOOD results, are shown in Table 7.4. Table 7.4 - Estimates of 200 Year Mean Daily and Instantaneous Floods on Kitseguecla River Additional Data Proaram Conventional low^probable^high none 38.2^316.4^661.6 5 years actual (1960-63, 65) 187.8^254.8^321.9 224.7 9 years actual (1960-63, 65-69) 203.1^280.9^358.7 319.9 9 years actual (1960-63, 65-69) 231.1^295.7^360.4 313.9 + largest flood in 11 years Instantaneous Flows 9 years actual (1960-63, 65-69) 327.0^403.1^481.9 735.7 + largest flood in 11 years As one can see, the FLOOD results are initially a little more conservative than the conventional estimate, and the confidence limits are wide apart, but as more data is used, the FLOOD estimate becomes less conservative and the confidence limits narrow. 50 CHAPTER 8 DISCUSSION AND CONCLUSIONS Streamflows are essential information for environmental assessments and the design of water related facilities. Fortunately, many large streams and rivers have extensive historical records which make the process of predicting design flows, such as peak floods and low flows, relatively simple. However, the majority of streams in the province of British Columbia have little or no historical records. Thus, flows for those streams must be estimated by some other means, and the most commonly accepted procedure is regional analysis. Regional streamflow analysis is performed frequently around the province of British Columbia. However, to date, there has been no attempt to develop any sort of a package, or system, to facilitate or streamline the process. The development of such a system would not only increase the efficiency of data summary and analysis, but would encourage the consideration of larger amounts of data, thereby improving the quality of results. At present, the labour intensive nature of data summary work prohibits the examination of all but the most relevant material, and numerous scenarios are not considered. The personal computer, with its ability to handle huge quantities of data, has revolutionized the field of hydrology. The advent of the computer, coupled with the development of software packages designed specifically for managing spatially oriented data (GIS), provides an ideal environment for the development of a streamflow summary and analysis system. This thesis involved the development of such a system, which is known as the DATSUM/QUIKMap system. This system 51 uses a spreadsheet program called DATSUM to process raw streamflow data files from a CD information disk in order to create summary data files for Water Survey of Canada (WSC) streamflow regions in B.C. These files are generated in a format which can be read and displayed by a geographic information system known as QUIKMap. The data is displayed in a manner designed to aid hydrologists in using the system to perform regional analyses. In contrast to most hydrologic GIS applications, this system is very simple to operate and does not require the skills of a specialized GIS operator. Use of the DATSUM/QUIKMap system will facilitate water related studies in British Columbia, and potentially, all of Canada. The DATSUM/QUIKMap system, when combined with the FLOOD program, creates an application which offers many advantages over traditional regional flood frequency techniques. For instance, it requires the user to provide considerable judgement and input to the analysis process which forces the user to be aware of the systems limitations and the relative validity of results. In addition, it is a quick and simple technique which does not involve the complicated mathematics associated with regression analysis. A well, it provides low, probable and high estimates of flood flows and permits the use of additional information to update estimates. The DATSUM program was specifically designed to create streamflow data summary files for WSC regions in B.C., but as WSC is a federal organization, and streamflow data files are produced in a standard format for the whole country, the program should operate equally well on data for any province. Thus, a user need only obtain a digitized map of his/her specific area of interest, which is compatible with the QUIKMap GIS software, and a streamflow data analysis and summary system could be created for any area of the country. Currently, the availability of electronic databases and digitized maps in Canada is rather limited, and this is restricting the development and use of GIS 52 related applications in hydrology and other disciplines. However, in British Columbia, there is an extensive mapping program underway to produce a digital terrain model of the entire province. By 1996, over 7000 digitized maps, at a scale of 1:20 000, will provide planimetric and topographic data. These maps will be compatible with most commercially available GIS packages which will serve to dramatically facilitate the use of GIS systems in B.C. As with any software system, the DATSUM/QUIKMap system is far from being developed to its fullest potential. Computer technology is changing so rapidly that most software and hardware systems are almost obsolete by the time they are produced. This system underwent numerous revisions during its development, and there is still opportunity for improvement. For instance, the CD-ROM system is only able to export one named data file at a time, rather than a batch of files. This is an extremely limiting feature when one wishes to retrieve hundreds of data files. Another area needing improvement is the interaction capabilities of the different components of the system. The current system requires the user to operate three independent software systems to access, manipulate and view data. Unfortunately, the WSC CD-ROM software does not have the capability to interact with other software packages, so raw data must be extracted from the CD-ROM, exported to the DATSUM program for analysis, then saved to a directory which is accessed by the GIS. The system is not particularly efficient by today's software standards, and it requires considerable file management input by the user, but it is still a major improvement over currently available options. Ideally, the system should be an automated package which is able to access, manipulate and display data with minimal user involvement; perhaps, as technology advances, such a system could be created. A possible future expansion of the current system would be to modify the DATSUM program to generate hydrographs, either for daily or monthly flow records. 53 The QUIKMap system has the capability to display "picture" files stored in the database, and the ability to view hydrographs would be useful for regional analyses. In a similar fashion, it is conceivable that unit hydrographs, as well as numerous other rainfall parameters, could be generated and displayed if a digitized rainfall database was available. At present there is no such database produced in B.C., but as rainfall records are currently kept at many locations throughout the province, it seems reasonable to assume that an electronic database similar to the WSC streamflow database will be made available in the near future. However, the system, as it currently stands, could be used to aid in the generation of synthetic unit hydrographs for ungauged basins. Synthetic hydrographs can be generated by relating the physical geometry of a gauged area to its hydrograph, and then comparing the basin's characteristics with those of an ungauged area. If basin parameters additional to those currently displayed by the system are required, the DATSUM program could be easily modified to produce the desired values. Another possible future application of this system would be to integrate it with an expert system. An expert system is a computer program designed to mimic the decision making or problem solving processes of a human expert. These systems are designed to deal with uncertainty and have enormous potential as they can be used to capture expertise. As estimating streamflows is such an inexact science, due to the enormous number of variables involved and the uncertain nature of many of those variables, an expert system could aid engineers and hydrologists in making estimates and supporting decisions. Some GIS/expert system applications have been created which generate input parameters for hydrologic models by combining the data handling capabilities of a GIS with the "mature hydrologic expertise" of an expert system to optimize outputs. In a similar fashion, an expert system could be created to operate in 54 conjunction with the DATSUM/QUIKMap system. The expert system would utilize a number of heuristics to guide the user through the regional analysis procedure. Regional analysis relies heavily on human judgement and it would be useful to have a readily available "second opinion". Regardless of how sophisticated a system may become, it is important to leave the final decision making or estimating processes in the hands of the user. Since data is never complete, judgement is always required, and this can only be provided by humans. Thus, systems should be designed to assist humans, not replace them. A program which attempts to make estimates of streamflow values could be hazardous, as people are inclined to believe computer outputs with blind faith. Geographic information systems are particularly dangerous as they are such complex systems which handle large volumes of data and produce impressive, seemingly accurate maps and data. Although it is tempting to simply accept such output values, the user must take care to understand how the systems work so that expected ranges of values can be anticipated in order to assess the quality of results. The system developed for this thesis permits streamflow data to be organized and presented in a user-friendly environment on personal computers, so as to facilitate the process of regional analysis, as well as numerous other streamf low oriented procedures. Water related professionals, such as hydrologists or biologists, will find it to be a valuable tool for obtaining information, and when used in conjunction with other applications, such as the FLOOD program, new levels of results should be attained. Finally, with advancing technology, this system could be refined and expanded to create new applications for the enhancement of water related studies throughout Canada. 55 BIBLIOGRAPHY Axys Software, QUIKMap - Version 2.51: User's Guide, Sidney, B.C., Canada, 1992. Balser, R., "Terrain Resource Information Management (TRIM): A Standardized Geo-referencing Database for the Province of British Columbia." In GIS '89: A Wider Perspective, Proceedings of a Symposium, Vancouver, B.C., March 1989, 23-28. Berry, J. K., "Computer-Assisted Map Analysis: Potential and Pitfalls" in Photogrammetric Engineering and Remote Sensing, Vol. 53, 1988, 1405-1410. Crippen Consultants, Microhydro - Guidance Manual of Procedures for Assessment of Micro Hydro Potential, Report ER80-9E, produced for Energy, Mines and Resources Canada, Vancouver, Canada, October 1980. Energy, Mines and Resources Canada - Efficiency and Alternate Energy Technology Branch, Canadian Small Hydropower Handbook: British Columbia Region, Ottawa, Canada, June 1989. Environment Canada, Hydat User's Manual (CD-ROM Version 3.0), Ottawa, Canada, 1990. Fisher, G. T., "Geographic Information System/Watershed Model Interface" in Hydraulic Engineering, ed. M. Ports, ASCE, 1989, 851-856. Kumar, S., and J. A. Meech, A Hypermanual on Expert Systems, produced for Energy, Mines and Resources Canada, Mineral Sciences Laboratories, CANMET, Ottawa, Canada, 1992. Lamke, R. D., Flood Characteristics of Alaskan Streams, produced for United States Department of the Interior Geological Survey, Report 78-129, Anchorage, Alaska, 1979. Linsley, R. K., M. A. Kohler and J. L. Paulhus, Hydrology for Engineers, 3d. ed., Toronto: McGraw-Hill Book Company, 1982. 56 Marble, D. F., "Geographic Information Systems: An Overview." In Pecora 9 Proceedings: Spatial Information Technologies for Remote Sensing Today and Tomorrow October 1984, 18-24. Microsoft Corporation, Microsoft Excel User's Guide, Redmond, WA, USA, 1991. Microsoft Corporation, Microsoft Excel Function Reference, Redmond, WA, USA, 1991. Moeller, R. A., "Application of a Geographic Information System to Hydrologic Modeling Using HEC-1" in Civil Engineering Applications of Remote Sensing and Geographic Information Systems, ed. D. B. Stafford, ASCE, 1991, 269-277. Muzik, I., "Application of a GIS to SCS Procedure for Designing Flood Hydrographs." In Proceedings of the International Symposium on Modeling in Agricultural. Forest, and Rangeland Hydrology, Chicago 1988, American Society of Agricultural Engineers, Michigan, 494-500. Ontario Ministry of Energy, Small Hydro Engineering Manual, 1988. Russell, S. 0., "Flood Probability Estimation," Journal of the Hydraulics Division, ASCE, Vol. 108, No. HY1, 1981. Russell, S. 0., "Estimating Flows From Limited Data," University of British Columbia, Civil Engineering Department, May, 1991. Stuebe, M. M. and D.M. Johnston, "Runoff Volume Estimation Using GIS Techniques", Water Resources Bulletin, Vol. 26, No. 4, August, 1990, 611-620. VanBlarrgan, E. J., R. M. Ragan and J. S. Schaake, "Hydrologic Geographic Information System", Transportation Research Record, No. 1261, 1990, 44-51. VanBlarrgan, E. J., R. M. Ragan and J. S. Schaake, "Automated Estimation of Hydrologic Parameters" in Civil Engineering Applications of Remote Sensing and Geographic Information Systems, ed. D. B. Stafford, ASCE, 1991, 278-286. Watt, W. E., et al., Hydrology of Floods In Canada: A Guide to Planning and Design, National Research Council of Canada, 1989. Wolfe, M. L. and C. M. Neale, "Input Data Development for a Distributed Parameter Hydrologic Model (FESHM)." In Proceedings of the International Symposium on Modeling in Agricultural, Forest, and Rangeland Hydrology, Chicago 1988, American Society of Agricultural Engineers, Michigan, 462-463. 57 Zhang, H., C. T. Haan and D. L. Nofziger, "Hydrologic Modeling With GIS: An Overview," in Applied Enqineerinq In Agriculture, ASCE, Vol. 6, No. 4, July 90, 453- 458. 58 APPENDIX A DATSUM PROGRAM DETAILS Appendix A.1 - DATSUM Summary File A.1.1 Fields of the DATSUM Summary Files This is a brief description of the different fields in the streamflow data summary files, which are designed to be read by the QUIKMAP software. KEY: This is the streamflow station identification number as given by Water Survey of Canada. This serves as the file identifier in the QUIKMAP nomenclature. STA_NAME : This is the streamflow station name as given by Water Survey of Canada. M_A_RUNOFF : Mean of the mean annual runoffs (m3/s). The mean annual runoffs are calculated from mean monthly flow data. UA_MAR : Unit area mean of the mean annual runoffs (m3/s/km2). CV_RUNOFF : Coefficient of Variation (standard deviation / mean) of the mean annual runoffs. AREA: Rainfall catchment area (km2). YEARS : Time span over which streamf low records were kept. Records were not necessarily kept for every year during this time span. COUNT: Number of years of streamflow record. COMP_COUNT : Number of complete years of streamflow record. M RATIO : Mean of the annual ratios of max instantaneous flow to max daily flow. CV_RATIO : Coefficient of Variation of the annual ratios of max instantaneous flow to max daily flow. FLOOD_COUNT : Number of complete years of extreme flow record. MN _ LAT_ DEG : Streamflow station coordinate - latitude degrees. MN _ LAT_ MIN : Streamflow station coordinate - latitude minutes. MN _ LAT_ SEC: Streamflow station coordinate - latitude seconds. MN LON DEG : Streamflow station coordinate - longitude degrees. 60 MN _ LON _MIN : Streamflow station coordinate - longitude minutes. MN _ LON _SEC : Streamflow station coordinate - longitude seconds. LBL TEXT : Text to appear on the QUIKMAP screen beside the symbol. DATA TYPE: Specifies whether the data is symbol or polyline data. LBL_SIZE : Specifies the size of the LBL_TEXT message. SYM_LN_TYP : Contains the numeric code for the symbol or polyline type to be displayed. SIZE_THICK : Specifies the symbol size or the polyline line thickness. DATA COLOR : Specifies the symbol colour by numeric code. HATCH PATT : Defines the hatch pattern used to fill the symbol or polyline. OPERATION : Field used to specify different forms of presentation of the data. For example, "B" specifies boxing the LBL_TEXT, while "I" specifies that the data be ignored. M MAXDAILY : Mean of the annual max daily flows (m3/s). CV_MAXDAILY : Coefficient of Variation of the max daily flows. CC MAXDtoMAXI : Correlation Coefficient between max daily flows and the ratios of max instantaneous flow to max daily flow. F_P_INDEX : Flow Pattern Index. This is the coefficient of variation of mean monthly flows and indicates the degree of change of flow values, from month to month. UA_MMD : Unit area mean of the max daily flows (m3/s/km2). 61 A.1.2 Example Summary File - Region 08JA KEY STA NAME M_A_RUNOFF UA_MAR 08JA001 NECHAKO RIVER AT FORT FRASER -999.9 -9.990 08JA002 OOTSA RIVER AT OOTSA LAKE 99.2 0.022 08JA003 WHITESAIL RIVER NEAR OOTSA LAKE 33.4 0.033 08JA004 TETACHUCK RIVER NEAR OOTSA LAKE 65.3 0.020 08JA005 TAHTSA RIVER NEAR OOTSA LAKE 51.4 0.036 08JA006 TAHTSA RIVER AT OUTLET OF TAHTSA LAKE 26.3 0.046 08JA007 NECHAKO RIVER AT OUTLET OF NATALKUZ LAKE -999.9 -9.990 08JA009 CHESLATTA RIVER NEAR OOTSA LAKE -999.9 -9.990 08JA010 NECHAKO RIVER BELOW BIG BEND CREEK 176.0 0.013 08JA011 NECHAKO RIVER NEAR OOTSA LAKE -999.9 -9.990 08JA013 SKINS LAKE SPILLWAY, NECHAKO RESERVOIR 106.3 -9.990 08JA014 VAN TINE CREEK NEAR THE MOUTH 0.9 0.006 08JA015 LAVENTIE CREEK NEAR THE MOUTH 5.0 0.058 08JA016 MACIVOR CREEK NEAR THE MOUTH 1.0 0.018 08JA017 NECHAKO RIVER BELOW CHESLATTA FALLS 61.7 0.004 CV_RUNOFF AREA YEARS COUNT COMP_COUNT M_RATIO CV_RATIO -99.99 17700 1915-1954 29 0 -99.99 -99.99 0.13 4450 1929-1952 24 18 -99.99 -99.99 0.12 1010 1930-1952 23 6 1.01 0.02 0.10 3320 1930-1952 23 5 1.02 0.02 -99.99 1420 1930-1952 23 1 1.02 0.02 -99.99 578 1930-1953 13 1 -99.99 -99.99 -99.99 12100 1932-1933 2 0 -99.99 -99.99 -99.99 1580 1950-1951 2 0 -99.99 -99.99 -99.99 13900 1950-1952 3 1 1.01 0.00 -99.99 1949-1950 2 0 -99.99 -99.99 0.58 1955-1990 36 35 -99.99 -99.99 0.52 153 1974-1990 17 16 1.16 0.14 0.12 86.5 1976-1990 15 14 1.81 0.34 0.21 53.4 1976-1990 15 11 1.22 0.10 0.12 15600 1980-1990 11 10 1.01 0.01 62 FLOOD_COUNT MN_LAT_DEG MN_LAT_MIN MN_LAT_SEC MN_LON_DEG 0 54 3 18 124 0 53 37 30 125 17 53 38 20 126 17 53 21 12 125 17 53 45 20 126 0 53 41 0 127 0 53 25 40 125 0 53 38 36 124 2 53 33 50 124 0 53 26 40 125 0 53 46 15 125 14 53 15 30 125 14 53 39 9 127 10 53 48 7 126 10 53 41 2 124 MN_LON_MIN MN_LON_SEC LBL_TEXTI DATA_ TYPE 33 39 08JA001/nINSUFFICIENT DATA S 44 0 08JA002/nMAR=0.022/nFPI=0.79 S 44 56 08JA003/nMAR=0.033/nFPI=013 S 34 10 08JA004/nMAR=0.020/nFPI=0.5 S 42 0 08JA005/nINSUFFICIENT DATA S 14 0 08JA006/nINSUFFICIENT DATA S 6 40 08JA007/nINSUFFICIENT DATA S 58 30 08JA009/nINSUFFICIENT DATA S 51 36 08JA010/nINSUFFICIENT DATA S 1 40 08JA011/nINSUFFICIENT DATA S 58 14 08JA013/nMAR=-9.99/nFPI=0.36 S 24 50 08JA014/nMAR=0.006/nFPI=1.22 S 32 13 08JA015/nMAR=0.058/nFP1=1 S 21 0 08JA016/nMAR=0.018/nFPI=1.2 S 50 8 08JA017/nMAR=0.004/nFP1=0.73 S 63 LBL_SIZE SYM_LN_TYP SIZE_THICK DATA_COLOR HATCH_PATT OPERATION 2 11 15 10 0 B 2 1 9 11 0 B 2 2 9 11 0 B 2 1 9 11 0 B 2 2 9 11 08 2 2 7 9 0 B 2 11 13 13 0 B 2 11 9 11 0 B 2 1 13 13 0 B 2 11 3 6 08 2 11 3 6 1 B 2 1 5 14 1 B 2 3 3 6 1 B 2 1 3 6 1 B 2 1 15 10 1 B M_MAXDAILY CV_MAXDAILY CC_MAXDtoMAXI F_P_INDEX UA_MMD -99.99 -99.99 -99.99 0.66 -99.990 -99.99 -99.99 -99.99 0.79 -99.990 109.56 0.16 -0.02 0.73 0.108 155.47 0.18 0.07 0.50 0.047 214.47 0.21 0.30 0.72 0.151 -99.99 -99.99 -99.99 0.90 -99.990 -99.99 -99.99 -99.99 0.22 -99.990 -99.99 -99.99 -99.99 0.54 -99.990 517.00 0.02 -1.00 0.81 0.037 -99.99 -99.99 -99.99 0.52 -99.990 -99.99 -99.99 -99.99 0.36 -99.990 11.35 0.54 0.34 1.22 0.074 51.54 0.68 0.14 1.00 0.596 7.12 0.25 -0.15 1.20 0.133 270.60 0.16 -0.07 0.73 0.017 64 Appendix A.2 - Macros: Names and Functions This is a brief description of the Excel macros used to create the DATSUM program. The macros are found in eight different macrosheets, and they are listed below in the order in which they appear in each macrosheet. The group of macros under the heading "Command Macros" appear twice in the list. They are emphasized because the user is able to invoke them independent of the running of the DATSUM program. These macros generally provide quick and simple ways to manage files. Command Macros: These are macros which can be manually invoked by striking the key combination shown beside each macro name. RUN DATSUM^<CTRL> <d> Master program macro which opens and closes files as needed and orchestrates all the other macros. OPEN^<CTRL> < o > Opens all macro files necessary for the creation of streamflow data summary files. CLOSE^<CTRL> < c > Closes all the DATSUM macros without saving any of the changes. Useful for exiting the program. SAVEFILE^<CTRL> < a > Closes the file and prompts whether you want to save changes. SNGL_FLOW_FILE^<CTRL> < S > Formats an individual mean flow file without sending data to a summary file. This is the macro that should be invoked if one wishes to look at a single streamflow data file. SNGL_RATIO_FILE^<CTRL> < s > Formats an individual max flow data file without sending data to a summary file. 65 MACROIA.XLM : Small macros that perform very simple and specialized functions related to formatting the individual streamflow data files. PARSEI, PARSE2, PARSE3, PARSE4 Parse different lines of the raw data files to put them into a format which is easily manipulated with EXCEL commands. average Selects a column of numbers and calculates the average value. (ie. mean of the mean annual flows). stdev Selects a column of numbers and calculates the coefficient of variation. (ie. coefficient of variation of the mean annual flows). parseloop Parses a selected column of cells. COUNTER Counts the number of cells in a column which contain values. MOVEDATA Creates a summary line of data at the bottom of the file. SUMDATA Copies the summary line of data created by MOVEDATA macro to the summary file. SAVEFILE^ <CTRL> <a> Closes the file and prompts whether you want to save changes. EXTRADATA Checks the file to see whether there is a second page of data below the lines referencing latitude and longitude, and if so, appends it at the appropriate place. SAVEOLD Closes the file without saving any changes. NO_REGULATED Checks the mean annual flow values in the data file and removes any letters attached to any values, such as "R" for regulated or "E" for estimated. 66 NO REGULATED2 Checks the mean monthly flow values in the data file and removes any letters attached to any values, such as "R" for regulated or "E" for estimated. QUIKDATA.XLM : Macros that specify what sort of symbols will represent each streamf low data file. HATCH Specifies the hatch pattern for each symbol. Current stations have a solid hatch pattern while discontinued stations have a blank hatch pattern. SHAPE Specifies symbol shapes depending on mean of mean annual runoff values. SYMSIZE Specifies symbol size and colour depending on catchment area. OPERATION Specifies the operation code. In this case, "B" is specified to indicate boxed labels. See page 11-11 in the Quikmap manual. DATAFILL Invokes a number of QUIKDATA macros. ALIGNMENT Specifies left alignment for all columns. SYMBOL Specifies that each file be represented by a symbol. DATABASE1 Marks all columns of data as a database so that the file can be formatted as a dBASE III file. SHOW_TEXT Creates the LBL TEXT data in the summary file. YES_DATA Macro invoked by the SHOW_TEXT macro if MAR and CV data have been calculated for a specific station. Creates a data line which contains the station name, MAR and CV. This data line is formatted so that it can be read as a label by the QUIKMAP software. 67 NO_DATA Macro invoked by the SHOW_TEXT macro if MAR and CV data have not been calculated for a specific station. Creates a data line which contains the station name and a message indicating that no values have been calculated. This data line is formatted so that it can be read as a label by the QUIKMAP software. ROUND_NUM Rounds off MAR and CV values to desired significant figures, permitting the values to be used by the YES_DATA macro. (Otherwise, the figures are too long and the YES_DATA macro has difficulty handling them. STATION MOVE Copies the station names to another area of the summary spreadsheet so that this data can be used by the YES_DATA and NO_DATA macros. FORMAT_RATNUM Formats the ratio numbers to two decimal places. CLEAR_DATA Clears data which was used to format other data, but which is not needed in the final summary file. LATLONG_CHECK Checks the latitude and longitude values in the data summary file, and places a zero in the cell if the cell is blank. INDEX.XLM : Macros that take data from the index files and copy it to the summary files. INDEXX Parses data lines and then selects data and copies it to a summary file. PARSELOOP Parses the data in the index file. SENDNAME Sends station name and number to the data summary file. SENDYEAR Sends years of record and count to the data summary file. 68 COUNTCHECK Checks whether the number of data lines in the summary file agrees with the number of data lines in the index file. That is, it checks to see whether all the streamflow files have been processed and had their information sent to the data summary file. EXTRAINDEX Checks the file to see if there is a second page of data below the lines referencing latitude and longitude, and if so, appends it at the appropriate place. FORMAT_NUMBER Formats the unit area mean annual runoff values to have 3 decimal places, the cv values to have 2 decimal places and the MAR values to have 1 decimal place. FLOW.XLM : Macros that link other macros together to format streamflow data files. FLOW Links many of the other macros together to parse a raw data file, perform a number of calculations, summarize information at the bottom of the file, send the data summary to a summary file, and then exit the file without saving any changes. SNGL_FLOW_FILE^<CTRL> <S> Performs same functions as the FLOW macro, without sending the data to a summary file or exiting the file. This is the macro that should be invoked if one wishes to look at a single streamflow data file. RATIO.XLM : Macros involved in formatting streamflow max flow files. RATIO Parses raw data max flow files and enacts the COUNT RAT macro. PARSE5, PARSE6 Parse different lines of the raw data files to put them into a format which is easily manipulated with EXCEL commands. CLEAN_UP Removes lines of record which are incomplete. 69 EXTRARATIO Checks the file to see whether there is a second page of data below the line referencing data type, and if so, appends it at the appropriate place. RATIO_CALC Calculates the ratio of max instantaneous flow to max daily flow for every year of complete record. SUM_CALC Calculates the mean, standard deviation, and coefficient of variation of the annual max daily flows and of the ratios of max annual instantaneous to max annual daily flows. Also calculates the correlation coefficient between the max daily flows and the ratios of max instantaneous to max daily flows. MOVE_RAT Creates a summary line of data at the bottom of the file. SUM_RAT Copies the summary line of data created by the MOVE_RAT macro to the summary file. COUNT_RAT Counts the number of complete years of max flow record, and performs one task (macro YES RAT) if there is at least one year of complete record, and performs another task (macro NO_RAT) if there is not. YES_RATIO If the macro COUNT RAT indicates that there are complete years of record for the current station, then this macro links many other macros together to format the file, perform statistical calculations, and copy the results to a summary file. NO_RATIO If the macro COUNT_RAT indicates that there are NO complete years of record for the current station, then this macro places the number -9999 in the appropriate columns of the summary file. SNGL RATIO_FILE^<CTRL> <s> Formats an individual max flow data file without sending data to a summary file. ONE RATIO If the macro COUNT_RAT indicates that there is only one complete year of record for the current station, then this macro calculates the ratio for that year 70 and places it in the M_RATIO column of the summary file and -99.99 in the CV_RATIO column. CALC_ UA_ MEAN In the summary file, this macro calculates the unit area mean of the annual max daily flows, if the catchment area is known. Otherwise, it places the value -99.99 in the appropriate column. Note that the value -99.99 is already in place if there are no complete years of data (see SUM_RAT and ONE RATIO). SETUP.XLM : Macros involved in setting up the other macros so that they will operate for a specific data file or group of files. MASTER_FLOW Master macro that opens individual streamflow data files and invokes the FLOW macro. This is the macro that is used to create the flow data for the data summary files. MASTER_RATIO Master macro that opens individual max flow data files and invokes the RATIO macro. This is the macro that is used to create the ratio data for the data summary files. FILEOPEN_SETUP Copies all the station names from the appropriate index file to the SETUP.XLM macro sheet so that the MASTER_FLOW and MASTER_RATIO macros can be configured to open the appropriate files. RENAME_FILE Changes the file references on the SETUP.XLM macro sheet. BEGIN_SETUP Sets up the macro sheet SETUP.XLM so that the macros FILEOPEN SETUP and RENAME_FILE can operate. Also sets up references in the DATSUM.XLM macrosheet. MASTER_FILE_SETUP Master command macro which invokes the BEGIN_SETUP, RENAME_FILE and FILEOPEN_SETUP macros. Once the MASTER_FILE_SETUP macro has been run all the appropriate macro sheets should have been configured correctly so that the MASTER_FLOW and MASTER_RATIO macros operate in the desired fashion. 71 MASTER_FILE_CREATE Master command macro which invokes the MASTER_FLOW and MASTER_RATIO macros. This macro should only be run after the macro sheets have been properly configured by the MASTER_FILE_SETUP macro. MASTER_MASTER Master command macro which invokes all the appropriate macros so as to create a complete summary file. OPEN.XLM : Macros designed to open and close various combinations of files. OPEN^ <CTRL> <o> Opens all macro files necessary for the creation of streamflow data summary files. CLOSE^ <CTRL> <c> Closes all the DATSUM macrosheets, except OPEN.XLM, without saving any of the changes. DATSUM.XLM: Macros designed to control the other macros. RUN_DATSUM Master program macro which opens and closes files as needed and orchestrates all the other macros. 72 Figure A.2 - Flow Chart Illustrating DATSUM Macro Relationships (page 1 of 3) 73 BEGIN_SETUP FILEOPEN_SETUP INDEX.XLM^EXTRAINDEX MACRO1A.XLM^SAVEOLD RENAME FILE FLOW MACRO1A.XLM^EXTRADATA PARSE2 parseloop PARSE3 PARSE4 NO_REGULATED NO REGULATED2 average stdev COUNTER MOVEDATA SUMDATA SAVEOLD INDEXX INDEXXLM EXTRA1NDEX PARSELOOP SENDNAME SENDYEAR FORMAT NUMBER COUNTCHECK MACRO1A.XLM SAVEOLD DATAFILL QUIKDATA.XLM HATCH SHAPE SYMSIZE OPERATION SYMBOL SHOW_TEXT QUIKDATA.XLM^ROUND_NUM DATABASE1 STATON_MOVE CLEARDATA YES_DATA NO_DATA RATIO RAT10.XLM^IXTRARATIO PARSES PARSE() COUNT RAT RATIO XLM^ NO_RATIO ONE_RATIO YES_RAT10 Figure A.2 - Flow Chart Illustrating DATSUM Macro Relationships (page 2 of 3) 74 MACRO1A.XLM ^ SAVEOLD RA110.XLM CLEAN_UP CALC_UA_MEAN MACRO1A.XLM ^ SAVEOLD RAT1O.XLM CLEAN_UP RATIO CALC SUM_CALC MOVE_RAT SUM_RAT RAT1O.XLM CALC_UA_MEAN MACRO1A.XLM ^ SAVEOLD Figure A.2 - Flow Chart Illustrating DATSUM Macro Relationships (page 3 of 3) 75 PARSE1 PARSE3PARSE2 =ECHO(FALSE) =ECHO(FALSE) =ECHO(FALSE) =RETURN() =RETURN() =SELECTCRC[3]*) Appendix A.3 - Programming Code Macrosheet : MACRO1A.XLM =PARSECO[li^ STILL CRE =PARSEC[1912][ 42.8 ][ 53.0 =PARSE('[LOCATION - LATH ][4 =EDIT.DELETE(1) =SELECTCRC[1) =EDIT.DELETE(1) =SELECT(RCF4r) =RETURN() 76 PARSE4 stdevaverage =EDIT.DELETE(1) =RETURN()=RETURN() =ECHO(FALSE) =ECHO(FALSE) =ECHO(FALSE) =PARSE('[^][ LONG =SELECT(MAR') =SELECT('SD') =SELECT(RC[3]') =FORMULA(=AVERAGE(R7C14:BOT2)") =FORMULA("=STDEV(R7C14:BOT2)") =SELECT("RC[1]") =EDIT.DELETE(1) =SELECTCRC[-4]') =RETURN() 77 parseloop COUNTER =ECHO(FALSE) =ECHO(FALSE) =SELECT("R7C1") =SELECT(R7C14') =SELECT.EN D(4) =SE LEGIT R[1]C") =S E LECT(' R7C1 :BOT1 ") =SELECTCCOUNT") =RETURN() =ELSE() =DEFINE.NAME("BOT1") =IF(ISBLANK(ACTIVE.CELL())) =RUN("MACRO1A.XLMIPARSET) =FORMULA(1) =SELECT(COUNT) =FORMULA(=COUNT(R7C14:130T2)") =END.IF() =RETURN() 78 MOVEDATA MOVEDATA (confd) SUMDATA =ECHO(FALSE) =SELECT(' R[10]C[8r) =SELECT(RC:RC[30]) =COPY() =PASTE.SPECIAL(3,1,FALSE,FALSE) =COPY() =SELECTCR[3]C[11r) =CAN CELCO PY0 =ACTIVATE('SUMBASE.XLS') =PASTE() =SE LECTC R[-9]C[-8]" ) =SELECT(R16384C3') =CANCEL.COPY() =SELECT(RC:RC[2]) =SELECT.EN D(3) =SELECTCRF7P[-131") =COPY() =SELECTCR[1]C") =IF(ISERROR(ACTIVE.CELL())) =SELECT("R[9]C[11r) =PASTE() =SELECTCR[7]CF3r) =PASTE.SPECIAL(3,1,FALSE,FALSE) =GANGS...COPY() =FORMULA(-999.9) =CAN CEL.COPY() =ACTIVATE.NEXTO =ELSE() =SELECT('R1C1") =RETURN() =COPY() =COPY() =SELECTCR[7]C[-31") =SELECT('R16384C1") =PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT.END(3) =END.IF() =SELECT("RC[24]") =CANCEL.COPY() =PASTE() =SELECT(`R[-71C[5]") =CANCEL.COPY() =IF(ISERROR(ACTIVE.CELL())) =SELECTCR[-21CF21r) =SELECT(' R[7]C(-4r) =IF(ISERROR(ACTIVE.CELL())) =FORMULA(-9.99) =SELECT('R[2]C[24]") =ELSE() =FORMULA(-9.99) =COPY() =ELSE() =SELECTCR[7]C[-41") =COPY() =PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT("R[2)C[24]") =CANCEL.COPY() =PASTE.SPECIAL(3,1,FALSE,FALSE) =END.IF() =CANCEL. COPY() =SELECT("R[-5]C[4r) =END.IF() =IF(ISERROR(ACTIVE.CELL())) =RETURN() =SELECT('R[5]C[-3r) =FORMULA(-99.99) =ELSE() =COPY() =SELECT("R[51CF31") =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =END.IF() =SELECT('R[-10]C[6]") =COPY() =SELECTCR[10]C[-5r) =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SELECT(13[-3]C") =COPY() =SELECT(' R[3]C[31") =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SELECT(' R[-10]C[-4r) =SE LECT(" RC:RC[2]*) =COPY() 79 SAVEFILE EXTRADATA SAVEOLD =SAVE.AS(, 1) =ECHO(FALSE) =SAVE.AS(,0) =FILE.CLOSE() =SELECT(*R7C1") =FILE.CLOSE() =RETURN() =SELECT.END(4) =RETURN() =SELECT(R[13]C") =PARSEC[1912][ 42.8 j[ 53.0 ][ 24.5 =S ET.NAME(' E)CTRA",ACTIVE.CELLO) =IF(ISTEXT(EXTRA)) =SELECT("R(11C") =DEFINE.NAMECDELB1 =SELECT("R[-131C") =DEFINE.NAME(DELT") =SELECT(' DELT:DELB") =EDIT.DELETE(2) =END.IF() =SELECT(131C1') =RETURN() 80 NO_REGULATED NO_REGULATED2 =ECHO(FALSE) =ECHO(FALSE) =SELECT("R8C14") =SELECT('B0T2') =IF(ISBLANK(ACTIVE.CELL())) =SELECT("R[21C") =SELECT("R7C14') =DEFINE.NAME(MMEND") =DEFINE.NAME(B0T2') =SELECT.END(1) =ELSE() =SELECT(RC[1]") =SELECTCR7C141 =DEFINE.NAME('MMSTART') =SELECT.END(4) =FOR.CELL("Regulate2MMSTART:MMEND",TRUE) =DEFINE.NAME(BOT2") =SELECT(Regulate2) =END.IF() =IF(ISNUMBER(Regulate2)) =SELECT(R7C14') =GOTO(R[22]C) =FOR.CELL('Regulate",'R7C14:BOT2",TRUE) =ELSE() =SELECT(Regulate) =COPY() =IF(ISNUMBER(Regulate)) =SELECT(RC[15]") =GOTO(R[221C) =PASTE() =ELSE() =CANCEL.COPY() =COPY() =IF(LEN(ACTIVE.CELL())) =SELECT('RC[2]") =PARSE(10.239][R1') =PASTE() =ELSE.IF(LEN(ACTIVE.CELL()=5)) =CANCEL.COPY() =PARSE(141.51[131") =IF(LEN(ACTIVE.CELL())=6) =ELSE.IF(LEN(ACTIVE.CELL()=4)) =PARSE([0.239][R]") =PARSE(1311j[R]") =ELSE.IF(LEN(ACTIVE.CELL()=5)) =ELSE.IF(LEN(ACTIVE.CELL0=3)) =PARSEC[41.51[1:11') =PARSE('[10][R]*) =ELSE.IF(LEN(ACTIVE.CELL()=4)) =ELSE.IF(LEN(ACTIVE.CELL()=2)) =PARSE(1311][R]") =PARSEr[11[R]*) =ELSE.IF(LEN(ACTIVE.CELL0=3)) =END.IF() =PARSE(110][R]*) =SELECT("RC") =ELSE.IF(LEN(ACTIVE.CELL()=2)) =COPY() =PARSEC[1][13]") =SELECT('FICH5r) =END.IF() =PASTE() =SELECT("RC") =CANCEL.COPY() =COPY() =END.IF() =SELECT("RCF2r) =NEXT() =PASTE() =RETURN() =CANCEL.COPY() =END.IF() =NEXT() =RETURN() 81 Macrosheet : QUIKDATA.XLM HATCH =SELECT('R1C7') =SELECT.END(4) =DEFINE.NAME('YEAREND') =SELECT('R2C7:YEAREND') =COPY() =SELECT('R2C36') =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =PARSE(11915-1[19541') =SELECT("R3C37") =SET.NAME('blank5',ACTIVE.CELL()) =I F(ISBLANK(blank5)) =SELECT("R2C37") =DEFINE.NAME('YEAREND2') =ELSE() =SELECT("R2C37") =SELECT.EN D(4) =DEFINE.NAME('YEAREND2') =END.IF() =SELECT("R2C37") =FOR.CELL(year1","R2C37:YEAREND2',FALSE) =IF(DEREF(year1)=1990) =SELECT("RC25') =FORMULA(1) =ELSE() =SELECT(RC25") =FORMULA(0) =END.IF() =SELECT("R[1]C') =NEXT() =RETURN() 82 SHAPE =SELECT("R1C4") =SELECT. END(4) =DEFINE.NAME(RUNEND') =SE LECT('R2C4') =FOR.CELL('run 1' ,'R2C4:RUNEND',FALSE) =IF(DEREF(run1)<O) =SELECT("RC22") =FORMULA(11) =ELSE.IF(AND(DEREF(runi )>-1,DEREF(run1)<0.025)) =SELECT("RC22') =FORMULA(1) =ELSE.IF(AND(DEREF(run1)<0.05,0R(DEREF(run1)=0.025,DEREF(run1)>0.025))) =SELECT('RC22') =FORMULA(2) =ELSE.IF(AND(D ER EF(run1)<0.075,0R(DEREF(run1).05,DER E F(run1)>0.05))) =SELECT('RC22") =FORMULA(3) =ELSE.IF(AND(DEREF(run1)<0.1,0R(DEREF(run1)=0.075,DEREF(run1)>0.075))) =SELECT("RC22') =FORMULA(64) =ELSE.IF(OR(DEREF(run1)=0.1,DEREF(run1)>0.1)) =SELECT("RC22') =FORMULA(6) =END.IF() =SELECT('R[1]C') =NEXT() =RETURN() 83 SYMSIZE =SELECT(R1C5") =SELECT.END(4) =SELECT(' RCM') =DEFINE.NAME(AREAEND") =SELECT(' R2C6') =FOR.CELL(areaR2C6:AREAEND',FALSE) =IF(DEREF(area)<100) =SELECT(' RC23") =FORMULA(1) =SELECT(RC24") =FORMULA(6) =SELECT(RC21') =FORMULA(2) =ELSE.IF(AND(DEREF(area)<500,0R(DEREF(area)=100,DEREF(area)>100))) =SELECT("RC23") =FORMULA(2) =SELECT("RC24") =FORMULA(14) =SELECT("RC21') =FORMULA(2) =ELSE .IF(AND(DEREF(area)<1000,0R(DEREF(area)=500,DEREF(area)>500))) =S ELECT(' RC23') =FORMULA(3) =SELECT("RC24") =FORM ULA(9) =SELECT('RC21") =FORM ULA(2) =ELS E.IF(AN D(DE R EF(area)<5000,0R(D EREF(area)=1000,DER E F(area)>1000))) =SELECT('RC23") =FORMULA(4) =S ELECT(" RC24') =FORMULA(11) =SELECT(RC21") =FORM ULA(2) =E LS E.IF(AN D(D EREF(area)<10000,0R(DER E F(area)=5000,DER E F(area)>5000))) =SELECT(RC23") =FORMULA(5) =SELECT("RC24') =FORMULA(12) =SELECT('RC21') =FORMULA(2) =ELSE.IF(AN D(DEREF(area)<15000,0R(DEREF(area)=10000,DEREF(area)>10000))) =SELECT(' RC23') =FORM ULA(6) =SELECT('RC24') =FORMULA(13) =SELECT("RC21') =FORM ULA(2) 84 SYMSIZE (coned) OPERATION =SET.NAME("blank6',ACTIVE.CELLO)=SELECT(' RC23') =IF(ISBLANK(blank6))=FORMULA(7) =SELECT(RC24') =SELECT(R2C26") =FORMULA(10) =FORMULA("B") =SELECT('RC21") =ELSE() =FORMULA(2) =SELECT("R2C25') =END.IF() =SELECT.END(4) =SELECT(13[1]C") =SELECTCRC[1].) =DEFINE.NAME("OPEND')=NEXT() =RETURN() =SELECT("R2C26") =ELSE.IF(OR(DEREF(area)=15000,DEREF(area)>15000)) =SELECT("R3C1') =FORMULA(B") =COPY() =SELECT("R3C26:0PEND') =PASTE() =CANCEL.COPY() =END.IF() =RETURN() 85 DATAFILL SYMBOLALIGNMENT =DEFINE.NAME(SYSEND")=RETURN() =RUN("QUIKDATA.XLMHATCH") =S ELECT("R1 C24') =IF(ISBLANK(blank6)) =RUN("QUIKDATA.XLMISHAPE") =SELECT.END(4) =SELECT("R2C20') =RUN(QUIKDATA.XLMISYMSIZE") =DEFINE.NAME("ALEND") =FORMULA("S") =RUN(QUIKDATA.XLMIOPERATION") =SELECT("1:11C1:ALEND") =ELSE() =RUN(QUIKDATA.XLMISYMBOL') =ALIGNMENT(2) =SELECT("R2C19") =R UN("Q U1KDATA.XLMISHOW TEXT) =SELECT("R1C1') =SELECT. END(4) =RUN(QUIKDATA.XLMIDATABASE1") =RETURN() =SELECT('FIC[1].) =SELECT(R2C20") =FORMULA(S) =COPY() =SELECT(R3C20:SYS END") =PASTE() =CANCEL.COPY() =END.IF() =RETURN() 86 DATABASE1 SHOW_TEXT SHOW_TEXT (cont'd) =SELECT(R1C31') =RUN("QUIKDATA.XLM!ROUND_NUM') =SELECT("R[1]C[-1]") =SELECT.END(4) =RUN('QUIKDATA.XLMISTATION_MOVE) =NEXT() =SELECTCRC[11") =SELECT(R1C19") =SELECT("textop3:texbot3') =DEFINE.NAME("DATAEND') =SELECT.END(4) =COPY() =SELECT("R1C1:DATAEND') =DEFINE.NAME(texbot1") =SELECT('R2C19") =SET.DATABASE() =SELECT("R2C19:texbot1") =PASTE.SPECIAL(3,1,FALSE,FALSE) =RETURN() =COPY() =CANCEL.COPY() =SELECT("texbot1") =SELECT('namtop2:texbot3') =SELECT(*F1[2]CF1r) =CLEAR(1) =PASTE() =END.IF() =SELECT("RC") =RETURN() =DEFINE.NAME('textop2") =SELECT(R3C19") =SET.NAME(blank9",ACTIVE.CELLO) =IF(ISBLANK(blank9)) =SELECT('textop2') =FOR.CELL(text","R4C18:textop2',FALSE) =IF(LEN(text)>26) =RUN("QUIKDATA.XLMIYES_DATA") =ELSE() =RUNCQUIKDATA.XLMINO_DATA") =END.IF() =SELECT(-13[1]CF1r) =NEXT() =SELECT('R4C19') =DEFINE.NAME("textop3") =COPY() =SELECT("R2C19") =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SELECT('namtop2:textop3") =CLEAR(1) =ELSE() =SELECT(lextop2") =SELECT.END(4) =DEFINE.NAME(texbot2") =SELECTCRC[1]") =DEFINE.NAME('texbot3') =SELECT("textop2") =SELECTCRC[11') =DEFINE.NAME(textop3') =SELECT("RCF1r) =FOR.CELL.(text",lextop2:texbot7,FALSE) =IF(LEN(text)>26) =RUN("OUIKDATA.XLMWES_DATA') =ELSE() =RUN('QUIKDATA.XLMINO_DATA") =END.IF() 87 YES_DATA NO_DATA =SELECTCRC[1r) =SELECT("RC[1].) =COPY() =RETURN() =FORMULA(=REPL4CE(RC18,1,7,RC1)) =FORMULA("=REPLACE(RC18,1,7,RC1)") =SELECTORCH r) =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL...COPY() =SELECTCRCH 4y) =I F(LEN(ACTIVE.CELLO)=3) =SELECTCRC[15]") =FORMULA(=REPLACE(RC18,14,3,RC4)) =ELSE.IF(LEN(ACTIVE.CELL())) =SELECT("RC[15]') =FORMULA(=REPLACE(RC18,14,4,RC4)") =ELSE.IF(LEN(ACTIVE.CELLO)=5) =SELECTCRC[15r) =FORMULAC=REPLACE(RC18,14,5,RC4)") =END.IF() =COPY() =SELECTCRCF1r) =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SELECT(RC[1]") =FORMULA(=REPLACE(RC18,25,4,RC5)) =RETURN() 88 ROUND_NUM ROUND_NUM (contd) STATION_MOVE =SELECT('R3C4') =SELECT("martop2") =SELECT(R1C1") =SET.NAME("blank8",ACTIVE.CELLO) =SELECT("RC[1]") =SELECT.END(4) =IF(ISBLANK(blank8)) =DEFINE.NAME("martop3") =DEFINE.NAME("nambot1") =SELECTCR2C4") =SELECTCRC[1r) =SELECT(' R2C1:nambot1") =COPY() =DEFINE.NAME("fpitop3") =COPY() =SELECT("R4C3") =SELECTCRCH") =SELECT("nambot1') =PASTE() =FORMULAC=ROUND(RCE-113)") =SELECT(' R[2]C") =CANCEL.COPY() =COPY() =PASTE() =SELECT(" R2C30") =SELECT("R[1]C:marbot3") =CANCEL.COPY() =COPY() =PASTE() =SELECT("RC") =SELECT("R4C6') =CANCEL.COPY() =DEFINE.NAME("namtop2") =PASTE() =SELECT("R2C30:fpibot1") =RETURN() =CANCELCOPY0 =COPY() =SELECTCR4C41 =SELECT.END(4) =FORMULAC=ROUND(RCF1J,3)") =SELECTCR[2]Cf-24r) =SELECT(' R4C5') =PASTE() =FORMULAC=ROUND(RC[1],2)") =SELECT("RCF1r) =COPY() =FORMULAC=ROUN D(RC[1],2)") =PASTE.SP ECIAL() =COPY() =CANCEL.COPY() =SELECT("R[11C:fpibot3") =SELECT("R4C3') =PASTE() =CLEAR(1) =CANCEL.COPY() =SELECTCR4C61 =SELECT("martop3:fpibot3") =CLEAR(1) =COPY() =ELSE() =PASTE.SP ECIAL(3,1,FALSE, FALSE) =SELECT("R2C4") =CANCEL.COPY() =SELECT.END(4) =SELECT("martop2:marbot2") =DEFINE.NAME("marbot1") =CLEAR(1) =SELECTCRC[261") =SELECT("fpitop2:fpibot2') =DEFINE.NAME("fpibot1") =CLEAR(1) =SELECT("R2C4:marbotl ') =END.IF() =COPY() =RETURN() =SELECT.END(4) =SELECT(' R[2]C[-1]") =PASTE() =SELECT(' RC") =DEFINE.NAME("martop2") =SELECT(' RC[3]" ) =DEFINE.NAME("fpitop2") =SELECT(' martop2") =SELECT.END(4) =DEFIN E.NAME("marbot2") =SELECT("RC[1]") =DEFINE.NAME("marbot3") =SELECT(' FIC[1r) =DEFINE.NAME("fpibot3') =SELECT(' RCM") =DEFINE.NAME("fpibot2") 89 FORMAT RATNUM LATLONG_CHECKCLEAR_DATA =SELECT.END(4) =SELECT.EN D(4)=CLEAR(1) =FORMAT.NUMBER(0.00") =SELECT(R2C13') =FOR.CELL(Blank","R2C13:latbor ,FALSE)=SELECT(R1C29') =SELECT.END(4) =SELECT(Blank) =DEFINE.NAME(ccbot1') =IF(ISBLANK(Blank)) =SELECT(R2C27:ccbot1') =FORM U LA(0) =FORMAT.NUMBER(0.00') =SELECTCRC[1r) =SELECT('R1C31") =FORMULA(0) =SELECT.END(4) =SELECT(' RC[1] ) =DEFINE.NAME("uambot1") =FORMULA(0) =SE LECT('R2C31:uambot1') =SELECT(RC[1]") =FORMAT.NUMBER(0.000") =FORMULA(0) =RETURN() =SELECT(RC[1].) =SELECT("1:11C11') =SELECT(' R2C27:R300C27') =DEFINE.NAME(cvrbot1') =SELECT('R1C27') =SELECT('RC[1n =SELECT("R2C10:cvrbot1') =RETURN() =DEFINE.NAME("latbot") =FORMULA(0) =SELECTCRC[11') =FORMULA(0) =END.IF() =NEXT() =RETURN() =SELECT(131C12') 90 INDElOC PARSELOOP =ECHO(FALSE) =S ELECT(' R6C1 =DEFINE.NAME('BOT1')=RETURN() Macrosheet : INDEX.XLM =RUN(INDEX2.XLMIEXTRAINDEX) =SET.NAM ECblank1 ',ACTIVE.CE LW) =RUN(' NDEX2.XLMIPARSELOOP) =IF(ISBLANK(blank1)) =RUN(I N DEX2.XLMISEN DNAME) =SELECT(R3C1:R4C1') =RUN(INDEX2.XLMISENDYEAR) =PARSEC[PRil NO. ][NAME =RUN(INDEX2.XLMIFORMAT NUMBER) =ELSE() =RUN(INDEX2.XLMICOUNTCHECK) =SELECT("R3C1') =RUN(MACRO1 A.XLMISAVE0 LD') =SELECT. EN D(4) =SELECT('R3C1:BOT1') =PARSEC[PFI][ NO. ][NAME =END.IF() =RETURN() 91 SENDNAME SENDYEAR =SELECT("R5C3") =SELECTCR5C61 =SET.NAME("blankT,ACTIVE.CELLO) =SET.NAME(blankT,ACTIVE.CELLO) AF(ISBLANK(blank2)) =IF(ISBLANK(blank3)) =S E LECT(' R4C2: R4C3") =SE LECT(' R4C5: R4C6') =COPY() =COPY() =ACTIVATE(S U MBASE.XLS' ) =ACTIVATE('SUMBASE.XLS') =S ELECT(' R2C1 ) =S ELECT(' R2CT ) =PASTE() =PASTE() =CANCEL.COPY() =CANCEL.COPY() =ACTIVATE.N EXTO =ACTIVATE .NEXT() =ELSE() =ELSE() =S ELECT(' R4C3') =SELECT(' R4C6' ) =SELECT.END(4) =SELECT. E N D(4) =DEFI NE. NAME('END1 ) =DEFINE.NAME("END2') =S ELECT(' R4C2:EN D1') =SELECT(' R4C5:EN DT) =COPY() =COPY() =ACTIVATE(' SUMBAS E.XLS') =ACTIVATE(SUMBASE.XLS') =S ELECT(' R2C1 ') =SELECT("R2C7") =PASTE() =PASTE() =CAN CE L.COPY() =CANCELCOPY() =ACTIVATE. N EXT() =ACTIVATE.NEXTO =END.IF() =EN D.I F() =RETURN() =RETURN() 92 COUNTCHECK EXTRAINDEX =ACTIVATE(SUMBASE.XLS') =SELECT('R3C1') =SELECT(R3C4') =SELECT.END(4) =SET.NAME("blank4',ACTIVE.CELL()) =SELECT('R[2]C') =IF(ISBLANK(blank4)) =PARSEC[PR][ NO.^][NAME =ACTIVATE.NEXTO =SET.NAME( EXTRA1',ACTIVE.CELLO) =GOTO(R[30]C) =IF(ISTEXT(EXTRA1)) =END.IF() =DEFINE.NAME(IDELB1') =SELECT("R2C4') =SELECT(R[-2]C') =SELECT.END(4) =DEFINE.NAME('DELT1') =DEFINE.NAME(END3') =SELECT('DELT1:DELB1') =SELECT(R[4]C") =EDIT.DELETE(2) =FORMULA(=COUNTA(R2C4:END3)") =END.IF() =COPY() =SELECT('R1C1') =ACTIVATE.NEXT() =RETURN() =SELECT(R3C3') =SELECT.END(4) =SELECT('R[3]C") =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SET.NAME(surn',DEREF(ACTIVE.CELL())) =SELECT(14[4]C') =DEFINE.NAME(END4') =SELECT('R[2]C') =FORMULA(=COUNTA(R4C3:END4)') =SET.NAME(index',DEREF(ACTIVE.CELL())) =SELECTCRCF2r) =FORMULA(INDEX COUNT =') =SELECT(*R[1]C') =FORMULA('SUMMARY COUNT =') =IF(sum=index) =GOTO(R[5]C) =ELSE() =ALERT(*The numbers of stations in the summary =HALT() =END.IF() =RETURN() 93 FORMAT_NUMBER =ACTIVATE(' SUMBASE.XLS') =SELECT("R1 C5") =SELECT.END(4) =DE Fl N E.NAME(" LAST1 ') =SE LECT(" R2C5:LAST1 ') =FORMAT.N UM BER("0.00') =SELECT(' R 1 04') =SELECT.EN D(4) =DEFINE.NAME("END3") =SELECT(' R2C4:EN D3') =FORMAT.N UM BER("0.000') =S ELECT(' R 1 C3') =SELECT.EN D(4) =DEFINE.NAME("LAST2") =S ELECT(' R2C3:LAST2') =FORMAT.NUMBERC0.01 =SELECT("R1 C30') =SELECT.EN D(4) =DEFINE.NAME("LAST4") =SELECT("R2C30:LAST4') =FORMAT.N UM BER("0.00') =SELECT(" R1 C9') =S ELECT.EN D(4) =S ELECT(' RC[21') =DEFINE.NAME("LAST3") =SELECT("R2C10:LAST3') =FORMAT.NUMBER("0.000') =ACTIVATE.NEXTO =RETURN() 94 Macrosheet : FLOW.XLM FLOW =ECHO(FALSE) =RUN(MACRO1A.XLMIEXTRADATA') =SELECT('R5C1') =RUN('MACRO1A.XLMIPARSE2") =RUN("MACRO1A.XLM!parseloop") =SELECT("R8C1') =IF(ISBLANK(ACTIVE.CELL())) =SELECT(R[3]C") =ELSE() =SELECT("RF11C") =SELECT.END(4) =SELECT.END(4) =SELECT("R[2]C") =END.IF() =RUN(MACRO1A.XLM1PARSE3') =SELECT("R[1]C") =RUN("MACRO1A.XLMIPARSE4") =SELECT("R[1]C") =SELECTCRC:R[6]C[9]") =EDIT.DELETE(1) =SELECT(RC[5]") =FORMULA(per unit area') =SELECT.END(1) =SELECT("R[1]C") =FORMULArMean Annual Runoff = ") =SELECTCR[11C") =FORMULA("Standard Deviation = ") =SELECT('R[1]C') =FORMULACCoefficient of Variation =") =SELECT(*R[2]C") =FORMULA("Years of Complete Record = ") =SELECTCR[-4]C[3]") =DEF1NE.NAME('MAR') =RUN("MACRO1A.XLMINO_REGULATED') =RUN(MACRO1A.XLMINO_REGULATED2") =RUN(MACRO1A.XLMlaverage") =SELECT(*R[1]C") =DEF1NE.NAME("SD') =R U N('MACRO1 A.XLMistdev') =SELECT("R[1]C") =FORMULA('=SD/MAR") =SELECTCR[21C') =DEFINE.NAME(COUNT') =RUN(MACRO1A.XLM!COUNTER") =SET.NAME(counr,DEREF(ACTIVE.CELL())) =SELECTCRE2jC[21') 95 FLOW (cont'd) =FORMULA(=SD/MAR') =SELECTCRHIC") =FORMULAC=SD/RF4P[3]") =SELECT("R(-1]C") =FORMULAC=MAR/RF3ppr) =SELECTCR[5]q-51') =FORMULACStreamflow Pattern Index = ') =SELECTCRC[3]") =FORMULAC=STDEV(RFlOJCF21:RHOJC[9])/AVERAGE(RHOIC[-2]:RHOP[9])") =SELECTCRF1p[2]") =IF(count<5) =FORMULA("TEMP001/nINSUFFICIENT DATA') =END.IF() =IF(count>4) =FORMULA("TEMP001/nMAR.000/nFPla.).00') =END.IF() =RUN("MACRO1A.XLM!MOVEDATA") =SELECT(RC1') =RUN(MACRO1A.XLM!SUMDATA') =RUN(MACRO1A.XLM!SAVEOLD") =RETURN() 96 SNGL_FLOW_FILE =ECHO(FALSE) =RUN("MACRO1A.XLM!EXTRADATA') =SELECT(' R5C1 ') =RUN(MACR01A.XLM!PARSE2') =RUN("MACRO1A.XLM!parseloop') =SELECT('REIC1') =I F(ISBLAN K(ACTIVE.CELL())) =SELECT('N3]C') =ELSE() =SELECT('FIF11C') =SELECT.END(4) =SELECT. END(4) =SELECT(' WIC') =END.IF() =RUN(MACR01A.XLMIPARSE3') =SELECT('R[1]C") =RUN("MACR01A.XLMIPARSE4') =SELECT('R(11C') =SELECT('FIC:13[61C[91') =EDIT.DELETE(1) =SELECTCRC[51') =FORMULA("per unit area") =SELECT.END(1) =SELECT("R[1]C") =FORMULA("Mean Annual Runoff = ") =SELECT("R[1 IC") =FORMULA("Standard Deviation = ") =SELECT("R[1 IC") =FORMULA("Coefficient of Variation = ") =SELECT("R[2]C") =FORMULA("Years of Complete Record = ") =SELECT("R[-4]C[3]") =DEFINE.NAME(MMAR") =RUN("MACRO1A.XLMNO_REGULATED") =RUN(°MACRO1A.XLMNO_REGULATED2") =RUN("MACRO1A.XLM!average") =SELECT("R[1]C") =DEFINE.NAME("SD") =RUN("MACRO1A.XLM!stdev") =SELECT("R[1 IC") =FORMULA("=SD/MAR") =SELECT("11[2]C") =DEFINE.NAME("COUNT") =RUN("MACRO1 A.XLMICOUNTER") =SET.NAME("count",DEREF(ACTIVE.CELL())) -SELECT(" R[-2]C[2]") =FORMULA("=SD/MAR") =SELECT("R[-11C") 97 SNGL_FLOW_FILE (cont'd) =FORMULAC=SD/R[-4]C[3]") =SELECT("RHIC") =FORMULA("=MAR/RF31C[3]") =SELECTCR[5]C[-5]") =FORMULA(Streamflow Pattern Index =") =SELECTCRC[31") =FORMULAC=STDEV(R[-10]C[-2]:R(-101C[9])/AVERAGE(RHO1CF21:RHOIC[9])") =SELECTCRE1)Cpy) =IF(count<5) =FORMULA(INSUFFICIENT DATA") =END.IF() =IF(count>4) =FORMULA("DATA OK') =END.IF() =SELECT("R1C16:R200C32") =CLEAR(1) =SELECT(131C1") =RETURN() 98 RATIO PARSE6PARSES Macrosheet : RATIO.XLM =ECHO(FALSE) =ECHO(FALSE) =ECHO(FALSE) =R UN(' RATIO.XLM!EXTRARATIO') =PARSECEYEARli =PARSE(11982][ =SE LECT("R5C1 =RETURN() =RETURN() =R UN ("RATIO.XLM!PARS E5") =SELECT.EN D(4) =SELECT. END(4) =DEFINE.NAME(RATEND*) =SELECT(R7C1:RATEND") =RUN(' RATIO.XLMIPARSE6') =RUN(RATIO.XLM!COUNT RAT') =RETURN() 99 CLEAN_UP EXTRARATIO =ECHO(FALSE) =ECHO(FALSE) =SELECT("R7C1') =SELECTCR7C1") =SELECT. END(4) =S E LECT. EN D(4) =DEFINE.NAME("CLEND") =SELECT('R[1 1]C") =SET.NAME(EXTRAV,ACTIVE.CELLO)=SELECT("R7C5') =IF(ISBLANK(ACTIVE.CELL())) =SELECT("R[1 ]a.) =SELECT(R) =SELECT('R:R[-1 1]) =ED IT.DELETE(2) =EDIT.DELETE(2) =SELECT(RC5') =END.IF() =ELSE() =SELECT('R7C1') =SELECT("RC9") =SELECT.EN D(4) =IF(ISBLANK(ACTIVE.CELL())) =SELECT("R[21C') =SELECT('W) =SELECTCR:R[31') =ED1T.DELETE(2) =EDIT.DELETE(2) =SELECT(RC5') =RETURN() =FOR.CELL("cleare,"R7C1:CLEND',FALSE) =IF(ISTEXT(EXTRA2)) =ELSE() =SELECT('Rf 11C[-4]") =END.IF() =END.IF() =NEXT() =RETURN() 100 RATIO_CALC SUM_CALC =ECHO(FALSE) =ECHO(FALSE) =SELECT(R7C15') =SELECT("R7C2') =SELECT.END(4) =SELECT.END(4) =SELECTCRC[21') =DEFINE.NAME("MAXIBOT') =DEFINE.NAME("RATBOT') =SELECT("R7C7') =SELECT(*R7C17") =SELECT.END(4) =FORMULAC=RCF151/RCH Or) =DEFINE.NAMECMAXDBOT") =COPY() =SELECT('R7C1") =SELECT("R8C17:RATBOT") =SELECT.END(4) =PASTE.SPECIAL(1,1,FALSE,FALSE) =SELECTCR[4]C") =RETURN() =FORMULA("Mean of Ratio of max instant/max daily =') =SELECT('R[2]C") =FORMULA('StDev of Ratio of max instant/max daily =') =SELECT('R[2]C') =FORMULA("C.V. of Ratio of max instant/max daily =') =SELECT('R[2]C') =FORMULAffears of Complete Record =") =SELECT("R[2]C') =FORMULA("Mean of annual max daily flows =') =SELECT("R[2]C') =FORMULA("StDev of annual max daily flows =") =SELECT('R[2]C") =FORMULACC.V. of annual max daily flows =') =SELECTCRPIC") =FORMULA("Correlation Coefficient between max daily flows and ratio =") =SELECT('FI[-14]C[6]") =DEFINE.NAME("MEANRAT") =FORMULA("=AVERAGE(R7C17:RATBOT)") =SELECT('R[2]C") =DEFINE.NAME("SDRAT') =FORMULA("—STDEV(R7C17:RATBOT)') =SELECT("R[2]C') =DEFINE.NAME("CVRAT') =FORMULA('=SDRAT/MEANRAT") =SELECT("R[2]C') =DEFINE.NAME("FCOUNT") =FORMULA("=COUNT(R7C17:RATBOT)') =SELECTCR[21C') =DEFINE.NAME("MEANMAX') =FORMULA("=AVERAGE(R7C7:MAXDBOT)') =SELECT('13[21C-) =DEFINE.NAME('SDMAX") =FORMULA("=STDEV(R7C7:MAXDBOT)') =SELECT('R[2]C') =DEFINE.NAME("CVMAX') =FORMULA("=SDMAX/MEANMAX") =SELECT("R[2]C") =DEFINE.NAME("CC') 1 01 MOVE_RAT SUM_RAT =ECHO(FALSE) =SELECT(`SUM_RAT LI N E1") =SELECT("MEAN RAT") =S ELECT(' RC2:RC[2]*) =COPY() =COPY() =S ELECT(' R[171CF5r) =ACTIVATE('S UM BASE.XLS") =PASTE.SPECIAL(3,1,FALSE,FALSE) =S ELECT(' R16384C10") =DEFINE.NAME('SUM_RAT_LINE1') =SELECT.END(3) =SELECTCCVRAT) =SELECT(' ADP') =COPY() =PASTE() =S ELECT(' R[131CF4r ) =CANCEL.COPY() =PASTE.SPECIAL(3,1,FALSE,FALSE) =ACTIVATE.NEXT() =SELECT(' FCOU NT") =SELECT(*SUM_RAT LIN E2`) =COPY() =SELECTCRC6:RC[2]") =SELECT(' R[11]C[-3]') =COPY() =PASTE.SPECIAL(3,1,FALSE,FALSE) =ACTIVATE("SUMBASE.XLS") =SELECT("MEANMAX") =SELECT(' R16384C27") =COPY() =SELECT.END(3) =SELECT(' R[9]CF1 r) =SELECT('R[11C*) =PASTE.S P ECIAL(3,1,FALSE, FALSE) =PASTE() =DEFINE.NAME("SUM_RAT_LINE2") =CANCEL.COPY() =SELECT(*CVMAX") =RUN("RATIO.XLMICALC_UA_MEAN") =COPY() =ACTIVATE.NEXT() =S ELECT(' R[5]C") =RETURN() =PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT("CC") =COPY() =SELECT("R[3]C[1]') =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =RETURN() 102 COUNT RAT YES_RATIO =ECHO(FALSE) =ECHO(FALSE) =R U N(' RATIO.XLM!CLEAN_U P')=SELECT('R7C1') =RUN("RATIO.XLM!RATIO_CALC")=SELECT.END(4) =RUN(RATIO.XLM!SUM_CALC')=SELECT("RC[1]) =RUN("RATIO.XLMISUM_RAT')=SELECT("R4C1') =DEFINE.NAME(RATBOT2') =RUN(RATIO.XLM!MOVE_RAT') =SET.NAME("CH ECK',ACTIVE.CELL()) =RUN('MACRO1A.XLM!SAVEOLD") =FORMULA("=COUNT(R7C2:RATBOT2)) =RETURN() F(CH ECK) =RUN("RATIO.XLM!NO_RATIO') =ELSE.IF(CHECK=1) =RUN(RATIO.XLM!ONE_RATIO') =ELSE() =RUN(RATIO.XLMIYES_RATIO') =EN D.I F() =RETURN() 103 NO_RATIO SNGL_RATIO_FILE =R U N(" MACRO1A.XLM!SAVE0 LW) =ECHO(FALSE) =ACTIVATE(SUMBASE.XLS") =RUN(RATIO.XLMIEXTRARATI0m) =SELECT("R16384C10') =SELECT('R5C1') =RUN("RATIO.XLMIPARSE5')=SELECT.END(3) =SELECT('R[1]C") =SELECT.END(4) =FORMULA(-99.99) =SELECT.END(4) =DEFINE.NAME(RATEND")=SELECT("RC[1]*) =SELECT('R7C1:RATEND")=FORMULA(-99.99) =RUN(RATIO.XLMIPARSE6")=SELECTCRC[1r) =FORMULA(0) =SELECT("R7C1') =SELECT("R16384C27') =SELECT.END(4) =SELECT.END(3) =SELECT("RC[1]) =DEFINE.NAME("RATBOT2')=SELECT("R[1]C") =FORMULA(-99.99) =SELECT("R4C1') =SET.NAME("CHECIC,ACTIVE.CELL())=SELECT(RC[1]") =FORMULA(-99.99) =FORMULA("=COUNT(R7C2:RATBOT2)) =SELECT(RC[1]") =IF(CHECK=0) =FORMULA(-99.99) =ALERT(*There are no years of record with bol =RUN(MACRO1A.XLMISAVEOLD")=SELECT('RC[2]) =FORMULA(-99.99) =ELSE() =RUN(RATIO.XLMICLEAN_UP*) =RUN(RATIO.XLMIRATIO_CALC") =RUN("RATIO.XLMISUM_CALC") =END.IF() =RETURN() 104 =RETURN() ONE_RATIO CALC_UA_MEAN =SELECT("R7C17") =I F(G ET.CELL(5)>0) =FO RM U LA(' =R7C2/R7C7") =S ELECT(' RC[251" ) =FORM U LAC=RCF4VRC[-25]")=COPY() =SELECT('R16384C10') =SELECT(' RC[25]") =SELECT.EN D(3) =FORMULA(-99.99) =SELECT(' R[11C') =END.IF() =RU NC RATIO .XLMICLEAN_U P =SELECT('FIC[-21]") =ACTIVATE('SUMBASE.XLS') =ELSE() =PASTE.SPECIAL(3,1,FALSE,FALSE) =RETURN() =S ELECT(' RC[1r) =FORMULA(-99.99) =S ELECT(' RC[1 ]") =FORMULA(1) =ACTIVATE.NEXT() =SELECT('R7C2') =COPY() =ACTIVATE("SUMBASE.XLS') =S ELECT(' R16384C27") =SELECT.END(3) =SELECT("13[1]C“) =PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT(' RC[1r) =FORMULA(-99.99) =SELECT(' RC[1r) =FORMULA(1) =SELECT(' RCF2r) =RUN("RATIO.XLMICALC_UA_MEAN") =ACTIVATE.NEXT() =RUN("MACRO1A.XLMISAVEOLD') =RETURN() 105 Macrosheet : SETUP.XLM MASTER_FLOW =OPEN r CAQU I KMAP \ FLOWDATMAR EA \TEMPOO1F1 =RUN(FLOW.XLM!FLOW) P EN(' CAQU IKMAP \FLOWDATA\AREA\INDETEMP.') =RUN(' INDEX.XLMUNDEXX") =ACTIVATE(' SUMBASE.XLS") =R U N("Q U IKDATA.XLM!DATAFI LL') =RUN('QUIKDATA.XLMICLEAR_DATA') =RETURN() 106 MASTER_RATIO =OP EN( CAQ U IKMAP \FLOWDATA \AREA \TEMP001 E.") =RUN(RATIO.XLMRATIO") =ACTIVATE("SUMBASE.XLS') =RUN("QUIKDATA.XLMIFORMAT_RATNUM") =RETURN() 107 FILEOPEN_SETUP FILEOPEN_SETUP (cont'd) =oPEN("CAQUIKMAP\FLOWDATA\AREA\INDETEMP.') =INSERT(2) =RUN(INDEX.XLMEXTRAINDEX") =FORMULA(".") =SELECT('R3C1') =SELECT(R[2]C") =SELECT.END(4) =NEXT() =DEFINE.NAME("BOT3') =SELECT(R5C3:stat1end") =SELECT("R3C1:B0T3") =COPY() =PARSENPRII NO.^][NAME =ACTIVATE("SETUP.XLM") =SELECT("R3C3") =SELECT("R2C2") =FORMULA("F.") =PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECTCRC[1]") =CANCELCOPY() =FORMULA("E.") =ACTIVATE.NEXTO =SELECT.END(4) =SELECT('R4C4') =DEFINE.NAME("stat2end") =FOR.CELL("num2","R3C4:stat2encr,FALSE) =SELECTMCF1r) =INSERT(2) =DEFINE.NAME("stat1end") =FORMULA(".") =SELECT("R4C3") =SELECT("13[21C-) =FORMULA(=REPLACE(RC2,8,2,R3C3)) =NEXT() =SELECT("R4C4") =SELECT("R5C4:stat2encr) =FORMULAC=REPLACE(RC2,8,2,R3C4)") =COPY() =SELECT("R4C3:R4C4") =ACTIVATE('SETUP.XLM") =COPY() =SELECT("R2C4) =SELECT(R5C3:stat2encr) =PASTE.SPECIAL(3,1,FALSE,FALSE) =PASTE() =CANCEL.COPY() =CANCEL.COPY() =ACTIVATE.NEXTO =SELECT(134C4:stat2end") =RU N("MACRO1A.XLM! SAVEO LD") =COPY() =ALERT("The program is now ready to work its mag =PASTE.SPECIAL(3,1,FALSE,FALSE) =RETURN() =CANCEL.COPY() =ALERT("As you should be aware, WSC's extreme flow data bas =SELECT("R1C4") =FORMULA(INPUT("Do you wish to create a dummy extreme de =1F(LEN(ACTIVE.CELLO)=3) =SELECT(R4C4:stat2end") =FORMULA.REPLACE(INPUT("Enter the name of the missing s• =ELSE() =GOTO(R48C5) =END.19) =SELECT(131C4') =FORMULA(INPUT("Do you wish to create a dummy extreme de =1F(LEN(ACTIVE.CELLO)=3) =SELECT(R404:stat2encr) =FORMULA.REPLACE(INPUTCEnter the name of the missing s' =ELSE() =GOTO(R48C5) =END.IF() =GOTO(R[-8]C) =SELECT("R4C3") =FOR.CELL('num1",'R3C3:stattencr ,FALS E) 108 RENAME_FILE =ACTIVATE("SETUP.XLM") =SELECT("R4C2') =SET.NAME(only1',ACTIVE.CELL()) =IF(ISBLANK(only1)) =SELECT("R2C2') =DEFINE.NAME('statonly1') =SELECT(R2C4') =DEFINE.NAME(statonly2-) .SELECT('R2C1") =FORMULA.REPLACE('AREA",area,2,1,TRUE,FALSE) =FORMULA.REPLACE("TEMPOO1F.",statonly1 ,2,1,TRUE,FALSE) =SELECT('R2C3') =FORMULA.REPLACE(AREA",area,2,1,TRUE,FALSE) =FORMULA.R EP LACE(TEMP001 E.",statonly2,2,1,TRUE,FALSE) =ELSE() =SELECT(R2C2') =SELECT.EN D(4) =DEFINE.NAME("nambot1") =SELECT(`R[11CF1r) =DEFINE.NAME("statbot1") =SELECT("R2C1') =FORMU LA. REP LACE(AREA',area,2,1 ,TRU E,FALSE) =SELECT("R2C1:R3C1') =COPY() =SELECT('R4C1:statbot1 ') =PASTE() =CANCEL.COPY() =SELECT(R3C2") =FOR.CELL("stat1 ','R3C1:statbot1 ',FALSE) =CLEAR(1) =SELECT("R[2]C") =NEXT() =SELECTCR2C21 =FOR.CELL("stat1 ','R2C2:nambot1 ',TRUE) =SELECTCRC(-1r) =FORMULA.REPLACE(TEMPOO1F.',stat1 ,2,1 ,TRUE,FALSE) =SE LECT('13[21C[1]*) =NEXT() =SELECT('R2C4') _-.:S ELECT. E N D(4) =DEFINE.NAME("nambot2') .SELECTCR[1]C[-11') =DEFINE.NAME(statbot2-) =SELECT("R2C3') =FORMULA.REPLACE("AREA",area,2,1,TRUE,FALSE) =SE LECT("R2C3:R3C3") =COPY() =SELECT(R4C3:statbot2') 109 RENAME_FILE (conrcl) =PASTE() =CANCEL.COPY() =SELECT(' R3C4') =FOR.CELL(stat2',`R3C3:statbot2',FALSE) =CLEAR(1) =SELECT(' R[2]C') =NEXT() =SELECT(' R2C4') =FOR.CELL(stat2R2C4:narnbot2",TRUE) =SELECTMCF11) =FORMULA.REPLACE(TEMPOO1E.',stat2,2,1 ,TRUE,FALSE) =SELECT(' WPM") =NEXT() =END.IF() =SELECT('R500C1:R506C3') =COPY() =SELECT.END(3) =SELECT("R[1]C") =PASTE() =CANCEL.COPY() =RETURN() 110 BEGIN_SETUP (cont'd)BEGIN_SETUP =ACTIVATE(SETUP.XLM") =RETURN() TEMP DATA_INPUT MASTER_FILE_SETUP =RUN(SETUP.XLMIFILEOPEN_SETUP")SUMTEMP =ACTIVATE(SUMTEMP.XLS') =RUN("SETUP.XLMIRENAME_FILE) INDETEMP =RETURN() =RUN(SETUP.XLMIBEGIN_SETUP") =FORMULA(INPUT("The results have been saved in a file named SUMBASE.X1 SUMMARY\SUMTEMP =SAVE.AS(CAQUIKMAP\FLOWDATA\SUMMARYNSUMTEMP.XLS',1) =SAVE.AS("CAQUIKMAP\FLOWDATA\SUMMARY\SUMTEMP.DBF',8) 112 MASTER_FILE_CREATE MASTER_MASTER =RUN("SETUP.XLMIMASTER_RATIO*)=RETURN() =RUN("SETUP.XLMIMASTER_FLOW") =RUN(SETUP.XLM!MASTER_FILE_SETUP") =RUN("SETUP.XLMIMASTER_RATIO") =RUN(SETUP.XLMNASTER_FLOW") =RUNCQUIKDATA.XLMILATLONG_CHECK") =RUN("QUIKDATA.XLM!DATABASE1") =BEEP() =RETURN() 113 OPEN CLOSE =ECHO(FALSE) =ECHO(FALSE) =RETURN() =SAVE.AS(,0) Macrosheet : OPEN.XLM =OPE N ("CAW IKMAKFLOWDATA \MACRO \FLOW.XLM') =ACTIVATE("SETUP.XLM") =OPEN (C:KW I KMAKFLOWDATA \MACRO \MACRO1A.XLM") =SAVE.AS(,0) =OPEN ("C :\OU IKMAKFLOWDATA \MACRO\INDEX.XLM') =CLOSE(FALSE) =OPEN("CAQU IKMAKFLOWDATA \MACROQUIKDATA.XLM') =ACTI VATE( FLOW.XLM-) =0 PEN ("C: \QU IKMAKFLOWDATA \MACRO \RATIO.XLM') =SAVE.AS(,0) =OPEN(OQU IKMAKFLOWDATA \MACRO \SETU P.XLM*) =CLOSE(FALSE) =OPEN(CAQU I KMAKFLOWDATA MACROZATSUM.XLM") =ACTIVATE(I N D EX.XLM') =CLOSE(FALSE) =ACTIVATE("QUIKDATA.XLM") =SAVE.AS(,0) =CLOSE(FALSE) =ACTIVATE("MACR01A.XLM") =SAVE.AS(,O) =CLOSE(FALSE) =ACTIVATE(RATIO.XLM") =SAVE.AS(,0) =CLOSE(FALSE) =ACTIVATE(' OATS UM.XLM") =SAVE.AS(,0) =CLOSE(FALSE) =RETURN() 114 Macrosheet : DATSUM.XLM RUN_DATSUM =IF(ALERT('Welcome to the DATSUM program. This program creates data summary files for WSC streamfiow regions. I trus =ALERT('The program will prompt you for input as required. If you make a mistake and mess things up, don't panic, just close =OPEN(CAQUIKMARFLOWDATA\SUMMARY\SUMBASE.XLS') =RUN(SETUP.XLMIMASTER_MASTER') =ACTIVATE(SETUP.XLM') =SAVE.AS(,0) =CLOSE(FALSE) =OPEN(CAQUIKMARFLOWDATA\MACRO\SETUP.XLM") =ACTIVATE("SUMBASE.XLS') =SAVE.AS(CAQUIKMAP\FLOWDATA\SUMMARY■SUM08EB.XLS',1) =SAVE.AS(CAQUIKMARFL0WDATA\SUMMARY1SUMO8EB.DBP,8) =CLOSE(FALSE) =SELECT('R2C2') =FORMULA(INPUT('The results have been saved in a file named SUM08EB.XLS. Do you wish to run the program again? An =1F(LEN(ACTIVE.CELL())=3,GOTO(R[-12]C),GOTO(R[1]C)) =ALERT("You have chosen to exit the program. All of the summary files you created have been saved in directory CAQUIKMA =ACTIVATE("SETUP.XLM') =SAVE.AS(,0) =CLOSE(FALSE) =ACTIVATErFLOW.XLM1 =SAVE.AS(,0) =CLOSE(FALSE) =ACT1VATE("INDEX.XLM") =SAVE.AS(,0) =CLOSE(FALSE) =ACTIVATE("QUIKDATA.XLM") =SAVE.AS(,0) =CLOSE(FALSE) =ACTIVATE('MACRO1A.XLM') =SAVE.AS(,0) =CLOSE(FALSE) =ACTIVATE("RATIO.XLM“) =SAVE.AS(,0) =CLOSE(FALSE) =RETURN() 115 Appendix A.4 - Installing Customized QUIKMap Files The QUIKMap program contains a number of files which can be modified to customize different applications. Basically, the majority of these files influence the way in which data is presented. The following is a list of QUIKMap files which have been created to be compatible with the DATSUM summary files. These files should be loaded into the CAQUIKMAP\FLOWDATA1SUMMARY directory before the QUIKMap software is operated. BASEMAPS.QCK : Menu file which allows the user to select a basemap from a number of choices. DATAFILE.QCK : Menu file which allows the user to select a WSC streamflow summary file (files created by DATSUM). NE_STAT.DBF : NW_STAT.DBF : CE_STAT.DBF : Database files which contain the names and locations of every CW_STAT.DBF : WSC streamflow region in British Columbia. SE_STAT.DBF : SW_STAT.DBF ALL_STAT.DBF LEGEND.LGD : File which contains a legend for the DATSUM summary files. This file can be invoked on the QUIKMap screen by pressing the keys <ALT> < L>. SUM****.EDT : This file controls how a DATSUM summary file appears on the QUIKMap screen. A different .EDT file needs to be created for every summary file, and the .EDT file must have the same name as the summary file with which it is associated. For example, the data file SUM08ED.DBF needs an .EDT file named SUM08ED.EDT. If the .EDT file is omitted, then the data will appear on the screen in standard QUIKMap form. If many data files have the same format, then .EDT files can be created by simply copying a model .EDT file many times, and renaming it appropriately. 116 APPENDIX B FLOOD PROGRAM DETAILS Appendix B.1 - FLOOD Program Organization As explained in the text, the FLOOD program was originally developed in Pascal software language, and has been rewritten for this thesis in Excel language. It is assumed that users of this new version of FLOOD will be familiar with Excel, but if this is not the case, then the Excel User's Manual (Microsoft Corporation, 1991) should be referenced. The Excel version of FLOOD consists of two worksheets, FLOOD.XLS and SUMMARY.XLS, and two macrosheets, DIALOG1.XLM and CALCS.XLM. The sheet FLOOD.XLS is a large spreadsheet on which numerous calculations are made, based on the user's input and resulting instructions from the macros. The sheet SUMMARY.XLS is a blank file into which the results of an analysis are exported. The sheet DIALOG1.XLM creates dialog boxes on the screen which permit the user to input values. Finally, the sheet CALCS.XLM contains twenty-five macros which perform various functions, and which, when run collectively, operate the program. A listing of the various macros and a description of their functions is given in Appendix B.3, as is a flowchart which illustrates the inter-macro relationships (Figure B.3). A copy of the programming code is given in Appendix B.4. Some of the code is not shown in its entirety, but in general, the missing code is text which is not significant to understanding the programming. 118 Appendix B.2 - Instructions for Use The following text presents instructions for installing and operating the FLOOD program. This program has been designed to operate in the Microsoft Excel software environment, although with modification, it may run in the Lotus 1-2- 3 environment. If the instructions are followed carefully, the program should run without incident. However, it should be noted that the program is not very robust, in that it has not been written with many safeguards against user errors, and could potentially be destroyed with improper use. Therefore, it is highly recommended that a back-up copy of the program always be kept on hand, so that if the user inadvertently destroys the program, it can easily be restored to its original state. B.2.1 Installing the Program Install the FLOOD program by copying the files FLOOD.XLS, CALCS.XLM, SUMMARY.XLS, and DIALOG1.XLM into a directory in the C:\  drive, titled FLOOD (ie. C:\FLOOD). This is imperative! The CALCS.XLM macro has many references to files in this directory. Therefore, the program will not run properly if the files are loaded into a different directory, unless the file references are changed appropriately. B.2.2 Running the Program Enter the Microsoft Excel environment. Open the files CALCS.XLM and DIALOG1.XLM. Now open the file FLOOD.XLS. Loading this file will automatically invoke the macro calcs.xlm!Master_Master, which will run the flood program and prompt the user for the appropriate inputs. The two dialog boxes, shown in Figures B.2.1 and B.2.2, will appear on the screen to allow the user to enter data. The user can move the cursor to the various input spaces in the dialog boxes by either using the mouse and clicking on the appropriate box, or by pressing the <Tab> key on the 119 '01 -Mean Estimate Low Probable High -C.V. Estimate Low Probable High  • -.Adjust Flows 1,3 Addition Factor o Multiplication Factor Low ^Î I Probable I Î High ^I 1̂ Corr. Coeff. " I. InPut O Auto Return Periods (2.5. 10, 20, 50. 100.200 500, 1000 and 1500 years) ther Return Periods Enter value as YEARS RP1 RP2 RP3 RP4 RP5 Use either the tab key or the mouse to move the cursor around the form. Figure B.2.1 - Initial Input Dialog Box Input a maximum of 10 data values (ex. 8 actual. 1 mth lamest, 1 flood exceeded) {Actual Flood Flows— newel flow02 HowO3 fluiv04 flow05 flow08 flow07 How08 110509 Howl 0 -Flood Exceeded m times in n Years— Flow^Times Year: flow01 ^I Î flow02 I I^I -Mth Largest Flood in n Years Flow^Rank Years flow01 ^Î 1 I^1 flow02 Note: If you choose to adjust the flows, results for both adjusted and unadjusted flows will be givenj Figure B.2.2 - Additional Flow Data Dialog Box 120 keyboard. Once the values have been entered in the spaces on the dialog box, the user need just press the <Enter> key to proceed. The first dialog box, titled "Initial Input", prompts the user to enter basic data, which is the minimum data required for the program to operate. The program requires estimates of the low, probable and high values of the mean annual extreme (ie. flood or low) flow and its coefficient of variation. The low value is the one which the user is 95% sure the actual value will exceed, the probable is the most likely value, and the high is the value the user is 95% sure the actual value will not exceed. The user must then specify the design return periods and is given the option of entering individual values, or selecting a standard set of return periods, namely 2, 5, 10, 20, 50, 100, 200, 500, 1000 and 1500 years. The second dialog box, titled "Additional Flow Data", allows the user to enter additional relevant data, which the program uses to update its flow estimates. The user is able to enter actual flood flows, and/or Mth largest floods in N years, and/or floods exceeded M times in N years. A maximum of ten additional flow values is permitted. One final option available to the user, is the ability to adjust flows by an addition or multiplication factor. This can be useful, for instance, when one is interested in knowing both max daily and max instantaneous flood flow estimates. In this case, the user would enter all the usual information about the max daily flows, and then would select the multiplication factor option and enter low, probable and high estimates of that factor, as well as a correlation coefficient between the factor and the max daily flows. These instructions may seem confusing, but should become clear after reading Chapter 7, Sample Application. If the reader is still confused, then it is suggested that the reader refer to Dr. Denis Russell's paper, "Estimating Flows from Limited Data"(Russell 1991). 121 Once the program has been run, it will open the SUMMARY.XLS file and save the results to this file under a new file name, specified by the user. The program will then close the FLOOD.XLS file without saving any of the changes, so that the file is ready to run again. In order to run the program again, the user need just open the FLOOD.XLS file and then follow the prompts. Note that the input values from the previous run of the program will appear in the dialog boxes, allowing the user to check his previously inputted values, and facilitating the running of "what-if' scenarios. The results of the analysis will be found in the CAFLOOD directory, under the user specified file name. The file is in standard Excel worksheet format, and can be manipulated with standard Excel commands. Note that if the user utilized the "Adjust Flows" option of the program, then the results will contain two sets of numbers, one adjusted and one unadjusted, and that the unadjusted values may appear off the screen. Simply scroll page down in order to view these values. B.2.3 Exiting the Proaram Exit the program by closing all files in the usual manner, but make sure to not save any of the changes. Notes 1. Dialog1.xlm is protected. To unprotect the file so that it can be edited, the password is DIALOG1. 2. Cell R1C1 in file FLOOD.XLS is linked to the CALCS.XLM file, so that you may experience difficulties making changes to file references in file CALCS.XLM. If you wish to eliminate the link, open file CALCS.XLM, click on the pull-down menu Formula, select Define Name, and delete the entry Auto_Open_calcs. 122 Appendix B.3 - Macros: Names and Functions This is a brief description of the Excel macros used to create the FLOOD program. All of the following macros are found in macrosheet CALCS.XLM, and they are given below in the order in which they appear in the macrosheet. MASTER_MASTER < CTRL> < m > Master macro which links all the other flood macros. The letters in brackets "< >" indicate the key strokes which can be used to invoke this command macro. MASTER_MASTER2 Master macro which links all the other flood macros. This version is automatically invoked if the user tries to input more than 10 additional data values. RETURN PERIOD Copies the return periods from the DIALOG1.XLM macrosheet and then calculates the low, probable and high return period flows for each return period. INPUT_ M _STDV Copies the user input estimates of low, probable, and high means and coefficient of variations from the DIALOG1.XLM macrosheet to the appropriate cells in the flood worksheet. DIALOG_BOX Invokes the dialog boxes which allow the user to input values. COPY NUM Copies the 25 pairs of return period flows and associated probabilities to another area on the spreadsheet, where they will be ranked in ascending order according to flow. Copies the data from one of three areas depending on the situation. Copies the data from one area if additional flow data has been entered and the probabilities adjusted, from another area if this is not the case, and from a third area if the flows have been adjusted by an addition or multiplication factor. SORT_NUM Ranks the 25 pairs of return period flows and associated probabilities in ascending order according to flow. CALC_LOW Calculates the 10th percentile flow value. 123 CALC_HIGH Calculates the 90th percentile flow value. MOVE_NUM1 Copies the return periods and low, probable and high estimates of return period flows to a summary area of the spreadsheet. MASTER_CALC Coordinates various macros to calculate the estimated low, probable and high return period flows for a number of specified return periods. ACTUAL DATA Copies previously inputed actual data values from the DIALOG1.XLM macrosheet to the FLOOD.XLS worksheet, and then uses these values to update probabilities. MOVE_LIKELIHOOD1 Moves a column of actual data likelihood values to the likelihood summary area of the spreadsheet. FLOOD_EXCEEDED Copies previously inputted flood exceeded values from the DIALOG1.XLM macrosheet to the FLOOD.XLS worksheet, and then uses these values to update probabilities. MOVE_LIKELIHOOD2 Copies the updated probabilities (by flood exceeded macro) to another area of the spreadsheet. Mth_LARGEST Copies previously inputted Mth largest data values from the DIALOG1.XLM macrosheet to the FLOOD.XLS worksheet, and then uses these values to update probabilities. MOVE_LIKELIHOOD3 Copies the updated probabilities (by Mth largest macro) to another area of the spreadsheet. MASTER_FLOOD Uses additional flood data previously inputted by the user (actual data, flood exceeded or Mth largest), to update the probabilities of an event occurring. COUNT_DATA Counts the number of additional data values entered by the user. If the number is greater than 10, then the MASTER_MASTER2 macro is invoked, otherwise the MASTER MASTER macro is invoked. 124 MASTER_FACTOR Master control macro which combines a number of other macros to change the flow values by an addition or multiplication factor. ADD_FACTOR Changes the flow values by adding a specified number to each flow value. Performs one set of actions if the values have previously been altered by additional flow data, and another set, if not. MULT_FACTOR Changes the flow values by multiplying each flow value by a specified number. Performs one set of actions if the values have previously been altered by additional flow data, and another set, if not. MOVE_NUM_FACT When the "change flows by a factor" option is chosen, this macro copies the return periods to a new area of the worksheet, and then invokes other macros to calculate the new factored return period flows for each return period. MOVE_NUM2 Copies the low, probable and high estimates of the factored return period flows to a summary area of the spreadsheet. SUMMARY Saves the results of the analysis to a separate file, prompts the user to name the summary file, then closes the FLOOD.XLS file without saving any changes, so that it is ready to be used again. 125 Figure B.3 - Flow Chart Illustrating Flood Macro Relationships 126 MASTER_MASTER2MASTER_MASTER =RUN(CALCS.XLM!COUNT DATA')=SET.VALUE(excess,0) =SET.VALUE(excess,1)=IF(ALERT('Welcome to the FLOOD program. This pro =excess+1=ALERT("The FLOOD program was originally designed =RUN(CALCS.XLMIDIALOG_BOV)=ALERT("The FLOOD program uses compound probabi =RUN(CALCS.XLMIRETURN_PERIOD')=ALERT("The program is now ready to run. It will take a Appendix B.4 - Programming Code Macrosheet : CALCS.XLM =RUN(*CALCS.XLMIDIALOG_BOr) =ALERT("The program is now ready to run. It will take a few seco =RUN(CALCS.XLMICOUNT DATA') =RUN(CALCS.XLMIINPUT_M_STDV`) .4F(excess>0,GOTO(R[7]C),GOTO(Rf 11C)) =RUN(CALCS.XLM!MASTER_FLOOD') =RUN(CALCS.XLIWINPUT M_STDV') =RUN(CALCS.XLMIMASTER_FACTOR') =RUN(cALCS.XLMIMASTER_FLOOD') =RUNCCALCS.XLMISUMMARY') =RUN("CALCS.XLMIRETURN_PERIOD') =RETURN() =RUN("CALCS.XLMIMASTER_FACTOR') =RUN(CALCS.XLMISUMMARY') =SET.VALUE(excess,0) =RETURN() 127 RETURN_PERIOD INPUT M_STDV =ECHO(FALSE) =ECHO(FALSE) =ACTIVATECDIALOG 1.XLM") =ACTIVATE("DIALOG 1.XLM") =1F(C: \FLOODOIALOG1.XLM1Auto_RP=TRUE) =SELECT(R6C7:R8C7') =ACTIVATE(' FLOOD.XLS") =COPY() =SELECT(FLOOD.XLSIR49C6:R58C6") =ACTIVATENEXTO =COPY() =SELECT("R2C8') =SELECT(CAFLOOD \FLOOD.XLS1R49C1) =PASTE() =PASTE() =CANCELCOPY() =CANCELCOPY() =ACTIVATE(DIALOG1.XLM') =END.1F() =SELECT('R12C7:R14C7') =ACTIVATE(DIALOG1.XLM') =COPY() =SELECT(C: \ FLO° D\DIALOG1.XLM1R21C7:R2507) =ACTIVATE.NEXTO =COPY() =SELECT('R5C8") =ACTIVATE(FLOOD.XLS") =PASTE() =SELECT("R63C1") =CANCEL.COPY() =SELECT.END(3) =ACTIVATE('FLOOD.XLS') =SELECT(13[1]C") =SELECT(132C8) =PASTE() =COPY() =CANCEL.COPY() =SELECT("R4C3') =FOR.CELL("rp1",'CAFLOOD \FL000.XLSIR49C1:R63C1,TRUE) =PASTE() =SELECT(rp1) =CANCEL.COPY() =COPY() =SELECT(R3C8") =SELECT('FLOOD.XLSIreturn_period') =COPY() =PASTE() =SELECT(' R4C5") =CANCEL.COPY() =PASTE() =RUN(CALCS.XLM!MASTER_CALC') =CANCEL.COPY() =NEXT() =SELECT('R4C8') =RETURN() =COPY() =SELECT('R4C7") =PASTE() =CAN CEL.COPY() =SELECT(R5C8") =COPY() =SELECT('R5C2") =PASTE() =CANCEL.COPY() =SELECT('R6C8') =COPY() =SELECT('R7C2") =PASTE() =CANCEL.COPY() =SELECT(' R7C8') =COPY() =SELECT('R9C2') =PASTE() =CANCEL.COPY() =RETURN() 128 DIALOG_BOX =DIALOG.BOX(C:\FLOOD\DIALOG 1 .XLM1Initial_Input) =DIALOG .BOX(C: \FLOODDIALOG1.XLMIAdditional_Data) =RETURN() 129 COPY_NUM SORT_NUM =ECHO(FALSE) =ECHO(FALSE) ..1F(C: \FLOOD \FLOOD.XLSlinput type=0) =SORT(1,'R45C42",1) =SELECT("R16C6:R40C6") =RETURN() =I FCCAFLOOD \FLOOD.XLS1factor_type,GOTO(13[11C),GOTO(R[25]C)) =SELECT(R45C42:R69C43') =COPY() =SELECT("R45C42') nPASTE.SPECIAL(3,1,FALSE,FALSE) =CANCELCOPY() =SELECT(R16C4:R40C4') =COPY() =SELECT("R45C43') =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =ELSE() =SELECT(1316C42:R40042") =COPY() =SELECT(1345C42') =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SELECT(*R16C40:R40C40') =COPY() =SELECT("R45C43') =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCELCOPY0 =END.IF() =GOTO(R[111C) =SELECT('R16C49:R40C49') =COPY() =SELECT(*R45C42") =PASTE.SPECIAL(3,1, FALSE, FALSE) =CANCEL.COPY() =SELECT("R16C47:R40C47') =COPY() =SELECT(' R45C43") =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =RETURN() 130 CALC_LOW =ECHO(FALSE) =SELECT(' R71 C42') =I FCC AFLOO D FLOOD.XLSIlcum 1 >0.1) =FORMU LA(=(0.10/1cum 1 )*low1 ) =GOTO(R[1 9IC) =ELSE. IF(CAFLOOD \FLOOD.XLS'Ilcum2>0.1) =FO RMU LAC=low1 +(((.10-lcum1 )/(lcum2-lcum 1 ))*(low2-low1 ))') =GOTO(R[16]C) =ELSE.' F(C: \FLOOD \FLOOD.XLS'Ilcum3>0.1) =FORMULAr=low2+(((.10-lcum2)/(lcum3-lcum2))*(low3-low2))) =GOTO(R[131C) =ELS E.I FCC: \FLOOD \FLOOD.XLSIlcum4>0.1) =FORMULAC4ow3+0(.10-lcum3)/(lcurn4-lcum3))*(low4-low3))*) =GOTO(R[1 01C) =ELS E.I F(CAFLOOD \FLOOD.XLS'Ilcum5>0.1) =FORMULAC=low4+(((.10-lcum4)/(lcum5-lcum4))*(low5-low4))) =GOTO(R[7]C) =ELS E.I F(CAFLOO D \FLOO D.XLS'Ilcum6>0.1 ) =FORMULAC=low5+(((.10-lcum5)/(lcum6-lcum5))*(low6-low5W) =GOTO(R[41C) =ELSE.IF(CAFLOOD \FLOO D.XLS'Ilcum7>0.1 ) =FORMULAC=low6+0(.10-lcum6)/(lcum7-lcum6))*(low7-low6))") =GOTO(R[1]C) =END.IF() =RETURN() 131 CALC_HIGH MOVE_NUM1 =ECHO(FALSE) =ECHO(FALSE) =SELECT(" R73C42') =SELECT("R71C42') =I F(CAFLOOD \FLOOD.XLSThcum1>0.1) =COPY() =FORMULA("=high1") =SELECT(iowbor) =GOTO(R[19]C) =SELECT.EN D(3) =ELSE.IF(C: \FLOOD \ FLOC D.XLS1hcum2>0.1) =SELECT(13[1]C") =GOTO(R[16JC) =CANCEL.COPY() =ELSE.IF(C:\FLOOD \FLOOD.XLSThcum3>0.1) =SELECT(' R72C42") =GOTO(R[131C) =SELECT("probot") =ELSE. I F(CAFLOOD FLOOD.XLS'fficum4>0.1) =S ELECT.EN D(3) =PASTE.SPECIAL(3,1,FALSE,FALSE)=GOTO(R[10]C) =ELSE., F(CAFLOO D FLOOD.XLS'fficum5>0.1) =CANCEL.COPY() =GOTO(R[7]C) =COPY() =ELS E.I F(CAFLOOD \ FLOOD. XLS'fficum6>0.1) =SELECT("highbot") =GOTO(R[41C) =SELECTCR[1]C") =PASTE.SPECIAL(3,1,FALSE,FALSE)...ELSE. I F(CAFLOOD \FLOOD.XLS'fficum7>0.1) =GOTO(R[1 1C) =RETURN() =FORMULAC=high2+(((hcum2-0.10)/(hcum2-hcum1))*(high1-high2))") =PASTE.SPECIAL(3,1,FALSE,FALSE) =FORM U LAr=high3+(((hcum3-0.10)/(hcum3-hcum2))*(high2-high3)r) =COPY() =FORMULAC=high4+(((hcum4-0.10)/(hcum441cum3))*(high3-high4))") =SELECT(1=1[1]C") =FORMULAC=high5+(((hcum5-0.10)/(hcum5-hcum4))*(high4-high5))") =SELECT(-R73C42") =FORMULAr=high6+(((hcum6-0.10)/(houm6-hcum5))*(high5-high6))") =S ELECT. EN D(3) =FORMULAC=high7+(((hcum7-0.10)/(hcum7-hcum6))*(high6-high7)y) =CANCEL.COPY() MASTER_CALC =RUN("CALCS.XLM!COPY_NUM") =RUN(CALCS.XLM!SORT_NUM") =RUN(CALCS.XLM!CALC_LOW*) =RUN("CALCS.XLM!CALC_HIGH') =IFCCAFLOOD\FLOOD.XLS1factor_type=0,RUN(CALCS.XLM!MOVE_NUM1'),RUN(CALCS.XLM!MOVE_NUM2")) =RETURN() 133 ACTUAL_DATA MOVE_LIKELIHOOD1 =ECHO(FALSE) =ECHO(FALSE) =SELECT( R 16C11:R40C11°)=ACTI VATE( DIALOG1 .XLM") =SELECT(CAFLOOD \DIALOG1.XLM1R6C15:R15C15) =SELECT(R16C26') =COPY() =SELECT.END(1) =ACTIVATE("FLOOD.XLS") =SELECT('FIC[1]") =PASTE.SPECIAL(3,1,FALSE,FALSE)=SELECT(CAFLOOD \FLOOD.XLS1R3C22) =PASTE() =CANCEL.COPY() =CANCEL.COPY() =RETURN() =IFCCAFLOOD \DIALOG 1. XLM1Actual_Data1>O,GOTO(R[11C),GOTO(R[16]C)) =COPY() =FOR.CELL("ad1','CAFL000 \FLOOD.XLS1R3C22:R12C22,TRUE) =SELECT(ad1) =COPY() =SELECT("afte) =PASTE() =CANCEL.COPY() =RUN(CALCS.XLMIMOVE_LIKELIHOOD1') =NEXT() =FORMULA(1,'CAFLOOD \FLOOD.XLSlinput_type) =RETURN() 134 FLOOD_EXCEEDED =ECHO(FALSE) =ACTIVATE(DIALOG1.XLM") =I FCCAFLOOD \DIALOG1.XLM!Flood_Exceeded1>O,GOTO(R[1]C),GOTO(R[41]C)) =SELECT(CAFLOOD \DIALOG1.XLM1R39015:R44C15) =COPY() =ACTIVATECFLOOD.XLS') =SELECT(CAFLOOD \FLOOD.XLS1R3C23) =PASTE() =CANCELCOPY() =SELECT(lex1') =COPY() =SELECT(afv') =PASTE() =CANCEL.COPY() =SELECT(lem1') =COPY() =SELECT(rn') =PASTE() =CANCEL.COPY() =SELECT('fen1') =COPY() =SELECT(n) =PASTE() =CANCEL.COPY() =RUN(CALCS.XLM!MOVE_LIKELIHOOD2') =FORMULA(1,'CAFLOOD \FLOOD.XLSlinput_type) =IFCC: \FLOOD \FLOOD.XLSIfex2>O,GOTO(R[11C),GOTO(R[17]C)) =SELECT('fex2') =COPY() =SELECT(afv") =PASTE() =CANCELCOPY0 =SELECT("fem2') =COPY() =SELECT("ni') =PASTE() =CANCEL.COPY() =SELECT(len2") =COPY() =SELECT(' n") =PASTE() =CANCEL.COPY() =RUN(CALCS.XLM!MOVE_LIKELIHOOD2') =RETURN() 135 MOVE_LIKELIH0002 Mth_LARGEST =ECHO(FALSE) =ECHO(FALSE) =SELECT(1316C12:R40C12') =ACTI VATE(" DIALOG 1.XLM') =COPY() =IFCC:\FLOO D \ DIALOG 1.XLM1Mth_Largest>0,GOTO(R[1]C),GOTO(R[41]C)) =S ELECT( R16C26') =SE LECT(C: \FLOOD \ DIALOG 1.XLM1 R28C15:R33C15) =SELECT.EN D(1) =COPY() =SELECT('RC[1]') =ACTIVATE('FLOOD.XLS') =PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT(CAFLOOD \FLOOD.XLS1R3C24) =CANCEL.COPY() =PASTE() =RETURN() =CANCEL.COPY() =SELECT(MII') =COPY() =SELECT("afv“) =PASTE() =CANCEL.COPY() =SELECT("m1m1') =COPY() =SELECT(nth') =PASTE() =CANCEL.COPY() =SELECT(' mln1") =COPY() =SELECT("W) =PASTE() =CANCEL.COPY() =R U N(CALCS.XLMIMOVE_LIKE LI HOO D3') =FORMULA(1,'CAFLOOD \FLOOD.XLSlinput_type) =IF(C: \FLOOD \FLOOD.XLS1m12>O,GOTO(R[1]C),GOTO(R[17]C)) =SELECT(m12') =COPY() =SELECT(afv“) =PASTE() =CANCEL.COPY() =S ELECT(' m1m2') =COPY() =SELECT("mth') =PASTE() =CAN CE L.COPY() =SE LECT('mln2') =COPY() =SELECT(n) =PASTE() =CANCEL.COPY() =R U N(' CALCS.XLMIMOVE_LI KELI HOO D3') =RETURN() 136 MOVE_LIKELIHOOD3 MASTER_FLOOD =RUN(CALCS.XLMACTUAL_DATA')=ECHO(FALSE) =RUN(CALCS.XLM!Mth_LARG EST')=COPY() =SELECT(' R16C26") =RETURN() =SELECT(*R16C13:R40013') =RUN(CALCS.XLMIFLOOD_EXCEEDED") =SELECT.END(1) =SELECT('RC[1]*) =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =RETURN() 137 COUNT_DATA MASTER_FACTOR =ECHO(FALSE) =ECHO(FALSE) =ACTIVATE('FLOOD.XLS") =RUN(CALCS2.XLM!MOVE_NUM_FACT) =ELSE.IF(C: \FLOOMIALOG1.XLM1Multiplication_Factor=TRUE)=CLOSE(FALSE) =RETURN() =END.IF() =IF((CO U NTCCAFLOODO IALOG 1 . XLM1R6C1 5:R1 5( =IFCC: \FLOOD \DIALOG1.XLMAddition_Factor=TRUE) =RUN(CALCS.XLM!ADD_FACTOR")=ALERT("Nice try! You are only allowed to enter a ma =ACTIVATE(D IALOG 1 .XLM") =ACTIVATE("DIALOG1.XLM") =OPEN("C: \FLOOD \FLOOD.XLS") =RUNCCALCS.XLM!MULT_FACTOR1 =RUN(CALCS.XLMMASTER_MASTER2') =RUN(CALCS2.XLM!MOVE_NUM_FACT) =RETURN() 138 ADD_FACTOR =ECHO(FALSE) =SELECT(CAFLOOD \ DIALOG 1.XLM11355C15:R58C15) =COPY() =ACTIVATE(' FLOO D. XLS') =SELECTCCAFLOOD \FLOOD.XLS'Ilfact) =PASTE() =CAN CEL.COPY() =FORMULA(1;C: \FLOOD \FLOOD.XLS1R3C49) =SELECT(CAFLOOD FLO° D.XLS'Imean_factor) =COPY() =SELECT(CAFLOOD\FLOOD.XLSIaddiact) =PASTE.SPECIAL(3,1,FALSE,FALSE) =CAN CEL.COPY() =SELECT(CAFLOOD\FLOOD.XLS'Imult_fact) =FORMULA(Wa") =IFCCAFLOOD\FLOOD.XLSlinput type=0,GOTO(R[l jC),GOTO(R[l 9]C)) =SELECT(CAFLOOD \FLOOD.XLSIR16C4:R40C4) =COPY() =SELECT(CAFLOOD\FLOOD.XLS'Irev_prob_t) =PASTE() =CANCEL.COPY() =SELECT('CAFLOOD \FLOOD.XLS'Irev_mean_t) =FORMULAC=RCF4314-mean_factor) =COPY() =SE LECT(CAFLOO D FLO° D. XLS1 rev_mean_VC: \FLOOD \FLOOD.XLS'Irev_mean_b) =PASTE() =CANCEL.COPY() =SELECT(CAFLOOD \FLOOD.XLS'Irev_stdev_t) =FORMULAC=SQRTWRC[43]^2)*(stdev_factor^2))+(2*ccfRCF43]*stdev_factor))") =COPY() =SELECT(CAFLOOD FLO° D.XLS'Irev_stdev_VC: \ FLOOD \ FLOOD.XLS'Irev stdev_b) =PASTE() =CANCEL.COPY() =GOTO(RF291C[1]) =SELECT('CAFLOOD \FLOOD.XLS'Inorm_refit_prob_tC: \FLOOD \ FLO° D.XLS'Inorm_refit_prob_b) =COPY() =SELECT(CAFLOOD\FLOOD.XLS'Irev_prob_t) =PASTE.S PECIAL(3,1,FALS E, FALSE) =CANCEL.COPY() =SELECT('CAFLOOD \FLOOD.XLSIrev_meani) =FORMULA("=RC[-7]+mean_factor") =COPY() =SELECT(CAFLOOD FLO° D.XLS1 rev_meant:'CAFLOO D \FLOOD.XLS'Irev_mean_b) =PASTE() =CANCEL.COPY() .SELECT(CAFLOOD \ FLO° D. XLS'Irev_stdev_t) =FORM U LAC=SQ RTWRC[7]^2)*(stdev_factor^2))+(2*ccr RC[7]'stdev_factor))') 139 ADD_FACTOR (continued) =COPY() =SELECTCCAFLOOD\FL000.XLS1rev_stdev_VC:\FLOOD\FLOOD.XLS1n =PASTE() =CANCEL.COPY() =RETURN() 140 MULT_FACTOR =ECHO(FALSE) =SELECT(CAFLOODZIALOG1.XLM'IR55C15:R58C15) =COPY() =ACTIVATE('FLOOD.XLS") =SELECT(CAFLOOD\FLOOD.XLS'Ilfact) =PASTE() =CANCEL.COPY() =FORMULA(2;C:\FLOOD\FLOOD.XLS1R3C49) =SELECT('CAFLOOD\FLOOD.XLSImean_factor) =COPY() =SELECTCCAFLOOD\FLOOD.XLS'Imultfact) =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SELECT(CAFLOOD\FLOOD.XLSladd _fact) =FORMULA(rila') =IFCCAFLOOD\FLOOD.XLSlinput type),GOTO(R[1]C),GOTO(R[19C)) =SELECT('CAFLOOD\FL000.XLS1R1604:R4004) =COPY() =SELECT(CAFLOOD\FLOOD.XLS'Irev_prob_t) =PASTE() =CANCEL.COPY() =SELECT(CAFLOOD\FLOOD.XLS'Irev_meanf) =FORMULAC=RCF43rmean_factor+ccrRCF42rstdev _factor) =COPY() =SELECT(CAFLOOD\FLOOD.XLS'Irev_mean_UCAFLOOD\FLOOD.XLS'Irev_mean_b) =PASTE() =CANCEL.COPY() =SELECT(CAFLOOD\FLOOD.XLS'Irev_stdev_t) =FORMULAr=SQRT(((RCF441^2)*(stdevfactor^2))+((mean_factorA2)*(RC[43]^2))+(2*ccrRC =COPY() =SELECTCCAFLOOD\FLOOD.XLSirev_stdev_VC:\FLOOD\FLOOD.XLS'Irev_stdev_b) =PASTE() =CANCEL.COPY() =GOTO(R[18]C) =SELECTCCAFLOOD\FLOOD.XLS'Inorm_refit_prob_VCAFLOOD\FLOOD.XLS'Inorm_refit_pro =COPY() =SELECT(CAFLOOD\FLOOD.XLSIrev_prob_t) =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCEL.COPY() =SELECTCCAFLOOD\FLOOD.XLS'Irev_mean j) =FORMULA(=RC[7]* mean _factor") =COPY() =SELECT('C:\FLOOD\FLOOD.XLS 'Irev_meanf'CAFLOOD\FLOOD.XLS'Irev_mean_b) =PASTE() =CANCEL.COPY() =SELECT(C:\FLOOD\FLOOD.XLS 'Irev_stdevf) =FORMULAC=SQRTWRCHIA2)*(stdev_factor^2))+((mean_factor^2)*(RCF71^2))+(2*ccr RCH 141 MULT_FACTOR (continued) =COPY() =SELECT(C:\FLOOD\FLOOD.XLS1rev_stdev_VCAFLOODTLOOD.XLS1rev_stdev_b ) =PASTE() =CANCEL.COPY() =RETURN() 142 MOVE_NUM_FACT MOVE_NUM2 =ECHO(FALSE) =ECHO(FALSE) =ACTIVATE(' FLOOD.XLS') =SELECT("R71C42") =SELECT(iowbot_fact")=COPY() =SELECT(CAFLOOD \FLOOD.XLS1R71C1) =SELECT.END(3) =PASTE() =SELECTCR[1]C') =PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY() =SELECT(rp) =SELECT(R72C42") =COPY() =COPY() =SELECT("probotiact")=sELECT('FLOOD.XLS!return_period-) =PASTE() =SE LECT. END(3) =CAN CEL.COPY() =SE LECT("R[11C`) =NEXT() =CANCEL.COPY() =RETURN() =SELECT(R73C42") =SE LECT(CAFLOOMFLOOD.XLS1 R49C1:R63C1) =COPY() =FOR.CELL('rp','CAFLOOD \FLOOD.XLS1R71C1:R85C1,TRUE) =CANCEL.COPY() =RUN(CALCS.XLMIMASTER_CALC") =PASTE.SPECIAL(3,1,FALSE,FALSE) =COPY() =SELECT(highbot fact") =SELECT.END(3) =SELECT(11[1]C") =PASTE.SPECIAL(3,1,FALSE,FALSE) =CANCELCOPY0 =RETURN() 143 SUMMARY =ECHO(FALSE) =IF(CAFLOOD \FLOOD.XLS1factor_type=0) =ACTIVATE(FLOOD.XLS') =SELECT('R42C1:R63C4') =COPY() =OPEN("c:\flood■summary.xle) =SELECT(R1C1') =PASTE() =CANCELCOPY() =ACTIVATE.NEXTO =ACTIVATE('SUMMARY.XLS') =SAVE.AS(INPUT("The results of your analysis have been plac =SELECT(131 C1') =ELSE() =SELECT(*FLOOD.XLSItop_sum:highbot_facr) =COPY() =OPEN(c:\flood\summary.xls ") =SELECT(131C1') =PASTE() =CANCELCOPY() =ACTIVATE.NEXT() =ACTIVATE(SUMMARY.XLS') =SAVE.ASONPUT(*The results of your analysis have been plac =SELECT("R1C1') =END.IF() =ACTIVATE("FLOOD.XLS") =CLOSE(FALSE) =RETURN() 144

Cite

Citation Scheme:

    

Usage Statistics

Country Views Downloads
United States 3 0
China 1 2
City Views Downloads
Beijing 1 0
Sunnyvale 1 0
Redmond 1 0
Ashburn 1 0

{[{ mDataHeader[type] }]} {[{ month[type] }]} {[{ tData[type] }]}

Share

Share to:

Comment

Related Items