UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

A GIS application for regional streamflow analysis Cathcart, Jaime G. 1993-09-17

You don't seem to have a PDF reader installed, try download the pdf

Item Metadata


831-ubc_1993_fall_cathcart_jaime.pdf [ 5.4MB ]
JSON: 831-1.0050485.json
JSON-LD: 831-1.0050485-ld.json
RDF/XML (Pretty): 831-1.0050485-rdf.xml
RDF/JSON: 831-1.0050485-rdf.json
Turtle: 831-1.0050485-turtle.txt
N-Triples: 831-1.0050485-rdf-ntriples.txt
Original Record: 831-1.0050485-source.json
Full Text

Full Text

A GIS APPLICATION FORREGIONAL STREAMFLOW ANALYSISbyJAIME GRANT CATHCARTB.A.Sc., The University of British Columbia, 1987A THESIS SUBMITTED IN PARTIAL FULFILLMENT OFTHE REQUIREMENTS FOR THE DEGREE OFMASTER OF APPLIED SCIENCEinTHE FACULTY OF GRADUATE STUDIES(The Department of Civil Engineering)We accept this thesis as conformingto the required standardTHE UNIVERSITY OF BRITISH COLUMBIAJune 1993© Jaime Grant Cathcart, 1993In presenting this thesis in partial fulfilment of the requirements for an advanceddegree at the University of British Columbia, I agree that the Library shall make itfreely available for reference and study. I further agree that permission for extensivecopying of this thesis for scholarly purposes may be granted by the head of mydepartment or by his or her representatives. It is understood that copying orpublication of this thesis for financial gain shall not be allowed without my writtenpermission.Department of  riu;^irl'eCrri(1The University of British ColumbiaVancouver, CanadaDate DE-6 (2/88)ABSTRACTRegional analysis is a statistical procedure used by hydrologists to estimatestreamflow parameters for ungauged streams. The procedure involves analyzinghistorical flow records for streams in a particular region, in order to estimate flowsfor other streams in the region. Traditionally, the process of obtaining andsummarizing relevant streamflow data has been rather inefficient and time-consuming. Regional streamflow analysis is performed frequently in BritishColumbia, and there is a need for a system designed to facilitate this process. Thisthesis involves the development of such a system.The system is comprised of a computer program (DATSUM) whichstatistically analyzes and then summarizes streamflow data for British Columbia, ina format which can be read and displayed by a geographic information system(GIS). Water Survey of Canada (WSC) streamflow records are the raw database,and the GIS used for this project is a locally developed product, known asQUIKMap.The system operates on DOS based personal computers, and information ispresented to the user as a series of symbols on a digitized map of the major riversand streams of B.C. Each symbol represents a WSC streamflow station, anddetailed information about each station can be simply obtained by selecting thestations of choice with an electronic computer mouse. This information can then beused for regional analyses or other streamflow dependent applications.The thesis also involves adapting a flow estimating program, called FLOOD,to make it compatible with the DATSUM program. The FLOOD program estimatesextreme high or low flows for streams with little or no streamflow data, and isperfectly suited to be used in conjunction with the DATSUM / GIS system.Finally, use of the DATSUM / GIS system is illustrated with a regionalanalysis of the Kitseguecla River. The results are then used as input for theFLOOD program, and the river's 200 year flood is estimated.iiiTABLE OF CONTENTSPaceABSTRACT ^LIST OF TABLES  ^viiLIST OF FIGURES  ^viiiACKNOWLEDGMENT  ^ixChapter1. INTRODUCTION  ^12. REGIONAL ANALYSIS  ^52.1^Regional Analysis^ 52.2 Regional Flood Frequency Analysis^ 62.3^Limitations of Regional Flood Frequency Analysis ^72.4 DATSUM/QUIKMap/FLOOD System  ^83. GEOGRAPHIC INFORMATION SYSTEMS (GIS) ^ 103.1 Why a Geographic Information System Was Chosen ^ 103.2 What is a Geographic Information System? ^ 113.3 Advantages of GIS Over Traditional Spatial DataManagement Systems ^ 123.4 Disadvantages of GIS's 133.5 GIS and Hydrology 143.5.1 Using GIS to Generate Hydrologic Model Inputs ^ 153.5.2 GIS and Expert Systems ^ 163.5.3 GIS/Hydrologic Model Interfaces ^ 18ivTABLE OF CONTENTS (continued)Chapter3.5.4 Database/GIS Interfaces3.6^Selection of a GIS Package ^4.^DEVELOPMENT OF THE DATSUM PROGRAM ^Page1819214.1 Initial Considerations 214.2 WSC Data Files ^ 224.3 DATSUM Summary Files ^ 224.4 DATSUM Program Organization 234.5 Instructions for Use of DATSUM ^ 254.5.1^Installing the Program 254.5.2 Creating and Loading WSC Data Files 254.5.3 Running the Program ^ 274.5.4 Exiting the Program 295. OPERATION OF THE DATSUM/QUIKMAP SYSTEM ^ 315.1 System Organization 315.2 Operating the QUIKMap Application 316. THE FLOOD PROGRAM ^ 406.1 Description of the FLOOD Program ^ 406.2 Advantage Over Frequency Analysis 407. SAMPLE APPLICATION ^ 427.1 Kitseguecla River 427.2 Data Summary Procedure ^ 42TABLE OF CONTENTS (continued) ^ Page Chapter7.3^Regional Analysis7.4^Additional Data8.^DISCUSSION AND CONCLUSIONS ^BIBLIOGRAPHY ^Appendix47485156A. DATSUM PROGRAM DETAILS ^ 59A.1 DATSUM Summary File 60A.1.1^Fields of the DATSUM Summary Files 60A.1.2 Example Summary File - Region 08JA ^ 62A.2 Macros: Names and Functions ^ 65A.3 Programming Code ^ 76A.4 Installing Customized QUIKMap Files 116B. FLOOD PROGRAM DETAILS ^ 1176.1 FLOOD Program Organization 118B.2 Instructions for Use ^ 119B.2.1^Installing the Program ^ 119B.2.2 Running the Program 119B.2.3 Exiting the Program ^ 122B.3 Macros: Names and Functions 123B.4 Programming Code ^ 127viLIST OF TABLESTablePage7.1 Regional Flood Data for Kitseguecla River Near Skeena Crossing 467.2 Estimates of Peak Flood Parameters for Kitseguecla River ^ 477.3 Extreme Flow Records for Station 08EF004 ^ 497.4 Estimates of 200 Year Mean Daily and Instantaneous Floodson Kitseguecla River ^ 50viiLIST OF FIGURESPageFigure4.1 Sample DATSUM Screen Message  ^284.2 Sample DATSUM Input Screen  ^284.3 Sequence of DATSUM Screens Requesting Extreme Flow Information 305.1^QUIKMap Data File Directory Menu  ^325.2 QUIKMap Data File Directory Menu  ^325.3 Basemap of B.C. with WSC Streamflow Regions  ^335.4 Sequence of QUIKMap Data File Directory Menus  ^345.5 WSC Streamflow Stations - Region 08JA  ^355.6 Sample QUIKMap Map Legend  ^375.7 Sample QUIKMap Data Display Screen  ^387.1^Use of QUIKMap's "Radius" Function  ^45A.2 Flow Chart Illustrating DATSUM Macro Relationships  ^73B.2.1 Initial Input Dialog Box  ^120B.2.2 Additional Flow Data Dialog Box ^  120B.3 Flow Chart Illustrating FLOOD Macro Relationships^126ACKNOWLEDGMENTThe author wishes to express special thanks to his supervisor, Dr. S.O.(Denis) Russell, for his invaluable guidance, assistance and encouragementthroughout the development of this thesis. Thanks is also extended to theengineers and management of Crippen Consultants, particularly Wynn Morgan andMario Merlo, for their generous technical and financial assistance. Finally, theauthor is indebted to the Science Council of British Columbia for endorsing theproject and providing substantial financial assistance.ixCHAPTER 1INTRODUCTIONWater is a special substance. It is the medium from which life began and it isvital for the continuing existence of all species. It enhances human lives with itsmany forms of natural beauty and its amazing resource potential. Canadians arevery fortunate, as they have an abundant supply of water. However, this watersupply is not unlimited, and Canadians must take care to manage their waterresources in a responsible and knowledgeable fashion, so that future generationsmay also enjoy the benefits of Canada's water. A key element in the managementof water resources is the ability to understand the water cycle and its relatedprocesses, such as rainfall, runoff and stream flows.For centuries man has been attempting to manage water resources, be it tocontrol floods, irrigate fields, generate electricity, or preserve natural beauty. Thewater cycle, however, is unpredictable, and the greatest challenge has been toaccurately estimate or measure the water supply. A whole science, known ashydrology, has developed around this pursuit. Over time, many water supplyestimating techniques have evolved, but the science of hydrology is still largelyempirical. With the advent of computers, estimating techniques have become morecomplex and sophisticated, as systems are able to manage increasing amounts ofdata. Despite this, there is still great difficulty and inadequacy in trying to model asystem with so many variables.One approach to estimating stream flows which has been popular for years,but which is still evolving and improving with the growth of micro-computers, is1regional analysis. Regional analysis is a statistical procedure based on the ideathat as stream flow patterns and volumes are determined by precipitation and runoffconditions, then all streams and rivers in the same geographical area will tend tohave similar flows. The procedure involves analyzing historical records for streamsin a particular region in order to estimate flows for ungauged streams in the sameregion.Over the past twenty years a new area of computer technology, known asgeographic information systems (GIS), has been slowly evolving. In the past fiveyears, with the growth of the personal computer, development of this technology hasaccelerated rapidly and it is now being embraced by hydrologists and engineers. AGIS is a spatial data handling system. It is capable of analyzing and displayingspatially related data, which makes it particularly suited to hydrologic modeling;many hydrologic parameters are dependent on area and thus their areal extent mustbe assessed before their values can be determined.Considerable work has been done in the development of GIS applications forthe generation of input values for hydrologic models. In addition, systems whichinterface between GIS's and hydrologic models have been developed. In contrast,very little work appears to have been done to create data\GIS interfaces. All typesof electronic databases are currently available, but they do not have much practicaluse if they cannot be readily utilized by other computer systems. This thesisinvolves, amongst other things, the creation of such an interface.The objective of this thesis is to statistically analyze and summarizestreamf low data for British Columbia and to present it in a way that will assisthydrologists in performing regional analyses. In British Columbia, streamflowrecords for hundreds of streams have been kept for varying periods of time over thelast 100 years. This data is compiled and updated on an annual basis by WaterSurvey of Canada (WSC), and is now available in CD-ROM format.2This thesis involves the development of an Excel computer program(DATSUM) which processes raw data files from the CD information disc to createsummary data files for Water Survey of Canada (WSC) streamflow regions in B.C.,in a format which can be read and displayed by a geographic information system(GIS). The GIS is able to display a detailed digitized map of the major streams andrivers of British Columbia, and then upon request of the user, show the datasummary files on the map, so that regional relationships between the variousstreams can be readily established by a hydrologist. It should be noted that manyof the attributes usually associated with a GIS are not utilized by this project. TheGIS, in this case, serves only to display data and does not perform any complexanalysis.In addition to the DATSUM program, this thesis involves the rewriting, inExcel language, of a flow estimating program (FLOOD) developed by Dr. DenisRussell, a civil engineering professor at The University of British Columbia. Thisprogram, originally written in Pascal language, estimates design flows (peak flows,low flows, etc.) on the basis of a stream's mean annual flow and its coefficient ofvariation, two parameters displayed by the GIS system. It was considerednecessary to rewrite the program so that the data summary program and the flowprogram could be operated with the same software.This text is set out in the following manner. Chapter 2 describes the need fordesign flow estimating techniques, with emphasis on regional flood frequencyanalysis. Chapter 3 describes geographic information systems in detail andoutlines the major areas or research and development of hydrological applications.Chapter 4 deals with the development of the data summary program DATSUM.Chapter 5 explains how the GIS program operates with the files created by theDATSUM program. Chapter 6 describes the design flow estimating programFLOOD, and explains how it can be used. Chapter 7 demonstrates how the whole3system operates, with an example application for the Kitseguecla River, in centralBritish Columbia. Finally, a summary and discussion of the implications of thesystem are given in Chapter 8.4CHAPTER 2REGIONAL ANALYSISBritish Columbia contains thousands of streams and rivers. Most of the majorwaterways have a gauging station where streamflows are recorded. However, manysmall streams and rivers have no gauging stations or historical streamflow records.As the interest in small stream development and management in the provincecontinues to increase, due in part to the province's energy focus changing fromlarge scale hydro development to small scale hydro, there is an increasing need fora simple, yet effective, method for estimating flows for ungauged streams.2.1^Regional Analysis The most common and effective method for estimating streamflows isregional analysis. Regional analysis is a statistical procedure based on the ideathat streams and rivers in the same geographical area will tend to have similarflows, since stream flow patterns and volumes are determined by precipitation andrunoff conditions. The procedure involves analyzing historical records for streamsin a particular region, to provide a basis for estimating flows for ungauged streamsin the same region. The estimates are made by adjusting the recorded flowsaccording to relationships between various catchment characteristics or parameters.Generally, such analysis requires significant time and expertise, and it is the intentof this thesis to develop a system to simplify and expedite this process.52.2^Recional Flood Frequency AnalysisRegional analysis is most often performed as part of a flood estimatingprocedure. In fact, this is so often the case that the majority of hydrology texts onlydiscuss regional analysis in terms of regional flood frequency analysis, a floodestimating technique which combines regional analysis with frequency analysis.Frequency analysis is a statistical technique for estimating the size of a peak flowevent which will be equaled or exceeded within a specified time period. There are anumber of different flood frequency analysis techniques available, but theirdifferences are minimal, and there are certain elements which are common to most.These elements are described in the text" Hydrology of Floods in Canada!' (Watt etal. 1989) as follows:1. The regional boundaries are defined and gauging stations are screenedfor acceptable records.2. A single-station frequency analysis is carried out for acceptable stationswithin the region.3. Relations are postulated between certain flood statistics and thephysiographic and climatic characteristics of the drainage basins.4. Values of required physiographic and climatic characteristics aredetermined for each drainage basin.5. The relations postulated in (3) are cast in the form of predictionequations, the coefficients of which are determined using multipleregression (a method used to relate basin characteristics to flowparameters) or other statistical techniques.6Typically, a regional flood frequency analysis is performed for various"homogeneous" regions of a large geographic area, such as a province or state,and then the derived equations are published and made available to interestedparties. These parties then use the information to make quick estimates of designflows. One example of such a project is a regional flood frequency analysis for thestate of Alaska (Lamke 1979). This was an extensive study, undertaken by theUnited States Department of the Interior Geological Survey, involving the analysisof streamflow data for 260 gauged basins. The state was divided into two climaticregions and regression equations were generated for floods of different recurrenceintervals. The average standard errors of the regression equations for the twoareas were 48 and 74 percent, which is fairly representative of the amount of errorwhich can be expected when applying this type of analysis to such a huge area.2.3^Limitations of Regional Flood Frequency AnalysisRegional flood frequency analyses have been performed for various parts ofCanada ( Watt et al. 1989) and the technique is reasonably reliable and effective inareas with similar physiographic, geomorphologic, and climatic characteristics.However, there are a number of difficulties, drawbacks and limitations involved withusing this technique.To begin with, the technique only works when there is sufficient data. Allgauging stations used in the analysis must have similar periods of record, and onlythose stations with 10 or more years of record should be used (Watt et al. 1988). InBritish Columbia this is a serious problem as many areas of the province haveminimal streamflow records.Another problem is that the technique is only reliable and effective in areaswith similar physiographic, geomorphologic, and climatic characteristics. Inaddition, it is often very difficult to define the boundaries of a "homogeneous" region7and to determine the significance of different characteristics within a region. InBritish Columbia, the terrain and climate are so variable that it is virtually impossibleto produce reliable flood frequency models for large areas of the province.Regional flood frequency analysis can be performed on small regions of theprovince as needed, but many small projects cannot afford the time or expense toundertake an extensive study, and must rely on previously published studies. Inmany cases these studies, as with the Alaska study discussed previously, involvelarge areas and consequently have large errors.In addition, the system is unable to directly incorporate site specificinformation into the analysis. Frequency analysis requires information on meanannual floods, but additional information, such as the largest flood, or the floodwhich has not been exceeded in a number of years, is not given consideration.Finally, a major weakness of this technique is that once the predictionequations and regression coefficients have been produced, the user is unable toincorporate engineering judgement into the estimating procedure. Generally, theuser of the equations is not familiar with all the decisions and judgement involved inproducing the regression equations, and thus the equations may be usedimproperly. Basically, it becomes a "black box" procedure with all the associateddifficulties.2.4 DATSUM/QUIKMap/FLOOD System The GIS application developed in this thesis (DATSUM/QUIKMap system),when used in conjunction with the FLOOD program, provides an alternate means ofestimating flood flows which is simple and efficient. It offers many benefits overtraditional techniques by requiring the user to provide considerable judgement andinput to the analysis process. Each analysis is performed on a region by regionbasis, rather than trying to "homogenize" large areas, with the size of region and8significance of different records determined by the individual user. This involvementforces the user to be aware of the systems limitations and the relative validity ofresults.In addition, the system is quick and user-friendly and does require thecomplex mathematics associated with regression analysis. As well, a means isprovided for incorporating various types of flow data into the analysis which couldserve to significantly improve estimates. Finally, results are presented withconfidence limits (low, probable and high values) so as to provide the user with agauge or 'feel" for the quality of the results.The most common basin characteristic used in regional analysis is basinarea, but other factors such as basin slope, mean elevation, mean annualprecipitation, orientation relative to mountains, and number and size of lakes arealso significant. As this project is aimed at producing a system suitable for aprefeasibility level of study, the data generated and produced is presented in a unitarea format and is designed to facilitate a regional analysis based primarily oncatchment areas. If the user wishes to make a more sophisticated regional analysisincorporating additional information, then that information must be obtained from adifferent source. The type of data generated and produced by this system isdiscussed in Chapter 4 and listed in Appendix A.1.9CHAPTER 3GEOGRAPHIC INFORMATION SYSTEMS (GIS)3.1 Why a Geoqraphic Information System Was Chosen Dr. Denis Russell, a civil engineering professor at the University of BritishColumbia, has written a computer program, named FLOOD, for estimating designflows on the basis of estimates of a stream's mean annual flow and its coefficient ofvariation (cv) (Russell 1991). The program works very well for streams that haveavailable streamflow records, as it is easy to calculate mean flows and cv's.However, the area where this program has the most useful application is withstreams that have either no or very sparse records. This is the case for the majorityof small streams in the province. In the past, if one wished to use this program toestimate flows for an ungauged stream, one would have had to undertake thefollowing lengthy process :Obtain a map of the province, locate the target stream, and identify anyWSC streamflow stations in the area.2. Extract the streamf low data for every WSC station in the area. (In thepast, this data was only available from WSC in book form, but recently, aCD-ROM data source has been published.)3. Analyze the data to generate unit area means and cv's for every station.4. Perform a simple regional analysis to estimate the mean and cv for thestream in question.5. Enter the estimated data into the FLOOD program.10Since steps 1) through 3) can be quite time consuming, it was considereddesirable to develop a system which would eliminate them and present all therequired data in a simple user-friendly package. The intent of this thesis is todevelop such a package. It should be noted that this system does not attempt toestimate flow values, but rather is designed to aid engineers and hydrologists inmaking estimates. A program which attempts to make estimates of streamflowvalues could be dangerous. People are inclined to believe computer outputs withblind faith, and it is felt that more realistic and reliable values can be obtained bycombining regional information with engineering judgment. In other words, thesystem will provide the user with information about the various streamflow stationsin a particular region, and the user will have to "eyeball" the figures and estimatethe best composite value for the stream in question. The FLOOD program promptsthe user for low, probable and high estimates of a value, which lends itself very wellto this "eyeballing" procedure.At first it was thought that the best way to create a data analysis and displaysystem would be to develop a program using C++ software, which would generate amap of the province and display the mean and cv data for each streamflow stationat the correct location on the map. However, this proved to be a daunting task, andupon further investigation, it was discovered that geographic information systems(GIS) could provide the required facility. Why reinvent the wheel when manysystems designed specifically to handle spatial data were commercially available?3.2 What is a Geographic Information System ? The latest computer tool to be applied to the problem of estimatingstreamflows is the geographic information system (GIS). A GIS is a spatial datahandling system. Basically, it can be thought of as an image (map) display systemconnected to a database system. However, it is more sophisticated than that, as a11GIS must not only be able to store and display data, but it must also be able tomanipulate and analyze data, and then display the results of these analyses(Marble 1984). A GIS is able to determine, in a quantitative fashion, the complexrelationships which exist between map elements.3.3 Advantages of GIS Over Traditional Spatial Data ManagementSystems GIS's were developed out of a need to be able to overlay spatial data sets,so that relationships between the different layers of data could be determined.Traditionally, if one wished to integrate two or more spatial data sets, one wouldhave to transform the data sets to a common map scale, prepare a transparentoverlay for each set, arrange the different overlays so that coordinate systems arealigned, then manually prepare a composite overlay which shows areas where thedifferent data types overlap to form various data groups. This procedure can beextremely complex and time consuming, and thus its use has been limited. Thedevelopment of GIS technology has made this type of analysis very efficient andenables the user to selectively analyze only the data which is pertinent to aparticular situation. As well, the availability of a system which is able to quickly andefficiently process large amounts of data encourages the consideration of differentoptions or alternate approaches to an analysis, which otherwise may have beenmissed due to time, cost and manpower constraints.In addition to the ability to analyze spatial data, GIS's provide otheradvantages over the traditional spatial data management systems. For instance,once a data base has been established and stored in a GIS, it is easily modifiedand updated as needed. This is in contrast to analog map documents which arecostly and time consuming to change. Another advantage of GIS's is that they allowthe user to quickly move from one area of the map to another, and to zoom in and12out of a map image as required. Traditional paper maps provide only one view andare difficult to work with if the area of interest spans more than one map sheet.GIS's are also superior over traditional maps in the sense that they provide a readilyavailable system for producing graphics for reports or presentations. Most GISsoftware systems are able to quickly produce high quality scale plots or prints, andsupporting map features, such as roads, lakes, streams or culverts can beincorporated in the graphics whether or not they are GIS features (ie. connected tothe data base).Finally, digitized maps are now being produced by government and privateagencies which are compatible with most GIS packages, thus providing a source ofhigh quality basemaps. In British Columbia, there is currently an ambitious mappingprogram underway to produce a digital terrain model of the entire province. Thisproject is known as TRIM (Terrain Resource Information Management) and is a jointventure between government and industry in an effort to create a commongeographic reference base (Balser 1989). By 1996, over 7000 digitized maps, at ascale of 1:20 000, will provide planimetric and topographic data for the wholeprovince. This will serve to dramatically facilitate the use of geographic informationsystems in B.C.3.4^Disadvantages of GIS's The technology of geographic information systems is very impressive and itis easy to get carried away and use GIS's inappropriately. Other computer systemsmay be far more suitable for a particular application and care must be taken whenassessing the suitability of a project. It must be kept in mind that a generic GIS isnot a modeling tool; it is a spatial data handling tool that can enhance modeling.GIS's have a number of drawbacks. For instance, building a GIS data basecan be very time consuming and expensive, and often it is justified only if the13particular area impacts on many lives, and/or the model will be used for long termongoing monitoring and studies, and/or the investigation is very complex. Inaddition, GIS's generally require large amounts of data, and locating and obtainingappropriate data may be a problem. Finally, geographic information systems is aspecialized technology requiring highly skilled operators to fully utilize itscapabilities. This requirement may limit the usefulness of a system to engineers orhydrologists.A note of caution which is particularly applicable to GIS's: as computersystems become more complex and automated, it is imperative that the userunderstand both the geographic information systems and any models used inconjunction with them. The user must be able to anticipate the expected range ofoutput values in order to assess the quality of results. GIS's provide "a vehicle forgenerating impressive, seemingly accurate maps expressing complex spatialrelationships with minimal understanding of actual spatial relationships" (Berry1987). There is danger in the fact that technology is advancing so rapidly that itmay be getting ahead of knowledge about using systems and interpreting results.3.5 GIS and HydrologyGIS technology has been developing for approximately 20 years and hasfound extensive application in the forest industry. It is starting to gain acceptance inmany engineering fields, such as mining and geology, and over the last ten yearshas seen increased use in hydrologic modelling (Haan, Nofziger and Zhang 1989).However, there is still not widespread use of GIS in most hydrology circles. This issurprising, as many hydrologic parameters are areally-based and interrelated, andgeographic information systems are ideally suited to deal with this type of data.Perhaps the reluctance to adopt this technology is due in part to the fact that a GISis a specialized tool requiring skilled personnel to fully utilize and understand its14capabilities, and many engineers and hydrologists would rather deal with systemsthat do not have such stringent operational requirements. Regardless, GIS usecontinues to gain momentum and will become more prominent as computertechnology advances.3.5.1 Using GIS to Generate Hydrologic Model InputsThe majority of water related GIS projects has involved using GIS's for eitherwater resource management or the derivation of hydrologic parameters for input intohydrologic models. This second area of study has received considerable attentionrecently as hydrologic computer models have found greater use with the advent ofthe personal computer. The integration of GIS's with hydrologic models is playingan increasing role in the design, calibration and comparison of hydrologic models.Geographic information systems have been applied to hydrologic modeling inseveral recent studies. Moeller (1991) developed a GIS application to be used inconjunction with the SCS unit hydrograph methodology in the HEC-1 hydrologicmodel. His system automates the tedious and time consuming process ofcalculating areas for the derivation of SCS curve numbers, percent imperviousvalues, and subbasin areas. These three parameters lent themselves very well toGIS analysis as they are all areally-based so that an areal extent was required inorder to quantify their values.Muzik (1988) developed a system similar to Moeller's, in that he used a GISto store data on physical land characteristics and rainfall data, and then generateSCS runoff curve numbers. He developed his system in association with anothercomputer system which retrieved information from the GIS and computed severalrunoff parameters.15Wolfe and Neal (1988) used a GIS to provide data input to a finite elementmodel. The layering capabilities of the GIS were used to overlay maps of soil typeand land cover in order to identify areas which were hydrologically similar.Stuebe and Johnston (1991) developed a GIS application to assist in allstages of modeling stormwater runoff. The system was used to generate input forthe USDA SCS TR-55 hydrologic model. Elevation, soil and landcover data wereentered into the GIS and "hydrologic response units" were created and assigned arunoff curve number. Runoff volume values were then calculated for each grid celland the watershed was delineated. The system then used the runoff area andrunoff volume values to estimate a runoff volume for the watershed.3.5.2 GIS and Expert SystemsAnother area of study which is producing some encouraging results is thecombination of GIS applications with expert system technology. An expert system isa computer program designed to mimic the decision making or problem solvingprocesses of a human expert. These systems are designed to deal with uncertaintyand therefore have enormous potential in the field of hydrology, where situationsare rarely well defined. An expert system can be used to capture the expertise of ahuman expert in a manner which can be interpreted and used by other computersystems, such as a GIS.GIS/expert systems are generally designed to produce input parameters forhydrologic models, but they differ from basic GIS's because they combine theefficiency of the GIS with the "mature hydrologic expertise" of an expert system tooptimize the outputs. VanBlargan, Ragan and Schaake (1990 and 1991) have donea considerable amount of work in this area and have produced a system whichautomatically generates drainage area boundaries, times of concentration, andhydrographs. The system requires digital elevation and stream data as input. It16delineates basin areas by calculating the flow direction of every grid point on thebasis of a number of rules (ie. the expert system) and then groups all the grid pointswhich flow towards a common stream. Travel times across each cell are alsocomputed and then the time of concentration for a basin is calculated by examiningeach possible flow path to determine the longest hydraulic path. Unit hydrographsare then generated from time area curves defined from the area and travel timevalues. The unit hydrographs can then generate complete streamflow hydrographsfor any given storm.In addition to significant time savings over traditional means, this expertsystem may provide improved estimates of values. For instance, when determiningtimes of concentration, the expert system examines every possible flow path andconsiders the slope and land use for each grid cell. In contrast, traditional methodscould not afford this attention to detail, as it would be far too time consuming to bepractical. Thus, flow paths were generally estimated as the longest distance seenon a contour map, a simplification which could produce considerable error.Obviously, expert system/GIS packages offer benefits over basic GISapplications, and this area of study will probably see the greatest development anduse in the future. To date, however, hydrologists seem reluctant to embrace expertsystem technology. Perhaps this reluctance is due to the fact that engineers tend tobe very cautious and skeptical of any system which claims to handle uncertainty.Water related processes (weather, runoff, streamflow, etc.) are inherently uncertain,and engineers have been witness to many failed attempts to integrate thisuncertainty into conventional computer programs. With continued successfulapplication, this technology will gain acceptance and should play a major role, likelyin conjunction with geographic information systems, in the future development ofhydrologic modelling.173.5.3 GIS/Hydrologic Model InterfacesAs explained previously, geographic information systems are findingapplication as input data generators for hydrologic models. Currently, most systemsrequire an operator to use a GIS to generate hydrologic parameters, which are thenmanually entered into a hydrologic model. The two systems operate independent ofeach other, which is inefficient. In response to this, some work has recently beendone in the area of developing GIS/hydrologic model interfaces. An interface actsas a link between the two systems, so that the data produced by a GIS iscustomized to suit whatever application is being run with the hydrologic model.Fisher (1989) describes a conceptual design for such an interface. This systemwould provide a graphic, menu-driven interface whose structure would be easilyadaptable to various applications, including different hydrologic models. More workneeds to be done in this area and once the various systems start "talking" to eachother, GIS's will find greater application and use in hydrology.3.5.4 Database/GIS InterfacesOne area of study that seems to have received very little attention is thedevelopment of interfaces between raw data bases and geographic informationsystems. All types of electronic databases are currently available, but they do nothave much practical use if they cannot be readily utilized by other computersystems. This thesis involves, amongst other things, the development of aninterface between Water Survey of Canada's CD-ROM streamflow data base and ageographic information system called QUIKMap. This interface analyzes raw datato generate streamflow parameters which are then formulated so that they can beread and displayed by the GIS. The system is designed to summarize largequantities of data so as to provide users with a simple means of viewing data on aregion by region basis, for the province of British Columbia. In contrast to most18other hydrological GIS applications, the GIS, in this case, serves mainly to displaydata and does not perform any complex analysis. As far as the author knows, thisproject is the first attempt in Canada to use a GIS system to aid in estimatingstreamflow parameters for ungauged streams for a whole province, in such asimple, yet effective, fashion.3.6 Selection of a GIS Package For the purposes of this study, the GIS software QUIKMap, developed by theAxys Software company of Sidney, British Columbia, was chosen. This softwarepackage was selected because:1. The system is able to store and display spatially related data.2. An appropriate map of British Columbia was in print and available for usewith the system. The Ministry of Environment has produced a digitized1:2 500 000 scale map of B.C., called the Blue Atlas, which is available toany interested party. This map represents all the major streams andrivers in B.C., and is completely compatible with the QUIKMap software,making it perfectly suited to our application.3. The system is very easy to use and operate. In contrast to many of themore sophisticated GIS's, QUIKMap does not require the operating skillsof a full-time GIS technician. It was the intent of the author to produce aproduct which could be used simply and quickly by any hydrologist.4. The system is relatively inexpensive, at approximately $800, compared tothe average costs of $3,000 to $4,000, and a high of approximately$20,000, for other GIS software. This low price will not prevent small19consulting firms from purchasing the software and utilizing the results ofthis project. It should be noted that the reason this system is relativelyinexpensive, is that it does not contain complex data analysis andmanipulation capabilities and it does not handle polyline data as well asmany other systems. Technically, QUIKMap would not be considered bymany to be a "true" GIS, but rather a desktop mapping system. However,as this project is largely concerned with presenting point data, whichQUIKMap handles very well, this is not a concern.5. It is a product developed, sold and supported by British Columbians. Asthis project is largely funded by the Science Council of B.C., it was feltthat it would be appropriate to utilize local products and expertise.20CHAPTER 4DEVELOPMENT OF THE DATSUM PROGRAMOnce a GIS package was selected, the next step was to determine how thesystem displayed data, how the data available on the CD-ROM could be formattedto be compatible with the GIS, and how it could be analyzed to produce the requiredresults.4.1^Initial Considerations The first factor to be considered was that the CD-ROM data is in ASCIIformat, while the GIS system requires database files in dBASE III format. Thesecond factor was that the CD-ROM data is only available in a very basic form, andit needed to be analyzed and manipulated before being exported to the GIS. Aftermuch consideration, it was decided that the requirements could best be met with aspreadsheet program, and the software package Excel, produced by the MicrosoftCorporation, was chosen. In addition to meeting all the requirements listed above,this software, or its compatible competition, Lotus 1-2-3, is available at almost allengineering consulting firms.At first it was hoped that an interactive system could be developed that wouldcreate and display data as needed. However, the GIS, Excel and CD-ROMsoftware do not have the capability to interact, and so data summary files have to becreated independent of the GIS. The files are saved to a directory which is thenaccessed by the GIS.214.2 WSC Data FilesAnother unforeseen problem was that CD-ROM data files can only beexported and labelled one file at a time. As the province of British Columbiacontains over 1600 streamflow stations, and two files were needed (monthly dataand extreme data) for every station, over 3200 files had to be extracted and savedbefore any analysis could even begin. This proved to be an extremely timeconsuming and labour intensive process, and as a result, this project was limited tocreating data summary files for only the south-west and central-west areas of theprovince. This required only approximately 1000 data files and was deemedsufficient to demonstrate how the system operates. If the user wishes to view datafor other areas of the province, then data summary files can be created as needed.Fortunately, once the data summary files are created, they should be relevant for atleast three or four years. It will not be necessary to produce new data summaryfiles every year as the values calculated by the system are statistically based, andan additional year of records will not be too significant.The files created for this project are based on data up to and including 1990,and it should not be necessary to recalculate the values until the 1993 data isavailable. By then, hopefully, the CD system will be able to produce batches offiles, which will dramatically streamline the data extraction process. If the latestdata is only needed for a particular region, and one is not interested in producingdata for the entire province, then up-to-date data files can be created easily.4.3 DATSUM Summary FilesAll the streamflow stations in Canada are organized in terms of regions (ie.region 08JA), and every station is given a name associated with a particularregion(ie. 08JA001, 08JA002, etc.). Thus, it was logical to organize the datasummary files in terms of regions, and the Excel data summary program, DATSUM,22is designed to do just that. Basically, the program takes all the raw CD-ROM datafiles for a particular region of the province and performs a number of calculations tocreate streamflow parameters for each file. It then exports all the calculated valuesto a data summary file, where additional information pertaining to the data display inthe GIS is added. A data summary file, which can be read by the GIS, is created forevery region in the province. An example of the data summary file produced forregion 08JA is shown in Appendix A.1.2. Also given is a description of every datatype created in the database summary file. It should be noted that this datasummary file is shown as it appears in either a dBASE III or Excel environment, butthat it is displayed differently in QUIKMap. An example of how QUIKMap displaysan individual line from a summary file is shown in Figure 5.7.The data summary files created by DATSUM contain a variety of differentstreamflow parameters. These specific parameters where chosen because they aresome of the most commonly used hydrological values, and because this version ofDATSUM was designed specifically to create inputs for the FLOOD program.However, with a basic knowledge of the Excel language, the program can be easilymodified to calculate other streamflow values.4.4 DATSUM Program Organization The DATSUM program is written in Microsoft Excel macro language, andoperating the program requires a basic knowledge of the Excel worksheetenvironment and commands. If the user is not familiar with Excel, then the ExcelUser's Manual (Microsoft Corporation, 1991) should be referenced.The program DATSUM consists of nine macrosheets and two worksheets.The macrosheets are named MACRO1A.XLM, QUIKDATA.XLM, INDEX.XLM,FLOW.XLM, RATIO.XLM, SETUP.XLM, OPEN.XLM, DATSUM.XLM, andFUNCTION.XLA, and the worksheets are named DUMM001E and SUMBASE.XLS.23The sheet MACRO1A.XLM contains small macros that perform very simple andspecialized functions related to formatting the individual streamflow data files. Thesheet QUIKDATA.XLM contains macros that specify what sort of symbols willrepresent each streamflow data file. The sheet INDEX.XLM contains macros thattake information from the raw index data files and transfer it to the summary files.The sheet FLOW.XLM contains macros that link other macros together to formatindividual streamflow data files, while the sheet RATIO.XLM contains macros thatperform similar functions for individual max flow files. The sheet SETUP.XLMcontains macros designed to setup the other macros so that they will operate for aspecific data file or group of files. The sheet OPEN.XLM contains macros whichopen and close different combinations of DATSUM files. The sheet DATSUM.XLMcontains only one macro, and it is the master program macro which opens andcloses files as needed and orchestrates all the other macros. The sheetFUNCTION.XLA is an "add-in" macrosheet and it contains an "add-in" macro whichcalculates the correlation coefficient of an array of numbers. Finally, the sheetDUMM001E is a dummy file used by the system when a certain type of file ismissing, and SUMBASE.XLS is a generic worksheet onto which the results of aDATSUM analysis are exported and saved.A listing of the 67 macros which make up the program, and a description oftheir functions, is given in Appendix A.2, as is a flowchart which illustrates the inter-macro relationships (Figure A.2). A copy of the programming code is given inAppendix A.3. Some of the code is not shown in its entirety, but in general, themissing code is text which is not significant to understanding the programming.Note that macrosheet FUNCTION.XLA is not mentioned in either Appendices A.2 orA.3 because it is very simple and does not merit inclusion.244.5 Instructions for Use of DATSUM The following text outlines instructions for installing and operating theDATSUM program. This program has been designed to operate in the MicrosoftExcel software environment, although with modification, it may run in the Lotus 1-2-3 environment. If the instructions are followed carefully, the program should runwithout incident. However, it should be noted that the program is not very robust, inthat it has not been written with many safeguards against user errors, and couldpotentially be destroyed with improper use. Therefore, it is highly recommendedthat a back-up copy of the program always be kept on hand, so that if the userinadvertently destroys the program, it can easily be restored to its original state.4.5.1 Installing the ProgramInstall the DATSUM program by copying the files MACRO1A.XLM,QUIKDATA.XLM, INDEX.XLM, FLOW.XLM, OPEN.XLM, RATIO.XLM, SETUP.XLMand DATSUM.XLM into a directory in the C:\  drive titled QUIKMAP\FLOWDATAMACRO (ie. C:\QUIKMAP\FLOWDATA\MACRO). Next, copy the fileFUNCTION.XLA into the EXCEL directory titled XLSTART (ie. C:\EXCEL\XLSTART), and the file SUMBASE.XLS into a directory in the C:\ drive titledQUIKMAP\FLOWDATA\SUMMARY (ie. C:\QUIKMAP\FLOWDATA  \SUMMARY).This is imperative! The DATSUM macros have many references to files in thesedirectories. Therefore, the program will not run properly if the files are loaded intodifferent directories, unless the file references are changed appropriately.4.5.2 Creating and Loading WSC Data FilesThe DATSUM program creates streamflow data summary files for streamflowregions in British Columbia. In order to do this, it requires monthly flow and extremeflow data files for every streamflow station in a particular region. As mentioned25earlier in the text, this data is compiled and updated annually by Water Survey ofCanada (WSC), and is available in CD-ROM format. The following are instructionsfor obtaining the raw data files from the CD-ROM in the correct format, and loadingthese files so that they can be accessed by the DATSUM program. Note that theDATSUM program could be used to create streamflow data summary files for anyprovince in Canada, but that this thesis limits the discussion to B.C.The WSC CD-ROM surface water database package, known as HYDAT(Environment Canada, 1990), can be obtained from the Water Resources Branch orEnvironment Canada. The package consists of a user's manual, some data accesssoftware, and a CD-ROM disk containing surface water data for all of Canada. Thepackage allows the user to view and copy a variety of data types, in a variety ofdata formats. The DATSUM program has been designed to operate on filesformatted in a very particular manner, and the following steps should be taken toensure that the files are created properly. For the sake of brevity, it is assumed thatthe reader has access to the HYDAT User's Manual, and so the instructions beloware given in a very simplified form.1. Enter the province window and select the province of interest; in this case,British Columbia.2. Create an index file of all of the flow files in a particular region. Use the filterfunction to select only those files in a particular region (ie. all stations withnumbers beginning with 08MH, for example) and only those files with flowrecords (ie parameter=discharge). Save the file with the "Output", "Print", "File"and "All" commands, and name the file regionnameINDE (ie. for region 08MH,name the index file 08MHINDE).263. For every station within a particular region, create mean flow and extreme flowfiles. Two files must be created for every station, except in the situation notedbelow. Do this by selecting the station of choice, then selecting either "Mean" or"Extreme". Wait for the system to create your file on the screen, then select the"Output", "Print", "File" and "All" commands. Finally, name the file with its stationnumber and the letter F or E, to indicate whether the data is mean (F)low or(E)xtreme (ie. 08MHOO1F or 08MH001E, 08MHOO2F etc.) Take note that everystreamflow station may not have extreme data. If this is the case, then anextreme file cannot be created. Instead, simply record the name of the stationso that this information can be inputted to DATSUM when the system promptsthe user for such information. Also, make sure that the dummy file, DUMM001E,is copied to the same directory as all the files for that region. This is furtherexplained in step 4, below.4. Once all the required files have been created for every station in a region, createa directory with the region name, and then transfer all the files to it. Create thisdirectory as a subdirectory of CAQUIKMAP\FLOWDATA (ie. for region 08MH,save the INDEO8MH file, the DUMM001E file (if needed), and all the 08MH***Fand 08MH**"E files to CAQUIKMAP\FLOWDATA\08MH.) It may soundunnecessarily complicated to be creating so many different directories, but theuser could potentially be dealing with hundreds of files, and this process isnecessary to maintain order.4.5.3 Running the ProgramEnter the Microsoft Excel environment and open the file OPEN.XLM. Run the"OPEN" macro by pressing the keys <CTRL> < o >. This will automatically open allthe files necessary to run the DATSUM program. Start the DATSUM program by27pressing the keys <CTRL> < d >. A number of messages and prompts will appearon the screen (Figure 4.1). Simply follow the prompts and make appropriateselections.Figure 4.1 - Sample DATSUM Screen Message.The first significant input required by the program will be a four character WSCstreamflow region name (Figure 4.2).Figure 4.2 - Sample DATSUM Input Screen.28After this data has been entered, the system will prompt the user for the names ofthose stations in the region without extreme data records (Figure 4.3). Once thisinformation has been entered, the program is ready to run. The user will berequired to wait momentarily, and then a beep will sound to indicate when theanalysis is complete. The program saves the results of the analysis to theCAQUIKMAP\FLOWDATAISUMMARY directory in both Excel (.xls extension) andDBase III (.dbf extension) format.4.5.4 Exiting the ProgramOnce the program has saved the results of an analysis, the user is given theoption of either exiting the program or running it again. If the exit option is chosen,the program will close all the DATSUM macros except DATSUM.XLM andOPEN.XLM. The user must close these files manually without saving anychanges.29Do you wish to create a dummy extremedata file for a sheamflow station? AnswerYES or NO_IYESR.M.='';§ §V.Enter the name of the missing station_Make sure to put the characters "E" and"." at the end of the name_ ie.081411001 E.Microsoft Excel As you should be aware, WSC's extreme flow data base is notcomplete. Thus, the program must substitute a dummy file forthose stations without extreme data records. You will now beprompted to input the names of those stations for the currentregion.Figure 4.3 - Sequence of DATSUM Screens Requesting Extreme Flow Information.30CHAPTER 5OPERATION OF THE DATSUM/QUIKMAP SYSTEM5.1^System Organization This chapter will describe how the DATSUM summary files are displayed bythe QUIKMap GIS software. This description will be very basic, however, as it isassumed that the reader is familiar with the operation of QUIKMap. If this is not thecase, and the reader has difficulty following the text, then the QUIKMap operationmanual should be referenced.This particular application of QUIKMap has been designed to presentstreamflow information on all the different streamf low stations in B.C., for thepurpose of regional analysis. When performing a regional analysis, one does notusually know the names of the various streamflow regions around the stream ofinterest. Thus, one must first identify these regions, before one can requestdetailed streamflow information. To facilitate this process, the system has beensetup with two subsystems; the first allows the user to view the locations of all thestreamflow regions in the province and identify those in the area of interest, whilethe second permits the user to access detailed streamflow information. Whatfollows is a description of these two subsystems and how they can be operated.5.2 Operating the QUIKMap Application Before entering the QUIKMap operating environment, it is necessary tochange the current directory of the computer system to the directory containing allof the relevant data files (ie. CAQUIKMAP\FLOWDATA\SUMMARY). This will31ensure that the system will be able to access all the necessary files. Once this isdone, the user may enter the QUIKMap program and select the appropriate basemap. The next step is to then select the data file of interest. Upon selecting thedata file option from the pull-down menu, a small window will appear on the screenand display the following:I WSC Station Locations IWSC Station DataFigure 5.1 - QUIKMap Data File Directory MenuIf the user selects "WSC Station Locations", then another window will appear on thescreen to allow the user to select different regions of the province. This window willappear as follows:All BC WSC StationsSouth-west BC WSC StationsSouth-east BC WSC StationsCentral-west BC WSC StationsCentral-east BC WSC StationsNorth-west BC WSC StationsNorth-east BC WSC StationsFigure 5.2 - QUIKMap Data File Directory MenuUpon selection of the appropriate region, a number of small names (thestreamflow regions) will appear on the map (Figure 5.3). In order to identify thestreamflow regions in the area of the stream in question, the user need just32Figure 5.3 - Basemap of B.C. With WSC Streamflow Regionsactivate the coordinate display option, so that the latitude and longitude of thecursor appears on the screen, and then move the cursor to the location of thestream. The user can zoom in and out of the map display in order to isolateindividual stations or view larger areas as needed.Once the user has identified the streamflow regions of significance, she/he isnow in a position to ask the system to provide data on the various streams in theregion. To do this, the user returns to the initial pop-up window and selects "WSCStation Data". This initiates another series of pop-up windows and the user simplyclicks the mouse on the suitable choices and works through the system to select theappropriate streamflow regions (Figure 5.4).08D RegionsSouth-west BC 08E RegionsSouth-east BC 08F Regions'Central-west BO orlo* 108J Regions' 'Region oeJACentral-east BC 08K Regions Region 08JBNorth-west BC 08M Regions Region 08JDNorth-east BC 080 RegionsFigure 5.4 Sequence of QUIKMap Data File Directory Menus.Subsequently, various symbols will appear on the screen, each symbolrepresenting a different streamflow station (Figure 5.5). The symbols appear in avariety of shapes, sizes and colours, depending on the characteristics of eachstream. In addition to the symbols, a small box containing information about thestations will appear beside each symbol. If the station has five or more years ofrecord, then the station name, unit area mean annual flow value, and34Figure 5.5 - WSC Streamflow Stations - Region 08JAflow pattern index value are given. (The flow pattern index is the coefficient ofvariation of the mean monthly flows and gives an indication of how the flow valueschange from month to month.) If there are less than five years ofrecord, then only the station name and an "insufficient data" message are given.This message signals to the user that any statistical information should be used withcaution because of the limited data.A symbol legend, as in Figure 5.6, can be brought up on the screen to guidethe user, and from the symbols alone, considerable information about each streamcan be obtained. In addition, if more detailed information is required, the user needsimply place the cursor on the appropriate symbol and press the key <E>, and adetailed data window, as in Figure 5.7, will appear on the screen. This windowdisplays the data summary file created by the DATSUM program, and as explainedearlier, the type of information displayed can be controlled by editing the DATSUMprogram.36MAP LEGENDCATCHMENT SIZE (km2)0 (brown)^ 0 - 100(yellow)^100 - 500(blue)^500 - 1,000(light blue)^1,000 - 5,000El hed)^5,000 - 10,000(violet)^10,000 - 15,000(green)^15,000<UNIT AREA RUNOFF (cme per km2)0^0.000 - 0.02500.0250 - 0.0500.050 - 0.0750.075 - 0.1000.100<):(^INCOMPLETE DATASOLID SYMBOLS =ACTIVE STAllONSEMPTY SYMBOLS = DISCONTINUED STATIONSINSUFFICIENT DATA = <5 years of dataFigure 5.6 - Sample QUIKMap Map Legend37STREAMFLOW STATION GENERAL INFORMATIONStation Number^08JA014Station Name^VAN TINE CREEK NEAR THE MOUTHRainfall Catchment Area (km2) 153Station Coordinates - latitude degrees 53Station Coordinates - latitude minutes 15Station Coordinates - latitude seconds 30Station Coordinates - latitude degrees 125Station Coordinates - latitude minutes 24Station Coordinates - latitude seconds 50STREAMFLOW STATISTICAL PARAMETERSMean of the mean annual runoffs (m3/s)^ 0.9Unit area mean of the mean annual runoffs (m3/s/km2)^0.006Coefficient of Variation of the mean annual runoffs^0.52Flow Pattern Index (0 means no monthly change in flow)^1.22Time span over which streamflow records were kept ^1974-1990Number of years of streamflow record^ 17Number of complete years of streamflow record^16Mean of the annual max daily flows (m3/s) 11.35Unit area mean of the annual max daily flows (m3/s/km2)^0.074Coefficient of Variation of the annual max daily flows^0.54Mean of the annual ratios of max inst. flow to max daily flow ^1.16Coefficient of Variation of the annual ratios^ 0.14Correlation Coefficient between max daily and ratio inst./daily ^0.34Number of complete years of extreme flow record^14QUIKMap DISPLAY ATTRIBUTESType of DataSymbol/Line type^1Size of Symbol 5Symbol Colour 14Symbol Hatch Pattern^1Label Text^ 08JA014/nMAR = 0.006/nFPI = 1.22Label Size 1Special InstructionsNOTE: Any negative value (ie. -99.99) indicates missing or incomplete data.Figure 5.7 - Sample QUIKMap Data Display Screen.38One can see that the system provides a very efficient and user-friendly wayto obtain streamflow information, which can aid hydrologists in numerousapplications. One example would be to use the information presented as input to aflow estimating program, such as FLOOD.39CHAPTER 6THE FLOOD PROGRAM6.1^Description of the FLOOD Program The FLOOD program was developed by Dr. Denis Russell, a professor in theCivil Engineering department of the University of British Columbia. This program isdesigned to estimate extreme high (flood) and low flows for streams with little or nostreamflow data. The program uses regional data to create initial flow estimates,and then updates the estimates according to additional information, through theapplication of Bayes' theorem (Russell 1981). The basic concept is to set up acompound probability distribution on the basis of prior information, and then updatethe weights of component distributions in light of additional information. This paperwill not attempt to describe in detail the theory behind the method, but rather refersthe reader to Dr. Russell's paper (Russell 1991). It should be noted that the originalversion of the FLOOD program allows the user to select the underlying probabilitydistribution type from a list of four, while the simplified Excel version of the programlimits the user to only the Extreme Value Type I (Gumbel) distribution, as this is themost appropriate distribution for extreme, or flood, flows.6.2 Advantage Over Frequency AnalysisTraditionally, the most common method of translating raw streamflow datainto future probabilities of occurrence, is frequency analysis. This techniquerequires annual peak flow data, but does not give consideration to other types ofdata, such as the largest flood, or the flood which has not been exceeded, in a40number of years. If the data base is very large, then the effect of this additionaldata is often inherent in the analysis, but if the data base is small, which is often thecase, then the significance of these values is not accounted for. This sort of data,which is often available through a high water mark or the recollections of localresidents, may prove to be very important in arriving at final flow estimates. TheFLOOD program provides a means of incorporating this information into the flowestimating procedure.In Appendix B, one can find instructions for use of the Excel version ofFLOOD, and a description of the various files which make up the program.41CHAPTER 7SAMPLE APPLICATION7.1^Kitsequecla RiverFor the purpose of demonstrating this system, a regional analysis wasperformed to estimate streamflow parameters for the Kitseguecla River, and thenthese results were used as input for the FLOOD program, to estimate peak flows.For this demonstration, it was assumed that the 200 year daily and instantaneouspeak floods were required. WSC station 08EF004, The Kitseguecla River nearSkeena Crossing, is located in central-west British Columbia and has a drainagearea of 728 km2. The river flows in a south-westerly direction through mountainousregions, and there are wide variations in precipitation and runoff in the area. Theriver was gauged from 1960 until 1971, and there are eleven years of max daily andmax instantaneous flow records.7.2 Data Summary Procedure As described earlier in this document, the DATSUM program was used tocreate streamflow data summary files for various WSC regions, including all ofthose in the central-west area of the province. Thus, as the Kitseguecla River liesin this area, a regional streamflow database was available, and the hydrologist wasable to use the GIS system to quickly identify the relevant WSC streamflow stations,and then to access and summarize the data for each station. This was done in thefollowing manner, and the results of the analysis are shown in Table 7.1:421. The user entered the GIS system and requested the "Streams of B.C."basemap, a digitized map of the province showing all the major streamsand rivers.2. From the pull-down menus, the user selected the "WSC StationLocations" and the "Central-west BC WSC Regions" options, resulting ina display of all of the WSC streamflow regions in the central-west areaof the province (similar to Figure 5.3, but only displaying stations in thecentral-west region of the province.). The user then moved the cursor tothe location of the Kitseguecla River, and identified the streamflowregions in the local area as 08DB, 08EB, 08EC, 08EE, 08EF, 08EG.3. From the pull-down menus the "WSC Station Data" and "Central-westBC" options were then selected, followed by selections of each of the sixstreamflow regions of interest. This resulted in a screen display of asymbol and message for every streamflow station within these regions(similar to Figure 5.5), a total of 62.4. The next step was to limit the analysis to the most relevant stations. Asthese stations are located in a mountainous region there are widevariations in basin characteristics and runoff values. Therefore, it isdifficult to determine what factors may make one station more relevantthan another. The selection process can be made on the basis of avariety of different parameters, depending on the type of informationavailable and the extent of the analysis. If this was to be a detailedanalysis, then the selection would likely be based on a combination ofdifferent factors and the process would involve a considerable amount oftime and hydrological judgement. However, as this analysis is simply for43the purpose of demonstrating the DATSUM/QUIKMap system, and asthe number of streamflow stations involved is very large, a simpleapproach was adopted. In this case, proximity to the Kitseguecla Riverwas chosen as the limiting factor. The QUIKMap system has a functionto facilitate this type of selection, and so the "Radius" option waschosen. This allows the user to select only those stations within aspecified radius of a chosen point. A radius of 100 km around thestation 08EF004 was chosen and the resulting screen image is shown inFigure 7.1. All those stations marked by a small arrow have been"tagged", and with an additional command, the screen will redraw anddisplay only those stations, a total of 31.5. Further reductions were made by eliminating those stations with lessthan 5 years of streamflow records and those with incomplete extremeflow records, resulting in a final total of 11 stations. The data for thesestations was then viewed and the results are summarized in Table 7.1.It should be noted that the system also has the option of allowing theuser to print out individual data files as needed.Traditionally, the process of identifying, selecting, viewing and summarizingdata is very time-consuming, but with this efficient and user-friendly GIS-data filepackage, Table 7.1 was generated in approximately 30 minutes.44Figure 7.1 - Use of QUIKMap's "Radius" FunctionTable 7.1 - Regional Flood Data for Kitseguecla River Near Skeena CrossingStation No. River Period ofRecordNo. of yearsof Ex. RecordDrainage Unit AreaArea km2^MARFPI MeanFloodCVFloodMeanRatioCorrelation OrientationCoefficient08DB001 Nass River above 1929-1990 22 18500 0.042 0.96 0.187 0.16 1.04 0.23 N to SShumal Creek08EB005 Skeena River above 1970-1990 16 12400 0.029 1.08 0.164 0.19 1.05 0.2 N to SBabine River08EC001 Babine River at 1929-1985 10 6480 0.007 0.7 0.016 0.38 1.04 -0.51 N to SBabine08EC013 Babine River at Outletof Nilkitkwa Lake1972-1990 14 6790 0.007 0.7 0.021 0.35 1.01 -0.19 NW to SE08EE008 Goathom Creek near 1960-1990 26 132 0.013 1.04 0.118 0.46 1.26 0.31 S to NTelkwa08EE020 Telkwa River below 1975-1990 13 368 0.038 0.92 0.244 0.33 1.12 0.78 W to ETsai Creek08EE025 Two Mile Creek in 1982-1990 7 20 0.006 0.26 0.023 0.68 1.15 0.92 E to WDistrict Lot 483408EE028 Station Creek above 1985-1990 6 11 0.025 1.04 0.191 0.21 1.29 -0.09 S to NDiversions08EF001 Skeena River at Usk 1928-1990 33 42200 0.022 0.94 0.117 0.2 1.03 -0.03 N to S08EF005 Zymoetz River above 1963-1990 25 2980 0.035 0.86 0.22 0.55 1.25 0.54 NE to SWO.K. Creek08EG011 Zymagotitz River near 1960-1990 28 376 0.062 0.71 0.452 0.4 1.48 0.49 NW to SETerrace08Ef004 Kitseguecla Rivernear Skeena Crossing1960-1971 11 728 0.021 0.97 0.184 0.36 1.5 0.56 NE to SWlow^probable^highmean max daily flood (m3/s/km2)coefficient of variationratio of daily to instantaneouscorrelation coefficient0.02 0.16 0.40.16 0.36 0.61.12 1.28 1.480.247.3^Reclional AnalysisThe next step was to perform an analysis of this data so as to estimatestreamflow parameters for use in the FLOOD program. In order to calculate maxdaily and max instantaneous flows for various return periods, the FLOOD programrequires inputs of low, probable and high estimates of a stream's mean daily floodand coefficient of variation, its ratio of peak daily to instantaneous flows, and thecorrelation coefficient between the ratio and the mean daily flood. For KitsegueclaRiver these values were estimated as follows:Table 7.2 - Estimates of Peak Flood Parameters for Kitseguecla RiverReasons for the selection of these values are as follows:1. Mean daily flood: The mean daily flood values differ by as much as anorder of magnitude, and as there is no clear correlation between flowand drainage area, orientation, or period of record, it seemedreasonable to use the average value as the most probable estimate, andthen to specify a wide range between the low and high estimates. Somethought was given to discarding the low and high values, but as station4708EE025 is one of only two stations within a 30 km radius of station08EF004, this approach was abandoned.2. Coefficient of variation: The coefficients of variation fall within a muchnarrower range than the daily flood values, but in a similar fashion, theydo not correlate strongly to any basin characteristics, and so the sameapproach as used for the daily flood estimates was adopted.3. Ratio: The ratios of max daily to max instantaneous peak flows tend todecrease as the catchment area increases. Thus, the ratio estimateswere based on the ratio values of stations with catchments of similarsize to station 08EF004. Three stations, 08EE020, 08EF005, and08EG011, were identified with this characteristic, and the most probableratio estimate was given as the average of their ratios. The high and lowestimates were simply the high and low values of these three stations.4. Correlation Coefficient:^In contrast to the ratio values, the correlationcoefficients between the max daily floods and ratios of max daily to maxinstantaneous flows are not strongly related to catchment size, so theaverage value was used.7.4^Additional Data The flood program permits the user to input additional flood data in order toimprove or update the flood estimates. Station 08EF004 has 11 years of floodrecords (Table 7.3), and increasing amounts of this data were inputted to FLOOD todemonstrate how additional data alters the estimates. Initially, five years of flooddata were used (1960-63,65), then nine years of data(1960-63,65-69), and finally,nine years of data plus the fact that a flow of 229 m3/s was the largest flow in 1148Table 7.3 - Extreme Flow Records for Station 08EF004KITSEGUECLA RIVER NEAR SKEENA CROSSING - STATION NO. 08EF004 ^Max Instantaneous Discharges Max Dail ^DischargesYear Flow (m3/s) Time Date Flow (m3/s) Date1960 106 07:30 PST 15-Oct 87.2 15-Oct1961 265 12:15 PST 31-Oct 137 31-Oct1962 97.7 12:05 PST 1-Nov 86.4 1-Nov1963 190 23:10 PST 21-May 153 22-May1965 171 07:30 PST 22-Oct 114 22-Oct1966 603 04:30 PST 24-Oct * 229 24-Oct1967 125 23:00 PST 6-Jun 112 2-Jun1968 251 03:00 PST 21-May 203 21-May1969 167 00:30 PST 24-May 119 24-May1970 237 12:44 PST 26-Jul 160 26-Jul1971 119 18:36 PST 6-Jun 75.3 7-Junyears. All 11 years of flow were not used as the system is designed for situationswhere there is very little data, and can only handle 10 pieces of additional data.However, all 11 yeas of data were used in a conventional flood frequencyanalysis, and the results of this, in comparison with the FLOOD results, are shownin Table 7.4.Table 7.4 - Estimates of 200 Year Mean Daily and Instantaneous Floods onKitseguecla RiverAdditional Data Proaram Conventionallow^probable^highnone 38.2^316.4^661.65 years actual (1960-63, 65) 187.8^254.8^321.9 224.79 years actual (1960-63, 65-69) 203.1^280.9^358.7 319.99 years actual (1960-63, 65-69) 231.1^295.7^360.4 313.9+ largest flood in 11 yearsInstantaneous Flows9 years actual (1960-63, 65-69) 327.0^403.1^481.9 735.7+ largest flood in 11 yearsAs one can see, the FLOOD results are initially a little more conservative than theconventional estimate, and the confidence limits are wide apart, but as more data isused, the FLOOD estimate becomes less conservative and the confidence limitsnarrow.50CHAPTER 8DISCUSSION AND CONCLUSIONSStreamflows are essential information for environmental assessments andthe design of water related facilities. Fortunately, many large streams and rivershave extensive historical records which make the process of predicting designflows, such as peak floods and low flows, relatively simple. However, the majority ofstreams in the province of British Columbia have little or no historical records.Thus, flows for those streams must be estimated by some other means, and themost commonly accepted procedure is regional analysis.Regional streamflow analysis is performed frequently around the province ofBritish Columbia. However, to date, there has been no attempt to develop any sortof a package, or system, to facilitate or streamline the process. The development ofsuch a system would not only increase the efficiency of data summary and analysis,but would encourage the consideration of larger amounts of data, thereby improvingthe quality of results. At present, the labour intensive nature of data summary workprohibits the examination of all but the most relevant material, and numerousscenarios are not considered.The personal computer, with its ability to handle huge quantities of data, hasrevolutionized the field of hydrology. The advent of the computer, coupled with thedevelopment of software packages designed specifically for managing spatiallyoriented data (GIS), provides an ideal environment for the development of astreamflow summary and analysis system. This thesis involved the development ofsuch a system, which is known as the DATSUM/QUIKMap system. This system51uses a spreadsheet program called DATSUM to process raw streamflow data filesfrom a CD information disk in order to create summary data files for Water Survey ofCanada (WSC) streamflow regions in B.C. These files are generated in a formatwhich can be read and displayed by a geographic information system known asQUIKMap. The data is displayed in a manner designed to aid hydrologists in usingthe system to perform regional analyses. In contrast to most hydrologic GISapplications, this system is very simple to operate and does not require the skills ofa specialized GIS operator. Use of the DATSUM/QUIKMap system will facilitatewater related studies in British Columbia, and potentially, all of Canada.The DATSUM/QUIKMap system, when combined with the FLOOD program,creates an application which offers many advantages over traditional regional floodfrequency techniques. For instance, it requires the user to provide considerablejudgement and input to the analysis process which forces the user to be aware ofthe systems limitations and the relative validity of results. In addition, it is a quickand simple technique which does not involve the complicated mathematicsassociated with regression analysis. A well, it provides low, probable and highestimates of flood flows and permits the use of additional information to updateestimates.The DATSUM program was specifically designed to create streamflow datasummary files for WSC regions in B.C., but as WSC is a federal organization, andstreamflow data files are produced in a standard format for the whole country, theprogram should operate equally well on data for any province. Thus, a user needonly obtain a digitized map of his/her specific area of interest, which is compatiblewith the QUIKMap GIS software, and a streamflow data analysis and summarysystem could be created for any area of the country.Currently, the availability of electronic databases and digitized maps inCanada is rather limited, and this is restricting the development and use of GIS52related applications in hydrology and other disciplines. However, in BritishColumbia, there is an extensive mapping program underway to produce a digitalterrain model of the entire province. By 1996, over 7000 digitized maps, at a scaleof 1:20 000, will provide planimetric and topographic data. These maps will becompatible with most commercially available GIS packages which will serve todramatically facilitate the use of GIS systems in B.C.As with any software system, the DATSUM/QUIKMap system is far frombeing developed to its fullest potential. Computer technology is changing so rapidlythat most software and hardware systems are almost obsolete by the time they areproduced. This system underwent numerous revisions during its development, andthere is still opportunity for improvement. For instance, the CD-ROM system is onlyable to export one named data file at a time, rather than a batch of files. This is anextremely limiting feature when one wishes to retrieve hundreds of data files.Another area needing improvement is the interaction capabilities of thedifferent components of the system. The current system requires the user tooperate three independent software systems to access, manipulate and view data.Unfortunately, the WSC CD-ROM software does not have the capability to interactwith other software packages, so raw data must be extracted from the CD-ROM,exported to the DATSUM program for analysis, then saved to a directory which isaccessed by the GIS. The system is not particularly efficient by today's softwarestandards, and it requires considerable file management input by the user, but it isstill a major improvement over currently available options. Ideally, the systemshould be an automated package which is able to access, manipulate and displaydata with minimal user involvement; perhaps, as technology advances, such asystem could be created.A possible future expansion of the current system would be to modify theDATSUM program to generate hydrographs, either for daily or monthly flow records.53The QUIKMap system has the capability to display "picture" files stored in thedatabase, and the ability to view hydrographs would be useful for regional analyses.In a similar fashion, it is conceivable that unit hydrographs, as well as numerousother rainfall parameters, could be generated and displayed if a digitized rainfalldatabase was available. At present there is no such database produced in B.C., butas rainfall records are currently kept at many locations throughout the province, itseems reasonable to assume that an electronic database similar to the WSCstreamflow database will be made available in the near future.However, the system, as it currently stands, could be used to aid in thegeneration of synthetic unit hydrographs for ungauged basins. Synthetichydrographs can be generated by relating the physical geometry of a gauged areato its hydrograph, and then comparing the basin's characteristics with those of anungauged area. If basin parameters additional to those currently displayed by thesystem are required, the DATSUM program could be easily modified to produce thedesired values.Another possible future application of this system would be to integrate it withan expert system. An expert system is a computer program designed to mimic thedecision making or problem solving processes of a human expert. These systemsare designed to deal with uncertainty and have enormous potential as they can beused to capture expertise. As estimating streamflows is such an inexact science,due to the enormous number of variables involved and the uncertain nature of manyof those variables, an expert system could aid engineers and hydrologists in makingestimates and supporting decisions.Some GIS/expert system applications have been created which generateinput parameters for hydrologic models by combining the data handling capabilitiesof a GIS with the "mature hydrologic expertise" of an expert system to optimizeoutputs. In a similar fashion, an expert system could be created to operate in54conjunction with the DATSUM/QUIKMap system. The expert system would utilize anumber of heuristics to guide the user through the regional analysis procedure.Regional analysis relies heavily on human judgement and it would be useful to havea readily available "second opinion".Regardless of how sophisticated a system may become, it is important toleave the final decision making or estimating processes in the hands of the user.Since data is never complete, judgement is always required, and this can only beprovided by humans. Thus, systems should be designed to assist humans, notreplace them. A program which attempts to make estimates of streamflow valuescould be hazardous, as people are inclined to believe computer outputs with blindfaith. Geographic information systems are particularly dangerous as they are suchcomplex systems which handle large volumes of data and produce impressive,seemingly accurate maps and data. Although it is tempting to simply accept suchoutput values, the user must take care to understand how the systems work so thatexpected ranges of values can be anticipated in order to assess the quality ofresults.The system developed for this thesis permits streamflow data to beorganized and presented in a user-friendly environment on personal computers, soas to facilitate the process of regional analysis, as well as numerous otherstreamf low oriented procedures. Water related professionals, such as hydrologistsor biologists, will find it to be a valuable tool for obtaining information, and whenused in conjunction with other applications, such as the FLOOD program, newlevels of results should be attained. Finally, with advancing technology, this systemcould be refined and expanded to create new applications for the enhancement ofwater related studies throughout Canada.55BIBLIOGRAPHYAxys Software, QUIKMap - Version 2.51: User's Guide, Sidney, B.C., Canada,1992.Balser, R., "Terrain Resource Information Management (TRIM): A StandardizedGeo-referencing Database for the Province of British Columbia." In GIS '89: AWider Perspective, Proceedings of a Symposium, Vancouver, B.C., March 1989,23-28.Berry, J. K., "Computer-Assisted Map Analysis: Potential and Pitfalls" inPhotogrammetric Engineering and Remote Sensing, Vol. 53, 1988, 1405-1410.Crippen Consultants, Microhydro - Guidance Manual of Procedures for Assessmentof Micro Hydro Potential, Report ER80-9E, produced for Energy, Mines andResources Canada, Vancouver, Canada, October 1980.Energy, Mines and Resources Canada - Efficiency and Alternate EnergyTechnology Branch, Canadian Small Hydropower Handbook: British ColumbiaRegion, Ottawa, Canada, June 1989.Environment Canada, Hydat User's Manual (CD-ROM Version 3.0), Ottawa,Canada, 1990.Fisher, G. T., "Geographic Information System/Watershed Model Interface" inHydraulic Engineering, ed. M. Ports, ASCE, 1989, 851-856.Kumar, S., and J. A. Meech, A Hypermanual on Expert Systems, produced forEnergy, Mines and Resources Canada, Mineral Sciences Laboratories, CANMET,Ottawa, Canada, 1992.Lamke, R. D., Flood Characteristics of Alaskan Streams, produced for United StatesDepartment of the Interior Geological Survey, Report 78-129, Anchorage, Alaska,1979.Linsley, R. K., M. A. Kohler and J. L. Paulhus, Hydrology for Engineers, 3d. ed.,Toronto: McGraw-Hill Book Company, 1982.56Marble, D. F., "Geographic Information Systems: An Overview." In Pecora 9Proceedings: Spatial Information Technologies for Remote Sensing Today and Tomorrow October 1984, 18-24.Microsoft Corporation, Microsoft Excel User's Guide, Redmond, WA, USA, 1991.Microsoft Corporation, Microsoft Excel Function Reference, Redmond, WA, USA,1991.Moeller, R. A., "Application of a Geographic Information System to HydrologicModeling Using HEC-1" in Civil Engineering Applications of Remote Sensing andGeographic Information Systems, ed. D. B. Stafford, ASCE, 1991, 269-277.Muzik, I., "Application of a GIS to SCS Procedure for Designing FloodHydrographs." In Proceedings of the International Symposium on Modeling in Agricultural. Forest, and Rangeland Hydrology, Chicago 1988, American Society ofAgricultural Engineers, Michigan, 494-500.Ontario Ministry of Energy, Small Hydro Engineering Manual, 1988.Russell, S. 0., "Flood Probability Estimation," Journal of the Hydraulics Division,ASCE, Vol. 108, No. HY1, 1981.Russell, S. 0., "Estimating Flows From Limited Data," University of BritishColumbia, Civil Engineering Department, May, 1991.Stuebe, M. M. and D.M. Johnston, "Runoff Volume Estimation Using GISTechniques", Water Resources Bulletin, Vol. 26, No. 4, August, 1990, 611-620.VanBlarrgan, E. J., R. M. Ragan and J. S. Schaake, "Hydrologic GeographicInformation System", Transportation Research Record, No. 1261, 1990, 44-51.VanBlarrgan, E. J., R. M. Ragan and J. S. Schaake, "Automated Estimation ofHydrologic Parameters" in Civil Engineering Applications of Remote Sensing andGeographic Information Systems, ed. D. B. Stafford, ASCE, 1991, 278-286.Watt, W. E., et al., Hydrology of Floods In Canada: A Guide to Planning andDesign, National Research Council of Canada, 1989.Wolfe, M. L. and C. M. Neale, "Input Data Development for a Distributed ParameterHydrologic Model (FESHM)." In Proceedings of the International Symposium onModeling in Agricultural, Forest, and Rangeland Hydrology, Chicago 1988,American Society of Agricultural Engineers, Michigan, 462-463.57Zhang, H., C. T. Haan and D. L. Nofziger, "Hydrologic Modeling With GIS: AnOverview," in Applied Enqineerinq In Agriculture, ASCE, Vol. 6, No. 4, July 90, 453-458.58APPENDIX ADATSUM PROGRAM DETAILSAppendix A.1 - DATSUM Summary FileA.1.1 Fields of the DATSUM Summary FilesThis is a brief description of the different fields in the streamflow data summary files,which are designed to be read by the QUIKMAP software.KEY: This is the streamflow station identification number as given by Water Surveyof Canada. This serves as the file identifier in the QUIKMAP nomenclature.STA_NAME : This is the streamflow station name as given by Water Survey ofCanada.M_A_RUNOFF : Mean of the mean annual runoffs (m3/s). The mean annual runoffsare calculated from mean monthly flow data.UA_MAR : Unit area mean of the mean annual runoffs (m3/s/km2).CV_RUNOFF : Coefficient of Variation (standard deviation / mean) of the meanannual runoffs.AREA: Rainfall catchment area (km2).YEARS : Time span over which streamf low records were kept. Records were notnecessarily kept for every year during this time span.COUNT: Number of years of streamflow record.COMP_COUNT : Number of complete years of streamflow record.M RATIO : Mean of the annual ratios of max instantaneous flow to max daily flow.CV_RATIO : Coefficient of Variation of the annual ratios of max instantaneous flowto max daily flow.FLOOD_COUNT : Number of complete years of extreme flow record.MN _ LAT_ DEG : Streamflow station coordinate - latitude degrees.MN _ LAT_ MIN : Streamflow station coordinate - latitude minutes.MN _ LAT_ SEC: Streamflow station coordinate - latitude seconds.MN LON DEG : Streamflow station coordinate - longitude degrees.60MN _ LON _MIN : Streamflow station coordinate - longitude minutes.MN _ LON _SEC : Streamflow station coordinate - longitude seconds.LBL TEXT : Text to appear on the QUIKMAP screen beside the symbol.DATA TYPE: Specifies whether the data is symbol or polyline data.LBL_SIZE : Specifies the size of the LBL_TEXT message.SYM_LN_TYP : Contains the numeric code for the symbol or polyline type to bedisplayed.SIZE_THICK : Specifies the symbol size or the polyline line thickness.DATA COLOR : Specifies the symbol colour by numeric code.HATCH PATT : Defines the hatch pattern used to fill the symbol or polyline.OPERATION : Field used to specify different forms of presentation of the data. Forexample, "B" specifies boxing the LBL_TEXT, while "I" specifies that the data beignored.M MAXDAILY : Mean of the annual max daily flows (m3/s).CV_MAXDAILY : Coefficient of Variation of the max daily flows.CC MAXDtoMAXI : Correlation Coefficient between max daily flows and the ratiosof max instantaneous flow to max daily flow.F_P_INDEX : Flow Pattern Index. This is the coefficient of variation of meanmonthly flows and indicates the degree of change of flow values, from month tomonth.UA_MMD : Unit area mean of the max daily flows (m3/s/km2).61A.1.2 Example Summary File - Region 08JAKEY STA NAME M_A_RUNOFF UA_MAR08JA001 NECHAKO RIVER AT FORT FRASER -999.9 -9.99008JA002 OOTSA RIVER AT OOTSA LAKE 99.2 0.02208JA003 WHITESAIL RIVER NEAR OOTSA LAKE 33.4 0.03308JA004 TETACHUCK RIVER NEAR OOTSA LAKE 65.3 0.02008JA005 TAHTSA RIVER NEAR OOTSA LAKE 51.4 0.03608JA006 TAHTSA RIVER AT OUTLET OF TAHTSA LAKE 26.3 0.04608JA007 NECHAKO RIVER AT OUTLET OF NATALKUZ LAKE -999.9 -9.99008JA009 CHESLATTA RIVER NEAR OOTSA LAKE -999.9 -9.99008JA010 NECHAKO RIVER BELOW BIG BEND CREEK 176.0 0.01308JA011 NECHAKO RIVER NEAR OOTSA LAKE -999.9 -9.99008JA013 SKINS LAKE SPILLWAY, NECHAKO RESERVOIR 106.3 -9.99008JA014 VAN TINE CREEK NEAR THE MOUTH 0.9 0.00608JA015 LAVENTIE CREEK NEAR THE MOUTH 5.0 0.05808JA016 MACIVOR CREEK NEAR THE MOUTH 1.0 0.01808JA017 NECHAKO RIVER BELOW CHESLATTA FALLS 61.7 0.004CV_RUNOFF AREA YEARS COUNT COMP_COUNT M_RATIO CV_RATIO-99.99 17700 1915-1954 29 0 -99.99 -99.990.13 4450 1929-1952 24 18 -99.99 -99.990.12 1010 1930-1952 23 6 1.01 0.020.10 3320 1930-1952 23 5 1.02 0.02-99.99 1420 1930-1952 23 1 1.02 0.02-99.99 578 1930-1953 13 1 -99.99 -99.99-99.99 12100 1932-1933 2 0 -99.99 -99.99-99.99 1580 1950-1951 2 0 -99.99 -99.99-99.99 13900 1950-1952 3 1 1.01 0.00-99.99 1949-1950 2 0 -99.99 -99.990.58 1955-1990 36 35 -99.99 -99.990.52 153 1974-1990 17 16 1.16 0.140.12 86.5 1976-1990 15 14 1.81 0.340.21 53.4 1976-1990 15 11 1.22 0.100.12 15600 1980-1990 11 10 1.01 0.0162FLOOD_COUNT MN_LAT_DEG MN_LAT_MIN MN_LAT_SEC MN_LON_DEG0 54 3 18 1240 53 37 30 12517 53 38 20 12617 53 21 12 12517 53 45 20 1260 53 41 0 1270 53 25 40 1250 53 38 36 1242 53 33 50 1240 53 26 40 1250 53 46 15 12514 53 15 30 12514 53 39 9 12710 53 48 7 12610 53 41 2 124MN_LON_MIN MN_LON_SEC LBL_TEXTI DATA_ TYPE33 39 08JA001/nINSUFFICIENT DATA S44 0 08JA002/nMAR=0.022/nFPI=0.79 S44 56 08JA003/nMAR=0.033/nFPI=013 S34 10 08JA004/nMAR=0.020/nFPI=0.5 S42 0 08JA005/nINSUFFICIENT DATA S14 0 08JA006/nINSUFFICIENT DATA S6 40 08JA007/nINSUFFICIENT DATA S58 30 08JA009/nINSUFFICIENT DATA S51 36 08JA010/nINSUFFICIENT DATA S1 40 08JA011/nINSUFFICIENT DATA S58 14 08JA013/nMAR=-9.99/nFPI=0.36 S24 50 08JA014/nMAR=0.006/nFPI=1.22 S32 13 08JA015/nMAR=0.058/nFP1=1 S21 0 08JA016/nMAR=0.018/nFPI=1.2 S50 8 08JA017/nMAR=0.004/nFP1=0.73 S63LBL_SIZE SYM_LN_TYP SIZE_THICK DATA_COLOR HATCH_PATT OPERATION2 11 15 10 0 B2 1 9 11 0 B2 2 9 11 0 B2 1 9 11 0 B2 2 9 11 082 2 7 9 0 B2 11 13 13 0 B2 11 9 11 0 B2 1 13 13 0 B2 11 3 6 082 11 3 6 1 B2 1 5 14 1 B2 3 3 6 1 B2 1 3 6 1 B2 1 15 10 1 BM_MAXDAILY CV_MAXDAILY CC_MAXDtoMAXI F_P_INDEX UA_MMD-99.99 -99.99 -99.99 0.66 -99.990-99.99 -99.99 -99.99 0.79 -99.990109.56 0.16 -0.02 0.73 0.108155.47 0.18 0.07 0.50 0.047214.47 0.21 0.30 0.72 0.151-99.99 -99.99 -99.99 0.90 -99.990-99.99 -99.99 -99.99 0.22 -99.990-99.99 -99.99 -99.99 0.54 -99.990517.00 0.02 -1.00 0.81 0.037-99.99 -99.99 -99.99 0.52 -99.990-99.99 -99.99 -99.99 0.36 -99.99011.35 0.54 0.34 1.22 0.07451.54 0.68 0.14 1.00 0.5967.12 0.25 -0.15 1.20 0.133270.60 0.16 -0.07 0.73 0.01764Appendix A.2 - Macros: Names and FunctionsThis is a brief description of the Excel macros used to create the DATSUMprogram. The macros are found in eight different macrosheets, and they are listedbelow in the order in which they appear in each macrosheet. The group of macrosunder the heading "Command Macros" appear twice in the list. They areemphasized because the user is able to invoke them independent of the running ofthe DATSUM program. These macros generally provide quick and simple ways tomanage files.Command Macros: These are macros which can be manually invoked by strikingthe key combination shown beside each macro name.RUN DATSUM^<CTRL> <d>Master program macro which opens and closes files as needed andorchestrates all the other macros.OPEN^<CTRL> < o >Opens all macro files necessary for the creation of streamflow datasummary files.CLOSE^<CTRL> < c >Closes all the DATSUM macros without saving any of the changes.Useful for exiting the program.SAVEFILE^<CTRL> < a >Closes the file and prompts whether you want to save changes.SNGL_FLOW_FILE^<CTRL> < S >Formats an individual mean flow file without sending data to asummary file. This is the macro that should be invoked if one wishesto look at a single streamflow data file.SNGL_RATIO_FILE^<CTRL> < s >Formats an individual max flow data file without sending data to asummary file.65MACROIA.XLM : Small macros that perform very simple and specialized functionsrelated to formatting the individual streamflow data files.PARSEI, PARSE2, PARSE3, PARSE4Parse different lines of the raw data files to put them into a format which iseasily manipulated with EXCEL commands.averageSelects a column of numbers and calculates the average value. (ie. mean ofthe mean annual flows).stdevSelects a column of numbers and calculates the coefficient of variation. (ie.coefficient of variation of the mean annual flows).parseloopParses a selected column of cells.COUNTERCounts the number of cells in a column which contain values.MOVEDATACreates a summary line of data at the bottom of the file.SUMDATACopies the summary line of data created by MOVEDATA macro to thesummary file.SAVEFILE^ <CTRL> <a>Closes the file and prompts whether you want to save changes.EXTRADATAChecks the file to see whether there is a second page of data below the linesreferencing latitude and longitude, and if so, appends it at the appropriateplace.SAVEOLDCloses the file without saving any changes.NO_REGULATEDChecks the mean annual flow values in the data file and removes any lettersattached to any values, such as "R" for regulated or "E" for estimated.66NO REGULATED2Checks the mean monthly flow values in the data file and removes any lettersattached to any values, such as "R" for regulated or "E" for estimated.QUIKDATA.XLM : Macros that specify what sort of symbols will represent eachstreamf low data file.HATCHSpecifies the hatch pattern for each symbol. Current stations have a solidhatch pattern while discontinued stations have a blank hatch pattern.SHAPESpecifies symbol shapes depending on mean of mean annual runoff values.SYMSIZESpecifies symbol size and colour depending on catchment area.OPERATIONSpecifies the operation code. In this case, "B" is specified to indicate boxedlabels. See page 11-11 in the Quikmap manual.DATAFILLInvokes a number of QUIKDATA macros.ALIGNMENTSpecifies left alignment for all columns.SYMBOLSpecifies that each file be represented by a symbol.DATABASE1Marks all columns of data as a database so that the file can be formatted asa dBASE III file.SHOW_TEXTCreates the LBL TEXT data in the summary file.YES_DATAMacro invoked by the SHOW_TEXT macro if MAR and CV data have beencalculated for a specific station. Creates a data line which contains thestation name, MAR and CV. This data line is formatted so that it can be readas a label by the QUIKMAP software.67NO_DATAMacro invoked by the SHOW_TEXT macro if MAR and CV data have notbeen calculated for a specific station. Creates a data line which contains thestation name and a message indicating that no values have been calculated.This data line is formatted so that it can be read as a label by the QUIKMAPsoftware.ROUND_NUMRounds off MAR and CV values to desired significant figures, permitting thevalues to be used by the YES_DATA macro. (Otherwise, the figures are toolong and the YES_DATA macro has difficulty handling them.STATION MOVECopies the station names to another area of the summary spreadsheet sothat this data can be used by the YES_DATA and NO_DATA macros.FORMAT_RATNUMFormats the ratio numbers to two decimal places.CLEAR_DATAClears data which was used to format other data, but which is not needed inthe final summary file.LATLONG_CHECKChecks the latitude and longitude values in the data summary file, andplaces a zero in the cell if the cell is blank.INDEX.XLM : Macros that take data from the index files and copy it to the summaryfiles.INDEXXParses data lines and then selects data and copies it to a summary file.PARSELOOPParses the data in the index file.SENDNAMESends station name and number to the data summary file.SENDYEARSends years of record and count to the data summary file.68COUNTCHECKChecks whether the number of data lines in the summary file agrees with thenumber of data lines in the index file. That is, it checks to see whether all thestreamflow files have been processed and had their information sent to thedata summary file.EXTRAINDEXChecks the file to see if there is a second page of data below the linesreferencing latitude and longitude, and if so, appends it at the appropriateplace.FORMAT_NUMBERFormats the unit area mean annual runoff values to have 3 decimal places,the cv values to have 2 decimal places and the MAR values to have 1decimal place.FLOW.XLM : Macros that link other macros together to format streamflow data files.FLOWLinks many of the other macros together to parse a raw data file, perform anumber of calculations, summarize information at the bottom of the file, sendthe data summary to a summary file, and then exit the file without saving anychanges.SNGL_FLOW_FILE^<CTRL> <S>Performs same functions as the FLOW macro, without sending the data to asummary file or exiting the file. This is the macro that should be invoked ifone wishes to look at a single streamflow data file.RATIO.XLM : Macros involved in formatting streamflow max flow files.RATIOParses raw data max flow files and enacts the COUNT RAT macro.PARSE5, PARSE6Parse different lines of the raw data files to put them into a format which iseasily manipulated with EXCEL commands.CLEAN_UPRemoves lines of record which are incomplete.69EXTRARATIOChecks the file to see whether there is a second page of data below the linereferencing data type, and if so, appends it at the appropriate place.RATIO_CALCCalculates the ratio of max instantaneous flow to max daily flow for everyyear of complete record.SUM_CALCCalculates the mean, standard deviation, and coefficient of variation of theannual max daily flows and of the ratios of max annual instantaneous to maxannual daily flows. Also calculates the correlation coefficient between themax daily flows and the ratios of max instantaneous to max daily flows.MOVE_RATCreates a summary line of data at the bottom of the file.SUM_RATCopies the summary line of data created by the MOVE_RAT macro to thesummary file.COUNT_RATCounts the number of complete years of max flow record, and performs onetask (macro YES RAT) if there is at least one year of complete record, andperforms another task (macro NO_RAT) if there is not.YES_RATIOIf the macro COUNT RAT indicates that there are complete years of recordfor the current station, then this macro links many other macros together toformat the file, perform statistical calculations, and copy the results to asummary file.NO_RATIOIf the macro COUNT_RAT indicates that there are NO complete years ofrecord for the current station, then this macro places the number -9999 in theappropriate columns of the summary file.SNGL RATIO_FILE^<CTRL> <s>Formats an individual max flow data file without sending data to a summaryfile.ONE RATIOIf the macro COUNT_RAT indicates that there is only one complete year ofrecord for the current station, then this macro calculates the ratio for that year70and places it in the M_RATIO column of the summary file and -99.99 in theCV_RATIO column.CALC_ UA_ MEANIn the summary file, this macro calculates the unit area mean of the annualmax daily flows, if the catchment area is known. Otherwise, it places thevalue -99.99 in the appropriate column. Note that the value -99.99 isalready in place if there are no complete years of data (see SUM_RAT andONE RATIO).SETUP.XLM : Macros involved in setting up the other macros so that they willoperate for a specific data file or group of files.MASTER_FLOWMaster macro that opens individual streamflow data files and invokes theFLOW macro. This is the macro that is used to create the flow data for thedata summary files.MASTER_RATIOMaster macro that opens individual max flow data files and invokes theRATIO macro. This is the macro that is used to create the ratio data for thedata summary files.FILEOPEN_SETUPCopies all the station names from the appropriate index file to theSETUP.XLM macro sheet so that the MASTER_FLOW and MASTER_RATIOmacros can be configured to open the appropriate files.RENAME_FILEChanges the file references on the SETUP.XLM macro sheet.BEGIN_SETUPSets up the macro sheet SETUP.XLM so that the macros FILEOPEN SETUPand RENAME_FILE can operate. Also sets up references in theDATSUM.XLM macrosheet.MASTER_FILE_SETUPMaster command macro which invokes the BEGIN_SETUP, RENAME_FILEand FILEOPEN_SETUP macros. Once the MASTER_FILE_SETUP macrohas been run all the appropriate macro sheets should have been configuredcorrectly so that the MASTER_FLOW and MASTER_RATIO macros operatein the desired fashion.71MASTER_FILE_CREATEMaster command macro which invokes the MASTER_FLOW andMASTER_RATIO macros. This macro should only be run after the macrosheets have been properly configured by the MASTER_FILE_SETUP macro.MASTER_MASTERMaster command macro which invokes all the appropriate macros so as tocreate a complete summary file.OPEN.XLM : Macros designed to open and close various combinations of files.OPEN^ <CTRL> <o>Opens all macro files necessary for the creation of streamflow data summaryfiles.CLOSE^ <CTRL> <c>Closes all the DATSUM macrosheets, except OPEN.XLM, without saving anyof the changes.DATSUM.XLM: Macros designed to control the other macros.RUN_DATSUMMaster program macro which opens and closes files as needed andorchestrates all the other macros.72Figure A.2 - Flow Chart Illustrating DATSUM Macro Relationships (page 1 of 3)73BEGIN_SETUPFILEOPEN_SETUP INDEX.XLM^EXTRAINDEXMACRO1A.XLM^SAVEOLDRENAME FILEFLOW MACRO1A.XLM^EXTRADATAPARSE2parseloopPARSE3PARSE4NO_REGULATEDNO REGULATED2averagestdevCOUNTERMOVEDATASUMDATASAVEOLDINDEXX INDEXXLM EXTRA1NDEXPARSELOOPSENDNAMESENDYEARFORMAT NUMBERCOUNTCHECKMACRO1A.XLM SAVEOLDDATAFILL QUIKDATA.XLM HATCHSHAPESYMSIZEOPERATIONSYMBOLSHOW_TEXT QUIKDATA.XLM^ROUND_NUMDATABASE1 STATON_MOVECLEARDATA YES_DATANO_DATARATIO RAT10.XLM^IXTRARATIOPARSESPARSE()COUNT RAT RATIO XLM^ NO_RATIOONE_RATIOYES_RAT10Figure A.2 - Flow Chart Illustrating DATSUM Macro Relationships (page 2 of 3)74MACRO1A.XLM^SAVEOLDRA110.XLM CLEAN_UPCALC_UA_MEANMACRO1A.XLM^SAVEOLDRAT1O.XLM CLEAN_UPRATIO CALCSUM_CALCMOVE_RATSUM_RAT RAT1O.XLM CALC_UA_MEANMACRO1A.XLM^SAVEOLDFigure A.2 - Flow Chart Illustrating DATSUM Macro Relationships (page 3 of 3)75PARSE1 PARSE3PARSE2=ECHO(FALSE) =ECHO(FALSE) =ECHO(FALSE)=RETURN() =RETURN() =SELECTCRC[3]*)Appendix A.3 - Programming CodeMacrosheet : MACRO1A.XLM =PARSECO[li^ STILL CRE =PARSEC[1912][ 42.8 ][ 53.0 =PARSE('[LOCATION - LATH ][4=EDIT.DELETE(1)=SELECTCRC[1)=EDIT.DELETE(1)=SELECT(RCF4r)=RETURN()76PARSE4 stdevaverage=EDIT.DELETE(1) =RETURN()=RETURN()=ECHO(FALSE) =ECHO(FALSE) =ECHO(FALSE)=PARSE('[^][ LONG =SELECT(MAR') =SELECT('SD')=SELECT(RC[3]') =FORMULA(=AVERAGE(R7C14:BOT2)") =FORMULA("=STDEV(R7C14:BOT2)")=SELECT("RC[1]")=EDIT.DELETE(1)=SELECTCRC[-4]')=RETURN()77parseloop COUNTER=ECHO(FALSE) =ECHO(FALSE)=SELECT("R7C1") =SELECT(R7C14')=SELECT.EN D(4) =SE LEGIT R[1]C")=S E LECT(' R7C1 :BOT1 ") =SELECTCCOUNT")=RETURN() =ELSE()=DEFINE.NAME("BOT1") =IF(ISBLANK(ACTIVE.CELL()))=RUN("MACRO1A.XLMIPARSET) =FORMULA(1)=SELECT(COUNT)=FORMULA(=COUNT(R7C14:130T2)")=END.IF()=RETURN()78MOVEDATA MOVEDATA (confd) SUMDATA=ECHO(FALSE) =SELECT(' R[10]C[8r) =SELECT(RC:RC[30])=COPY() =PASTE.SPECIAL(3,1,FALSE,FALSE) =COPY()=SELECTCR[3]C[11r) =CAN CELCO PY0 =ACTIVATE('SUMBASE.XLS')=PASTE() =SE LECTC R[-9]C[-8]" ) =SELECT(R16384C3')=CANCEL.COPY() =SELECT(RC:RC[2]) =SELECT.EN D(3)=SELECTCRF7P[-131") =COPY() =SELECTCR[1]C")=IF(ISERROR(ACTIVE.CELL())) =SELECT("R[9]C[11r) =PASTE()=SELECTCR[7]CF3r) =PASTE.SPECIAL(3,1,FALSE,FALSE) =GANGS...COPY()=FORMULA(-999.9) =CAN CEL.COPY() =ACTIVATE.NEXTO=ELSE() =SELECT('R1C1") =RETURN()=COPY() =COPY()=SELECTCR[7]C[-31") =SELECT('R16384C1")=PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT.END(3)=END.IF() =SELECT("RC[24]")=CANCEL.COPY() =PASTE()=SELECT(`R[-71C[5]") =CANCEL.COPY()=IF(ISERROR(ACTIVE.CELL())) =SELECTCR[-21CF21r)=SELECT(' R[7]C(-4r) =IF(ISERROR(ACTIVE.CELL()))=FORMULA(-9.99) =SELECT('R[2]C[24]")=ELSE() =FORMULA(-9.99)=COPY() =ELSE()=SELECTCR[7]C[-41") =COPY()=PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT("R[2)C[24]")=CANCEL.COPY() =PASTE.SPECIAL(3,1,FALSE,FALSE)=END.IF() =CANCEL. COPY()=SELECT("R[-5]C[4r) =END.IF()=IF(ISERROR(ACTIVE.CELL())) =RETURN()=SELECT('R[5]C[-3r)=FORMULA(-99.99)=ELSE()=COPY()=SELECT("R[51CF31")=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=END.IF()=SELECT('R[-10]C[6]")=COPY()=SELECTCR[10]C[-5r)=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECT(13[-3]C")=COPY()=SELECT(' R[3]C[31")=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECT(' R[-10]C[-4r)=SE LECT(" RC:RC[2]*)=COPY()79SAVEFILE EXTRADATA SAVEOLD=SAVE.AS(, 1) =ECHO(FALSE) =SAVE.AS(,0)=FILE.CLOSE() =SELECT(*R7C1") =FILE.CLOSE()=RETURN() =SELECT.END(4) =RETURN()=SELECT(R[13]C")=PARSEC[1912][ 42.8 j[ 53.0 ][ 24.5=S ET.NAME(' E)CTRA",ACTIVE.CELLO)=IF(ISTEXT(EXTRA))=SELECT("R(11C")=DEFINE.NAMECDELB1=SELECT("R[-131C")=DEFINE.NAME(DELT")=SELECT(' DELT:DELB")=EDIT.DELETE(2)=END.IF()=SELECT(131C1')=RETURN()80NO_REGULATED NO_REGULATED2=ECHO(FALSE) =ECHO(FALSE)=SELECT("R8C14") =SELECT('B0T2')=IF(ISBLANK(ACTIVE.CELL())) =SELECT("R[21C")=SELECT("R7C14') =DEFINE.NAME(MMEND")=DEFINE.NAME(B0T2') =SELECT.END(1)=ELSE() =SELECT(RC[1]")=SELECTCR7C141 =DEFINE.NAME('MMSTART')=SELECT.END(4) =FOR.CELL("Regulate2MMSTART:MMEND",TRUE)=DEFINE.NAME(BOT2") =SELECT(Regulate2)=END.IF() =IF(ISNUMBER(Regulate2))=SELECT(R7C14') =GOTO(R[22]C)=FOR.CELL('Regulate",'R7C14:BOT2",TRUE) =ELSE()=SELECT(Regulate) =COPY()=IF(ISNUMBER(Regulate)) =SELECT(RC[15]")=GOTO(R[221C) =PASTE()=ELSE() =CANCEL.COPY()=COPY() =IF(LEN(ACTIVE.CELL()))=SELECT('RC[2]") =PARSE(10.239][R1')=PASTE() =ELSE.IF(LEN(ACTIVE.CELL()=5))=CANCEL.COPY() =PARSE(141.51[131")=IF(LEN(ACTIVE.CELL())=6) =ELSE.IF(LEN(ACTIVE.CELL()=4))=PARSE([0.239][R]") =PARSE(1311j[R]")=ELSE.IF(LEN(ACTIVE.CELL()=5)) =ELSE.IF(LEN(ACTIVE.CELL0=3))=PARSEC[41.51[1:11') =PARSE('[10][R]*)=ELSE.IF(LEN(ACTIVE.CELL()=4)) =ELSE.IF(LEN(ACTIVE.CELL()=2))=PARSE(1311][R]") =PARSEr[11[R]*)=ELSE.IF(LEN(ACTIVE.CELL0=3)) =END.IF()=PARSE(110][R]*) =SELECT("RC")=ELSE.IF(LEN(ACTIVE.CELL()=2)) =COPY()=PARSEC[1][13]") =SELECT('FICH5r)=END.IF() =PASTE()=SELECT("RC") =CANCEL.COPY()=COPY() =END.IF()=SELECT("RCF2r) =NEXT()=PASTE() =RETURN()=CANCEL.COPY()=END.IF()=NEXT()=RETURN()81Macrosheet : QUIKDATA.XLM HATCH=SELECT('R1C7')=SELECT.END(4)=DEFINE.NAME('YEAREND')=SELECT('R2C7:YEAREND')=COPY()=SELECT('R2C36')=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=PARSE(11915-1[19541')=SELECT("R3C37")=SET.NAME('blank5',ACTIVE.CELL())=I F(ISBLANK(blank5))=SELECT("R2C37")=DEFINE.NAME('YEAREND2')=ELSE()=SELECT("R2C37")=SELECT.EN D(4)=DEFINE.NAME('YEAREND2')=END.IF()=SELECT("R2C37")=FOR.CELL(year1","R2C37:YEAREND2',FALSE)=IF(DEREF(year1)=1990)=SELECT("RC25')=FORMULA(1)=ELSE()=SELECT(RC25")=FORMULA(0)=END.IF()=SELECT("R[1]C')=NEXT()=RETURN()82SHAPE=SELECT("R1C4")=SELECT. END(4)=DEFINE.NAME(RUNEND')=SE LECT('R2C4')=FOR.CELL('run 1' ,'R2C4:RUNEND',FALSE)=IF(DEREF(run1)<O)=SELECT("RC22")=FORMULA(11)=ELSE.IF(AND(DEREF(runi )>-1,DEREF(run1)<0.025))=SELECT("RC22')=FORMULA(1)=ELSE.IF(AND(DEREF(run1)<0.05,0R(DEREF(run1)=0.025,DEREF(run1)>0.025)))=SELECT('RC22')=FORMULA(2)=ELSE.IF(AND(D ER EF(run1)<0.075,0R(DEREF(run1).05,DER E F(run1)>0.05)))=SELECT('RC22")=FORMULA(3)=ELSE.IF(AND(DEREF(run1)<0.1,0R(DEREF(run1)=0.075,DEREF(run1)>0.075)))=SELECT("RC22')=FORMULA(64)=ELSE.IF(OR(DEREF(run1)=0.1,DEREF(run1)>0.1))=SELECT("RC22')=FORMULA(6)=END.IF()=SELECT('R[1]C')=NEXT()=RETURN()83SYMSIZE=SELECT(R1C5")=SELECT.END(4)=SELECT(' RCM')=DEFINE.NAME(AREAEND")=SELECT(' R2C6')=FOR.CELL(areaR2C6:AREAEND',FALSE)=IF(DEREF(area)<100)=SELECT(' RC23")=FORMULA(1)=SELECT(RC24")=FORMULA(6)=SELECT(RC21')=FORMULA(2)=ELSE.IF(AND(DEREF(area)<500,0R(DEREF(area)=100,DEREF(area)>100)))=SELECT("RC23")=FORMULA(2)=SELECT("RC24")=FORMULA(14)=SELECT("RC21')=FORMULA(2)=ELSE.IF(AND(DEREF(area)<1000,0R(DEREF(area)=500,DEREF(area)>500)))=S ELECT(' RC23')=FORMULA(3)=SELECT("RC24")=FORM ULA(9)=SELECT('RC21")=FORM ULA(2)=ELS E.IF(AN D(DE R EF(area)<5000,0R(D EREF(area)=1000,DER E F(area)>1000)))=SELECT('RC23")=FORMULA(4)=S ELECT(" RC24')=FORMULA(11)=SELECT(RC21")=FORM ULA(2)=E LS E.IF(AN D(D EREF(area)<10000,0R(DER E F(area)=5000,DER E F(area)>5000)))=SELECT(RC23")=FORMULA(5)=SELECT("RC24')=FORMULA(12)=SELECT('RC21')=FORMULA(2)=ELSE.IF(AN D(DEREF(area)<15000,0R(DEREF(area)=10000,DEREF(area)>10000)))=SELECT(' RC23')=FORM ULA(6)=SELECT('RC24')=FORMULA(13)=SELECT("RC21')=FORM ULA(2)84SYMSIZE (coned) OPERATION=SET.NAME("blank6',ACTIVE.CELLO)=SELECT(' RC23')=IF(ISBLANK(blank6))=FORMULA(7)=SELECT(RC24') =SELECT(R2C26")=FORMULA(10) =FORMULA("B")=SELECT('RC21") =ELSE()=FORMULA(2) =SELECT("R2C25')=END.IF() =SELECT.END(4)=SELECT(13[1]C") =SELECTCRC[1].)=DEFINE.NAME("OPEND')=NEXT()=RETURN() =SELECT("R2C26")=ELSE.IF(OR(DEREF(area)=15000,DEREF(area)>15000)) =SELECT("R3C1')=FORMULA(B")=COPY()=SELECT("R3C26:0PEND')=PASTE()=CANCEL.COPY()=END.IF()=RETURN()85DATAFILL SYMBOLALIGNMENT=DEFINE.NAME(SYSEND")=RETURN()=RUN("QUIKDATA.XLMHATCH") =S ELECT("R1 C24') =IF(ISBLANK(blank6))=RUN("QUIKDATA.XLMISHAPE") =SELECT.END(4) =SELECT("R2C20')=RUN(QUIKDATA.XLMISYMSIZE") =DEFINE.NAME("ALEND") =FORMULA("S")=RUN(QUIKDATA.XLMIOPERATION") =SELECT("1:11C1:ALEND") =ELSE()=RUN(QUIKDATA.XLMISYMBOL') =ALIGNMENT(2) =SELECT("R2C19")=R UN("Q U1KDATA.XLMISHOW TEXT) =SELECT("R1C1') =SELECT. END(4)=RUN(QUIKDATA.XLMIDATABASE1") =RETURN() =SELECT('FIC[1].)=SELECT(R2C20")=FORMULA(S)=COPY()=SELECT(R3C20:SYS END")=PASTE()=CANCEL.COPY()=END.IF()=RETURN()86DATABASE1 SHOW_TEXT SHOW_TEXT (cont'd)=SELECT(R1C31') =RUN("QUIKDATA.XLM!ROUND_NUM') =SELECT("R[1]C[-1]")=SELECT.END(4) =RUN('QUIKDATA.XLMISTATION_MOVE) =NEXT()=SELECTCRC[11") =SELECT(R1C19") =SELECT("textop3:texbot3')=DEFINE.NAME("DATAEND') =SELECT.END(4) =COPY()=SELECT("R1C1:DATAEND') =DEFINE.NAME(texbot1") =SELECT('R2C19")=SET.DATABASE() =SELECT("R2C19:texbot1") =PASTE.SPECIAL(3,1,FALSE,FALSE)=RETURN() =COPY() =CANCEL.COPY()=SELECT("texbot1") =SELECT('namtop2:texbot3')=SELECT(*F1[2]CF1r) =CLEAR(1)=PASTE() =END.IF()=SELECT("RC") =RETURN()=DEFINE.NAME('textop2")=SELECT(R3C19")=SET.NAME(blank9",ACTIVE.CELLO)=IF(ISBLANK(blank9))=SELECT('textop2')=FOR.CELL(text","R4C18:textop2',FALSE)=IF(LEN(text)>26)=RUN("QUIKDATA.XLMIYES_DATA")=ELSE()=RUNCQUIKDATA.XLMINO_DATA")=END.IF()=SELECT(-13[1]CF1r)=NEXT()=SELECT('R4C19')=DEFINE.NAME("textop3")=COPY()=SELECT("R2C19")=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECT('namtop2:textop3")=CLEAR(1)=ELSE()=SELECT(lextop2")=SELECT.END(4)=DEFINE.NAME(texbot2")=SELECTCRC[1]")=DEFINE.NAME('texbot3')=SELECT("textop2")=SELECTCRC[11')=DEFINE.NAME(textop3')=SELECT("RCF1r)=FOR.CELL.(text",lextop2:texbot7,FALSE)=IF(LEN(text)>26)=RUN("OUIKDATA.XLMWES_DATA')=ELSE()=RUN('QUIKDATA.XLMINO_DATA")=END.IF()87YES_DATA NO_DATA=SELECTCRC[1r) =SELECT("RC[1].)=COPY() =RETURN()=FORMULA(=REPL4CE(RC18,1,7,RC1)) =FORMULA("=REPLACE(RC18,1,7,RC1)")=SELECTORCH r)=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL...COPY()=SELECTCRCH 4y)=I F(LEN(ACTIVE.CELLO)=3)=SELECTCRC[15]")=FORMULA(=REPLACE(RC18,14,3,RC4))=ELSE.IF(LEN(ACTIVE.CELL()))=SELECT("RC[15]')=FORMULA(=REPLACE(RC18,14,4,RC4)")=ELSE.IF(LEN(ACTIVE.CELLO)=5)=SELECTCRC[15r)=FORMULAC=REPLACE(RC18,14,5,RC4)")=END.IF()=COPY()=SELECTCRCF1r)=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECT(RC[1]")=FORMULA(=REPLACE(RC18,25,4,RC5))=RETURN()88ROUND_NUM ROUND_NUM (contd) STATION_MOVE=SELECT('R3C4') =SELECT("martop2") =SELECT(R1C1")=SET.NAME("blank8",ACTIVE.CELLO) =SELECT("RC[1]") =SELECT.END(4)=IF(ISBLANK(blank8)) =DEFINE.NAME("martop3") =DEFINE.NAME("nambot1")=SELECTCR2C4") =SELECTCRC[1r) =SELECT(' R2C1:nambot1")=COPY() =DEFINE.NAME("fpitop3") =COPY()=SELECT("R4C3") =SELECTCRCH") =SELECT("nambot1')=PASTE() =FORMULAC=ROUND(RCE-113)") =SELECT(' R[2]C")=CANCEL.COPY() =COPY() =PASTE()=SELECT(" R2C30") =SELECT("R[1]C:marbot3") =CANCEL.COPY()=COPY() =PASTE() =SELECT("RC")=SELECT("R4C6') =CANCEL.COPY() =DEFINE.NAME("namtop2")=PASTE() =SELECT("R2C30:fpibot1") =RETURN()=CANCELCOPY0 =COPY()=SELECTCR4C41 =SELECT.END(4)=FORMULAC=ROUND(RCF1J,3)") =SELECTCR[2]Cf-24r)=SELECT(' R4C5') =PASTE()=FORMULAC=ROUND(RC[1],2)") =SELECT("RCF1r)=COPY() =FORMULAC=ROUN D(RC[1],2)")=PASTE.SP ECIAL() =COPY()=CANCEL.COPY() =SELECT("R[11C:fpibot3")=SELECT("R4C3') =PASTE()=CLEAR(1) =CANCEL.COPY()=SELECTCR4C61 =SELECT("martop3:fpibot3")=CLEAR(1) =COPY()=ELSE() =PASTE.SP ECIAL(3,1,FALSE, FALSE)=SELECT("R2C4") =CANCEL.COPY()=SELECT.END(4) =SELECT("martop2:marbot2")=DEFINE.NAME("marbot1") =CLEAR(1)=SELECTCRC[261") =SELECT("fpitop2:fpibot2')=DEFINE.NAME("fpibot1") =CLEAR(1)=SELECT("R2C4:marbotl ') =END.IF()=COPY() =RETURN()=SELECT.END(4)=SELECT(' R[2]C[-1]")=PASTE()=SELECT(' RC")=DEFINE.NAME("martop2")=SELECT(' RC[3]" )=DEFINE.NAME("fpitop2")=SELECT(' martop2")=SELECT.END(4)=DEFIN E.NAME("marbot2")=SELECT("RC[1]")=DEFINE.NAME("marbot3")=SELECT(' FIC[1r)=DEFINE.NAME("fpibot3')=SELECT(' RCM")=DEFINE.NAME("fpibot2")89FORMAT RATNUM LATLONG_CHECKCLEAR_DATA=SELECT.END(4) =SELECT.EN D(4)=CLEAR(1)=FORMAT.NUMBER(0.00") =SELECT(R2C13')=FOR.CELL(Blank","R2C13:latbor ,FALSE)=SELECT(R1C29')=SELECT.END(4) =SELECT(Blank)=DEFINE.NAME(ccbot1') =IF(ISBLANK(Blank))=SELECT(R2C27:ccbot1') =FORM U LA(0)=FORMAT.NUMBER(0.00') =SELECTCRC[1r)=SELECT('R1C31") =FORMULA(0)=SELECT.END(4) =SELECT(' RC[1] )=DEFINE.NAME("uambot1") =FORMULA(0)=SE LECT('R2C31:uambot1') =SELECT(RC[1]")=FORMAT.NUMBER(0.000") =FORMULA(0)=RETURN() =SELECT(RC[1].)=SELECT("1:11C11') =SELECT(' R2C27:R300C27')=DEFINE.NAME(cvrbot1') =SELECT('R1C27') =SELECT('RC[1n=SELECT("R2C10:cvrbot1') =RETURN() =DEFINE.NAME("latbot")=FORMULA(0)=SELECTCRC[11')=FORMULA(0)=END.IF()=NEXT()=RETURN()=SELECT(131C12')90INDElOC PARSELOOP=ECHO(FALSE) =S ELECT(' R6C1=DEFINE.NAME('BOT1')=RETURN()Macrosheet : INDEX.XLM=RUN(INDEX2.XLMIEXTRAINDEX) =SET.NAM ECblank1 ',ACTIVE.CE LW)=RUN(' NDEX2.XLMIPARSELOOP) =IF(ISBLANK(blank1))=RUN(I N DEX2.XLMISEN DNAME) =SELECT(R3C1:R4C1')=RUN(INDEX2.XLMISENDYEAR) =PARSEC[PRil NO. ][NAME=RUN(INDEX2.XLMIFORMAT NUMBER) =ELSE()=RUN(INDEX2.XLMICOUNTCHECK) =SELECT("R3C1')=RUN(MACRO1 A.XLMISAVE0 LD') =SELECT. EN D(4)=SELECT('R3C1:BOT1')=PARSEC[PFI][ NO. ][NAME=END.IF()=RETURN()91SENDNAME SENDYEAR=SELECT("R5C3") =SELECTCR5C61=SET.NAME("blankT,ACTIVE.CELLO) =SET.NAME(blankT,ACTIVE.CELLO)AF(ISBLANK(blank2)) =IF(ISBLANK(blank3))=S E LECT(' R4C2: R4C3") =SE LECT(' R4C5: R4C6')=COPY() =COPY()=ACTIVATE(S U MBASE.XLS' ) =ACTIVATE('SUMBASE.XLS')=S ELECT(' R2C1 ) =S ELECT(' R2CT )=PASTE() =PASTE()=CANCEL.COPY() =CANCEL.COPY()=ACTIVATE.N EXTO =ACTIVATE .NEXT()=ELSE() =ELSE()=S ELECT(' R4C3') =SELECT(' R4C6' )=SELECT.END(4) =SELECT. E N D(4)=DEFI NE. NAME('END1 ) =DEFINE.NAME("END2')=S ELECT(' R4C2:EN D1') =SELECT(' R4C5:EN DT)=COPY() =COPY()=ACTIVATE(' SUMBAS E.XLS') =ACTIVATE(SUMBASE.XLS')=S ELECT(' R2C1 ') =SELECT("R2C7")=PASTE() =PASTE()=CAN CE L.COPY() =CANCELCOPY()=ACTIVATE. N EXT() =ACTIVATE.NEXTO=END.IF() =EN D.I F()=RETURN() =RETURN()92COUNTCHECK EXTRAINDEX=ACTIVATE(SUMBASE.XLS') =SELECT('R3C1')=SELECT(R3C4') =SELECT.END(4)=SET.NAME("blank4',ACTIVE.CELL()) =SELECT('R[2]C')=IF(ISBLANK(blank4)) =PARSEC[PR][ NO.^][NAME=ACTIVATE.NEXTO =SET.NAME( EXTRA1',ACTIVE.CELLO)=GOTO(R[30]C) =IF(ISTEXT(EXTRA1))=END.IF() =DEFINE.NAME(IDELB1')=SELECT("R2C4') =SELECT(R[-2]C')=SELECT.END(4) =DEFINE.NAME('DELT1')=DEFINE.NAME(END3') =SELECT('DELT1:DELB1')=SELECT(R[4]C") =EDIT.DELETE(2)=FORMULA(=COUNTA(R2C4:END3)") =END.IF()=COPY() =SELECT('R1C1')=ACTIVATE.NEXT() =RETURN()=SELECT(R3C3')=SELECT.END(4)=SELECT('R[3]C")=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SET.NAME(surn',DEREF(ACTIVE.CELL()))=SELECT(14[4]C')=DEFINE.NAME(END4')=SELECT('R[2]C')=FORMULA(=COUNTA(R4C3:END4)')=SET.NAME(index',DEREF(ACTIVE.CELL()))=SELECTCRCF2r)=FORMULA(INDEX COUNT =')=SELECT(*R[1]C')=FORMULA('SUMMARY COUNT =')=IF(sum=index)=GOTO(R[5]C)=ELSE()=ALERT(*The numbers of stations in the summary=HALT()=END.IF()=RETURN()93FORMAT_NUMBER=ACTIVATE(' SUMBASE.XLS')=SELECT("R1 C5")=SELECT.END(4)=DE Fl N E.NAME(" LAST1 ')=SE LECT(" R2C5:LAST1 ')=FORMAT.N UM BER("0.00')=SELECT(' R 1 04')=SELECT.EN D(4)=DEFINE.NAME("END3")=SELECT(' R2C4:EN D3')=FORMAT.N UM BER("0.000')=S ELECT(' R 1 C3')=SELECT.EN D(4)=DEFINE.NAME("LAST2")=S ELECT(' R2C3:LAST2')=FORMAT.NUMBERC0.01=SELECT("R1 C30')=SELECT.EN D(4)=DEFINE.NAME("LAST4")=SELECT("R2C30:LAST4')=FORMAT.N UM BER("0.00')=SELECT(" R1 C9')=S ELECT.EN D(4)=S ELECT(' RC[21')=DEFINE.NAME("LAST3")=SELECT("R2C10:LAST3')=FORMAT.NUMBER("0.000')=ACTIVATE.NEXTO=RETURN()94Macrosheet : FLOW.XLM FLOW=ECHO(FALSE)=RUN(MACRO1A.XLMIEXTRADATA')=SELECT('R5C1')=RUN('MACRO1A.XLMIPARSE2")=RUN("MACRO1A.XLM!parseloop")=SELECT("R8C1')=IF(ISBLANK(ACTIVE.CELL()))=SELECT(R[3]C")=ELSE()=SELECT("RF11C")=SELECT.END(4)=SELECT.END(4)=SELECT("R[2]C")=END.IF()=RUN(MACRO1A.XLM1PARSE3')=SELECT("R[1]C")=RUN("MACRO1A.XLMIPARSE4")=SELECT("R[1]C")=SELECTCRC:R[6]C[9]")=EDIT.DELETE(1)=SELECT(RC[5]")=FORMULA(per unit area')=SELECT.END(1)=SELECT("R[1]C")=FORMULArMean Annual Runoff = ")=SELECTCR[11C")=FORMULA("Standard Deviation = ")=SELECT('R[1]C')=FORMULACCoefficient of Variation =")=SELECT(*R[2]C")=FORMULA("Years of Complete Record = ")=SELECTCR[-4]C[3]")=DEF1NE.NAME('MAR')=RUN("MACRO1A.XLMINO_REGULATED')=RUN(MACRO1A.XLMINO_REGULATED2")=RUN(MACRO1A.XLMlaverage")=SELECT(*R[1]C")=DEF1NE.NAME("SD')=R U N('MACRO1 A.XLMistdev')=SELECT("R[1]C")=FORMULA('=SD/MAR")=SELECTCR[21C')=DEFINE.NAME(COUNT')=RUN(MACRO1A.XLM!COUNTER")=SET.NAME(counr,DEREF(ACTIVE.CELL()))=SELECTCRE2jC[21')95FLOW (cont'd)=FORMULA(=SD/MAR')=SELECTCRHIC")=FORMULAC=SD/RF4P[3]")=SELECT("R(-1]C")=FORMULAC=MAR/RF3ppr)=SELECTCR[5]q-51')=FORMULACStreamflow Pattern Index = ')=SELECTCRC[3]")=FORMULAC=STDEV(RFlOJCF21:RHOJC[9])/AVERAGE(RHOIC[-2]:RHOP[9])")=SELECTCRF1p[2]")=IF(count<5)=FORMULA("TEMP001/nINSUFFICIENT DATA')=END.IF()=IF(count>4)=FORMULA("TEMP001/nMAR.000/nFPla.).00')=END.IF()=RUN("MACRO1A.XLM!MOVEDATA")=SELECT(RC1')=RUN(MACRO1A.XLM!SUMDATA')=RUN(MACRO1A.XLM!SAVEOLD")=RETURN()96SNGL_FLOW_FILE=ECHO(FALSE)=RUN("MACRO1A.XLM!EXTRADATA')=SELECT(' R5C1 ')=RUN(MACR01A.XLM!PARSE2')=RUN("MACRO1A.XLM!parseloop')=SELECT('REIC1')=I F(ISBLAN K(ACTIVE.CELL()))=SELECT('N3]C')=ELSE()=SELECT('FIF11C')=SELECT.END(4)=SELECT. END(4)=SELECT(' WIC')=END.IF()=RUN(MACR01A.XLMIPARSE3')=SELECT('R[1]C")=RUN("MACR01A.XLMIPARSE4')=SELECT('R(11C')=SELECT('FIC:13[61C[91')=EDIT.DELETE(1)=SELECTCRC[51')=FORMULA("per unit area")=SELECT.END(1)=SELECT("R[1]C")=FORMULA("Mean Annual Runoff = ")=SELECT("R[1 IC")=FORMULA("Standard Deviation = ")=SELECT("R[1 IC")=FORMULA("Coefficient of Variation = ")=SELECT("R[2]C")=FORMULA("Years of Complete Record = ")=SELECT("R[-4]C[3]")=DEFINE.NAME(MMAR")=RUN("MACRO1A.XLMNO_REGULATED")=RUN(°MACRO1A.XLMNO_REGULATED2")=RUN("MACRO1A.XLM!average")=SELECT("R[1]C")=DEFINE.NAME("SD")=RUN("MACRO1A.XLM!stdev")=SELECT("R[1 IC")=FORMULA("=SD/MAR")=SELECT("11[2]C")=DEFINE.NAME("COUNT")=RUN("MACRO1 A.XLMICOUNTER")=SET.NAME("count",DEREF(ACTIVE.CELL()))-SELECT(" R[-2]C[2]")=FORMULA("=SD/MAR")=SELECT("R[-11C")97SNGL_FLOW_FILE (cont'd)=FORMULAC=SD/R[-4]C[3]")=SELECT("RHIC")=FORMULA("=MAR/RF31C[3]")=SELECTCR[5]C[-5]")=FORMULA(Streamflow Pattern Index =")=SELECTCRC[31")=FORMULAC=STDEV(R[-10]C[-2]:R(-101C[9])/AVERAGE(RHO1CF21:RHOIC[9])")=SELECTCRE1)Cpy)=IF(count<5)=FORMULA(INSUFFICIENT DATA")=END.IF()=IF(count>4)=FORMULA("DATA OK')=END.IF()=SELECT("R1C16:R200C32")=CLEAR(1)=SELECT(131C1")=RETURN()98RATIO PARSE6PARSESMacrosheet : RATIO.XLM =ECHO(FALSE) =ECHO(FALSE) =ECHO(FALSE)=R UN(' RATIO.XLM!EXTRARATIO') =PARSECEYEARli =PARSE(11982][=SE LECT("R5C1 =RETURN() =RETURN()=R UN ("RATIO.XLM!PARS E5")=SELECT.EN D(4)=SELECT. END(4)=DEFINE.NAME(RATEND*)=SELECT(R7C1:RATEND")=RUN(' RATIO.XLMIPARSE6')=RUN(RATIO.XLM!COUNT RAT')=RETURN()99CLEAN_UP EXTRARATIO=ECHO(FALSE) =ECHO(FALSE)=SELECT("R7C1') =SELECTCR7C1")=SELECT. END(4) =S E LECT. EN D(4)=DEFINE.NAME("CLEND") =SELECT('R[1 1]C")=SET.NAME(EXTRAV,ACTIVE.CELLO)=SELECT("R7C5')=IF(ISBLANK(ACTIVE.CELL())) =SELECT("R[1 ]a.)=SELECT(R) =SELECT('R:R[-1 1])=ED IT.DELETE(2) =EDIT.DELETE(2)=SELECT(RC5') =END.IF()=ELSE() =SELECT('R7C1')=SELECT("RC9") =SELECT.EN D(4)=IF(ISBLANK(ACTIVE.CELL())) =SELECT("R[21C')=SELECT('W) =SELECTCR:R[31')=ED1T.DELETE(2) =EDIT.DELETE(2)=SELECT(RC5') =RETURN()=FOR.CELL("cleare,"R7C1:CLEND',FALSE) =IF(ISTEXT(EXTRA2))=ELSE()=SELECT('Rf 11C[-4]")=END.IF()=END.IF()=NEXT()=RETURN()100RATIO_CALC SUM_CALC=ECHO(FALSE) =ECHO(FALSE)=SELECT(R7C15') =SELECT("R7C2')=SELECT.END(4) =SELECT.END(4)=SELECTCRC[21') =DEFINE.NAME("MAXIBOT')=DEFINE.NAME("RATBOT') =SELECT("R7C7')=SELECT(*R7C17") =SELECT.END(4)=FORMULAC=RCF151/RCH Or) =DEFINE.NAMECMAXDBOT")=COPY() =SELECT('R7C1")=SELECT("R8C17:RATBOT") =SELECT.END(4)=PASTE.SPECIAL(1,1,FALSE,FALSE) =SELECTCR[4]C")=RETURN() =FORMULA("Mean of Ratio of max instant/max daily =')=SELECT('R[2]C")=FORMULA('StDev of Ratio of max instant/max daily =')=SELECT('R[2]C')=FORMULA("C.V. of Ratio of max instant/max daily =')=SELECT('R[2]C')=FORMULAffears of Complete Record =")=SELECT("R[2]C')=FORMULA("Mean of annual max daily flows =')=SELECT("R[2]C')=FORMULA("StDev of annual max daily flows =")=SELECT('R[2]C")=FORMULACC.V. of annual max daily flows =')=SELECTCRPIC")=FORMULA("Correlation Coefficient between max daily flows and ratio =")=SELECT('FI[-14]C[6]")=DEFINE.NAME("MEANRAT")=FORMULA("=AVERAGE(R7C17:RATBOT)")=SELECT('R[2]C")=DEFINE.NAME("SDRAT')=FORMULA("—STDEV(R7C17:RATBOT)')=SELECT("R[2]C')=DEFINE.NAME("CVRAT')=FORMULA('=SDRAT/MEANRAT")=SELECT("R[2]C')=DEFINE.NAME("FCOUNT")=FORMULA("=COUNT(R7C17:RATBOT)')=SELECTCR[21C')=DEFINE.NAME("MEANMAX')=FORMULA("=AVERAGE(R7C7:MAXDBOT)')=SELECT('13[21C-)=DEFINE.NAME('SDMAX")=FORMULA("=STDEV(R7C7:MAXDBOT)')=SELECT('R[2]C')=DEFINE.NAME("CVMAX')=FORMULA("=SDMAX/MEANMAX")=SELECT("R[2]C")=DEFINE.NAME("CC')1 01MOVE_RAT SUM_RAT=ECHO(FALSE) =SELECT(`SUM_RAT LI N E1")=SELECT("MEAN RAT") =S ELECT(' RC2:RC[2]*)=COPY() =COPY()=S ELECT(' R[171CF5r) =ACTIVATE('S UM BASE.XLS")=PASTE.SPECIAL(3,1,FALSE,FALSE) =S ELECT(' R16384C10")=DEFINE.NAME('SUM_RAT_LINE1') =SELECT.END(3)=SELECTCCVRAT) =SELECT(' ADP')=COPY() =PASTE()=S ELECT(' R[131CF4r ) =CANCEL.COPY()=PASTE.SPECIAL(3,1,FALSE,FALSE) =ACTIVATE.NEXT()=SELECT(' FCOU NT") =SELECT(*SUM_RAT LIN E2`)=COPY() =SELECTCRC6:RC[2]")=SELECT(' R[11]C[-3]') =COPY()=PASTE.SPECIAL(3,1,FALSE,FALSE) =ACTIVATE("SUMBASE.XLS")=SELECT("MEANMAX") =SELECT(' R16384C27")=COPY() =SELECT.END(3)=SELECT(' R[9]CF1 r) =SELECT('R[11C*)=PASTE.S P ECIAL(3,1,FALSE, FALSE) =PASTE()=DEFINE.NAME("SUM_RAT_LINE2") =CANCEL.COPY()=SELECT(*CVMAX") =RUN("RATIO.XLMICALC_UA_MEAN")=COPY() =ACTIVATE.NEXT()=S ELECT(' R[5]C") =RETURN()=PASTE.SPECIAL(3,1,FALSE,FALSE)=SELECT("CC")=COPY()=SELECT("R[3]C[1]')=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=RETURN()102COUNT RAT YES_RATIO=ECHO(FALSE) =ECHO(FALSE)=R U N(' RATIO.XLM!CLEAN_U P')=SELECT('R7C1')=RUN("RATIO.XLM!RATIO_CALC")=SELECT.END(4)=RUN(RATIO.XLM!SUM_CALC')=SELECT("RC[1])=RUN("RATIO.XLMISUM_RAT')=SELECT("R4C1')=DEFINE.NAME(RATBOT2') =RUN(RATIO.XLM!MOVE_RAT')=SET.NAME("CH ECK',ACTIVE.CELL()) =RUN('MACRO1A.XLM!SAVEOLD")=FORMULA("=COUNT(R7C2:RATBOT2)) =RETURN()F(CH ECK)=RUN("RATIO.XLM!NO_RATIO')=ELSE.IF(CHECK=1)=RUN(RATIO.XLM!ONE_RATIO')=ELSE()=RUN(RATIO.XLMIYES_RATIO')=EN D.I F()=RETURN()103NO_RATIO SNGL_RATIO_FILE=R U N(" MACRO1A.XLM!SAVE0 LW) =ECHO(FALSE)=ACTIVATE(SUMBASE.XLS") =RUN(RATIO.XLMIEXTRARATI0m)=SELECT("R16384C10') =SELECT('R5C1')=RUN("RATIO.XLMIPARSE5')=SELECT.END(3)=SELECT('R[1]C") =SELECT.END(4)=FORMULA(-99.99) =SELECT.END(4)=DEFINE.NAME(RATEND")=SELECT("RC[1]*)=SELECT('R7C1:RATEND")=FORMULA(-99.99)=RUN(RATIO.XLMIPARSE6")=SELECTCRC[1r)=FORMULA(0) =SELECT("R7C1')=SELECT("R16384C27') =SELECT.END(4)=SELECT.END(3) =SELECT("RC[1])=DEFINE.NAME("RATBOT2')=SELECT("R[1]C")=FORMULA(-99.99) =SELECT("R4C1')=SET.NAME("CHECIC,ACTIVE.CELL())=SELECT(RC[1]")=FORMULA(-99.99) =FORMULA("=COUNT(R7C2:RATBOT2))=SELECT(RC[1]") =IF(CHECK=0)=FORMULA(-99.99) =ALERT(*There are no years of record with bol=RUN(MACRO1A.XLMISAVEOLD")=SELECT('RC[2])=FORMULA(-99.99) =ELSE()=RUN(RATIO.XLMICLEAN_UP*)=RUN(RATIO.XLMIRATIO_CALC")=RUN("RATIO.XLMISUM_CALC")=END.IF()=RETURN()104=RETURN()ONE_RATIO CALC_UA_MEAN=SELECT("R7C17") =I F(G ET.CELL(5)>0)=FO RM U LA(' =R7C2/R7C7") =S ELECT(' RC[251" )=FORM U LAC=RCF4VRC[-25]")=COPY()=SELECT('R16384C10') =SELECT(' RC[25]")=SELECT.EN D(3) =FORMULA(-99.99)=SELECT(' R[11C') =END.IF()=RU NC RATIO .XLMICLEAN_U P =SELECT('FIC[-21]")=ACTIVATE('SUMBASE.XLS') =ELSE()=PASTE.SPECIAL(3,1,FALSE,FALSE) =RETURN()=S ELECT(' RC[1r)=FORMULA(-99.99)=S ELECT(' RC[1 ]")=FORMULA(1)=ACTIVATE.NEXT()=SELECT('R7C2')=COPY()=ACTIVATE("SUMBASE.XLS')=S ELECT(' R16384C27")=SELECT.END(3)=SELECT("13[1]C“)=PASTE.SPECIAL(3,1,FALSE,FALSE)=SELECT(' RC[1r)=FORMULA(-99.99)=SELECT(' RC[1r)=FORMULA(1)=SELECT(' RCF2r)=RUN("RATIO.XLMICALC_UA_MEAN")=ACTIVATE.NEXT()=RUN("MACRO1A.XLMISAVEOLD')=RETURN()105Macrosheet : SETUP.XLM MASTER_FLOW=OPEN r CAQU I KMAP \ FLOWDATMAR EA \TEMPOO1F1=RUN(FLOW.XLM!FLOW)P EN(' CAQU IKMAP \FLOWDATA\AREA\INDETEMP.')=RUN(' INDEX.XLMUNDEXX")=ACTIVATE(' SUMBASE.XLS")=R U N("Q U IKDATA.XLM!DATAFI LL')=RUN('QUIKDATA.XLMICLEAR_DATA')=RETURN()106MASTER_RATIO=OP EN( CAQ U IKMAP \FLOWDATA \AREA \TEMP001 E.")=RUN(RATIO.XLMRATIO")=ACTIVATE("SUMBASE.XLS')=RUN("QUIKDATA.XLMIFORMAT_RATNUM")=RETURN()107FILEOPEN_SETUP FILEOPEN_SETUP (cont'd)=oPEN("CAQUIKMAP\FLOWDATA\AREA\INDETEMP.') =INSERT(2)=RUN(INDEX.XLMEXTRAINDEX") =FORMULA(".")=SELECT('R3C1') =SELECT(R[2]C")=SELECT.END(4) =NEXT()=DEFINE.NAME("BOT3') =SELECT(R5C3:stat1end")=SELECT("R3C1:B0T3") =COPY()=PARSENPRII NO.^][NAME =ACTIVATE("SETUP.XLM")=SELECT("R3C3") =SELECT("R2C2")=FORMULA("F.") =PASTE.SPECIAL(3,1,FALSE,FALSE)=SELECTCRC[1]") =CANCELCOPY()=FORMULA("E.") =ACTIVATE.NEXTO=SELECT.END(4) =SELECT('R4C4')=DEFINE.NAME("stat2end") =FOR.CELL("num2","R3C4:stat2encr,FALSE)=SELECTMCF1r) =INSERT(2)=DEFINE.NAME("stat1end") =FORMULA(".")=SELECT("R4C3") =SELECT("13[21C-)=FORMULA(=REPLACE(RC2,8,2,R3C3)) =NEXT()=SELECT("R4C4") =SELECT("R5C4:stat2encr)=FORMULAC=REPLACE(RC2,8,2,R3C4)") =COPY()=SELECT("R4C3:R4C4") =ACTIVATE('SETUP.XLM")=COPY() =SELECT("R2C4)=SELECT(R5C3:stat2encr) =PASTE.SPECIAL(3,1,FALSE,FALSE)=PASTE() =CANCEL.COPY()=CANCEL.COPY() =ACTIVATE.NEXTO=SELECT(134C4:stat2end") =RU N("MACRO1A.XLM! SAVEO LD")=COPY() =ALERT("The program is now ready to work its mag=PASTE.SPECIAL(3,1,FALSE,FALSE) =RETURN()=CANCEL.COPY()=ALERT("As you should be aware, WSC's extreme flow data bas=SELECT("R1C4")=FORMULA(INPUT("Do you wish to create a dummy extreme de=1F(LEN(ACTIVE.CELLO)=3)=SELECT(R4C4:stat2end")=FORMULA.REPLACE(INPUT("Enter the name of the missing s•=ELSE()=GOTO(R48C5)=END.19)=SELECT(131C4')=FORMULA(INPUT("Do you wish to create a dummy extreme de=1F(LEN(ACTIVE.CELLO)=3)=SELECT(R404:stat2encr)=FORMULA.REPLACE(INPUTCEnter the name of the missing s'=ELSE()=GOTO(R48C5)=END.IF()=GOTO(R[-8]C)=SELECT("R4C3")=FOR.CELL('num1",'R3C3:stattencr ,FALS E)108RENAME_FILE=ACTIVATE("SETUP.XLM")=SELECT("R4C2')=SET.NAME(only1',ACTIVE.CELL())=IF(ISBLANK(only1))=SELECT("R2C2')=DEFINE.NAME('statonly1')=SELECT(R2C4')=DEFINE.NAME(statonly2-).SELECT('R2C1")=FORMULA.REPLACE('AREA",area,2,1,TRUE,FALSE)=FORMULA.REPLACE("TEMPOO1F.",statonly1 ,2,1,TRUE,FALSE)=SELECT('R2C3')=FORMULA.REPLACE(AREA",area,2,1,TRUE,FALSE)=FORMULA.R EP LACE(TEMP001 E.",statonly2,2,1,TRUE,FALSE)=ELSE()=SELECT(R2C2')=SELECT.EN D(4)=DEFINE.NAME("nambot1")=SELECT(`R[11CF1r)=DEFINE.NAME("statbot1")=SELECT("R2C1')=FORMU LA. REP LACE(AREA',area,2,1 ,TRU E,FALSE)=SELECT("R2C1:R3C1')=COPY()=SELECT('R4C1:statbot1 ')=PASTE()=CANCEL.COPY()=SELECT(R3C2")=FOR.CELL("stat1 ','R3C1:statbot1 ',FALSE)=CLEAR(1)=SELECT("R[2]C")=NEXT()=SELECTCR2C21=FOR.CELL("stat1 ','R2C2:nambot1 ',TRUE)=SELECTCRC(-1r)=FORMULA.REPLACE(TEMPOO1F.',stat1 ,2,1 ,TRUE,FALSE)=SE LECT('13[21C[1]*)=NEXT()=SELECT('R2C4')_-.:S ELECT. E N D(4)=DEFINE.NAME("nambot2').SELECTCR[1]C[-11')=DEFINE.NAME(statbot2-)=SELECT("R2C3')=FORMULA.REPLACE("AREA",area,2,1,TRUE,FALSE)=SE LECT("R2C3:R3C3")=COPY()=SELECT(R4C3:statbot2')109RENAME_FILE (conrcl)=PASTE()=CANCEL.COPY()=SELECT(' R3C4')=FOR.CELL(stat2',`R3C3:statbot2',FALSE)=CLEAR(1)=SELECT(' R[2]C')=NEXT()=SELECT(' R2C4')=FOR.CELL(stat2R2C4:narnbot2",TRUE)=SELECTMCF11)=FORMULA.REPLACE(TEMPOO1E.',stat2,2,1 ,TRUE,FALSE)=SELECT(' WPM")=NEXT()=END.IF()=SELECT('R500C1:R506C3')=COPY()=SELECT.END(3)=SELECT("R[1]C")=PASTE()=CANCEL.COPY()=RETURN()110BEGIN_SETUP (cont'd)BEGIN_SETUP=ACTIVATE(SETUP.XLM") =RETURN()TEMPDATA_INPUT MASTER_FILE_SETUP=RUN(SETUP.XLMIFILEOPEN_SETUP")SUMTEMP=ACTIVATE(SUMTEMP.XLS') =RUN("SETUP.XLMIRENAME_FILE)INDETEMP =RETURN()=RUN(SETUP.XLMIBEGIN_SETUP")=FORMULA(INPUT("The results have been saved in a file named SUMBASE.X1SUMMARY\SUMTEMP=SAVE.AS(CAQUIKMAP\FLOWDATA\SUMMARYNSUMTEMP.XLS',1)=SAVE.AS("CAQUIKMAP\FLOWDATA\SUMMARY\SUMTEMP.DBF',8)112MASTER_FILE_CREATE MASTER_MASTER=RUN("SETUP.XLMIMASTER_RATIO*)=RETURN()=RUN("SETUP.XLMIMASTER_FLOW") =RUN(SETUP.XLM!MASTER_FILE_SETUP")=RUN("SETUP.XLMIMASTER_RATIO") =RUN(SETUP.XLMNASTER_FLOW")=RUNCQUIKDATA.XLMILATLONG_CHECK")=RUN("QUIKDATA.XLM!DATABASE1")=BEEP()=RETURN()113OPEN CLOSE=ECHO(FALSE) =ECHO(FALSE)=RETURN() =SAVE.AS(,0)Macrosheet : OPEN.XLM=OPE N ("CAW IKMAKFLOWDATA \MACRO \FLOW.XLM') =ACTIVATE("SETUP.XLM")=OPEN (C:KW I KMAKFLOWDATA \MACRO \MACRO1A.XLM") =SAVE.AS(,0)=OPEN ("C :\OU IKMAKFLOWDATA \MACRO\INDEX.XLM') =CLOSE(FALSE)=OPEN("CAQU IKMAKFLOWDATA \MACROQUIKDATA.XLM') =ACTI VATE( FLOW.XLM-)=0 PEN ("C: \QU IKMAKFLOWDATA \MACRO \RATIO.XLM') =SAVE.AS(,0)=OPEN(OQU IKMAKFLOWDATA \MACRO \SETU P.XLM*) =CLOSE(FALSE)=OPEN(CAQU I KMAKFLOWDATA MACROZATSUM.XLM") =ACTIVATE(I N D EX.XLM')=CLOSE(FALSE)=ACTIVATE("QUIKDATA.XLM")=SAVE.AS(,0)=CLOSE(FALSE)=ACTIVATE("MACR01A.XLM")=SAVE.AS(,O)=CLOSE(FALSE)=ACTIVATE(RATIO.XLM")=SAVE.AS(,0)=CLOSE(FALSE)=ACTIVATE(' OATS UM.XLM")=SAVE.AS(,0)=CLOSE(FALSE)=RETURN()114Macrosheet : DATSUM.XLMRUN_DATSUM=IF(ALERT('Welcome to the DATSUM program. This program creates data summary files for WSC streamfiow regions. I trus=ALERT('The program will prompt you for input as required. If you make a mistake and mess things up, don't panic, just close=OPEN(CAQUIKMARFLOWDATA\SUMMARY\SUMBASE.XLS')=RUN(SETUP.XLMIMASTER_MASTER')=ACTIVATE(SETUP.XLM')=SAVE.AS(,0)=CLOSE(FALSE)=OPEN(CAQUIKMARFLOWDATA\MACRO\SETUP.XLM")=ACTIVATE("SUMBASE.XLS')=SAVE.AS(CAQUIKMAP\FLOWDATA\SUMMARY■SUM08EB.XLS',1)=SAVE.AS(CAQUIKMARFL0WDATA\SUMMARY1SUMO8EB.DBP,8)=CLOSE(FALSE)=SELECT('R2C2')=FORMULA(INPUT('The results have been saved in a file named SUM08EB.XLS. Do you wish to run the program again? An=1F(LEN(ACTIVE.CELL())=3,GOTO(R[-12]C),GOTO(R[1]C))=ALERT("You have chosen to exit the program. All of the summary files you created have been saved in directory CAQUIKMA=ACTIVATE("SETUP.XLM')=SAVE.AS(,0)=CLOSE(FALSE)=ACTIVATErFLOW.XLM1=SAVE.AS(,0)=CLOSE(FALSE)=ACT1VATE("INDEX.XLM")=SAVE.AS(,0)=CLOSE(FALSE)=ACTIVATE("QUIKDATA.XLM")=SAVE.AS(,0)=CLOSE(FALSE)=ACTIVATE('MACRO1A.XLM')=SAVE.AS(,0)=CLOSE(FALSE)=ACTIVATE("RATIO.XLM“)=SAVE.AS(,0)=CLOSE(FALSE)=RETURN()115Appendix A.4 - Installing Customized QUIKMap FilesThe QUIKMap program contains a number of files which can be modified tocustomize different applications. Basically, the majority of these files influence theway in which data is presented. The following is a list of QUIKMap files which havebeen created to be compatible with the DATSUM summary files. These files shouldbe loaded into the CAQUIKMAP\FLOWDATA1SUMMARY directory before theQUIKMap software is operated.BASEMAPS.QCK : Menu file which allows the user to select a basemap from anumber of choices.DATAFILE.QCK : Menu file which allows the user to select a WSC streamflowsummary file (files created by DATSUM).NE_STAT.DBF :NW_STAT.DBF :CE_STAT.DBF : Database files which contain the names and locations of everyCW_STAT.DBF : WSC streamflow region in British Columbia.SE_STAT.DBF :SW_STAT.DBFALL_STAT.DBFLEGEND.LGD : File which contains a legend for the DATSUM summary files. Thisfile can be invoked on the QUIKMap screen by pressing the keys <ALT> < L>.SUM****.EDT : This file controls how a DATSUM summary file appears on theQUIKMap screen. A different .EDT file needs to be created for every summary file,and the .EDT file must have the same name as the summary file with which it isassociated. For example, the data file SUM08ED.DBF needs an .EDT file namedSUM08ED.EDT. If the .EDT file is omitted, then the data will appear on the screenin standard QUIKMap form. If many data files have the same format, then .EDT filescan be created by simply copying a model .EDT file many times, and renaming itappropriately.116APPENDIX BFLOOD PROGRAM DETAILSAppendix B.1 - FLOOD Program Organization As explained in the text, the FLOOD program was originally developed inPascal software language, and has been rewritten for this thesis in Excel language.It is assumed that users of this new version of FLOOD will be familiar with Excel, butif this is not the case, then the Excel User's Manual (Microsoft Corporation, 1991)should be referenced.The Excel version of FLOOD consists of two worksheets, FLOOD.XLS andSUMMARY.XLS, and two macrosheets, DIALOG1.XLM and CALCS.XLM. Thesheet FLOOD.XLS is a large spreadsheet on which numerous calculations aremade, based on the user's input and resulting instructions from the macros. Thesheet SUMMARY.XLS is a blank file into which the results of an analysis areexported. The sheet DIALOG1.XLM creates dialog boxes on the screen whichpermit the user to input values. Finally, the sheet CALCS.XLM contains twenty-fivemacros which perform various functions, and which, when run collectively, operatethe program. A listing of the various macros and a description of their functions isgiven in Appendix B.3, as is a flowchart which illustrates the inter-macrorelationships (Figure B.3). A copy of the programming code is given in AppendixB.4. Some of the code is not shown in its entirety, but in general, the missing codeis text which is not significant to understanding the programming.118Appendix B.2 - Instructions for UseThe following text presents instructions for installing and operating theFLOOD program. This program has been designed to operate in the MicrosoftExcel software environment, although with modification, it may run in the Lotus 1-2-3 environment. If the instructions are followed carefully, the program should runwithout incident. However, it should be noted that the program is not very robust, inthat it has not been written with many safeguards against user errors, and couldpotentially be destroyed with improper use. Therefore, it is highly recommendedthat a back-up copy of the program always be kept on hand, so that if the userinadvertently destroys the program, it can easily be restored to its original state.B.2.1 Installing the Program Install the FLOOD program by copying the files FLOOD.XLS, CALCS.XLM,SUMMARY.XLS, and DIALOG1.XLM into a directory in the C:\ drive, titled FLOOD(ie. C:\FLOOD). This is imperative! The CALCS.XLM macro has many referencesto files in this directory. Therefore, the program will not run properly if the files areloaded into a different directory, unless the file references are changedappropriately.B.2.2 Running the Program Enter the Microsoft Excel environment. Open the files CALCS.XLM andDIALOG1.XLM. Now open the file FLOOD.XLS. Loading this file will automaticallyinvoke the macro calcs.xlm!Master_Master, which will run the flood program andprompt the user for the appropriate inputs. The two dialog boxes, shown in FiguresB.2.1 and B.2.2, will appear on the screen to allow the user to enter data. The usercan move the cursor to the various input spaces in the dialog boxes by either usingthe mouse and clicking on the appropriate box, or by pressing the <Tab> key on the119'01-Mean EstimateLowProbableHigh-C.V. EstimateLowProbableHigh  -.Adjust Flows1,3 Addition Factoro Multiplication FactorLow^I^IProbable I^IHigh^I^1Corr. Coeff." I. InPut O Auto Return Periods(2.5. 10, 20, 50. 100.200500, 1000 and 1500 years)ther Return PeriodsEnter value as YEARSRP1RP2RP3RP4RP5Use either the tab key or the mouse to move the cursor around the form.Figure B.2.1 - Initial Input Dialog BoxInput a maximum of 10 data values (ex. 8 actual. 1 mth lamest, 1 flood exceeded){Actual Flood Flows—newelflow02HowO3fluiv04flow05flow08flow07How08110509Howl 0-Flood Exceeded m times in n Years—Flow^Times Year:flow01^I^Iflow02 I I^I-Mth Largest Flood in n YearsFlow^Rank Yearsflow01^I^1 I^1flow02Note:If you choose to adjust the flows, results for both adjusted and unadjusted flows will be givenjFigure B.2.2 - Additional Flow Data Dialog Box120keyboard. Once the values have been entered in the spaces on the dialog box, theuser need just press the <Enter> key to proceed.The first dialog box, titled "Initial Input", prompts the user to enter basic data,which is the minimum data required for the program to operate. The programrequires estimates of the low, probable and high values of the mean annual extreme(ie. flood or low) flow and its coefficient of variation. The low value is the one whichthe user is 95% sure the actual value will exceed, the probable is the most likelyvalue, and the high is the value the user is 95% sure the actual value will notexceed. The user must then specify the design return periods and is given theoption of entering individual values, or selecting a standard set of return periods,namely 2, 5, 10, 20, 50, 100, 200, 500, 1000 and 1500 years.The second dialog box, titled "Additional Flow Data", allows the user to enteradditional relevant data, which the program uses to update its flow estimates. Theuser is able to enter actual flood flows, and/or Mth largest floods in N years, and/orfloods exceeded M times in N years. A maximum of ten additional flow values ispermitted. One final option available to the user, is the ability to adjust flows by anaddition or multiplication factor. This can be useful, for instance, when one isinterested in knowing both max daily and max instantaneous flood flow estimates.In this case, the user would enter all the usual information about the max dailyflows, and then would select the multiplication factor option and enter low, probableand high estimates of that factor, as well as a correlation coefficient between thefactor and the max daily flows.These instructions may seem confusing, but should become clear afterreading Chapter 7, Sample Application. If the reader is still confused, then it issuggested that the reader refer to Dr. Denis Russell's paper, "Estimating Flows fromLimited Data"(Russell 1991).121Once the program has been run, it will open the SUMMARY.XLS file andsave the results to this file under a new file name, specified by the user. Theprogram will then close the FLOOD.XLS file without saving any of the changes, sothat the file is ready to run again.In order to run the program again, the user need just open the FLOOD.XLSfile and then follow the prompts. Note that the input values from the previous run ofthe program will appear in the dialog boxes, allowing the user to check hispreviously inputted values, and facilitating the running of "what-if' scenarios.The results of the analysis will be found in the CAFLOOD directory, underthe user specified file name. The file is in standard Excel worksheet format, andcan be manipulated with standard Excel commands. Note that if the user utilizedthe "Adjust Flows" option of the program, then the results will contain two sets ofnumbers, one adjusted and one unadjusted, and that the unadjusted values mayappear off the screen. Simply scroll page down in order to view these values.B.2.3 Exiting the Proaram Exit the program by closing all files in the usual manner, but make sure tonot save any of the changes.Notes 1. Dialog1.xlm is protected. To unprotect the file so that it can be edited, thepassword is DIALOG1.2. Cell R1C1 in file FLOOD.XLS is linked to the CALCS.XLM file, so that you mayexperience difficulties making changes to file references in file CALCS.XLM. Ifyou wish to eliminate the link, open file CALCS.XLM, click on the pull-downmenu Formula, select Define Name, and delete the entry Auto_Open_calcs.122Appendix B.3 - Macros: Names and FunctionsThis is a brief description of the Excel macros used to create the FLOODprogram. All of the following macros are found in macrosheet CALCS.XLM, andthey are given below in the order in which they appear in the macrosheet.MASTER_MASTER < CTRL> < m >Master macro which links all the other flood macros. The letters in brackets "< >"indicate the key strokes which can be used to invoke this command macro.MASTER_MASTER2Master macro which links all the other flood macros. This version is automaticallyinvoked if the user tries to input more than 10 additional data values.RETURN PERIODCopies the return periods from the DIALOG1.XLM macrosheet and then calculatesthe low, probable and high return period flows for each return period.INPUT_ M _STDVCopies the user input estimates of low, probable, and high means and coefficient ofvariations from the DIALOG1.XLM macrosheet to the appropriate cells in the floodworksheet.DIALOG_BOXInvokes the dialog boxes which allow the user to input values.COPY NUMCopies the 25 pairs of return period flows and associated probabilities to anotherarea on the spreadsheet, where they will be ranked in ascending order according toflow. Copies the data from one of three areas depending on the situation. Copiesthe data from one area if additional flow data has been entered and the probabilitiesadjusted, from another area if this is not the case, and from a third area if the flowshave been adjusted by an addition or multiplication factor.SORT_NUMRanks the 25 pairs of return period flows and associated probabilities in ascendingorder according to flow.CALC_LOWCalculates the 10th percentile flow value.123CALC_HIGHCalculates the 90th percentile flow value.MOVE_NUM1Copies the return periods and low, probable and high estimates of return periodflows to a summary area of the spreadsheet.MASTER_CALCCoordinates various macros to calculate the estimated low, probable and highreturn period flows for a number of specified return periods.ACTUAL DATACopies previously inputed actual data values from the DIALOG1.XLM macrosheet tothe FLOOD.XLS worksheet, and then uses these values to update probabilities.MOVE_LIKELIHOOD1Moves a column of actual data likelihood values to the likelihood summary area ofthe spreadsheet.FLOOD_EXCEEDEDCopies previously inputted flood exceeded values from the DIALOG1.XLMmacrosheet to the FLOOD.XLS worksheet, and then uses these values to updateprobabilities.MOVE_LIKELIHOOD2Copies the updated probabilities (by flood exceeded macro) to another area of thespreadsheet.Mth_LARGESTCopies previously inputted Mth largest data values from the DIALOG1.XLMmacrosheet to the FLOOD.XLS worksheet, and then uses these values to updateprobabilities.MOVE_LIKELIHOOD3Copies the updated probabilities (by Mth largest macro) to another area of thespreadsheet.MASTER_FLOODUses additional flood data previously inputted by the user (actual data, floodexceeded or Mth largest), to update the probabilities of an event occurring.COUNT_DATACounts the number of additional data values entered by the user. If the number isgreater than 10, then the MASTER_MASTER2 macro is invoked, otherwise theMASTER MASTER macro is invoked.124MASTER_FACTORMaster control macro which combines a number of other macros to change the flowvalues by an addition or multiplication factor.ADD_FACTORChanges the flow values by adding a specified number to each flow value.Performs one set of actions if the values have previously been altered by additionalflow data, and another set, if not.MULT_FACTORChanges the flow values by multiplying each flow value by a specified number.Performs one set of actions if the values have previously been altered by additionalflow data, and another set, if not.MOVE_NUM_FACTWhen the "change flows by a factor" option is chosen, this macro copies the returnperiods to a new area of the worksheet, and then invokes other macros to calculatethe new factored return period flows for each return period.MOVE_NUM2Copies the low, probable and high estimates of the factored return period flows to asummary area of the spreadsheet.SUMMARYSaves the results of the analysis to a separate file, prompts the user to name thesummary file, then closes the FLOOD.XLS file without saving any changes, so thatit is ready to be used again.125Figure B.3 - Flow Chart Illustrating Flood Macro Relationships126MASTER_MASTER2MASTER_MASTER=RUN(CALCS.XLM!COUNT DATA')=SET.VALUE(excess,0)=SET.VALUE(excess,1)=IF(ALERT('Welcome to the FLOOD program. This pro=excess+1=ALERT("The FLOOD program was originally designed=RUN(CALCS.XLMIDIALOG_BOV)=ALERT("The FLOOD program uses compound probabi=RUN(CALCS.XLMIRETURN_PERIOD')=ALERT("The program is now ready to run. It will take aAppendix B.4 - Programming CodeMacrosheet : CALCS.XLM =RUN(*CALCS.XLMIDIALOG_BOr) =ALERT("The program is now ready to run. It will take a few seco=RUN(CALCS.XLMICOUNT DATA') =RUN(CALCS.XLMIINPUT_M_STDV`).4F(excess>0,GOTO(R[7]C),GOTO(Rf 11C)) =RUN(CALCS.XLM!MASTER_FLOOD')=RUN(CALCS.XLIWINPUT M_STDV') =RUN(CALCS.XLMIMASTER_FACTOR')=RUN(cALCS.XLMIMASTER_FLOOD') =RUNCCALCS.XLMISUMMARY')=RUN("CALCS.XLMIRETURN_PERIOD') =RETURN()=RUN("CALCS.XLMIMASTER_FACTOR')=RUN(CALCS.XLMISUMMARY')=SET.VALUE(excess,0)=RETURN()127RETURN_PERIOD INPUT M_STDV=ECHO(FALSE) =ECHO(FALSE)=ACTIVATECDIALOG 1.XLM") =ACTIVATE("DIALOG 1.XLM")=1F(C: \FLOODOIALOG1.XLM1Auto_RP=TRUE) =SELECT(R6C7:R8C7')=ACTIVATE(' FLOOD.XLS") =COPY()=SELECT(FLOOD.XLSIR49C6:R58C6") =ACTIVATENEXTO=COPY() =SELECT("R2C8')=SELECT(CAFLOOD \FLOOD.XLS1R49C1) =PASTE()=PASTE() =CANCELCOPY()=CANCELCOPY() =ACTIVATE(DIALOG1.XLM')=END.1F() =SELECT('R12C7:R14C7')=ACTIVATE(DIALOG1.XLM') =COPY()=SELECT(C: \ FLO° D\DIALOG1.XLM1R21C7:R2507) =ACTIVATE.NEXTO=COPY() =SELECT('R5C8")=ACTIVATE(FLOOD.XLS") =PASTE()=SELECT("R63C1") =CANCEL.COPY()=SELECT.END(3) =ACTIVATE('FLOOD.XLS')=SELECT(13[1]C") =SELECT(132C8)=PASTE() =COPY()=CANCEL.COPY() =SELECT("R4C3')=FOR.CELL("rp1",'CAFLOOD \FL000.XLSIR49C1:R63C1,TRUE) =PASTE()=SELECT(rp1) =CANCEL.COPY()=COPY() =SELECT(R3C8")=SELECT('FLOOD.XLSIreturn_period') =COPY()=PASTE() =SELECT(' R4C5")=CANCEL.COPY() =PASTE()=RUN(CALCS.XLM!MASTER_CALC') =CANCEL.COPY()=NEXT() =SELECT('R4C8')=RETURN() =COPY()=SELECT('R4C7")=PASTE()=CAN CEL.COPY()=SELECT(R5C8")=COPY()=SELECT('R5C2")=PASTE()=CANCEL.COPY()=SELECT('R6C8')=COPY()=SELECT('R7C2")=PASTE()=CANCEL.COPY()=SELECT(' R7C8')=COPY()=SELECT('R9C2')=PASTE()=CANCEL.COPY()=RETURN()128DIALOG_BOX=DIALOG.BOX(C:\FLOOD\DIALOG 1 .XLM1Initial_Input)=DIALOG .BOX(C: \FLOODDIALOG1.XLMIAdditional_Data)=RETURN()129COPY_NUM SORT_NUM=ECHO(FALSE) =ECHO(FALSE)..1F(C: \FLOOD \FLOOD.XLSlinput type=0) =SORT(1,'R45C42",1)=SELECT("R16C6:R40C6") =RETURN()=I FCCAFLOOD \FLOOD.XLS1factor_type,GOTO(13[11C),GOTO(R[25]C)) =SELECT(R45C42:R69C43')=COPY()=SELECT("R45C42')nPASTE.SPECIAL(3,1,FALSE,FALSE)=CANCELCOPY()=SELECT(R16C4:R40C4')=COPY()=SELECT("R45C43')=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=ELSE()=SELECT(1316C42:R40042")=COPY()=SELECT(1345C42')=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECT(*R16C40:R40C40')=COPY()=SELECT("R45C43')=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCELCOPY0=END.IF()=GOTO(R[111C)=SELECT('R16C49:R40C49')=COPY()=SELECT(*R45C42")=PASTE.SPECIAL(3,1, FALSE, FALSE)=CANCEL.COPY()=SELECT("R16C47:R40C47')=COPY()=SELECT(' R45C43")=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=RETURN()130CALC_LOW=ECHO(FALSE)=SELECT(' R71 C42')=I FCC AFLOO D FLOOD.XLSIlcum 1 >0.1)=FORMU LA(=(0.10/1cum 1 )*low1 )=GOTO(R[1 9IC)=ELSE. IF(CAFLOOD \FLOOD.XLS'Ilcum2>0.1)=FO RMU LAC=low1 +(((.10-lcum1 )/(lcum2-lcum 1 ))*(low2-low1 ))')=GOTO(R[16]C)=ELSE.' F(C: \FLOOD \FLOOD.XLS'Ilcum3>0.1)=FORMULAr=low2+(((.10-lcum2)/(lcum3-lcum2))*(low3-low2)))=GOTO(R[131C)=ELS E.I FCC: \FLOOD \FLOOD.XLSIlcum4>0.1)=FORMULAC4ow3+0(.10-lcum3)/(lcurn4-lcum3))*(low4-low3))*)=GOTO(R[1 01C)=ELS E.I F(CAFLOOD \FLOOD.XLS'Ilcum5>0.1)=FORMULAC=low4+(((.10-lcum4)/(lcum5-lcum4))*(low5-low4)))=GOTO(R[7]C)=ELS E.I F(CAFLOO D \FLOO D.XLS'Ilcum6>0.1 )=FORMULAC=low5+(((.10-lcum5)/(lcum6-lcum5))*(low6-low5W)=GOTO(R[41C)=ELSE.IF(CAFLOOD \FLOO D.XLS'Ilcum7>0.1 )=FORMULAC=low6+0(.10-lcum6)/(lcum7-lcum6))*(low7-low6))")=GOTO(R[1]C)=END.IF()=RETURN()131CALC_HIGH MOVE_NUM1=ECHO(FALSE) =ECHO(FALSE)=SELECT(" R73C42') =SELECT("R71C42')=I F(CAFLOOD \FLOOD.XLSThcum1>0.1) =COPY()=FORMULA("=high1") =SELECT(iowbor)=GOTO(R[19]C) =SELECT.EN D(3)=ELSE.IF(C: \FLOOD \ FLOC D.XLS1hcum2>0.1) =SELECT(13[1]C")=GOTO(R[16JC) =CANCEL.COPY()=ELSE.IF(C:\FLOOD \FLOOD.XLSThcum3>0.1) =SELECT(' R72C42")=GOTO(R[131C) =SELECT("probot")=ELSE. I F(CAFLOOD FLOOD.XLS'fficum4>0.1) =S ELECT.EN D(3)=PASTE.SPECIAL(3,1,FALSE,FALSE)=GOTO(R[10]C)=ELSE., F(CAFLOO D FLOOD.XLS'fficum5>0.1) =CANCEL.COPY()=GOTO(R[7]C) =COPY()=ELS E.I F(CAFLOOD \ FLOOD. XLS'fficum6>0.1) =SELECT("highbot")=GOTO(R[41C) =SELECTCR[1]C")=PASTE.SPECIAL(3,1,FALSE,FALSE)...ELSE. I F(CAFLOOD \FLOOD.XLS'fficum7>0.1)=GOTO(R[1 1C) =RETURN()=FORMULAC=high2+(((hcum2-0.10)/(hcum2-hcum1))*(high1-high2))") =PASTE.SPECIAL(3,1,FALSE,FALSE)=FORM U LAr=high3+(((hcum3-0.10)/(hcum3-hcum2))*(high2-high3)r) =COPY()=FORMULAC=high4+(((hcum4-0.10)/(hcum441cum3))*(high3-high4))") =SELECT(1=1[1]C")=FORMULAC=high5+(((hcum5-0.10)/(hcum5-hcum4))*(high4-high5))") =SELECT(-R73C42")=FORMULAr=high6+(((hcum6-0.10)/(houm6-hcum5))*(high5-high6))") =S ELECT. EN D(3)=FORMULAC=high7+(((hcum7-0.10)/(hcum7-hcum6))*(high6-high7)y) =CANCEL.COPY()MASTER_CALC=RUN("CALCS.XLM!COPY_NUM")=RUN(CALCS.XLM!SORT_NUM")=RUN(CALCS.XLM!CALC_LOW*)=RUN("CALCS.XLM!CALC_HIGH')=IFCCAFLOOD\FLOOD.XLS1factor_type=0,RUN(CALCS.XLM!MOVE_NUM1'),RUN(CALCS.XLM!MOVE_NUM2"))=RETURN()133ACTUAL_DATA MOVE_LIKELIHOOD1=ECHO(FALSE) =ECHO(FALSE)=SELECT( R 16C11:R40C11°)=ACTI VATE( DIALOG1 .XLM")=SELECT(CAFLOOD \DIALOG1.XLM1R6C15:R15C15) =SELECT(R16C26')=COPY() =SELECT.END(1)=ACTIVATE("FLOOD.XLS") =SELECT('FIC[1]")=PASTE.SPECIAL(3,1,FALSE,FALSE)=SELECT(CAFLOOD \FLOOD.XLS1R3C22)=PASTE() =CANCEL.COPY()=CANCEL.COPY() =RETURN()=IFCCAFLOOD \DIALOG 1. XLM1Actual_Data1>O,GOTO(R[11C),GOTO(R[16]C)) =COPY()=FOR.CELL("ad1','CAFL000 \FLOOD.XLS1R3C22:R12C22,TRUE)=SELECT(ad1)=COPY()=SELECT("afte)=PASTE()=CANCEL.COPY()=RUN(CALCS.XLMIMOVE_LIKELIHOOD1')=NEXT()=FORMULA(1,'CAFLOOD \FLOOD.XLSlinput_type)=RETURN()134FLOOD_EXCEEDED=ECHO(FALSE)=ACTIVATE(DIALOG1.XLM")=I FCCAFLOOD \DIALOG1.XLM!Flood_Exceeded1>O,GOTO(R[1]C),GOTO(R[41]C))=SELECT(CAFLOOD \DIALOG1.XLM1R39015:R44C15)=COPY()=ACTIVATECFLOOD.XLS')=SELECT(CAFLOOD \FLOOD.XLS1R3C23)=PASTE()=CANCELCOPY()=SELECT(lex1')=COPY()=SELECT(afv')=PASTE()=CANCEL.COPY()=SELECT(lem1')=COPY()=SELECT(rn')=PASTE()=CANCEL.COPY()=SELECT('fen1')=COPY()=SELECT(n)=PASTE()=CANCEL.COPY()=RUN(CALCS.XLM!MOVE_LIKELIHOOD2')=FORMULA(1,'CAFLOOD \FLOOD.XLSlinput_type)=IFCC: \FLOOD \FLOOD.XLSIfex2>O,GOTO(R[11C),GOTO(R[17]C))=SELECT('fex2')=COPY()=SELECT(afv")=PASTE()=CANCELCOPY0=SELECT("fem2')=COPY()=SELECT("ni')=PASTE()=CANCEL.COPY()=SELECT(len2")=COPY()=SELECT(' n")=PASTE()=CANCEL.COPY()=RUN(CALCS.XLM!MOVE_LIKELIHOOD2')=RETURN()135MOVE_LIKELIH0002 Mth_LARGEST=ECHO(FALSE) =ECHO(FALSE)=SELECT(1316C12:R40C12') =ACTI VATE(" DIALOG 1.XLM')=COPY() =IFCC:\FLOO D \ DIALOG 1.XLM1Mth_Largest>0,GOTO(R[1]C),GOTO(R[41]C))=S ELECT( R16C26') =SE LECT(C: \FLOOD \ DIALOG 1.XLM1 R28C15:R33C15)=SELECT.EN D(1) =COPY()=SELECT('RC[1]') =ACTIVATE('FLOOD.XLS')=PASTE.SPECIAL(3,1,FALSE,FALSE) =SELECT(CAFLOOD \FLOOD.XLS1R3C24)=CANCEL.COPY() =PASTE()=RETURN() =CANCEL.COPY()=SELECT(MII')=COPY()=SELECT("afv“)=PASTE()=CANCEL.COPY()=SELECT("m1m1')=COPY()=SELECT(nth')=PASTE()=CANCEL.COPY()=SELECT(' mln1")=COPY()=SELECT("W)=PASTE()=CANCEL.COPY()=R U N(CALCS.XLMIMOVE_LIKE LI HOO D3')=FORMULA(1,'CAFLOOD \FLOOD.XLSlinput_type)=IF(C: \FLOOD \FLOOD.XLS1m12>O,GOTO(R[1]C),GOTO(R[17]C))=SELECT(m12')=COPY()=SELECT(afv“)=PASTE()=CANCEL.COPY()=S ELECT(' m1m2')=COPY()=SELECT("mth')=PASTE()=CAN CE L.COPY()=SE LECT('mln2')=COPY()=SELECT(n)=PASTE()=CANCEL.COPY()=R U N(' CALCS.XLMIMOVE_LI KELI HOO D3')=RETURN()136MOVE_LIKELIHOOD3 MASTER_FLOOD=RUN(CALCS.XLMACTUAL_DATA')=ECHO(FALSE)=RUN(CALCS.XLM!Mth_LARG EST')=COPY()=SELECT(' R16C26") =RETURN()=SELECT(*R16C13:R40013') =RUN(CALCS.XLMIFLOOD_EXCEEDED")=SELECT.END(1)=SELECT('RC[1]*)=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=RETURN()137COUNT_DATA MASTER_FACTOR=ECHO(FALSE) =ECHO(FALSE)=ACTIVATE('FLOOD.XLS") =RUN(CALCS2.XLM!MOVE_NUM_FACT)=ELSE.IF(C: \FLOOMIALOG1.XLM1Multiplication_Factor=TRUE)=CLOSE(FALSE)=RETURN() =END.IF()=IF((CO U NTCCAFLOODO IALOG 1 . XLM1R6C1 5:R1 5( =IFCC: \FLOOD \DIALOG1.XLMAddition_Factor=TRUE)=RUN(CALCS.XLM!ADD_FACTOR")=ALERT("Nice try! You are only allowed to enter a ma=ACTIVATE(D IALOG 1 .XLM") =ACTIVATE("DIALOG1.XLM")=OPEN("C: \FLOOD \FLOOD.XLS") =RUNCCALCS.XLM!MULT_FACTOR1=RUN(CALCS.XLMMASTER_MASTER2') =RUN(CALCS2.XLM!MOVE_NUM_FACT)=RETURN()138ADD_FACTOR=ECHO(FALSE)=SELECT(CAFLOOD \ DIALOG 1.XLM11355C15:R58C15)=COPY()=ACTIVATE(' FLOO D. XLS')=SELECTCCAFLOOD \FLOOD.XLS'Ilfact)=PASTE()=CAN CEL.COPY()=FORMULA(1;C: \FLOOD \FLOOD.XLS1R3C49)=SELECT(CAFLOOD FLO° D.XLS'Imean_factor)=COPY()=SELECT(CAFLOOD\FLOOD.XLSIaddiact)=PASTE.SPECIAL(3,1,FALSE,FALSE)=CAN CEL.COPY()=SELECT(CAFLOOD\FLOOD.XLS'Imult_fact)=FORMULA(Wa")=IFCCAFLOOD\FLOOD.XLSlinput type=0,GOTO(R[l jC),GOTO(R[l 9]C))=SELECT(CAFLOOD \FLOOD.XLSIR16C4:R40C4)=COPY()=SELECT(CAFLOOD\FLOOD.XLS'Irev_prob_t)=PASTE()=CANCEL.COPY()=SELECT('CAFLOOD \FLOOD.XLS'Irev_mean_t)=FORMULAC=RCF4314-mean_factor)=COPY()=SE LECT(CAFLOO D FLO° D. XLS1 rev_mean_VC: \FLOOD \FLOOD.XLS'Irev_mean_b)=PASTE()=CANCEL.COPY()=SELECT(CAFLOOD \FLOOD.XLS'Irev_stdev_t)=FORMULAC=SQRTWRC[43]^2)*(stdev_factor^2))+(2*ccfRCF43]*stdev_factor))")=COPY()=SELECT(CAFLOOD FLO° D.XLS'Irev_stdev_VC: \ FLOOD \ FLOOD.XLS'Irev stdev_b)=PASTE()=CANCEL.COPY()=GOTO(RF291C[1])=SELECT('CAFLOOD \FLOOD.XLS'Inorm_refit_prob_tC: \FLOOD \ FLO° D.XLS'Inorm_refit_prob_b)=COPY()=SELECT(CAFLOOD\FLOOD.XLS'Irev_prob_t)=PASTE.S PECIAL(3,1,FALS E, FALSE)=CANCEL.COPY()=SELECT('CAFLOOD \FLOOD.XLSIrev_meani)=FORMULA("=RC[-7]+mean_factor")=COPY()=SELECT(CAFLOOD FLO° D.XLS1 rev_meant:'CAFLOO D \FLOOD.XLS'Irev_mean_b)=PASTE()=CANCEL.COPY().SELECT(CAFLOOD \ FLO° D. XLS'Irev_stdev_t)=FORM U LAC=SQ RTWRC[7]^2)*(stdev_factor^2))+(2*ccr RC[7]'stdev_factor))')139ADD_FACTOR (continued)=COPY()=SELECTCCAFLOOD\FL000.XLS1rev_stdev_VC:\FLOOD\FLOOD.XLS1n=PASTE()=CANCEL.COPY()=RETURN()140MULT_FACTOR=ECHO(FALSE)=SELECT(CAFLOODZIALOG1.XLM'IR55C15:R58C15)=COPY()=ACTIVATE('FLOOD.XLS")=SELECT(CAFLOOD\FLOOD.XLS'Ilfact)=PASTE()=CANCEL.COPY()=FORMULA(2;C:\FLOOD\FLOOD.XLS1R3C49)=SELECT('CAFLOOD\FLOOD.XLSImean_factor)=COPY()=SELECTCCAFLOOD\FLOOD.XLS'Imultfact)=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECT(CAFLOOD\FLOOD.XLSladd _fact)=FORMULA(rila')=IFCCAFLOOD\FLOOD.XLSlinput type),GOTO(R[1]C),GOTO(R[19C))=SELECT('CAFLOOD\FL000.XLS1R1604:R4004)=COPY()=SELECT(CAFLOOD\FLOOD.XLS'Irev_prob_t)=PASTE()=CANCEL.COPY()=SELECT(CAFLOOD\FLOOD.XLS'Irev_meanf)=FORMULAC=RCF43rmean_factor+ccrRCF42rstdev _factor)=COPY()=SELECT(CAFLOOD\FLOOD.XLS'Irev_mean_UCAFLOOD\FLOOD.XLS'Irev_mean_b)=PASTE()=CANCEL.COPY()=SELECT(CAFLOOD\FLOOD.XLS'Irev_stdev_t)=FORMULAr=SQRT(((RCF441^2)*(stdevfactor^2))+((mean_factorA2)*(RC[43]^2))+(2*ccrRC=COPY()=SELECTCCAFLOOD\FLOOD.XLSirev_stdev_VC:\FLOOD\FLOOD.XLS'Irev_stdev_b)=PASTE()=CANCEL.COPY()=GOTO(R[18]C)=SELECTCCAFLOOD\FLOOD.XLS'Inorm_refit_prob_VCAFLOOD\FLOOD.XLS'Inorm_refit_pro=COPY()=SELECT(CAFLOOD\FLOOD.XLSIrev_prob_t)=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECTCCAFLOOD\FLOOD.XLS'Irev_mean j)=FORMULA(=RC[7]* mean _factor")=COPY()=SELECT('C:\FLOOD\FLOOD.XLS 'Irev_meanf'CAFLOOD\FLOOD.XLS'Irev_mean_b)=PASTE()=CANCEL.COPY()=SELECT(C:\FLOOD\FLOOD.XLS'Irev_stdevf)=FORMULAC=SQRTWRCHIA2)*(stdev_factor^2))+((mean_factor^2)*(RCF71^2))+(2*ccr RCH141MULT_FACTOR (continued)=COPY()=SELECT(C:\FLOOD\FLOOD.XLS1rev_stdev_VCAFLOODTLOOD.XLS1rev_stdev_b )=PASTE()=CANCEL.COPY()=RETURN()142MOVE_NUM_FACT MOVE_NUM2=ECHO(FALSE) =ECHO(FALSE)=ACTIVATE(' FLOOD.XLS') =SELECT("R71C42")=SELECT(iowbot_fact")=COPY()=SELECT(CAFLOOD \FLOOD.XLS1R71C1) =SELECT.END(3)=PASTE() =SELECTCR[1]C')=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCEL.COPY()=SELECT(rp) =SELECT(R72C42")=COPY() =COPY()=SELECT("probotiact")=sELECT('FLOOD.XLS!return_period-)=PASTE() =SE LECT. END(3)=CAN CEL.COPY() =SE LECT("R[11C`)=NEXT() =CANCEL.COPY()=RETURN() =SELECT(R73C42")=SE LECT(CAFLOOMFLOOD.XLS1 R49C1:R63C1) =COPY()=FOR.CELL('rp','CAFLOOD \FLOOD.XLS1R71C1:R85C1,TRUE) =CANCEL.COPY()=RUN(CALCS.XLMIMASTER_CALC") =PASTE.SPECIAL(3,1,FALSE,FALSE)=COPY()=SELECT(highbot fact")=SELECT.END(3)=SELECT(11[1]C")=PASTE.SPECIAL(3,1,FALSE,FALSE)=CANCELCOPY0=RETURN()143SUMMARY=ECHO(FALSE)=IF(CAFLOOD \FLOOD.XLS1factor_type=0)=ACTIVATE(FLOOD.XLS')=SELECT('R42C1:R63C4')=COPY()=OPEN("c:\flood■summary.xle)=SELECT(R1C1')=PASTE()=CANCELCOPY()=ACTIVATE.NEXTO=ACTIVATE('SUMMARY.XLS')=SAVE.AS(INPUT("The results of your analysis have been plac=SELECT(131 C1')=ELSE()=SELECT(*FLOOD.XLSItop_sum:highbot_facr)=COPY()=OPEN(c:\flood\summary.xls ")=SELECT(131C1')=PASTE()=CANCELCOPY()=ACTIVATE.NEXT()=ACTIVATE(SUMMARY.XLS')=SAVE.ASONPUT(*The results of your analysis have been plac=SELECT("R1C1')=END.IF()=ACTIVATE("FLOOD.XLS")=CLOSE(FALSE)=RETURN()144


Citation Scheme:


Citations by CSL (citeproc-js)

Usage Statistics



Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            async >
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:


Related Items