Lancaster, G T
Programming in COBOL is a simple yet concise how-to book that teaches the programming language in a short yet effective step-by-step manner, which can be easily understood by anyone with sufficient knowledge in information technology. Covering first the advantages of COBOL over other programming languages, the book discusses COBOL's divisions - identification, environment, procedure, and data, and then describes the testing of the COBOL source programs and program questions. The book is valuable for those who wish to learn basic COBOL language, but do not have the time to take manufacturers' o
COBOL for Students has established itself as one of the most successful teaching texts on COBOL programming and is now in its fourth edition. The first part of the book concentrates on the fundamentals of the language and takes students to the point where they can write modestly sized programs using sequential files. Part two assumes competence in elementary COBOL and explains design and other programming techniques which should be part of the professional programmer's repertoire. Part three extends the student's knowledge of the language by explaining some of the more advanced features of COB
Nowadays, billions of lines of code are in the COBOL programming language. This book is an analysis, a diagnosis, a strategy, a MDD method and a tool to transform legacy COBOL into modernized applications that comply with Internet computing, Service-Oriented Architecture (SOA) and the Cloud. It serves as a blueprint for those in charge of finding solutions to this considerable challenge.
The research thesis reports a detailed study of the Report Writer of the COBOL language in order to integrate it into the IRIS 50 COBOL compiler. In order to use existing compiler processing, the author developed a simulation of the Report Writer by using Cobol statements generated in the declarative part of the Division procedure. After a brief presentation of the IRIS 50 computer, the author presents the general plan of the compiler with modifications and adjunctions exclusively due to the Report Writer. The next part addresses the practical implementation and the problems met and solved during this implementation
Lorents, Alden C.
Various schools are struggling with the introduction of Object Oriented (OO) programming concepts and GUI (graphical user interfaces) within the traditional COBOL sequence. OO programming has been introduced in some of the curricula with languages such as C++, Smalltalk, and Java. Introducing OO programming into a typical COBOL sequence presents…
.... This research develops a transformation system to convert COBOL code into a generic imperative model, recapturing the initial design and deciphering the requirements implemented by the legacy code...
Lee, W.F. Jr.
Oak Ridge National Laboratory's Automated Safeguards Information System (OASIS) is a near real-time nuclear materials/precious metals safeguard and accountability control system. Using COBOL and RSTS/E on a dedicated 11/34, the system performs on-line inventory update, inquiry and report functions. Processed transactions consisting of intra-laboratory movements, on-site receipts and off-site shipments are maintained for inquiry and report preparation. A secure, controlled but friendly user environment is maintained by chaining between menu and data manipulation tasks. The use of menus, security and access control, screen manipulation, file access and contention, word processing activities, task size problems and other aspects of this application will be discussed
Mathur, F. P.
Several common higher level program languages are described. FORTRAN, ALGOL, COBOL, PL/1, and LISP 1.5 are summarized and compared. FORTRAN is the most widely used scientific programming language. ALGOL is a more powerful language for scientific programming. COBOL is used for most commercial programming applications. LISP 1.5 is primarily a list-processing language. PL/1 attempts to combine the desirable features of FORTRAN, ALGOL, and COBOL into a single language.
COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS
This article describes the integration of a knowledge-based system with a large COBOL-DB2-based offender management system. The knowledge-based application, developed for the purpose of offender sentence calculation, is shown to provide several benefits, including a shortened development cycle, simplified maintenance, and improved accuracy over a previous COBOL-based application.
Groover, J. L.; Jones, S. C.; King, W. L.
Program for data management system allows sophisticated inquiries while utilizing simplified language. Online system is composed of several programs. System is written primarily in COBOL with routines in ASSEMBLER and FORTRAN V.
For over 30 years, the Idaho Transportation Department (ITD) has had an LRS called MACS : (MilePoint And Coded Segment), which is being implemented on a mainframe using a : COBOL/CICS platform. As ITD began embracing newer technologies and moving tow...
The author states that graduates of junior college programs who learn COBOL will continue to find jobs, but employers will increasingly seek college graduates when filling positions for computer programers and systems analysts. Areas of growth for computer applications (services, military, data communications, and artificial intelligence) are…
Roubtsov, Serguei; Telea, Alexandru; Holten, Danny
Software quality assessment of large COBOL industrial legacy systems, both for maintenance or migration purposes, mounts a serious challenge. We present the Software Quality Assessment and Visualisation Toolset (SQuAVisiT), which assists users in performing the above task. First, it allows a fully
Roubtsov, S.; Telea, A.C.; Holten, D.H.R.
Software quality assessment of large COBOL industrial legacy systems, both for maintenance or migration purposes, mounts a serious challenge. We present the software quality assessment and visualisation toolset (SQuAVisiT), which assists users in performing the above task. First, it allows a fully
Ali, Azad; Smith, David
This paper presents a debate between two faculty members regarding the teaching of the legacy programming course (COBOL) in a Computer Science (CS) program. Among the two faculty members, one calls for the continuation of teaching this language and the other calls for replacing it with another modern language. Although CS programs are notorious…
Nord, Daryl; Seymour, Tom
After a brief discussion of the history and current status of business data processing versus computer science, this article focuses on the characteristics of a business data processing curriculum as compared to a computer science curriculum, including distinctions between the FORTRAN and COBOL programming languages. (SH)
Arnaudov, D.D.; Govorun, N.N.
The organization of the main files of the JINR Information Retrieval System is described. There are four main files in the System. They are as follows: MD file that consists of abstracts of documents; OMPOD file where the index records of documents are gathered; MZD file that consists of list heads, and OMD file- the file of descriptors. The last three files are considered in some detail. The System is realized in the COBOL language on the CDC computer
Approved for public release; distribution is unlimited Before the widespread use of Database Management Systems (DBMS), programmers have had to rely on the third generation language such as COBOL, Pascal, and PL/I to implement their application programs. These programs are usually very hard to maintain and modify unless very disciplined structured programming techniques are used. However, with the DBMS, the ease of development, maintenance, and modification of data-man...
A general discussion about the level of artificial intelligence in computer programs is presented. The suitability of various languages for the development of complex, intelligent programs is discussed, considering fourth-generation language as well as the well established structured COBOL language. It is concluded that the success of automation in many administrative fields depends to a large extent on the development of intelligent programs.
The department for the measurements of individual doses makes regular dose controls by means of film badges for approximately 14000 individuals. The operation is facilitated by a Honeywell Bull Mini 6 Mod 43 computer. The computer language is COBOL applied to registering of in-data such as delivery of badges, film development, calibration, invoices, recording of individual doses and customers. The print-out consists of customers, badge codes, dosimeter lists, development specifications, dose statements, addresses, bills, dose statistics and the register of individuals. As a consequence of charges the activity is financially self-supporting. (G.B.)
Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate
In the days before personal computers, BASIC was the easy programming language to learn, and serious programmers learned FORTRAN or COBOL to do "real work." Today, many people have discovered that Perl is both a great beginning programming language and one that enables them to write powerful programs with little effort. If you're interested in discovering how to program (or how others program), Perl For Dummies, 4th Edition, is for you. If you already know something about programming (but not about Perl), this book is also for you. If you're already an expert programmer, you're still w
The FORTRAN program DATA-ENTRY-3 was developed from the COBOL program DATA-ENTRY-1, which solves a large class of elementary data-capture, data-formating, and data-editing problems of managerial accounting. Most of the work involved finding methods to make DATA-ENTRY-3, which is written for a small-machine environment (PDP-11/10 under the RT-11 operating system), logically equivalent to DATA-ENTRY-1, which is written for a large-machine environment (CDC 6600 under a time-sharing operating system). This report explains how structured programing helped, and briefly describes the function of each subroutine.
Organick, Elliott Irving; Plummer, Robert P
Programming Language Structures deals with the structures of programming languages and introduces the reader to five important programming languages: Algol, Fortran, Lisp, Snobol, and Pascal. The fundamental similarities and differences among these languages are discussed. A unifying framework is constructed that can be used to study the structure of other languages, such as Cobol, PL/I, and APL. Several of the tools and methodologies needed to construct large programs are also considered.Comprised of 10 chapters, this book begins with a summary of the relevant concepts and principles about al
Munchausen, J.H.; Glazer, K.A.
This paper describes the effort by Southern California Edison Company (SCE) and the Electric Power Research Institute (EPRI) to develop an expert systems work station designed to support the San Onofre Nuclear Generating Station (SONGS). The expert systems work station utilizes IntelliCorp KEE (Knowledge Engineering Environment) and EPRI-IntelliCorp PLEXSYS (PLant EXpert SYStem) technology, and SCE Piping and Instrumentation Diagrams (P and ID's) and host-based computer applications to assist plant operations and maintenance personnel in the development of safety tagout boundaries. Of significance in this venture is the merging of conventional computer applications technology with expert systems technology. The EPRI PLEXSYS work station will act as a front-end for the SONGS Tagout Administration and Generation System (TAGS), a conventional CICS/COBOL mainframe computer application
Wheeler, L.E.; Scott, P.H.
The Oak Ridge Gaseous Diffusion Plant Inventory Control and Accountability System (ORICAS) utilizes state-of-the-art hardware, software, and communication to provide a computerized near real-time inventory of materials within a Uranium Enrichment Plant. Work stations are located in five strategic areas within the plant. Accountability areas include material receipt, enrichment, withdrawal, sampling, intraplant transfer, and shipment. Perpetual current inventory is maintained and is available to authorized users on-line and in printed reports. The system meets DOE material reporting requirements and provides accountability safeguards for early detection of possible loss or diversion. Hardware consists of multiple data input terminals and printers linked to a time-shared computer. Major software includes COBOL and IDMS (an Integrated Data Base Management System)
This talk is about automated code analysis and transformation tools to support scientific computing. Code bases are difficult to manage because of size, age, or safety requirements. Tools can help scientists and IT engineers understand their code, locate problems, improve quality. Tools can also help transform the code, by implementing complex refactorings, replatforming, or migration to a modern language. Such tools are themselves difficult to build. This talk describes DMS, a meta-tool for building software analysis tools. DMS is a kind of generalized compiler, and can be configured to process arbitrary programming languages, to carry out arbitrary analyses, and to convert specifications into running code. It has been used for a variety of purposes, including converting embedded mission software in the US B-2 Stealth Bomber, providing the US Social Security Administration with a deep view how their 200 millions lines of COBOL are connected, and reverse-engineering legacy factory process control code i...
The paper reviews the challenges of today's oil industry which is dominated in Europe by offshore production. Some of the key computer applications are examined, discussing new software development methods which have been adopted in order to achieve significant reduction in development times. The range of modern software development tools is considered, with the decreasing impact of traditional programming languages such as COBOL and FORTRAN. The use and benefits of non procedural languages are also discussed together with some views on their relevance to high energy physics. The paper concludes with a look into the not-too-distant future, stressing the need for new approaches to software development and improving the facilities for information handling. (orig.)
Full Text Available Entre las tarjetas perforadas, las reglas de cálculo, los libros de Fortran y Cobol, y los cursos de algoritmos, sistemas de información y teoría de juegos, los jóvenes Alfredo Amore, Xavier Caicedo y Diego Escobar recibieron el 28 de agosto de 1970 su grado como los primeros Ingenieros de Sistemas y Computación del país. Por la celebración de su aniversario 40, la Revista de Ingeniería se propone hacer memoria sobre el proceso de creación de este programa y sus primeros egresados, pues constituyen un hito determinante en la historia de la ingeniería colombiana, gestado desde la Universidad de los Andes.
Greene, W. A.
A special report writer (SSR) was developed which performs multiple correlations on files containing several data hierarchies. Output reports are specified in a simple notation, readily learned by persons having limited familarity with ADP. The SRR system can be adopted by other NASA installations while the basic techniques themselves are compatible with the information management needs of a wide range of organizations. Specifically, the program lends itself to generalization and can be readily adapted for other file management purposes. Extensive details on the characteristics of the SRR program are presented along with a full explanation of the system for those contemplating its application to other data bases. The complete COBOL program and documentation are available.
Drikos, G.; Psaromiligos, J.; Geotgiou, G.; Kamenopoulou, V.K.
Dose record keeping is the making and keeping of personnel dose records for radiation workers. It is an essential part of the process of monitoring the exposure of individuals to radiation and shares in the same objectives. The dose record keeping is becoming more and more critical because of the importance of statistical analysis and epidemiological studies in radiation protection, and of the increasing cooperation and exchange of personnel between countries.The GAEC's personnel dosimetry laboratory assures the personnel dosimetry all over the country and keeps the official central dose record.The personnel dosimetry information system had been established in an electronic form on 1989 in Cobol language. Since then appeared various arguments that imposed the change of the data base used. Some of them are: 1. There was no distinction between establishments and their laboratories. 2. The workers did not have a unique code number. consequently, the total dose of a person working in more than one place could not't be estimated. The workers were directly related to their workplace, so if somebody changed his working place he was treated as a new entry, resulting an overestimation of the number of monitored workers introducing an source of errors in the collective and average dose calculations. 3. With the increasing applications of the ionising radiations many types of dosemeters became indispensable e.g. for beta and gamma, for neutrons and for the extremities. Also, the new category of outside workers appeared requesting a special treatment. All these distinctions were not achievable with the previous system. 4. Last years appeared an increasing, interesting in statistical analysis of the personal doses. A program written in Cobol does not't offer many possibilities and has no flexibility for such analysis. The new information system has been rebuilt under the design of a relational database with more possibilities and more flexibility. (authors)
Kuzmak, Peter M.; Walters, Richard F.; Penrod, Gail
This paper describes an interface connecting InterSystems MUMPS (M/VX) to an external relational DBMS, the SYBASE Database Management System. The interface enables MUMPS to operate in a relational environment and gives the MUMPS language full access to a complete set of SQL commands. MUMPS generates SQL statements as ASCII text and sends them to the RDBMS. The RDBMS executes the statements and returns ASCII results to MUMPS. The interface suggests that the language features of MUMPS make it an attractive tool for use in the relational database environment. The approach described in this paper separates MUMPS from the relational database. Positioning the relational database outside of MUMPS promotes data sharing and permits a number of different options to be used for working with the data. Other languages like C, FORTRAN, and COBOL can access the RDBMS database. Advanced tools provided by the relational database vendor can also be used. SYBASE is an advanced high-performance transaction-oriented relational database management system for the VAX/VMS and UNIX operating systems. SYBASE is designed using a distributed open-systems architecture, and is relatively easy to interface with MUMPS.
Constantin Marian MATEI
Full Text Available Banks are still using legacy systems as the core of their business is comprised within such sys-tems. Since the technology and client demands are changing rapidly, the banks have to adapt their systems in order to be competitive. The issue is to identify correctly what are the bank users preferences in terms of software reliability and how modern is the system For instance, there are users who enjoy working using the old screen format, and there are users who enjoy working with newer layouts, Web interfaces, and so on. We need also to know the constraints generated by the usage of legacy systems, and how these systems can be improved or replaced. The scope of the article is to present a solution of modernizing a legacy banking system by using a SOA approach. The research is based on the modernization of a legacy system developed in COBOL/400 under IBM iSeries. The modernization process uses a SOA ap-proach using JAVA technologies.
Management of the database originated from individual and environment monitoring carried out in the UNIFESP/HSP complex, SP, Brazil; Gerenciamento da base de dados originados da monitoracao individual e ambiental efetuado no complexo UNIFESP/HSP
Medeiros, Regina Bitelli; Daros, Kellen Adriana Curci; Almeida, Natalia Correia de; Pires, Silvio Ricardo; Jorge, Luiz Tadeu [Universidade Federal de Sao Paulo (UNIFESP), SP (Brazil)
The Radiological Protection Sector of the Sao Paulo Hospital/Federal University of Sao Paulo, SP, Brazil manages the records of 457 dosemeters. Once the users must know about the absorbed doses monthly and the need of keep the individuals records until the age of 75 years old and for, at least during 30 years after the end of the occupation of the individual, it became necessary to construct a database and a computerized control to manage the accumulated doses. This control, between 1991 and 1999, was effected by means of a relational database (Cobol 85 - Operating System GCOS 7 (ABC Telematic Bull)). After this period, when the company responsible for dosimetry went on to provide computerized results, the data were stored in a Paradox database (Borland). In 2004, the databases were integrated and were created a third database developed in Oracle (IBM) and a system that allowed the institutional Intranet users to consult their accumulated doses annually and the value of the total effective dose accumulated during working life.
Sterzel, J.; Havelka, J.; Chrast, M.
A program was written for technological procedures for mining operations, driving of horizontal workings, driving of raises, boring of long survey boreholes, driving of wide workings, and for working support withdrawal. Each of the said types of technological procedures is specific as concerns data and different information contents but the basic configuration is the same. The procedure is divided into three parts. Part one shows basic data on the working and technological operations from which wages are calculated and material is inventoried. Part two offers necessary information on retreat paths, special duties during blasting operations, during material handling, and all other information required by safety specifications. Part three consists of a signature list containing the names of the team of the given section and of the area foremen, and approval columns. The programs are written in COBOL-DOS/4 allowing the operation of up to 30 remote terminals of the EC 7920 type. The advantage of the computer-assisted production of technological procedures is the possibility of using the technological procedures already defined for the production of new technological procedures, this by a mere change in parameters. The data base is also a source for analytical activities of practically all production units. (J.B.)
Bramlette, J.D.; Ewart, S.M.; Jones, C.E.
Westinghouse Idaho Nuclear Company, Inc. (WINCO) developed and implemented a computerized hazardous chemical tracking system, referred to as Haz-Trac, for use at the Idaho Chemical Processing Plant (ICPP). Haz-Trac is designed to provide a means to improve the accuracy and reliability of chemical information, which enhances the overall quality and safety of ICPP operations. The system tracks all chemicals and chemical components from the time they enter the ICPP until the chemical changes form, is used, or becomes a waste. The system runs on a Hewlett-Packard (HP) 3000 Series 70 computer. The system is written in COBOL and uses VIEW/3000, TurboIMAGE/DBMS 3000, OMNIDEX, and SPEEDWARE. The HP 3000 may be accessed throughout the ICPP, and from remote locations, using data communication lines. Haz-Trac went into production in October, 1989. Currently, over 1910 chemicals and chemical components are tracked on the system. More than 2500 personnel hours were saved during the first six months of operation. Cost savings have been realized by reducing the time needed to collect and compile reporting information, identifying and disposing of unneeded chemicals, and eliminating duplicate inventories. Haz-Trac maintains information required by the Superfund Amendment Reauthorization Act (SARA), the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) and the Occupational Safety and Health Administration (OSHA)
Description of program or function: SLIB77 is a source librarian program designed to maintain FORTRAN source code in a compressed form on magnetic disk. The program was prepared to meet program maintenance requirements for ongoing program development and continual improvement of very large programs involving many programmers from a number of different organizations. SLIB77 automatically maintains in one file the source of the current program as well as all previous modifications. Although written originally for FORTRAN programs, SLIB77 is suitable for use with data files, text files, operating systems, and other programming languages, such as Ada, C and COBOL. It can handle libraries with records of up to 160-characters. Records are grouped into DECKS and assigned deck names by the user. SLIB77 assigns a number to each record in each DECK. Records can be deleted or restored singly or as a group within each deck. Modification records are grouped and assigned modification identification names by the user. The program assigns numbers to each new record within the deck. The program has two modes of execution, BATCH and EDIT. The BATCH mode is controlled by an input file and is used to make changes permanent and create new library files. The EDIT mode is controlled by interactive terminal input and a built-in line editor is used for modification of single decks. Transferring of a library from one computer system to another is accomplished using a Portable Library File created by SLIB77 in a BATCH run
Bramlette, J D; Ewart, S M; Jones, C E
Westinghouse Idaho Nuclear Company, Inc. (WINCO) developed and implemented a computerized hazardous chemical tracking system, referred to as Haz-Trac, for use at the Idaho Chemical Processing Plant (ICPP). Haz-Trac is designed to provide a means to improve the accuracy and reliability of chemical information, which enhances the overall quality and safety of ICPP operations. The system tracks all chemicals and chemical components from the time they enter the ICPP until the chemical changes form, is used, or becomes a waste. The system runs on a Hewlett-Packard (HP) 3000 Series 70 computer. The system is written in COBOL and uses VIEW/3000, TurboIMAGE/DBMS 3000, OMNIDEX, and SPEEDWARE. The HP 3000 may be accessed throughout the ICPP, and from remote locations, using data communication lines. Haz-Trac went into production in October, 1989. Currently, over 1910 chemicals and chemical components are tracked on the system. More than 2500 personnel hours were saved during the first six months of operation. Cost savings have been realized by reducing the time needed to collect and compile reporting information, identifying and disposing of unneeded chemicals, and eliminating duplicate inventories. Haz-Trac maintains information required by the Superfund Amendment Reauthorization Act (SARA), the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) and the Occupational Safety and Health Administration (OSHA).
Strand, R.; Cox, T.L.; Sjoreen, A.; Alvic, D.
The National Center for Toxicological Research (NCTR) is the basic research arm of the US Food and Drug Administration (FDA). The NCTR has upgraded and standardized its computer operations on Digital Equipment Corporation VAX minicomputers using Software AG's ADABAS data base management system for all research applications. The NCTR is currently performing a large study to improve the functionality of the animal husbandry systems and applications called Breeding/Multigeneration Support System (BMSS). When functional, it will operate on VAX equipment using the ADABAS data base management system, TDMS, and COBOL. Oak Ridge National Laboratory (ORNL) is supporting NCTR in the design, prototyping, and software engineering of the BMSS. This document summarizes the internal design elements that include data structures, file structures, and system attributes that were required to facilitate the decision support requirements defined in the external design work. Prototype pseudocode then was developed for the recommended system attributes and file and data structures. Finally, ORNL described the processing requirements including the initial access of the BMSS, integration of the existing INLIFE system and the STUDY DEFINITION system under development, data system initialization and maintenance, and BMSS testing and verification. This document describes ORNL's recommendations for the internal design of the BMSS. ORNL will provide research support to NCTR in the additional phases of systems life cycle development for BMSS. ORNL has prepared this document according to NCTR's Standard Operating Procedures for Systems Development. 6 figs., 5 tabs.
Medeiros, Regina Bitelli; Daros, Kellen Adriana Curci; Almeida, Natalia Correia de; Pires, Silvio Ricardo; Jorge, Luiz Tadeu
The Radiological Protection Sector of the Sao Paulo Hospital/Federal University of Sao Paulo, SP, Brazil manages the records of 457 dosemeters. Once the users must know about the absorbed doses monthly and the need of keep the individuals records until the age of 75 years old and for, at least during 30 years after the end of the occupation of the individual, it became necessary to construct a database and a computerized control to manage the accumulated doses. This control, between 1991 and 1999, was effected by means of a relational database (Cobol 85 - Operating System GCOS 7 (ABC Telematic Bull)). After this period, when the company responsible for dosimetry went on to provide computerized results, the data were stored in a Paradox database (Borland). In 2004, the databases were integrated and were created a third database developed in Oracle (IBM) and a system that allowed the institutional Intranet users to consult their accumulated doses annually and the value of the total effective dose accumulated during working life
An accurate flowchart is an important part of the documentation for any computer program. The flowchart offers the user an easy to follow overview of program operation and the maintenance programmer an effective debugging tool. The TAMU FLOWCHART System was developed to flowchart any program written in the FORTRAN language. It generates a line printer flowchart which is representative of the program logic. This flowchart provides the user with a detailed representation of the program action taken as each program statement is executed. The TAMU FLOWCHART System should prove to be a valuable aid to groups working with complex FORTRAN programs. Each statement in the program is displayed within a symbol which represents the program action during processing of the enclosed statement. Symbols available include: subroutine, function, and entry statements; arithmetic statements; input and output statements; arithmetical and logical IF statements; subroutine calls with or without argument list returns; computed and assigned GO TO statements; DO statements; STOP and RETURN statements; and CONTINUE and ASSIGN statements. Comment cards within the source program may be suppressed or displayed and associated with a succeeding source statement. Each symbol is annotated with a label (if present in the source code), a block number, and the statement sequence number. Program flow and options within the program are represented by line segments and direction indicators connecting symbols. The TAMU FLOWCHART System should be able to accurately flowchart any working FORTRAN program. This program is written in COBOL for batch execution and has been implemented on an IBM 370 series computer with an OS operating system and with a central memory requirement of approximately 380K of 8 bit bytes. The TAMU FLOWCHART System was developed in 1977.