WorldWideScience

Sample records for lll cobol compiler

  1. Report printer (COBOL IRIS 50)

    International Nuclear Information System (INIS)

    Paul, Daniele

    1973-10-01

    The research thesis reports a detailed study of the Report Writer of the COBOL language in order to integrate it into the IRIS 50 COBOL compiler. In order to use existing compiler processing, the author developed a simulation of the Report Writer by using Cobol statements generated in the declarative part of the Division procedure. After a brief presentation of the IRIS 50 computer, the author presents the general plan of the compiler with modifications and adjunctions exclusively due to the Report Writer. The next part addresses the practical implementation and the problems met and solved during this implementation

  2. Programming in COBOL

    CERN Document Server

    Lancaster, G T

    2014-01-01

    Programming in COBOL is a simple yet concise how-to book that teaches the programming language in a short yet effective step-by-step manner, which can be easily understood by anyone with sufficient knowledge in information technology. Covering first the advantages of COBOL over other programming languages, the book discusses COBOL's divisions - identification, environment, procedure, and data, and then describes the testing of the COBOL source programs and program questions. The book is valuable for those who wish to learn basic COBOL language, but do not have the time to take manufacturers' o

  3. Cobol for students

    CERN Document Server

    Parkin, Andrew

    1995-01-01

    COBOL for Students has established itself as one of the most successful teaching texts on COBOL programming and is now in its fourth edition. The first part of the book concentrates on the fundamentals of the language and takes students to the point where they can write modestly sized programs using sequential files. Part two assumes competence in elementary COBOL and explains design and other programming techniques which should be part of the professional programmer's repertoire. Part three extends the student's knowledge of the language by explaining some of the more advanced features of COB

  4. Cobol software modernization

    CERN Document Server

    Barbier, Franck

    2015-01-01

    Nowadays, billions of lines of code are in the COBOL programming language. This book is an analysis, a diagnosis, a strategy, a MDD method and a tool to transform legacy COBOL into modernized applications that comply with Internet computing, Service-Oriented Architecture (SOA) and the Cloud.  It serves as a blueprint for those in charge of finding solutions to this considerable challenge.

  5. GUI and Object Oriented Programming in COBOL.

    Science.gov (United States)

    Lorents, Alden C.

    Various schools are struggling with the introduction of Object Oriented (OO) programming concepts and GUI (graphical user interfaces) within the traditional COBOL sequence. OO programming has been introduced in some of the curricula with languages such as C++, Smalltalk, and Java. Introducing OO programming into a typical COBOL sequence presents…

  6. Transforming Cobol Legacy Software to a Generic Imperative Model

    National Research Council Canada - National Science Library

    Moraes, DinaL

    1999-01-01

    .... This research develops a transformation system to convert COBOL code into a generic imperative model, recapturing the initial design and deciphering the requirements implemented by the legacy code...

  7. A brief description and comparison of programming languages FORTRAN, ALGOL, COBOL, PL/1, and LISP 1.5 from a critical standpoint

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Several common higher level program languages are described. FORTRAN, ALGOL, COBOL, PL/1, and LISP 1.5 are summarized and compared. FORTRAN is the most widely used scientific programming language. ALGOL is a more powerful language for scientific programming. COBOL is used for most commercial programming applications. LISP 1.5 is primarily a list-processing language. PL/1 attempts to combine the desirable features of FORTRAN, ALGOL, and COBOL into a single language.

  8. LPTR irradiation of LLL vanadium tensile specimens and LLL Nb--1Zr tensile specimens

    International Nuclear Information System (INIS)

    MacLean, S.C.; Rowe, C.L.

    1977-01-01

    The LPTR irradiation of 14 LLL vanadium tensile specimens and 14 LLL Nb-1Zr tensile specimens is described. Sample packaging, the irradiation schedule and neutron fluences for three energy ranges are given

  9. N-Dimensional LLL Reduction Algorithm with Pivoted Reflection

    Directory of Open Access Journals (Sweden)

    Zhongliang Deng

    2018-01-01

    Full Text Available The Lenstra-Lenstra-Lovász (LLL lattice reduction algorithm and many of its variants have been widely used by cryptography, multiple-input-multiple-output (MIMO communication systems and carrier phase positioning in global navigation satellite system (GNSS to solve the integer least squares (ILS problem. In this paper, we propose an n-dimensional LLL reduction algorithm (n-LLL, expanding the Lovász condition in LLL algorithm to n-dimensional space in order to obtain a further reduced basis. We also introduce pivoted Householder reflection into the algorithm to optimize the reduction time. For an m-order positive definite matrix, analysis shows that the n-LLL reduction algorithm will converge within finite steps and always produce better results than the original LLL reduction algorithm with n > 2. The simulations clearly prove that n-LLL is better than the original LLL in reducing the condition number of an ill-conditioned input matrix with 39% improvement on average for typical cases, which can significantly reduce the searching space for solving ILS problem. The simulation results also show that the pivoted reflection has significantly declined the number of swaps in the algorithm by 57%, making n-LLL a more practical reduction algorithm.

  10. Segment LLL Reduction of Lattice Bases Using Modular Arithmetic

    Directory of Open Access Journals (Sweden)

    Sanjay Mehrotra

    2010-07-01

    Full Text Available The algorithm of Lenstra, Lenstra, and Lovász (LLL transforms a given integer lattice basis into a reduced basis. Storjohann improved the worst case complexity of LLL algorithms by a factor of O(n using modular arithmetic. Koy and Schnorr developed a segment-LLL basis reduction algorithm that generates lattice basis satisfying a weaker condition than the LLL reduced basis with O(n improvement than the LLL algorithm. In this paper we combine Storjohann’s modular arithmetic approach with the segment-LLL approach to further improve the worst case complexity of the segment-LLL algorithms by a factor of n0.5.

  11. Geodesic continued fractions and LLL

    NARCIS (Netherlands)

    Beukers, F

    2014-01-01

    We discuss a proposal for a continued fraction-like algorithm to determine simultaneous rational approximations to dd real numbers α1,…,αdα1,…,αd. It combines an algorithm of Hermite and Lagarias with ideas from LLL-reduction. We dynamically LLL-reduce a quadratic form with parameter tt as t↓0t↓0.

  12. Laser fusion experiments at LLL

    Energy Technology Data Exchange (ETDEWEB)

    Ahlstrom, H.G.

    1980-06-16

    These notes present the experimental basis and status for laser fusion as developed at LLL. Two other chapters, one authored by K.A. Brueckner and the other by C. Max, present the theoretical implosion physics and laser plasma interaction physics. The notes consist of six sections. The first is an introductory section which provides some of the history of inertial fusion and a simple explanation of the concepts involved. The second section presents an extensive discussion of diagnostic instrumentation used in the LLL Laser Fusion Program. The third section is a presentation of laser facilities and capabilities at LLL. The purpose here is to define capability, not to derive how it was obtained. The fourth and fifth sections present the experimental data on laser-plasma interaction and implosion physics. The last chapter is a short projection of the future.

  13. Laser fusion experiments at LLL

    International Nuclear Information System (INIS)

    Ahlstrom, H.G.

    1980-01-01

    These notes present the experimental basis and status for laser fusion as developed at LLL. Two other chapters, one authored by K.A. Brueckner and the other by C. Max, present the theoretical implosion physics and laser plasma interaction physics. The notes consist of six sections. The first is an introductory section which provides some of the history of inertial fusion and a simple explanation of the concepts involved. The second section presents an extensive discussion of diagnostic instrumentation used in the LLL Laser Fusion Program. The third section is a presentation of laser facilities and capabilities at LLL. The purpose here is to define capability, not to derive how it was obtained. The fourth and fifth sections present the experimental data on laser-plasma interaction and implosion physics. The last chapter is a short projection of the future

  14. DT fusion neutron irradiation of LLL Nb3Sn and LLL superconductor wires at 4.20K

    International Nuclear Information System (INIS)

    MacLean, S.C.

    1977-01-01

    The DT fusion neutron irradiation of one LLL superconductor wire and one LLL Nb 3 Sn foil at 4.2 0 K is described. The sample position, beam-on time, and neutron dose record are given. The results from two ''profile'' dosimetry foils measuring the lateral variation in neutron flux are included

  15. OASIS: a COBOL-11 menu-driven information system

    International Nuclear Information System (INIS)

    Lee, W.F. Jr.

    1982-01-01

    Oak Ridge National Laboratory's Automated Safeguards Information System (OASIS) is a near real-time nuclear materials/precious metals safeguard and accountability control system. Using COBOL and RSTS/E on a dedicated 11/34, the system performs on-line inventory update, inquiry and report functions. Processed transactions consisting of intra-laboratory movements, on-site receipts and off-site shipments are maintained for inquiry and report preparation. A secure, controlled but friendly user environment is maintained by chaining between menu and data manipulation tasks. The use of menus, security and access control, screen manipulation, file access and contention, word processing activities, task size problems and other aspects of this application will be discussed

  16. On the theoretical link between LLL-reduction and Lambda-decorrelation

    Science.gov (United States)

    Lannes, A.

    2013-04-01

    The LLL algorithm, introduced by Lenstra et al. (Math Ann 261:515-534, 1982), plays a key role in many fields of applied mathematics. In particular, it is used as an effective numerical tool for preconditioning the integer least-squares problems arising in high-precision geodetic positioning and Global Navigation Satellite Systems (GNSS). In 1992, Teunissen developed a method for solving these nearest-lattice point (NLP) problems. This method is referred to as Lambda (for Least-squares AMBiguity Decorrelation Adjustment). The preconditioning stage of Lambda corresponds to its decorrelation algorithm. From an epistemological point of view, the latter was devised through an innovative statistical approach completely independent of the LLL algorithm. Recent papers pointed out some similarities between the LLL algorithm and the Lambda-decorrelation algorithm. We try to clarify this point in the paper. We first introduce a parameter measuring the orthogonality defect of the integer basis in which the NLP problem is solved, the LLL-reduced basis of the LLL algorithm, or the Λ -basis of the Lambda method. With regard to this problem, the potential qualities of these bases can then be compared. The Λ -basis is built by working at the level of the variance-covariance matrix of the float solution, while the LLL-reduced basis is built by working at the level of its inverse. As a general rule, the orthogonality defect of the Λ -basis is greater than that of the corresponding LLL-reduced basis; these bases are however very close to one another. To specify this tight relationship, we present a method that provides the dual LLL-reduced basis of a given Λ -basis. As a consequence of this basic link, all the recent developments made on the LLL algorithm can be applied to the Lambda-decorrelation algorithm. This point is illustrated in a concrete manner: we present a parallel Λ -type decorrelation algorithm derived from the parallel LLL algorithm of Luo and Qiao (Proceedings of

  17. Hydrogeochemical and stream-sediment reconnaissance program at LLL

    International Nuclear Information System (INIS)

    Tinney, J.F.

    1977-03-01

    The Lawrence Livermore Laboratory (LLL) is conducting a Hydrogeochemical and Stream-Sediment Reconnaissance (HSSR) survey in support of ERDA's National Uranium Resource Evaluation (NURE) program. Included in the LLL portion of this survey are seven western states (Arizona, California, Idaho, Nevada, Oregon, Utah, and Washington). Similar surveys are being carried out in the rest of the continental United States, including Alaska, as part of a systematic nationwide study of the distribution of uranium in surface water, groundwater, and stream sediment. The overall objective is to identify favorable areas for uranium exploration. This paper describes the program being conducted by LLL to complete our portion of the survey by 1981. The topics discussed are geology and sample acquisition, sample preparation and analysis, and data-base management

  18. A verified LLL algorithm

    NARCIS (Netherlands)

    Divasón, Jose; Joosten, Sebastiaan; Thiemann, René; Yamada, Akihisa

    2018-01-01

    The Lenstra-Lenstra-Lovász basis reduction algorithm, also known as LLL algorithm, is an algorithm to find a basis with short, nearly orthogonal vectors of an integer lattice. Thereby, it can also be seen as an approximation to solve the shortest vector problem (SVP), which is an NP-hard problem,

  19. Achievements and developments at LLL

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    In discussing Laboratory trends, the Director cited numerous examples of progress recorded during the past year. This summary highlights some of the programmatic and research milestones that marked 1976 and that represent the diversity of efforts carried out at LLL. 6 figures

  20. The LLL algorithm survey and applications

    CERN Document Server

    Nguyen, Phong Q

    2010-01-01

    The first book to offer a comprehensive view of the LLL algorithm, this text surveys computational aspects of Euclidean lattices and their main applications. It includes many detailed motivations, explanations and examples.

  1. What Have We Learnt about Mobile LifeLong Learning (mLLL)?

    Science.gov (United States)

    Seta, Luciano; Kukulska-Hulme, Agnes; Arrigo, Marco

    2014-01-01

    Mobile technologies are becoming ubiquitous in education, yet the wider implications of this phenomenon are not well understood. The paper discusses how mobile lifelong learning (mLLL) may be defined, and the challenges of forging a suitable definition in an ever-shifting technological and socio-economic landscape. mLLL appears as a ubiquitous…

  2. Why 1.02? The root Hermite factor of LLL and stochastic sandpile models

    OpenAIRE

    Ding, Jintai; Kim, Seungki; Takagi, Tsuyoshi; Wang, Yuntao

    2018-01-01

    In lattice-based cryptography, a disturbing and puzzling fact is that there exists such a conspicuous gap between the actual performance of LLL and what could be said of it theoretically. By now, no plausible mathematical explanation is yet proposed. In this paper, we provide compelling evidence that LLL behaves essentially identically to a certain stochastic variant of the sandpile model that we introduce. This allows us to explain many observations on the LLL algorithm that have so far been...

  3. The history of the LLL-algorithm

    NARCIS (Netherlands)

    Smeets, I.; Lenstra, A.; Lenstra, H.; Lovász, L.; van Emde Boas, P.

    2010-01-01

    The 25th birthday of the LLL-algorithm was celebrated in Caen from 29th June to 1st July 2007. The three day conference kicked off with a historical session of four talks about the origins of the algorithm. The speakers were the three L’s and close bystander Peter van Emde Boas. These were the

  4. AESTHESIA IN SOUTH AFRICA * lllSTORY OF

    African Journals Online (AJOL)

    family general practitioner who i always in the picture. lllSTORY ... gradually increased up to 125 minim daiJy of laudanum.3. This give .... about eighteen minutes, most nobly, scarcely uttering a word or ..... uld recall \\ a that of a onfu ed unea y.

  5. Biologic activity of the novel small molecule STAT3 inhibitor LLL12 against canine osteosarcoma cell lines

    Directory of Open Access Journals (Sweden)

    Couto Jason I

    2012-12-01

    Full Text Available Abstract Background STAT3 [1] has been shown to be dysregulated in nearly every major cancer, including osteosarcoma (OS. Constitutive activation of STAT3, via aberrant phosphorylation, leads to proliferation, cell survival and resistance to apoptosis. The present study sought to characterize the biologic activity of a novel allosteric STAT3 inhibitor, LLL12, in canine OS cell lines. Results We evaluated the effects of LLL12 treatment on 4 canine OS cell lines and found that LLL12 inhibited proliferation, induced apoptosis, reduced STAT3 phosphorylation, and decreased the expression of several transcriptional targets of STAT3 in these cells. Lastly, LLL12 exhibited synergistic anti-proliferative activity with the chemotherapeutic doxorubicin in the OS lines. Conclusion LLL12 exhibits biologic activity against canine OS cell lines through inhibition of STAT3 related cellular functions supporting its potential use as a novel therapy for OS.

  6. Second order statistical behavior of LLL and BKZ

    NARCIS (Netherlands)

    Y. Yu (Yang); L. Ducas (Léo)

    2017-01-01

    textabstractThe LLL algorithm (from Lenstra, Lenstra and Lovász) and its generalization BKZ (from Schnorr and Euchner) are widely used in cryptanalysis, especially for lattice-based cryptography. Precisely understanding their behavior is crucial for deriving appropriate key-size for cryptographic

  7. LLL DBASE glossary and parameter definitions, Part 1

    International Nuclear Information System (INIS)

    Rohrer, R.F.

    1975-01-01

    This report lists, defines, and updates parameters in DBASE, an LLL test effects data bank in which data is stored from experiments performed at NTS and other test sites. Parameters are listed by subject and by number. Part 2 of this report presents the same information for classified parameters

  8. The small molecule, LLL12, inhibits STAT3 phosphorylation and induces apoptosis in medulloblastoma and glioblastoma cells.

    Directory of Open Access Journals (Sweden)

    Sarah Ball

    Full Text Available Tumors of the central nervous system represent a major source of cancer-related deaths, with medulloblastoma and glioblastoma being the most common malignant brain tumors in children and adults respectively. While significant advances in treatment have been made, with the 5-year survival rate for medulloblastoma at 70-80%, treating patients under 3 years of age still poses a problem due to the deleterious effects of radiation on the developing brain, and the median survival for patients with glioblastoma is only 15 months. The transcription factor, STAT3, has been found constitutively activated in a wide variety of cancers and in recent years it has become an attractive therapeutic target. We designed a non-peptide small molecule STAT3 inhibitor, LLL12, using structure-based design. LLL12 was able to inhibit STAT3 phosphorylation, decrease cell viability and induce apoptosis in medulloblastoma and glioblastoma cell lines with elevated levels of p-STAT3 (Y705. IC(50 values for LLL12 were found to be between 1.07 µM and 5.98 µM in the five cell lines expressing phosphorylated STAT3. STAT3 target genes were found to be downregulated and a decrease in STAT3 DNA binding was observed following LLL12 treatment, indicating that LLL12 is an effective STAT3 inhibitor. LLL12 was also able to inhibit colony formation, wound healing and decreased IL-6 and LIF secretion. Our results suggest that LLL12 is a potent STAT3 inhibitor and that it may be a potential therapeutic treatment for medulloblastoma and glioblastoma.

  9. A New Block Processing Algorithm of LLL for Fast High-dimension Ambiguity Resolution

    Directory of Open Access Journals (Sweden)

    LIU Wanke

    2016-02-01

    Full Text Available Due to high dimension and precision for the ambiguity vector under GNSS observations of multi-frequency and multi-system, a major problem to limit computational efficiency of ambiguity resolution is the longer reduction time when using conventional LLL algorithm. To address this problem, it is proposed a new block processing algorithm of LLL by analyzing the relationship between the reduction time and the dimensions and precision of ambiguity. The new algorithm reduces the reduction time to improve computational efficiency of ambiguity resolution, which is based on block processing ambiguity variance-covariance matrix that decreased the dimensions of single reduction matrix. It is validated that the new algorithm with two groups of measured data. The results show that the computing efficiency of the new algorithm increased by 65.2% and 60.2% respectively compared with that of LLL algorithm when choosing a reasonable number of blocks.

  10. LLL hydrodiagnostics upgrade program

    International Nuclear Information System (INIS)

    Carpluk, G.T.; Innes, T.G.; Trost, S.R.

    1978-01-01

    The continued success of the nuclear weapon design and test programs is a direct result of the effective hydrodynamic programs developed at Livermore. Hydrodynamic studies provide the weapon designer with experimental verification and analysis of weapon designs through the use of explosively driven, nonfissile materials. Hydrodiagnostic techniques include (1) flash radiography; (2) high-speed photography; (3) electrical-pin diagnostics; and (4) interferometry. The focus of all hydrodynamic testing at LLL is the Site 300 Explosive Test Area located 18 miles east of Livermore. This remote, 10-mi 2 facility consists of: (1) areas for machining and assembling explosives; (2) environmental test facilities; (3) administrative and support facilities; and (4) five underground, reinforced-concrete bunkers equipped with high-speed cameras, electrical data-acquisition systems, and electron accelerators for flash radiography. The report summarizes the improvements made in radiography, optics, electronics, and interferometry, and indicates the projects that remain to be completed

  11. US NRC/LLL liaison with the Federal Republic of Germany for the GKSS-PSS steam condensation tests. Progress report No. 1

    International Nuclear Information System (INIS)

    McCauley, E.W.

    1979-01-01

    This progress report for the USNRC/LLL Liaison program with the Federal Republic of Germany regarding boiling water reactor containment multivent steam condensation tests being conducted by GKSS (Gesellschaft fuer Kernenergieverwertung in Schiffbau und Schiffahrt) address program activity during the period of May-June 1979. During this period the program scope was defined, initial contacts between LLL and GKSS were established, and the first trip by an LLL representative to the GKSS test facility was taken

  12. Mass spectrometer data system at LLL

    International Nuclear Information System (INIS)

    Friesen, R.D.

    1975-01-01

    The data systems on the three mass spectrometers at LLL are computer-controlled, pulse-counting systems synchronized to a repeatedly-swept magnetic field. The data are accumulated in the memory of the computer or in a Nuclear Data ND 180 in a multi-scaler mode of operation. This mode of sweeping allows a continuous check of the background stability and makes tune-up easier. But the main benefit is a reduction in the required ion emission rate stability. By the use of standards to set the system dead time, we have been able to utilize the sensitivity of a pulse counting system without the expense of exotic equipment

  13. Secondary School Students' LLL Competencies, and Their Relation with Classroom Structure and Achievement.

    Science.gov (United States)

    Klug, Julia; Lüftenegger, Marko; Bergsmann, Evelyn; Spiel, Christiane; Schober, Barbara

    2016-01-01

    There is a strong urge to foster lifelong learning (LLL) competencies with its key components - motivation and self-regulated learning - from early on in the education system. School in general is presently not considered to be successful in systematically imparting motivation and self-regulated learning strategies. There is strong evidence that decisive motivational determinants decrease the longer students stay in school. At present, the central sources of information about the situation in Austria are international monitoring studies, which only examine selected aspects of specific target groups, and their interpretability concerning mean values is constricted due to cultural differences. Thus, it is important to conduct additional and more differentiated national surveys of the actual state. This is why this study aimed at answering the following questions: (1) how well are Austrian students equipped for the future, in terms of their lifelong learning competencies, (2) can perceived classroom structure predict students' LLL, and (3) is there a correlation of students' LLL with their achievement in the school subjects math and German language. 5366 students (52.1% female) from 36 Austrian schools took part in the online-questionnaire (mean age 15.35 years, SD = 2.45), which measured their perceived LLL competencies in the subjects math and German language, their perceived classroom structure and their achievement. Results showed that the great majority of Austrian students - independent from domain and sex - know and are able to apply cognitive as well as metacognitive learning strategies. With regard to motivation the picture is less satisfactory: whilst students' self-efficacy is not the problem, there is a lack of interest in the school subjects and they often report to follow performance approach goals. Classroom structure positively predicted students' goals, interest, self-efficacy and learning strategies. Self-efficacy, performance approach goals, meta

  14. Secondary school students' LLL competencies, and their relation with classroom structure and achievement

    Directory of Open Access Journals (Sweden)

    Julia eKlug

    2016-05-01

    Full Text Available There is a strong urge to foster lifelong learning (LLL competencies with its key components - motivation and self-regulated learning - from early on in the education system. School in general is presently not considered to be successful in systematically imparting motivation and self-regulated learning strategies. There is strong evidence that decisive motivational determinants decrease the longer students stay in school. At present, the central sources of information about the situation in Austria are international monitoring studies, which only examine selected aspects of specific target groups, and their interpretability concerning mean values is constricted due to cultural differences. Thus, it is important to conduct additional and more differentiated national surveys of the actual state. This is why this study aimed at answering the following questions: (1 how well are Austrian students equipped for the future, in terms of their lifelong learning competencies, (2 can perceived classroom structure predict students’ LLL, and (3 is there a correlation of students’ LLL with their achievement in the school subjects math and German language. 5366 students (52.1% female from Thirty-six Austrian schools took part in the online-questionnaire (mean age 15.35 years, SD=2.45, which measured their perceived LLL competencies in the subjects math and German language, their perceived classroom structure and their achievement. Results showed that the great majority of Austrian students – independent from domain and sex - know and are able to apply cognitive as well as metacognitive learning strategies. With regard to motivation the picture is less satisfactory: whilst students’ self-efficacy is not the problem, there is a lack of interest in the school subjects and they often report to follow performance approach goals. Classroom structure positively predicted students’ goals, interest, self-efficacy and learning strategies. Self

  15. Secondary School Students’ LLL Competencies, and Their Relation with Classroom Structure and Achievement

    Science.gov (United States)

    Klug, Julia; Lüftenegger, Marko; Bergsmann, Evelyn; Spiel, Christiane; Schober, Barbara

    2016-01-01

    There is a strong urge to foster lifelong learning (LLL) competencies with its key components – motivation and self-regulated learning – from early on in the education system. School in general is presently not considered to be successful in systematically imparting motivation and self-regulated learning strategies. There is strong evidence that decisive motivational determinants decrease the longer students stay in school. At present, the central sources of information about the situation in Austria are international monitoring studies, which only examine selected aspects of specific target groups, and their interpretability concerning mean values is constricted due to cultural differences. Thus, it is important to conduct additional and more differentiated national surveys of the actual state. This is why this study aimed at answering the following questions: (1) how well are Austrian students equipped for the future, in terms of their lifelong learning competencies, (2) can perceived classroom structure predict students’ LLL, and (3) is there a correlation of students’ LLL with their achievement in the school subjects math and German language. 5366 students (52.1% female) from 36 Austrian schools took part in the online-questionnaire (mean age 15.35 years, SD = 2.45), which measured their perceived LLL competencies in the subjects math and German language, their perceived classroom structure and their achievement. Results showed that the great majority of Austrian students – independent from domain and sex – know and are able to apply cognitive as well as metacognitive learning strategies. With regard to motivation the picture is less satisfactory: whilst students’ self-efficacy is not the problem, there is a lack of interest in the school subjects and they often report to follow performance approach goals. Classroom structure positively predicted students’ goals, interest, self-efficacy and learning strategies. Self-efficacy, performance

  16. Status of the LBL/LLL development program

    International Nuclear Information System (INIS)

    Berkner, K.H.; Cooper, W.S.; Ehlers, K.W.; Pyle, R.V.; Hooper, E.B. Jr.

    1977-11-01

    The status and near-term goals of the LBL/LLL neutral-beam-development program are described. The emphasis is on the technology of systems based on the acceleration and neutralization of positive ions; this approach will be used in the near term, probably through 1985 at least. For more efficient injection, part of the plan is to develop a negative-ion approach suitable for 200- to 400-kV injectors on confinement experiments in the 1985 to 1990 period. However, the negative-ion based program is still very much in the research phase, and it is difficult to project how it will phase into fusion reactor fueling experiments

  17. Status of the LBL/LLL development program

    International Nuclear Information System (INIS)

    Berkner, K.H.; Cooper, W.S.; Ehlers, K.W.; Pyle, R.V.; Hooper, E.B. Jr.

    1978-01-01

    The status and near-term goals of the LBL/LLL neutral-beam-development program are described. The emphasis in this paper is on the technology of systems based on the acceleration and neutralization of positive ions; this approach will be used in the near term, probably through 1985 at least. For more efficient injection, part of our plan is to develop a negative-ion approach suitable for 200- to 400-kV injectors on confinement experiments in the 1985 to 1990 period. However, the negative-ion based program is still very much in the research phase, and it is difficult to project how it will phase into fusion reactor fueling experiments

  18. Computerized mass spectrometer data system at LLL

    International Nuclear Information System (INIS)

    Friesen, R.D.; Dupzyk, R.J.

    1976-01-01

    The data systems on the three mass spectrometers at LLL are computer-controlled, pulse-counting systems synchronized to a repeatedly swept magnetic field. The data are accumulated in the memory of the computer or in a Nuclear Data ND 180 in a multi-scaler mode of operation. This mode of data acquisition allows a continuous check of the background stability and makes tune-up easier. But the main benefit is a reduction in the required ion emission rate stability. By the use of standards to set the system dead time, we have been able to utilize the sensitivity of a pulse counting system without the expense of exotic equipment

  19. Laser damage testing at LLL: an overview and an update

    International Nuclear Information System (INIS)

    Milam, D.; Lowdermilk, W.H.; Wirtenson, G.R.

    1978-01-01

    Damage thresholds for single layers of common coating materials such as MgF 2 , SiO 2 , ThF 4 , Al 2 O 3 , ZrO 2 , and TiO 2 are given. Laser-induced damage of coated and uncoated optically polished surfaces has been studied at LLL for laser pulsewidths between 0.17 ns and 3 ns. Two 1064 nm Nd lasers generated this range of pulsewidths. This report contains a review of the results

  20. LLL magnetic fusion energy program: an overview

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    Over the last 12 months, significant progress has been made in the LLL magnetic fusion energy program. In the 2XIIB experiment, a tenfold improvement was achieved in the plasma confinement factor (the product of plasma density and confinement time), pushed plasma temperature and pressure to values never before reached in a magnetic fusion experiment, and demonstrated--for the first time--plasma startup by neutral beam injection. A new laser-pellet startup technique for Baseball IIT has been successfully tested and is now being incorporated in the experiment. Technological improvements have been realized, such as a breakthrough in fabricating niobium-tin conductors for superconducting magnets. These successes, together with complementary progress in theory and reactor design, have led to a proposal to build the MX facility, which could be on the threshold of a mirror fusion reactor

  1. Role of computers in quality assurance in the LLL Criticality Safety Program

    International Nuclear Information System (INIS)

    Koponen, B.L.

    1978-01-01

    Some of the aspects of computational criticality safety quality assurance that have been emphasized in recent years at LLL are summarized. In particular, computer code changes that have been made that help the criticality analyst reduce the number of errors that he makes and to locate those that he does make; and how a computerized ''benchmark'' data base aids him in the validation of his computational methods are discussed

  2. Lepton flavor violating decays τ→lll and μ→eγ in the Higgs triplet model

    International Nuclear Information System (INIS)

    Akeroyd, A. G.; Aoki, Mayumi; Sugiyama, Hiroaki

    2009-01-01

    Singly and doubly charged Higgs bosons in the Higgs triplet model mediate the lepton flavor violating (LFV) decays τ→lll and μ→eγ. The lepton flavor violating decay rates are proportional to products of two triplet Yukawa couplings (h ij ) which can be expressed in terms of the parameters of the neutrino mass matrix and an unknown triplet vacuum expectation value. We determine the parameter space of the neutrino mass matrix in which a signal for τ→lll and/or μ→eγ is possible at ongoing and planned experiments. The conditions for respecting the stringent upper limit for μ→eee are studied in detail, with emphasis given to the possibility of |h ee |≅0, which can only be realized if Majorana phases are present.

  3. LLL K Division nuclear test effects and geologic data base: glossary and parameter definitions (U)

    International Nuclear Information System (INIS)

    Howard, N.W.

    1979-01-01

    The report lists, defines, and updates Parameters in DBASE, an LLL test effects data bank in which data is stored from experiments performed at NTS and other test sites. Parameters are listed by subject and by number. Part 2 of this report presents the same information for parameters for which some of the data may be classified

  4. The time for atomic and molecular data bases is now. An overview of data management research at LLL

    International Nuclear Information System (INIS)

    Hampel, V.E.; Henry, E.A.

    1977-01-01

    We have created two numerical data bases of atomic and molecular (A+M) data required for laser induced fusion studies. One file contains primarily atomic energy levels and atomic transition data released by Charlotte E. Moore in NBS publications. The second file is based on the spectroscopic constants for more than 1000 molecular levels of approximately 160 heteronuclear diatomic molecules prepared by S. N. Suchard. Additional data bases are contemplated in support of the accelerating research activities in these fields. The present paucity of authenticated, computer-readable A+M data is not unlike that observed two decades ago in nuclear fission research. At that time, emphasis was also given to the accurate measurement of physical parameters and to reaction rates which eventually led to the ENDF/B series of evaluated neutron cross sections. Today, powerful computers have a more dominant role in modeling and predicting the results of promising experiments. Their effective use, however, depends more than ever before upon the availability of comprehensive and accurate files of A+M data. At the Lawrence Livermore Laboratory (LLL), these requirements are accentuated by the heavy reliance on computers. Also, trends are presently becoming apparent among users of the national computer network for Magnetic Fusion Energy (MFE), with its center at LLL, to coalesce organization-dependent data files into central data bases containing bibliographic information and numerical data as a common resource. The Data Management Research Project (LLL/DMRP) is collaborating with the National Bureau of Standards (NBS/NSRDS) to be able to respond to the emerging requirements. This should contribute to a ''Public Well'' of atomic and molecular data, unencumbered by legal or monetary constraints. (author)

  5. Ionization, charge exchange, and secondary electron emission in the extractor of an LBL/LLL neutral beam source

    International Nuclear Information System (INIS)

    Fink, J.H.; McDowell, C.E.

    1975-01-01

    Using a computer code, bombardment of the electrodes resulting from ionization, charge-exchange, and back-ion emission from the neutralizer cell is studied in the positive-ion extractor region of a Lawrence Berkeley Laboratory/Lawrence Livermore Laboratory (LBL/LLL) neutral beam source. Ion and electron trajectories are presented, grid dissipations estimated, and proposals made for future designs

  6. Engineering a compiler

    CERN Document Server

    Cooper, Keith D

    2012-01-01

    As computing has changed, so has the role of both the compiler and the compiler writer. The proliferation of processors, environments, and constraints demands an equally large number of compilers. To adapt, compiler writers retarget code generators, add optimizations, and work on issues such as code space or power consumption. Engineering a Compiler re-balances the curriculum for an introductory course in compiler construction to reflect the issues that arise in today's practice. Authors Keith Cooper and Linda Torczon convey both the art and the science of compiler construction and show best practice algorithms for the major problems inside a compiler. ·Focuses on the back end of the compiler-reflecting the focus of research and development over the last decade ·Applies the well-developed theory behind scanning and parsing to introduce concepts that play a critical role in optimization and code generation. ·Introduces the student to optimization through data-flow analysis, SSA form, and a selection of sc...

  7. Compiler Feedback using Continuous Dynamic Compilation during Development

    DEFF Research Database (Denmark)

    Jensen, Nicklas Bo; Karlsson, Sven; Probst, Christian W.

    2014-01-01

    to optimization. This tool can help programmers understand what the optimizing compiler has done and suggest automatic source code changes in cases where the compiler refrains from optimizing. We have integrated our tool into an integrated development environment, interactively giving feedback as part...

  8. Positive ion portion of the LBL/LLL Neutral Beam Program

    International Nuclear Information System (INIS)

    Pyle, R.V.; Baker, W.R.; Anderson, O.A.

    1978-06-01

    The positive ion portion of the Neutral Beam Development Program at the Lawrence Berkeley (LBL) and Livermore (LLL) Laboratories has two purposes: (a) to carry out general research and development in a timely way to assure that users' needs can be met in principle, and (b) to carry out specific development for users. To meet the first requirement, we have programs to develop sources capable of producing beams with high (85%) atomic fractions, long pulse lengths (10 sec to DC), and at beam energies up to 150 keV. We are also pursuing the development of on-line computer diagnostics and controls, the sophisticated high-power electronics required by neutral beam systems, and energy recovery. To meet the second requirement, we are developing prototype source modules to meet the requirements of the TMX and MFTF experiments at Lawrence Livermore Laboratory, the TFTR experiment at the Princeton Plasma Physics Laboratory, and the Doublet III experiment at General Atomic Co. The Lawrence Laboratories are also constructing and will demonstrate at LBL a complete prototype neutral injection system for TFTR, and are designing a similar system for Doublet III

  9. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  10. Advanced compiler design and implementation

    CERN Document Server

    Muchnick, Steven S

    1997-01-01

    From the Foreword by Susan L. Graham: This book takes on the challenges of contemporary languages and architectures, and prepares the reader for the new compiling problems that will inevitably arise in the future. The definitive book on advanced compiler design This comprehensive, up-to-date work examines advanced issues in the design and implementation of compilers for modern processors. Written for professionals and graduate students, the book guides readers in designing and implementing efficient structures for highly optimizing compilers for real-world languages. Covering advanced issues in fundamental areas of compiler design, this book discusses a wide array of possible code optimizations, determining the relative importance of optimizations, and selecting the most effective methods of implementation. * Lays the foundation for understanding the major issues of advanced compiler design * Treats optimization in-depth * Uses four case studies of commercial compiling suites to illustrate different approache...

  11. Time for atomic and molecular data bases is now (an overview of data management research at LLL)

    International Nuclear Information System (INIS)

    Hampel, V.E.; Henry, E.A.

    1977-01-01

    Two numerical data bases of atomic and molecular (A and M) data required for laser-induced fusion studies were created. One file contains primarily atomic energy levels and atomic transition data released by Charlotte E. Moore in NBS publications. The second file is based on the spectroscopic constants for more than 1000 molecular levels of approximately 160 heteronuclear diatomic molecules prepared by S. N. Suchard. Additional data bases are contemplated in support of the accelerating research activities in these fields. The present paucity of authenticated, computer-readable A and M data is not unlike that observed two decades ago in nuclear fission research. At that time, emphasis was also given to the accurate measurement of physical parameters and to reaction rates which eventually led to the ENDF/B series of evaluated neutron cross sections. Today, powerful computers have a more dominant role in modeling and predicting the results of promising experiments. Their effective use, however, depends more than ever before upon the availability of comprehensive and accurate files of A and M data. At the Lawrence Livermore Laboratory (LLL), these requirements are accentuated by the heavy reliance on computers. Also, trends are presently becoming apparent among users of the national computer network for Magnetic Fusion Energy, with its center at LLL, to coalesce organization-dependent data files into central data bases containing bibliographic information and numerical data as a common resource. The Data Management Research Project is collaborating with the National Bureau of Standards (NBS/NSRDS) to be able to respond to the emerging requirements. This should contribute to a ''Public Well'' of atomic and molecular data, unencumbered by legal or monetary constraints. 14 figures

  12. HAL/S-FC compiler system specifications

    Science.gov (United States)

    1976-01-01

    This document specifies the informational interfaces within the HAL/S-FC compiler, and between the compiler and the external environment. This Compiler System Specification is for the HAL/S-FC compiler and its associated run time facilities which implement the full HAL/S language. The HAL/S-FC compiler is designed to operate stand-alone on any compatible IBM 360/370 computer and within the Software Development Laboratory (SDL) at NASA/JSC, Houston, Texas.

  13. A Class-Specific Optimizing Compiler

    Directory of Open Access Journals (Sweden)

    Michael D. Sharp

    1993-01-01

    Full Text Available Class-specific optimizations are compiler optimizations specified by the class implementor to the compiler. They allow the compiler to take advantage of the semantics of the particular class so as to produce better code. Optimizations of interest include the strength reduction of class:: array address calculations, elimination of large temporaries, and the placement of asynchronous send/recv calls so as to achieve computation/communication overlap. We will outline our progress towards the implementation of a C++ compiler capable of incorporating class-specific optimizations.

  14. A Note on Compiling Fortran

    Energy Technology Data Exchange (ETDEWEB)

    Busby, L. E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-09-01

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous to compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.

  15. Integrated system for production of neutronics and photonics calculational constants. Volume XVI. Tabular and graphical presentation of 175 neutron group constants derived from the LLL evaluated neutron data library (ENDL)

    International Nuclear Information System (INIS)

    Plechaty, E.F.; Cullen, D.E.; Howerton, R.J.; Kimlinger, J.R.

    1975-01-01

    As of February 3, 1975, 175 neutron group constants had been derived from the Evaluated Nuclear Data Library (ENDL) at LLL. In this volume, tables and graphs of the constants are presented along with the conventions used in their preparation. (U.S.)

  16. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  17. Evaluation of HAL/S language compilability using SAMSO's Compiler Writing System (CWS)

    Science.gov (United States)

    Feliciano, M.; Anderson, H. D.; Bond, J. W., III

    1976-01-01

    NASA/Langley is engaged in a program to develop an adaptable guidance and control software concept for spacecraft such as shuttle-launched payloads. It is envisioned that this flight software be written in a higher-order language, such as HAL/S, to facilitate changes or additions. To make this adaptable software transferable to various onboard computers, a compiler writing system capability is necessary. A joint program with the Air Force Space and Missile Systems Organization was initiated to determine if the Compiler Writing System (CWS) owned by the Air Force could be utilized for this purpose. The present study explores the feasibility of including the HAL/S language constructs in CWS and the effort required to implement these constructs. This will determine the compilability of HAL/S using CWS and permit NASA/Langley to identify the HAL/S constructs desired for their applications. The study consisted of comparing the implementation of the Space Programming Language using CWS with the requirements for the implementation of HAL/S. It is the conclusion of the study that CWS already contains many of the language features of HAL/S and that it can be expanded for compiling part or all of HAL/S. It is assumed that persons reading and evaluating this report have a basic familiarity with (1) the principles of compiler construction and operation, and (2) the logical structure and applications characteristics of HAL/S and SPL.

  18. Towards a New LLL Paradigm? EU Policy on Key Competences and Reskilling: Facets and Trends

    Directory of Open Access Journals (Sweden)

    N. Papadakis

    2009-06-01

    Full Text Available Policy initiatives such as the European Year of Creativity and Innovation (2009 and the EU Framework on “Key Competences” (2006 and onwards aim at contributing on the ongoing reconceptualisation of skills (gradually correlated to Reskilling, Employability, Sustainability and Competitiveness and operate within the context of a changing balance between technocracy, pedagogy and politics. I.e. according to the EU cluster on Key Competences “major themes are applied throughout the Framework: creativity, critical thinking, initiative taking, play a major role in all eight key competences”. This explicit changing role of Creativity gains in political visibility and requires a contextually embedded and multidisciplinary approach. On such a perspective the present paper analyzes the political context and interest politics’ impact on the transformations on LLL and reskilling within the EU policy agenda and raises methodological and epistemological issues on the interface between educational and policy analysis.

  19. Compiler issues associated with safety-related software

    International Nuclear Information System (INIS)

    Feinauer, L.R.

    1991-01-01

    A critical issue in the quality assurance of safety-related software is the ability of the software to produce identical results, independent of the host machine, operating system, or compiler version under which the software is installed. A study is performed using the VIPRE-0l, FREY-01, and RETRAN-02 safety-related codes. Results from an IBM 3083 computer are compared with results from a CYBER 860 computer. All three of the computer programs examined are written in FORTRAN; the VIPRE code uses the FORTRAN 66 compiler, whereas the FREY and RETRAN codes use the FORTRAN 77 compiler. Various compiler options are studied to determine their effect on the output between machines. Since the Control Data Corporation and IBM machines inherently represent numerical data differently, methods of producing equivalent accuracy of data representation were an important focus of the study. This paper identifies particular problems in the automatic double-precision option (AUTODBL) of the IBM FORTRAN 1.4.x series of compilers. The IBM FORTRAN version 2 compilers provide much more stable, reliable compilation for engineering software. Careful selection of compilers and compiler options can help guarantee identical results between different machines. To ensure reproducibility of results, the same compiler and compiler options should be used to install the program as were used in the development and testing of the program

  20. C to VHDL compiler

    Science.gov (United States)

    Berdychowski, Piotr P.; Zabolotny, Wojciech M.

    2010-09-01

    The main goal of C to VHDL compiler project is to make FPGA platform more accessible for scientists and software developers. FPGA platform offers unique ability to configure the hardware to implement virtually any dedicated architecture, and modern devices provide sufficient number of hardware resources to implement parallel execution platforms with complex processing units. All this makes the FPGA platform very attractive for those looking for efficient heterogeneous, computing environment. Current industry standard in development of digital systems on FPGA platform is based on HDLs. Although very effective and expressive in hands of hardware development specialists, these languages require specific knowledge and experience, unreachable for most scientists and software programmers. C to VHDL compiler project attempts to remedy that by creating an application, that derives initial VHDL description of a digital system (for further compilation and synthesis), from purely algorithmic description in C programming language. This idea itself is not new, and the C to VHDL compiler combines the best approaches from existing solutions developed over many previous years, with the introduction of some new unique improvements.

  1. HAL/S-FC compiler system functional specification

    Science.gov (United States)

    1974-01-01

    Compiler organization is discussed, including overall compiler structure, internal data transfer, compiler development, and code optimization. The user, system, and SDL interfaces are described, along with compiler system requirements. Run-time software support package and restrictions and dependencies are also considered of the HAL/S-FC system.

  2. SPARQL compiler for Bobox

    OpenAIRE

    Čermák, Miroslav

    2013-01-01

    The goal of the work is to design and implement a SPARQL compiler for the Bobox system. In addition to lexical and syntactic analysis corresponding to W3C standard for SPARQL language, it performs semantic analysis and optimization of queries. Compiler will constuct an appropriate model for execution in Bobox, that depends on the physical database schema.

  3. Algorithmic synthesis using Python compiler

    Science.gov (United States)

    Cieszewski, Radoslaw; Romaniuk, Ryszard; Pozniak, Krzysztof; Linczuk, Maciej

    2015-09-01

    This paper presents a python to VHDL compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and translate it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the programmed circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. This can be achieved by using many computational resources at the same time. Creating parallel programs implemented in FPGAs in pure HDL is difficult and time consuming. Using higher level of abstraction and High-Level Synthesis compiler implementation time can be reduced. The compiler has been implemented using the Python language. This article describes design, implementation and results of created tools.

  4. Compiling a 50-year journey

    DEFF Research Database (Denmark)

    Hutton, Graham; Bahr, Patrick

    2017-01-01

    Fifty years ago, John McCarthy and James Painter published the first paper on compiler verification, in which they showed how to formally prove the correctness of a compiler that translates arithmetic expressions into code for a register-based machine. In this article, we revisit this example...

  5. An exploratory discussion on business files compilation

    International Nuclear Information System (INIS)

    Gao Chunying

    2014-01-01

    Business files compilation for an enterprise is a distillation and recreation of its spiritual wealth, from which the applicable information can be available to those who want to use it in a fast, extensive and precise way. Proceeding from the effects of business files compilation on scientific researches, productive constructions and developments, this paper in five points discusses the way how to define topics, analyze historical materials, search or select data and process it to an enterprise archives collection. Firstly, to expound the importance and necessity of business files compilation in production, operation and development of an company; secondly, to present processing methods from topic definition, material searching and data selection to final examination and correction; thirdly, to define principle and classification in order to make different categories and levels of processing methods available to business files compilation; fourthly, to discuss the specific method how to implement a file compilation through a documentation collection upon principle of topic definition gearing with demand; fifthly, to address application of information technology to business files compilation in view point of widely needs for business files so as to level up enterprise archives management. The present discussion focuses on the examination and correction principle of enterprise historical material compilation and the basic classifications as well as the major forms of business files compilation achievements. (author)

  6. Proving correctness of compilers using structured graphs

    DEFF Research Database (Denmark)

    Bahr, Patrick

    2014-01-01

    it into a compiler implementation using a graph type along with a correctness proof. The implementation and correctness proof of a compiler using a tree type without explicit jumps is simple, but yields code duplication. Our method provides a convenient way of improving such a compiler without giving up the benefits...

  7. 12 CFR 411.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Semi-annual compilation. 411.600 Section 411.600 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES NEW RESTRICTIONS ON LOBBYING Agency Reports § 411.600 Semi-annual compilation. (a) The head of each agency shall collect and compile the...

  8. ALGOL compiler. Syntax and semantic analysis

    International Nuclear Information System (INIS)

    Tarbouriech, Robert

    1971-01-01

    In this research thesis, the author reports the development of an ALGOL compiler which performs the main following tasks: systematic scan of the origin-programme to recognise the different components (identifiers, reserved words, constants, separators), analysis of the origin-programme structure to build up its statements and arithmetic expressions, processing of symbolic names (identifiers) to associate them with values they represent, and memory allocation for data and programme. Several issues are thus addressed: characteristics of the machine for which the compiler is developed, exact definition of the language (grammar, identifier and constant formation), syntax processing programme to provide the compiler with necessary elements (language vocabulary, precedence matrix), description of the first two phases of compilation: lexicographic analysis, and syntax analysis. The last phase (machine-code generation) is not addressed

  9. Supporting documents for LLL area 27 (410 area) safety analysis reports, Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    Odell, B. N. [comp.

    1977-02-01

    The following appendices are common to the LLL Safety Analysis Reports Nevada Test Site and are included here as supporting documents to those reports: Environmental Monitoring Report for the Nevada Test Site and Other Test Areas Used for Underground Nuclear Detonations, U. S. Environmental Protection Agency, Las Vegas, Rept. EMSL-LV-539-4 (1976); Selected Census Information Around the Nevada Test Site, U. S. Environmental Protection Agency, Las Vegas, Rept. NERC-LV-539-8 (1973); W. J. Hannon and H. L. McKague, An Examination of the Geology and Seismology Associated with Area 410 at the Nevada Test Site, Lawrence Livermore Laboratory, Livermore, Rept. UCRL-51830 (1975); K. R. Peterson, Diffusion Climatology for Hypothetical Accidents in Area 410 of the Nevada Test Site, Lawrence Livermore Laboratory, Livermore, Rept. UCRL-52074 (1976); J. R. McDonald, J. E. Minor, and K. C. Mehta, Development of a Design Basis Tornado and Structural Design Criteria for the Nevada Test Site, Nevada, Lawrence Livermore Laboratory, Livermore, Rept. UCRL-13668 (1975); A. E. Stevenson, Impact Tests of Wind-Borne Wooden Missiles, Sandia Laboratories, Tonopah, Rept. SAND 76-0407 (1976); and Hydrology of the 410 Area (Area 27) at the Nevada Test Site.

  10. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-10-01

    This is the third issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section approximately every six months. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations

  11. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1977-03-01

    This is the second issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation of compilations and evaluations is designed to keep the nuclear scientific community informed of the availability of compiled or evaluated NSD data, and contains references to laboratory reports, journal articles and books containing selected compilations and evaluations. It excludes references to ''mass-chain'' evaluations normally published in the ''Nuclear Data Sheets'' and ''Nuclear Physics''. The material contained in this compilation is sorted according to eight subject categories: general compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes; half-lives, energies and spectra; nuclear decay processes: gamma-rays; nuclear decay processes: fission products; nuclear decay processes: (others); atomic processes

  12. CAPS OpenACC Compilers: Performance and Portability

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    The announcement late 2011 of the new OpenACC directive-based programming standard supported by CAPS, CRAY and PGI compilers has open up the door to more scientific applications that can be ported on many-core systems. Following a porting methodology, this talk will first review the principles of programming with OpenACC and then the advanced features available in the CAPS compilers to further optimize OpenACC applications: library integration, tuning directives with auto-tune mechanisms to build applications adaptive to different GPUs. CAPS compilers use hardware vendors' backends such as NVIDIA CUDA and OpenCL making them the only OpenACC compilers supporting various many-core architectures. About the speaker Stéphane Bihan is co-funder and currently Director of Sales and Marketing at CAPS enterprise. He has held several R&D positions in companies such as ARC international plc in London, Canon Research Center France, ACE compiler experts in Amsterdam and the INRIA r...

  13. HAL/S-360 compiler test activity report

    Science.gov (United States)

    Helmers, C. T.

    1974-01-01

    The levels of testing employed in verifying the HAL/S-360 compiler were as follows: (1) typical applications program case testing; (2) functional testing of the compiler system and its generated code; and (3) machine oriented testing of compiler implementation on operational computers. Details of the initial test plan and subsequent adaptation are reported, along with complete test results for each phase which examined the production of object codes for every possible source statement.

  14. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  15. Automatic Parallelization An Overview of Fundamental Compiler Techniques

    CERN Document Server

    Midkiff, Samuel P

    2012-01-01

    Compiling for parallelism is a longstanding topic of compiler research. This book describes the fundamental principles of compiling "regular" numerical programs for parallelism. We begin with an explanation of analyses that allow a compiler to understand the interaction of data reads and writes in different statements and loop iterations during program execution. These analyses include dependence analysis, use-def analysis and pointer analysis. Next, we describe how the results of these analyses are used to enable transformations that make loops more amenable to parallelization, and

  16. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  17. A compiler for variational forms

    OpenAIRE

    Kirby, Robert C.; Logg, Anders

    2011-01-01

    As a key step towards a complete automation of the finite element method, we present a new algorithm for automatic and efficient evaluation of multilinear variational forms. The algorithm has been implemented in the form of a compiler, the FEniCS Form Compiler FFC. We present benchmark results for a series of standard variational forms, including the incompressible Navier-Stokes equations and linear elasticity. The speedup compared to the standard quadrature-based approach is impressive; in s...

  18. Tennessee Offender Management Information System

    OpenAIRE

    Beck, Tim

    1993-01-01

    This article describes the integration of a knowledge-based system with a large COBOL-DB2-based offender management system. The knowledge-based application, developed for the purpose of offender sentence calculation, is shown to provide several benefits, including a shortened development cycle, simplified maintenance, and improved accuracy over a previous COBOL-based application.

  19. Compiler Construction Using Java, JavaCC, and Yacc

    CERN Document Server

    Dos Reis, Anthony J

    2012-01-01

    Broad in scope, involving theory, the application of that theory, and programming technology, compiler construction is a moving target, with constant advances in compiler technology taking place. Today, a renewed focus on do-it-yourself programming makes a quality textbook on compilers, that both students and instructors will enjoy using, of even more vital importance. This book covers every topic essential to learning compilers from the ground up and is accompanied by a powerful and flexible software package for evaluating projects, as well as several tutorials, well-defined projects, and tes

  20. PIG 3 - A simple compiler for mercury

    Energy Technology Data Exchange (ETDEWEB)

    Bindon, D C [Computer Branch, Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-06-15

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  1. PIG 3 - A simple compiler for mercury

    International Nuclear Information System (INIS)

    Bindon, D.C.

    1961-06-01

    A short machine language compilation scheme is described; which will read programmes from paper tape, punched cards, or magnetic tape. The compiler occupies pages 8-15 of the ferrite store during translation. (author)

  2. Compilation of data on elementary particles

    International Nuclear Information System (INIS)

    Trippe, T.G.

    1984-09-01

    The most widely used data compilation in the field of elementary particle physics is the Review of Particle Properties. The origin, development and current state of this compilation are described with emphasis on the features which have contributed to its success: active involvement of particle physicists; critical evaluation and review of the data; completeness of coverage; regular distribution of reliable summaries including a pocket edition; heavy involvement of expert consultants; and international collaboration. The current state of the Review and new developments such as providing interactive access to the Review's database are described. Problems and solutions related to maintaining a strong and supportive relationship between compilation groups and the researchers who produce and use the data are discussed

  3. Compilation of functional languages using flow graph analysis

    NARCIS (Netherlands)

    Hartel, Pieter H.; Glaser, Hugh; Wild, John M.

    A system based on the notion of a flow graph is used to specify formally and to implement a compiler for a lazy functional language. The compiler takes a simple functional language as input and generates C. The generated C program can then be compiled, and loaded with an extensive run-time system to

  4. Research and Practice of the News Map Compilation Service

    Science.gov (United States)

    Zhao, T.; Liu, W.; Ma, W.

    2018-04-01

    Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  5. RESEARCH AND PRACTICE OF THE NEWS MAP COMPILATION SERVICE

    Directory of Open Access Journals (Sweden)

    T. Zhao

    2018-04-01

    Full Text Available Based on the needs of the news media on the map, this paper researches on the news map compilation service, conducts demand research on the service of compiling news maps, designs and compiles the public authority base map suitable for media publication, and constructs the news base map material library. It studies the compilation of domestic and international news maps with timeliness and strong pertinence and cross-regional characteristics, constructs the hot news thematic gallery and news map customization services, conducts research on types of news maps, establish closer liaison and cooperation methods with news media, and guides news media to use correct maps. Through the practice of the news map compilation service, this paper lists two cases of news map preparation services used by different media, compares and analyses cases, summarizes the research situation of news map compilation service, and at the same time puts forward outstanding problems and development suggestions in the service of news map compilation service.

  6. abc the aspectBench compiler for aspectJ a workbench for aspect-oriented programming language and compilers research

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    Aspect-oriented programming (AOP) is gaining popularity as a new way of modularising cross-cutting concerns. The aspectbench compiler (abc) is a new workbench for AOP research which provides an extensible research framework for both new language features and new compiler optimisations. This poste...

  7. A compilation of energy costs of physical activities.

    Science.gov (United States)

    Vaz, Mario; Karaolis, Nadine; Draper, Alizon; Shetty, Prakash

    2005-10-01

    There were two objectives: first, to review the existing data on energy costs of specified activities in the light of the recommendations made by the Joint Food and Agriculture Organization/World Health Organization/United Nations University (FAO/WHO/UNU) Expert Consultation of 1985. Second, to compile existing data on the energy costs of physical activities for an updated annexure of the current Expert Consultation on Energy and Protein Requirements. Electronic and manual search of the literature (predominantly English) to obtain published data on the energy costs of physical activities. The majority of the data prior to 1955 were obtained using an earlier compilation of Passmore and Durnin. Energy costs were expressed as physical activity ratio (PAR); the energy cost of the activity divided by either the measured or predicted basal metabolic rate (BMR). The compilation provides PARs for an expanded range of activities that include general personal activities, transport, domestic chores, occupational activities, sports and other recreational activities for men and women, separately, where available. The present compilation is largely in agreement with the 1985 compilation, for activities that are common to both compilations. The present compilation has been based on the need to provide data on adults for a wide spectrum of human activity. There are, however, lacunae in the available data for many activities, between genders, across age groups and in various physiological states.

  8. Fiscal 2000 report on advanced parallelized compiler technology. Outlines; 2000 nendo advanced heiretsuka compiler gijutsu hokokusho (Gaiyo hen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-03-01

    Research and development was carried out concerning the automatic parallelized compiler technology which improves on the practical performance, cost/performance ratio, and ease of operation of the multiprocessor system now used for constructing supercomputers and expected to provide a fundamental architecture for microprocessors for the 21st century. Efforts were made to develop an automatic multigrain parallelization technology for extracting multigrain as parallelized from a program and for making full use of the same and a parallelizing tuning technology for accelerating parallelization by feeding back to the compiler the dynamic information and user knowledge to be acquired during execution. Moreover, a benchmark program was selected and studies were made to set execution rules and evaluation indexes for the establishment of technologies for subjectively evaluating the performance of parallelizing compilers for the existing commercial parallel processing computers, which was achieved through the implementation and evaluation of the 'Advanced parallelizing compiler technology research and development project.' (NEDO)

  9. Regulatory and technical reports compilation for 1980

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzi, L.

    1981-04-01

    This compilation lists formal regulatory and technical reports and conference proceedings issued in 1980 by the US Nuclear Regulatory Commission. The compilation is divided into four major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The second major section of this compilation consists of a key-word index to report titles. The third major section contains an alphabetically arranged listing of contractor report numbers cross-referenced to their corresponding NRC report numbers. Finally, the fourth section is an errata supplement

  10. Compiling quantum circuits to realistic hardware architectures using temporal planners

    Science.gov (United States)

    Venturelli, Davide; Do, Minh; Rieffel, Eleanor; Frank, Jeremy

    2018-04-01

    To run quantum algorithms on emerging gate-model quantum hardware, quantum circuits must be compiled to take into account constraints on the hardware. For near-term hardware, with only limited means to mitigate decoherence, it is critical to minimize the duration of the circuit. We investigate the application of temporal planners to the problem of compiling quantum circuits to newly emerging quantum hardware. While our approach is general, we focus on compiling to superconducting hardware architectures with nearest neighbor constraints. Our initial experiments focus on compiling Quantum Alternating Operator Ansatz (QAOA) circuits whose high number of commuting gates allow great flexibility in the order in which the gates can be applied. That freedom makes it more challenging to find optimal compilations but also means there is a greater potential win from more optimized compilation than for less flexible circuits. We map this quantum circuit compilation problem to a temporal planning problem, and generated a test suite of compilation problems for QAOA circuits of various sizes to a realistic hardware architecture. We report compilation results from several state-of-the-art temporal planners on this test set. This early empirical evaluation demonstrates that temporal planning is a viable approach to quantum circuit compilation.

  11. Regulatory and technical reports: compilation for 1975-1978

    International Nuclear Information System (INIS)

    1982-04-01

    This brief compilation lists formal reports issued by the US Nuclear Regulatory Commission in 1975 through 1978 that were not listed in the Regulatory and Technical Reports Compilation for 1975 to 1978, NUREG-0304, Vol. 3. This compilation is divided into two sections. The first consists of a sequential listing of all reports in report-number order. The second section consists of an index developed from keywords in report titles and abstracts

  12. Compilations and evaluations of nuclear structure and decay data

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-10-01

    This is the fourth issue of a report series on published and to-be-published compilations and evaluations of nuclear structure and decay (NSD) data. This compilation is published and distributed by the IAEA Nuclear Data Section every year. The material contained in this compilation is sorted according to eight subject categories: General compilations; basic isotopic properties; nuclear structure properties; nuclear decay processes, half-lives, energies and spectra; nuclear decay processes, gamma-rays; nuclear decay processes, fission products; nuclear decay processes (others); atomic processes

  13. Advanced C and C++ compiling

    CERN Document Server

    Stevanovic, Milan

    2014-01-01

    Learning how to write C/C++ code is only the first step. To be a serious programmer, you need to understand the structure and purpose of the binary files produced by the compiler: object files, static libraries, shared libraries, and, of course, executables.Advanced C and C++ Compiling explains the build process in detail and shows how to integrate code from other developers in the form of deployed libraries as well as how to resolve issues and potential mismatches between your own and external code trees.With the proliferation of open source, understanding these issues is increasingly the res

  14. Integrated system for production of neutronics and photonics calculational constants. Volume 15, Part C. The LLL Evaluated Nuclear Data Library (ENDL): translation of ENDL neutron-induced interaction data into the ENDF/B format

    International Nuclear Information System (INIS)

    Howerton, R.J.

    1976-01-01

    The LLL evaluated nuclear data library (ENDL) has been translated into the evaluated neutron data file/version B (ENDF/B) format. This translation is for the convenience of those who wish to use ENDL data but who are more familiar with ENDF/B formats and procedures. Only that portion of ENDL dealing with neutron-induced interactions (including photon production from neutron-induced reactions) has been translated

  15. Compilation of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    Lundergan, C.D.; Mead, P.L.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078)

  16. Compilation of Sandia Laboratories technical capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Lundergan, C. D.; Mead, P. L. [eds.

    1975-11-01

    This report is a compilation of 17 individual documents that together summarize the technical capabilities of Sandia Laboratories. Each document in this compilation contains details about a specific area of capability. Examples of application of the capability to research and development problems are provided. An eighteenth document summarizes the content of the other seventeen. Each of these documents was issued with a separate report number (SAND 74-0073A through SAND 74-0091, except -0078). (RWR)

  17. VFC: The Vienna Fortran Compiler

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    1999-01-01

    Full Text Available High Performance Fortran (HPF offers an attractive high‐level language interface for programming scalable parallel architectures providing the user with directives for the specification of data distribution and delegating to the compiler the task of generating an explicitly parallel program. Available HPF compilers can handle regular codes quite efficiently, but dramatic performance losses may be encountered for applications which are based on highly irregular, dynamically changing data structures and access patterns. In this paper we introduce the Vienna Fortran Compiler (VFC, a new source‐to‐source parallelization system for HPF+, an optimized version of HPF, which addresses the requirements of irregular applications. In addition to extended data distribution and work distribution mechanisms, HPF+ provides the user with language features for specifying certain information that decisively influence a program’s performance. This comprises data locality assertions, non‐local access specifications and the possibility of reusing runtime‐generated communication schedules of irregular loops. Performance measurements of kernels from advanced applications demonstrate that with a high‐level data parallel language such as HPF+ a performance close to hand‐written message‐passing programs can be achieved even for highly irregular codes.

  18. Extension of Alvis compiler front-end

    Energy Technology Data Exchange (ETDEWEB)

    Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl [AGH University of Science and Technology, Department of Applied Computer Science, Al. Mickiewicza 30, 30-059 Krakow (Poland)

    2015-12-31

    Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providing new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.

  19. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  20. Semantics-based compiling: A case study in type-directed partial evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    , block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs......-directed compilation, in the spirit of Scott and Strachey. Our conclusion is that lambda-calculus normalization suffices for compiling by specializing an interpreter....

  1. A Symmetric Approach to Compilation and Decompilation

    DEFF Research Database (Denmark)

    Ager, Mads Sig; Danvy, Olivier; Goldberg, Mayer

    2002-01-01

    Just as an interpreter for a source language can be turned into a compiler from the source language to a target language, we observe that an interpreter for a target language can be turned into a compiler from the target language to a source language. In both cases, the key issue is the choice of...

  2. Compilation of current high energy physics experiments

    International Nuclear Information System (INIS)

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche

  3. Compiler design handbook optimizations and machine code generation

    CERN Document Server

    Srikant, YN

    2003-01-01

    The widespread use of object-oriented languages and Internet security concerns are just the beginning. Add embedded systems, multiple memory banks, highly pipelined units operating in parallel, and a host of other advances and it becomes clear that current and future computer architectures pose immense challenges to compiler designers-challenges that already exceed the capabilities of traditional compilation techniques. The Compiler Design Handbook: Optimizations and Machine Code Generation is designed to help you meet those challenges. Written by top researchers and designers from around the

  4. Writing Compilers and Interpreters A Software Engineering Approach

    CERN Document Server

    Mak, Ronald

    2011-01-01

    Long-awaited revision to a unique guide that covers both compilers and interpreters Revised, updated, and now focusing on Java instead of C++, this long-awaited, latest edition of this popular book teaches programmers and software engineering students how to write compilers and interpreters using Java. You?ll write compilers and interpreters as case studies, generating general assembly code for a Java Virtual Machine that takes advantage of the Java Collections Framework to shorten and simplify the code. In addition, coverage includes Java Collections Framework, UML modeling, object-oriented p

  5. Promising Compilation to ARMv8 POP

    OpenAIRE

    Podkopaev, Anton; Lahav, Ori; Vafeiadis, Viktor

    2017-01-01

    We prove the correctness of compilation of relaxed memory accesses and release-acquire fences from the "promising" semantics of [Kang et al. POPL'17] to the ARMv8 POP machine of [Flur et al. POPL'16]. The proof is highly non-trivial because both the ARMv8 POP and the promising semantics provide some extremely weak consistency guarantees for normal memory accesses; however, they do so in rather different ways. Our proof of compilation correctness to ARMv8 POP strengthens the results of the Kan...

  6. Design of methodology for incremental compiler construction

    Directory of Open Access Journals (Sweden)

    Pavel Haluza

    2011-01-01

    Full Text Available The paper deals with possibilities of the incremental compiler construction. It represents the compiler construction possibilities for languages with a fixed set of lexical units and for languages with a variable set of lexical units, too. The methodology design for the incremental compiler construction is based on the known algorithms for standard compiler construction and derived for both groups of languages. Under the group of languages with a fixed set of lexical units there belong languages, where each lexical unit has its constant meaning, e.g., common programming languages. For this group of languages the paper tries to solve the problem of the incremental semantic analysis, which is based on incremental parsing. In the group of languages with a variable set of lexical units (e.g., professional typographic system TEX, it is possible to change arbitrarily the meaning of each character on the input file at any time during processing. The change takes effect immediately and its validity can be somehow limited or is given by the end of the input. For this group of languages this paper tries to solve the problem case when we use macros temporarily changing the category of arbitrary characters.

  7. Compiling the First Monolingual Lusoga Dictionary

    Directory of Open Access Journals (Sweden)

    Minah Nabirye

    2011-10-01

    Full Text Available

    Abstract: In this research article a study is made of the approach followed to compile the first-ever monolingual dictionary for Lusoga. Lusoga is a Bantu language spoken in Uganda by slightly over two mil-lion people. Being an under-resourced language, the Lusoga orthography had to be designed, a grammar written, and a corpus built, before embarking on the compilation of the dictionary. This compilation was aimed at attaining an academic degree, hence requiring a rigorous research methodology. Firstly, the prevail-ing methods for compiling dictionaries were mainly practical and insufficient in explaining the theoretical linguistic basis for dictionary compilation. Since dictionaries are based on meaning, the theory of meaning was used to account for all linguistic data considered in dictionaries. However, meaning is considered at a very abstract level, far removed from the process of compiling dictionaries. Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular theory explains how the different modules of a language contribute information to the different parts of the dictionary article or dictionary information in general. Secondly, the research also had to contend with the different approaches for analysing Bantu languages for Bantu and European audiences. A descrip-tion of the Bantu- and European-centred approaches to Bantu studies was undertaken in respect of (a the classification of Lusoga words, and (b the specification of their citations. As a result, Lusoga lexicography deviates from the prevailing Bantu classification and citation of nouns, adjectives and verbs in particular. The dictionary was tested on two separate occasions and all the feedback was considered in the compilation pro-cess. This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary

  8. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  9. Electronic circuits for communications systems: A compilation

    Science.gov (United States)

    1972-01-01

    The compilation of electronic circuits for communications systems is divided into thirteen basic categories, each representing an area of circuit design and application. The compilation items are moderately complex and, as such, would appeal to the applications engineer. However, the rationale for the selection criteria was tailored so that the circuits would reflect fundamental design principles and applications, with an additional requirement for simplicity whenever possible.

  10. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  11. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  12. 49 CFR 801.57 - Records compiled for law enforcement purposes.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Records compiled for law enforcement purposes. 801... compiled for law enforcement purposes. Pursuant to 5 U.S.C. 552(b)(7), any records compiled for law or..., would disclose investigative procedures and practices, or would endanger the life or security of law...

  13. An Efficient Compiler for Weighted Rewrite Rules

    OpenAIRE

    Mohri, Mehryar; Sproat, Richard

    1996-01-01

    Context-dependent rewrite rules are used in many areas of natural language and speech processing. Work in computational phonology has demonstrated that, given certain conditions, such rewrite rules can be represented as finite-state transducers (FSTs). We describe a new algorithm for compiling rewrite rules into FSTs. We show the algorithm to be simpler and more efficient than existing algorithms. Further, many of our applications demand the ability to compile weighted rules into weighted FST...

  14. HOPE: Just-in-time Python compiler for astrophysical computations

    Science.gov (United States)

    Akeret, Joel; Gamper, Lukas; Amara, Adam; Refregier, Alexandre

    2014-11-01

    HOPE is a specialized Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimization on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. By using HOPE, the user benefits from being able to write common numerical code in Python while getting the performance of compiled implementation.

  15. Compiling the First Monolingual Lusoga Dictionary | Nabirye | Lexikos

    African Journals Online (AJOL)

    Another theory, the theory of modularity, was used to bridge the gap between the theory of meaning and the compilation process. The modular ... This article, then, gives an overall summary of all the steps involved in the compilation of the Eiwanika ly'Olusoga, i.e. the Monolingual Lusoga Dictionary. Keywords: lexicography ...

  16. Data compilation for particle-impact desorption, 2

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeutchi, Fujio.

    1985-07-01

    The particle impact desorption is one of the elementary processes of hydrogen recycling in controlled thermonuclear fusion reactors. We have surveyed the literature concerning the ion impact desorption and photon stimulated desorption published through the end of 1984 and compiled the data on the desorption cross sections and yields with the aid of a computer. This report presents the results of the compilation in graphs and tables as functions of incident energy, surface temperature and surface coverage. (author)

  17. An Initial Evaluation of the NAG f90 Compiler

    Directory of Open Access Journals (Sweden)

    Michael Metcalf

    1992-01-01

    Full Text Available A few weeks before the formal publication of the ISO Fortran 90 Standard, NAG announced the world's first f90 compiler. We have evaluated the compiler by using it to assess the impact of Fortran 90 on the CERN Program Library.

  18. Compiled MPI: Cost-Effective Exascale Applications Development

    Energy Technology Data Exchange (ETDEWEB)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A; Hoefler, T

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardware procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over

  19. Solidify, An LLVM pass to compile LLVM IR into Solidity

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-12

    The software currently compiles LLVM IR into Solidity (Ethereum’s dominant programming language) using LLVM’s pass library. Specifically, his compiler allows us to convert an arbitrary DSL into Solidity. We focus specifically on converting Domain Specific Languages into Solidity due to their ease of use, and provable properties. By creating a toolchain to compile lightweight domain-specific languages into Ethereum's dominant language, Solidity, we allow non-specialists to effectively develop safe and useful smart contracts. For example lawyers from a certain firm can have a proprietary DSL that codifies basic laws safely converted to Solidity to be securely executed on the blockchain. In another example, a simple provenance tracking language can be compiled and securely executed on the blockchain.

  20. Compilation of current high-energy-physics experiments

    International Nuclear Information System (INIS)

    Wohl, C.G.; Kelly, R.L.; Armstrong, F.E.

    1980-04-01

    This is the third edition of a compilation of current high energy physics experiments. It is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and ten participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), the Institute for Nuclear Study, Tokyo (INS), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. The compilation includes summaries of all high energy physics experiments at the above laboratories that (1) were approved (and not subsequently withdrawn) before about January 1980, and (2) had not completed taking of data by 1 January 1976

  1. SVM Support in the Vienna Fortran Compilation System

    OpenAIRE

    Brezany, Peter; Gerndt, Michael; Sipkova, Viera

    1994-01-01

    Vienna Fortran, a machine-independent language extension to Fortran which allows the user to write programs for distributed-memory systems using global addresses, provides the forall-loop construct for specifying irregular computations that do not cause inter-iteration dependences. Compilers for distributed-memory systems generate code that is based on runtime analysis techniques and is only efficient if, in addition, aggressive compile-time optimizations are applied. Since these optimization...

  2. A Compilation of Internship Reports - 2012

    Energy Technology Data Exchange (ETDEWEB)

    Stegman M.; Morris, M.; Blackburn, N.

    2012-08-08

    This compilation documents all research project undertaken by the 2012 summer Department of Energy - Workforce Development for Teachers and Scientists interns during their internship program at Brookhaven National Laboratory.

  3. Semantics-Based Compiling: A Case Study in Type-Directed Partial Evaluation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Vestergaard, René

    1996-01-01

    in the style of denotational semantics; – the output of the generated compiler is effectively three-address code, in the fashion and efficiency of the Dragon Book; – the generated compiler processes several hundred lines of source code per second. The source language considered in this case study is imperative......, block-structured, higher-order, call-by-value, allows subtyping, and obeys stack discipline. It is bigger than what is usually reported in the literature on semantics-based compiling and partial evaluation. Our compiling technique uses the first Futamura projection, i.e., we compile programs...... by specializing a definitional interpreter with respect to the program. Specialization is carried out using type-directed partial evaluation, which is a mild version of partial evaluation akin to lambda-calculus normalization. Our definitional interpreter follows the format of denotational semantics, with a clear...

  4. Compilation of results 1987

    International Nuclear Information System (INIS)

    1987-01-01

    A compilation is carried out which in concentrated form presents reports on research and development within the nuclear energy field covering a two and a half years period. The foregoing report was edited in December 1984. The projects are presendted with title, project number, responsible unit, person to contact and short result reports. The result reports consist of short summaries over each project. (L.F.)

  5. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    Science.gov (United States)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  6. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  7. Compilations and evaluations of nuclear structure and decay date

    International Nuclear Information System (INIS)

    Lorenz, A.

    The material contained in this compilation is sorted according to eight subject categories: 1. General Compilations; 2. Basic Isotopic Properties; 3. Nuclear Structure Properties; 4. Nuclear Decay Processes: Half-lives, Energies and Spectra; 5. Nuclear Decay Processes: Gamma-rays; 6. Nuclear Decay Processes: Fission Products; 7. Nuclear Decay Processes: (Others); 8. Atomic Processes

  8. 1988 Bulletin compilation and index

    International Nuclear Information System (INIS)

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information

  9. 1988 Bulletin compilation and index

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1989-02-01

    This document is published to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1988 calendar year. A table of contents and one index have been provided to assist in finding information.

  10. Automating Visualization Service Generation with the WATT Compiler

    Science.gov (United States)

    Bollig, E. F.; Lyness, M. D.; Erlebacher, G.; Yuen, D. A.

    2007-12-01

    As tasks and workflows become increasingly complex, software developers are devoting increasing attention to automation tools. Among many examples, the Automator tool from Apple collects components of a workflow into a single script, with very little effort on the part of the user. Tasks are most often described as a series of instructions. The granularity of the tasks dictates the tools to use. Compilers translate fine-grained instructions to assembler code, while scripting languages (ruby, perl) are used to describe a series of tasks at a higher level. Compilers can also be viewed as transformational tools: a cross-compiler can translate executable code written on one computer to assembler code understood on another, while transformational tools can translate from one high-level language to another. We are interested in creating visualization web services automatically, starting from stand-alone VTK (Visualization Toolkit) code written in Tcl. To this end, using the OCaml programming language, we have developed a compiler that translates Tcl into C++, including all the stubs, classes and methods to interface with gSOAP, a C++ implementation of the Soap 1.1/1.2 protocols. This compiler, referred to as the Web Automation and Translation Toolkit (WATT), is the first step towards automated creation of specialized visualization web services without input from the user. The WATT compiler seeks to automate all aspects of web service generation, including the transport layer, the division of labor and the details related to interface generation. The WATT compiler is part of ongoing efforts within the NSF funded VLab consortium [1] to facilitate and automate time-consuming tasks for the science related to understanding planetary materials. Through examples of services produced by WATT for the VLab portal, we will illustrate features, limitations and the improvements necessary to achieve the ultimate goal of complete and transparent automation in the generation of web

  11. Compilation of current high energy physics experiments - Sept. 1978

    Energy Technology Data Exchange (ETDEWEB)

    Addis, L.; Odian, A.; Row, G. M.; Ward, C. E. W.; Wanderer, P.; Armenteros, R.; Joos, P.; Groves, T. H.; Oyanagi, Y.; Arnison, G. T. J.; Antipov, Yu; Barinov, N.

    1978-09-01

    This compilation of current high-energy physics experiments is a collaborative effort of the Berkeley Particle Data Group, the SLAC library, and the nine participating laboratories: Argonne (ANL), Brookhaven (BNL), CERN, DESY, Fermilab (FNAL), KEK, Rutherford (RHEL), Serpukhov (SERP), and SLAC. Nominally, the compilation includes summaries of all high-energy physics experiments at the above laboratories that were approved (and not subsequently withdrawn) before about June 1978, and had not completed taking of data by 1 January 1975. The experimental summaries are supplemented with three indexes to the compilation, several vocabulary lists giving names or abbreviations used, and a short summary of the beams at each of the laboratories (except Rutherford). The summaries themselves are included on microfiche. (RWR)

  12. DrawCompileEvolve: Sparking interactive evolutionary art with human creations

    DEFF Research Database (Denmark)

    Zhang, Jinhong; Taarnby, Rasmus; Liapis, Antonios

    2015-01-01

    This paper presents DrawCompileEvolve, a web-based drawing tool which allows users to draw simple primitive shapes, group them together or define patterns in their groupings (e.g. symmetry, repetition). The user’s vector drawing is then compiled into an indirectly encoded genetic representation......, which can be evolved interactively, allowing the user to change the image’s colors, patterns and ultimately transform it. The human artist has direct control while drawing the initial seed of an evolutionary run and indirect control while interactively evolving it, thus making DrawCompileEvolve a mixed...

  13. Vectorization vs. compilation in query execution

    NARCIS (Netherlands)

    J. Sompolski (Juliusz); M. Zukowski (Marcin); P.A. Boncz (Peter)

    2011-01-01

    textabstractCompiling database queries into executable (sub-) programs provides substantial benefits comparing to traditional interpreted execution. Many of these benefits, such as reduced interpretation overhead, better instruction code locality, and providing opportunities to use SIMD

  14. Production compilation : A simple mechanism to model complex skill acquisition

    NARCIS (Netherlands)

    Taatgen, N.A.; Lee, F.J.

    2003-01-01

    In this article we describe production compilation, a mechanism for modeling skill acquisition. Production compilation has been developed within the ACT-Rational (ACT-R; J. R. Anderson, D. Bothell, M. D. Byrne, & C. Lebiere, 2002) cognitive architecture and consists of combining and specializing

  15. 32 CFR 806b.19 - Information compiled in anticipation of civil action.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Information compiled in anticipation of civil action. 806b.19 Section 806b.19 National Defense Department of Defense (Continued) DEPARTMENT OF THE AIR... compiled in anticipation of civil action. Withhold records compiled in connection with a civil action or...

  16. Charged particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, C.

    1999-01-01

    We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal reason for setting up the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The main goal of NACRE network was the transparency in the procedure of calculating the rates. More specifically this compilation aims at: 1. updating the experimental and theoretical data; 2. distinctly identifying the sources of the data used in rate calculation; 3. evaluating the uncertainties and errors; 4. providing numerically integrated reaction rates; 5. providing reverse reaction rates and analytical approximations of the adopted rates. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. The compilation is concerned with the reaction rates that are large enough for the target lifetimes shorter than the age of the Universe, taken equal to 15 x 10 9 y. The reaction rates are provided for temperatures lower than T = 10 10 K. In parallel with the rate compilation a cross section data base has been created and located at the site http://pntpm.ulb.ac.be/nacre..htm. (authors)

  17. Asian collaboration on nuclear reaction data compilation

    International Nuclear Information System (INIS)

    Aikawa, Masayuki; Furutachi, Naoya; Kato, Kiyoshi; Makinaga, Ayano; Devi, Vidya; Ichinkhorloo, Dagvadorj; Odsuren, Myagmarjav; Tsubakihara, Kohsuke; Katayama, Toshiyuki; Otuka, Naohiko

    2013-01-01

    Nuclear reaction data are essential for research and development in nuclear engineering, radiation therapy, nuclear physics and astrophysics. Experimental data must be compiled in a database and be accessible to nuclear data users. One of the nuclear reaction databases is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC) under the auspices of the International Atomic Energy Agency. Recently, collaboration among the Asian NRDC members is being further developed under the support of the Asia-Africa Science Platform Program of the Japan Society for the Promotion of Science. We report the activity for three years to develop the Asian collaboration on nuclear reaction data compilation. (author)

  18. Fault-tolerant digital microfluidic biochips compilation and synthesis

    CERN Document Server

    Pop, Paul; Stuart, Elena; Madsen, Jan

    2016-01-01

    This book describes for researchers in the fields of compiler technology, design and test, and electronic design automation the new area of digital microfluidic biochips (DMBs), and thus offers a new application area for their methods.  The authors present a routing-based model of operation execution, along with several associated compilation approaches, which progressively relax the assumption that operations execute inside fixed rectangular modules.  Since operations can experience transient faults during the execution of a bioassay, the authors show how to use both offline (design time) and online (runtime) recovery strategies. The book also presents methods for the synthesis of fault-tolerant application-specific DMB architectures. ·         Presents the current models used for the research on compilation and synthesis techniques of DMBs in a tutorial fashion; ·         Includes a set of “benchmarks”, which are presented in great detail and includes the source code of most of the t...

  19. PGHPF – An Optimizing High Performance Fortran Compiler for Distributed Memory Machines

    Directory of Open Access Journals (Sweden)

    Zeki Bozkus

    1997-01-01

    Full Text Available High Performance Fortran (HPF is the first widely supported, efficient, and portable parallel programming language for shared and distributed memory systems. HPF is realized through a set of directive-based extensions to Fortran 90. It enables application developers and Fortran end-users to write compact, portable, and efficient software that will compile and execute on workstations, shared memory servers, clusters, traditional supercomputers, or massively parallel processors. This article describes a production-quality HPF compiler for a set of parallel machines. Compilation techniques such as data and computation distribution, communication generation, run-time support, and optimization issues are elaborated as the basis for an HPF compiler implementation on distributed memory machines. The performance of this compiler on benchmark programs demonstrates that high efficiency can be achieved executing HPF code on parallel architectures.

  20. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    Science.gov (United States)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  1. abc: The AspectBench Compiler for AspectJ

    DEFF Research Database (Denmark)

    Allan, Chris; Avgustinov, Pavel; Christensen, Aske Simon

    2005-01-01

    abc is an extensible, optimising compiler for AspectJ. It has been designed as a workbench for experimental research in aspect-oriented programming languages and compilers. We outline a programme of research in these areas, and we review how abc can help in achieving those research goals...

  2. The Katydid system for compiling KEE applications to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    Components of a system known as Katydid are developed in an effort to compile knowledge-based systems developed in a multimechanism integrated environment (KEE) to Ada. The Katydid core is an Ada library supporting KEE object functionality, and the other elements include a rule compiler, a LISP-to-Ada translator, and a knowledge-base dumper. Katydid employs translation mechanisms that convert LISP knowledge structures and rules to Ada and utilizes basic prototypes of a run-time KEE object-structure library module for Ada. Preliminary results include the semiautomatic compilation of portions of a simple expert system to run in an Ada environment with the described algorithms. It is suggested that Ada can be employed for AI programming and implementation, and the Katydid system is being developed to include concurrency and synchronization mechanisms.

  3. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    Energy Technology Data Exchange (ETDEWEB)

    Foster, I.; Tick, E. (comp.)

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. The fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.

  4. DLVM: A modern compiler infrastructure for deep learning systems

    OpenAIRE

    Wei, Richard; Schwartz, Lane; Adve, Vikram

    2017-01-01

    Deep learning software demands reliability and performance. However, many of the existing deep learning frameworks are software libraries that act as an unsafe DSL in Python and a computation graph interpreter. We present DLVM, a design and implementation of a compiler infrastructure with a linear algebra intermediate representation, algorithmic differentiation by adjoint code generation, domain-specific optimizations and a code generator targeting GPU via LLVM. Designed as a modern compiler ...

  5. Cross-compilation of ATLAS online software to the power PC-Vx works system

    International Nuclear Information System (INIS)

    Tian Yuren; Li Jin; Ren Zhengyu; Zhu Kejun

    2005-01-01

    BES III, selected ATLAS online software as a framework of its run-control system. BES III applied Power PC-VxWorks system on its front-end readout system, so it is necessary to cross-compile this software to PowerPC-VxWorks system. The article demonstrates several aspects related to this project, such as the structure and organization of the ATLAS online software, the application of CMT tool while cross-compiling, the selection and configuration of the cross-compiler, methods to solve various problems due to the difference of compiler and operating system etc. The software, after cross-compiling, can normally run, and makes up a complete run-control system with the software running on Linux system. (authors)

  6. Transportation legislative data base: State radioactive materials transportation statute compilation, 1989--1993

    International Nuclear Information System (INIS)

    1994-04-01

    The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United States. The TLDB has been operated by the National Conference of State Legislatures (NCSL) under cooperative agreement with the US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management since 1992. The data base system serves the legislative and regulatory information needs of federal, state, tribal and local governments, the affected private sector and interested members of the general public. Users must be approved by DOE and NCSL. This report is a state statute compilation that updates the 1989 compilation produced by Battelle Memorial Institute, the previous manager of the data base. This compilation includes statutes not included in the prior compilation, as well as newly enacted laws. Statutes not included in the prior compilation show an enactment date prior to 1989. Statutes that deal with low-level radioactive waste transportation are included in the data base as are statutes from the states of Alaska and Hawaii. Over 155 new entries to the data base are summarized in this compilation

  7. A Language for Specifying Compiler Optimizations for Generic Software

    Energy Technology Data Exchange (ETDEWEB)

    Willcock, Jeremiah J. [Indiana Univ., Bloomington, IN (United States)

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allow the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.

  8. Compiling the parallel programming language NestStep to the CELL processor

    OpenAIRE

    Holm, Magnus

    2010-01-01

    The goal of this project is to create a source-to-source compiler which will translate NestStep code to C code. The compiler's job is to replace NestStep constructs with a series of function calls to the NestStep runtime system. NestStep is a parallel programming language extension based on the BSP model. It adds constructs for parallel programming on top of an imperative programming language. For this project, only constructs extending the C language are relevant. The output code will compil...

  9. Compilation of solar abundance data

    International Nuclear Information System (INIS)

    Hauge, Oe.; Engvold, O.

    1977-01-01

    Interest in the previous compilations of solar abundance data by the same authors (ITA--31 and ITA--39) has led to this third, revised edition. Solar abundance data of 67 elements are tabulated and in addition upper limits for the abundances of 5 elements are listed. References are made to 167 papers. A recommended abundance value is given for each element. (JIW)

  10. Compilation of piping benchmark problems - Cooperative international effort

    Energy Technology Data Exchange (ETDEWEB)

    McAfee, W J [comp.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations.

  11. Compiling knowledge-based systems from KEE to Ada

    Science.gov (United States)

    Filman, Robert E.; Bock, Conrad; Feldman, Roy

    1990-01-01

    The dominant technology for developing AI applications is to work in a multi-mechanism, integrated, knowledge-based system (KBS) development environment. Unfortunately, systems developed in such environments are inappropriate for delivering many applications - most importantly, they carry the baggage of the entire Lisp environment and are not written in conventional languages. One resolution of this problem would be to compile applications from complex environments to conventional languages. Here the first efforts to develop a system for compiling KBS developed in KEE to Ada (trademark). This system is called KATYDID, for KEE/Ada Translation Yields Development Into Delivery. KATYDID includes early prototypes of a run-time KEE core (object-structure) library module for Ada, and translation mechanisms for knowledge structures, rules, and Lisp code to Ada. Using these tools, part of a simple expert system was compiled (not quite automatically) to run in a purely Ada environment. This experience has given us various insights on Ada as an artificial intelligence programming language, potential solutions of some of the engineering difficulties encountered in early work, and inspiration on future system development.

  12. Compilation of piping benchmark problems - Cooperative international effort

    International Nuclear Information System (INIS)

    McAfee, W.J.

    1979-06-01

    This report is the culmination of an effort initiated in 1976 by the IWGFR to evaluate detailed and simplified analysis methods for piping systems with particular emphasis on piping bends. The procedure was to collect from participating member IWGFR countries descriptions of tests and test results for piping systems or bends, to compile, evaluate, and issue a selected number of these problems for analysis, and to compile and make a preliminary evaluation of the analysis results. The Oak Ridge National Laboratory agreed to coordinate this activity, including compilation of the original problems and the final analyses results. Of the problem descriptions submitted three were selected to be used. These were issued in December 1977. As a follow-on activity, addenda were issued that provided additional data or corrections to the original problem statement. A variety of both detailed and simplified analysis solutions were obtained. A brief comparative assessment of the analyses is contained in this report. All solutions submitted have also been included in order to provide users of this report the information necessary to make their own comparisons or evaluations

  13. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part IV: Normal and Inverted Letter 'h' and 'H' Architecture.

  14. Compilation and analysis of Escherichia coli promoter DNA sequences.

    OpenAIRE

    Hawley, D K; McClure, W R

    1983-01-01

    The DNA sequence of 168 promoter regions (-50 to +10) for Escherichia coli RNA polymerase were compiled. The complete listing was divided into two groups depending upon whether or not the promoter had been defined by genetic (promoter mutations) or biochemical (5' end determination) criteria. A consensus promoter sequence based on homologies among 112 well-defined promoters was determined that was in substantial agreement with previous compilations. In addition, we have tabulated 98 promoter ...

  15. Installation of a new Fortran compiler and effective programming method on the vector supercomputer

    International Nuclear Information System (INIS)

    Nemoto, Toshiyuki; Suzuki, Koichiro; Watanabe, Kenji; Machida, Masahiko; Osanai, Seiji; Isobe, Nobuo; Harada, Hiroo; Yokokawa, Mitsuo

    1992-07-01

    The Fortran compiler, version 10 has been replaced with the new one, version 12 (V12) on the Fujitsu Computer system at JAERI since May, 1992. The benchmark test for the performance of the V12 compiler is carried out with 16 representative nuclear codes in advance of the installation of the compiler. The performance of the compiler is achieved by the factor of 1.13 in average. The effect of the enhanced functions of the compiler and the compatibility to the nuclear codes are also examined. The assistant tool for vectorization TOP10EX is developed. In this report, the results of the evaluation of the V12 compiler and the usage of the tools for vectorization are presented. (author)

  16. Compiler-Agnostic Function Detection in Binaries

    NARCIS (Netherlands)

    Andriesse, D.A.; Slowinska, J.M.; Bos, H.J.

    2017-01-01

    We propose Nucleus, a novel function detection algorithm for binaries. In contrast to prior work, Nucleus is compiler-agnostic, and does not require any learning phase or signature information. Instead of scanning for signatures, Nucleus detects functions at the Control Flow Graph-level, making it

  17. Study of energetic electrons in the outer radiation-belt regions using data obtained by the LLL spectrometer on OGO-5 in 1968

    International Nuclear Information System (INIS)

    West, H.I. Jr.; Buck, R.M.; Davidson, G.

    1979-01-01

    An account is given of measurements of electrons made by the LLL magnetic electron spectrometer (60 to 3000 keV in seven differential energy channels) on the Ogo-5 satellite in the earth's outer-belt regions during 1968 and early 1969. The data were analyzed specifically to determine pitch-angle diffusion lifetimes as a function of energy in the L-range 2 to 5. As a part of this effort, the general dynamics of these regions were studied in terms of the time-dependent energy spectra, and pitch-angle distributions for the seven energy groups were obtained as a function of L with representative values presented for L = 2.5 to 6. The pitch-angle-diffusion results were used to analyze the dynamics of the electrons injected following the intense storms on October 31 and November 1, 1968, in terms of radial diffusion; the derived diffusion coefficients provide a quite reasonable picture of electron transport in the radiation belts. Both the radial- and pitch-angle-diffusion results are compared with earlier results. 53 references

  18. Regulatory and technical reports (abstract index journal): Annual compilation for 1987

    International Nuclear Information System (INIS)

    1988-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  19. Compilation of Instantaneous Source Functions for Varying ...

    African Journals Online (AJOL)

    Compilation of Instantaneous Source Functions for Varying Architecture of a Layered Reservoir with Mixed Boundaries and Horizontal Well Completion Part III: B-Shaped Architecture with Vertical Well in the Upper Layer.

  20. Regulatory and technical reports. Compilation for second quarter 1982, April to June

    International Nuclear Information System (INIS)

    1982-08-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. A detailed explanation of the entries precedes each index

  1. Gravity Data for Indiana (300 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity data (300 records) were compiled by Purdue University. This data base was received in February 1993. Principal gravity parameters include Free-air...

  2. Compiler Driven Code Comments and Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Karlsson, Sven

    2011-01-01

    . We demonstrate the ability of our tool to trans- form code, and suggest code refactoring that increase its amenability to optimization. The preliminary results shows that, with our tool-set, au- tomatic loop parallelization with the GNU C compiler, gcc, yields 8.6x best-case speedup over...

  3. Regulatory and technical reports: compilation for third quarter 1982 July-September

    International Nuclear Information System (INIS)

    1982-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. This precede the following indexes: Contractor Report Number Index; Personal Author Index; Subject Index; NRC Originating Organization Index (Staff Reports); NRC Contract Sponsor Index (Contractor Reports); Contractor Index; and Licensed Facility Index

  4. Perspex machine: V. Compilation of C programs

    Science.gov (United States)

    Spanner, Matthew P.; Anderson, James A. D. W.

    2006-01-01

    The perspex machine arose from the unification of the Turing machine with projective geometry. The original, constructive proof used four special, perspective transformations to implement the Turing machine in projective geometry. These four transformations are now generalised and applied in a compiler, implemented in Pop11, that converts a subset of the C programming language into perspexes. This is interesting both from a geometrical and a computational point of view. Geometrically, it is interesting that program source can be converted automatically to a sequence of perspective transformations and conditional jumps, though we find that the product of homogeneous transformations with normalisation can be non-associative. Computationally, it is interesting that program source can be compiled for a Reduced Instruction Set Computer (RISC), the perspex machine, that is a Single Instruction, Zero Exception (SIZE) computer.

  5. Compiling gate networks on an Ising quantum computer

    International Nuclear Information System (INIS)

    Bowdrey, M.D.; Jones, J.A.; Knill, E.; Laflamme, R.

    2005-01-01

    Here we describe a simple mechanical procedure for compiling a quantum gate network into the natural gates (pulses and delays) for an Ising quantum computer. The aim is not necessarily to generate the most efficient pulse sequence, but rather to develop an efficient compilation algorithm that can be easily implemented in large spin systems. The key observation is that it is not always necessary to refocus all the undesired couplings in a spin system. Instead, the coupling evolution can simply be tracked and then corrected at some later time. Although described within the language of NMR, the algorithm is applicable to any design of quantum computer based on Ising couplings

  6. Using MaxCompiler for High Level Synthesis of Trigger Algorithms

    CERN Document Server

    Summers, Sioni Paris; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  7. Compilation status and research topics in Hokkaido University Nuclear Reaction Data Centre

    International Nuclear Information System (INIS)

    Aikawa, M.; Furutachi, N.; Katō, K.; Ebata, S.; Ichinkhorloo, D.; Imai, S.; Sarsembayeva, A.; Zhou, B.; Otuka, N.

    2015-01-01

    Nuclear reaction data are necessary and applicable for many application fields. The nuclear reaction data must be compiled into a database for convenient availability. One such database is the EXFOR database maintained by the International Network of Nuclear Reaction Data Centres (NRDC). As a member of the NRDC, the Hokkaido University Nuclear Reaction Data Centre (JCPRG) compiles charged-particle induced reaction data and contributes about 10 percent of the EXFOR database. In this paper, we show the recent compilation status and related research topics of JCPRG. (author)

  8. 36 CFR 902.57 - Investigatory files compiled for law enforcement purposes.

    Science.gov (United States)

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Investigatory files compiled for law enforcement purposes. 902.57 Section 902.57 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT Exemptions From Public Access to Corporation Records § 902.57 Investigatory files compiled...

  9. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-based monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  10. Regular expressions compiler and some applications

    International Nuclear Information System (INIS)

    Saldana A, H.

    1978-01-01

    We deal with high level programming language of a Regular Expressions Compiler (REC). The first chapter is an introduction in which the history of the REC development and the problems related to its numerous applicatons are described. The syntactic and sematic rules as well as the language features are discussed just after the introduction. Concerning the applicatons as examples, an adaptation is given in order to solve numerical problems and another for the data manipulation. The last chapter is an exposition of ideas and techniques about the compiler construction. Examples of the adaptation to numerical problems show the applications to education, vector analysis, quantum mechanics, physics, mathematics and other sciences. The rudiments of an operating system for a minicomputer are the examples of the adaptation to symbolic data manipulaton. REC is a programming language that could be applied to solve problems in almost any human activity. Handling of computer graphics, control equipment, research on languages, microprocessors and general research are some of the fields in which this programming language can be applied and developed. (author)

  11. Compiling models into real-time systems

    International Nuclear Information System (INIS)

    Dormoy, J.L.; Cherriaux, F.; Ancelin, J.

    1992-08-01

    This paper presents an architecture for building real-time systems from models, and model-compiling techniques. This has been applied for building a real-time model-base monitoring system for nuclear plants, called KSE, which is currently being used in two plants in France. We describe how we used various artificial intelligence techniques for building it: a model-based approach, a logical model of its operation, a declarative implementation of these models, and original knowledge-compiling techniques for automatically generating the real-time expert system from those models. Some of those techniques have just been borrowed from the literature, but we had to modify or invent other techniques which simply did not exist. We also discuss two important problems, which are often underestimated in the artificial intelligence literature: size, and errors. Our architecture, which could be used in other applications, combines the advantages of the model-based approach with the efficiency requirements of real-time applications, while in general model-based approaches present serious drawbacks on this point

  12. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  13. Safety and maintenance engineering: A compilation

    Science.gov (United States)

    1974-01-01

    A compilation is presented for the dissemination of information on technological developments which have potential utility outside the aerospace and nuclear communities. Safety of personnel engaged in the handling of hazardous materials and equipment, protection of equipment from fire, high wind, or careless handling by personnel, and techniques for the maintenance of operating equipment are reported.

  14. Compilation of cross-sections. Pt. 1

    International Nuclear Information System (INIS)

    Flaminio, V.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1983-01-01

    A compilation of integral cross-sections for hadronic reactions is presented. This is an updated version of CERN/HERA 79-1, 79-2, 79-3. It contains all data published up to the beginning of 1982, but some more recent data have also been included. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  15. Compilation of information on melter modeling

    International Nuclear Information System (INIS)

    Eyler, L.L.

    1996-03-01

    The objective of the task described in this report is to compile information on modeling capabilities for the High-Temperature Melter and the Cold Crucible Melter and issue a modeling capabilities letter report summarizing existing modeling capabilities. The report is to include strategy recommendations for future modeling efforts to support the High Level Waste (BLW) melter development

  16. Verified compilation of Concurrent Managed Languages

    Science.gov (United States)

    2017-11-01

    Communications Division Information Directorate This report is published in the interest of scientific and technical information exchange, and its...271, 2007. [85] Viktor Vafeiadis. Modular fine-grained concurrency verification. Technical Report UCAM-CL-TR- 726, University of Cambridge, Computer...VERIFIED COMPILATION OF CONCURRENT MANAGED LANGUAGES PURDUE UNIVERSITY NOVEMBER 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE

  17. Compilation of cross-sections. Pt. 4

    International Nuclear Information System (INIS)

    Alekhin, S.I.; Ezhela, V.V.; Lugovsky, S.B.; Tolstenkov, A.N.; Yushchenko, O.P.; Baldini, A.; Cobal, M.; Flaminio, V.; Capiluppi, P.; Giacomelli, G.; Mandrioli, G.; Rossi, A.M.; Serra, P.; Moorhead, W.G.; Morrison, D.R.O.; Rivoire, N.

    1987-01-01

    This is the fourth volume in our series of data compilations on integrated cross-sections for weak, electromagnetic, and strong interaction processes. This volume covers data on reactions induced by photons, neutrinos, hyperons, and K L 0 . It contains all data published up to June 1986. Plots of the cross-sections versus incident laboratory momentum are also given. (orig.)

  18. Compilation of nuclear safety criteria potential application to DOE nonreactor facilities

    International Nuclear Information System (INIS)

    1992-03-01

    This bibliographic document compiles nuclear safety criteria applied to the various areas of nuclear safety addressed in a Safety Analysis Report for a nonreactor nuclear facility (NNF). The criteria listed are derived from federal regulations, Nuclear Regulatory Commission (NRC) guides and publications, DOE and DOE contractor publications, and industry codes and standards. The titles of the chapters and sections of Regulatory Guide 3.26, ''Standard Format and Content of Safety Analysis Reports for Fuel Reprocessing Plants'' were used to format the chapters and sections of this compilation. In each section the criteria are compiled in four groups, namely: (1) Code of Federal Regulations, (2) USNRC Regulatory Guides, (3) Codes and Standards, and (4) Supplementary Information

  19. Parallelizing More Loops with Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    2012-01-01

    an interactive compilation feedback system that guides programmers in iteratively modifying their application source code. This helps leverage the compiler’s ability to generate loop-parallel code. We employ our system to modify two sequential benchmarks dealing with image processing and edge detection...

  20. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  1. Deep knowledge and knowledge compilation for dynamic systems

    International Nuclear Information System (INIS)

    Mizoguchi, Riichiro

    1994-01-01

    Expert systems are viewed as knowledge-based systems which efficiently solve real-world problems based on the expertise contained in their knowledge bases elicited from domain experts. Although such expert systems that depends on heuristics of domain experts have contributed to the current success, they are known to be brittle and hard to build. This paper is concerned with research on model-based diagnosis and knowledge compilation for dynamic systems conducted by the author's group to overcome these difficulties. Firstly, we summarize the advantages and shortcomings of expert systems. Secondly, deep knowledge and knowledge compilation is discussed. Then, latest results of our research on model-based diagnosis is overviewed. The future direction of knowledge base technology research is also discussed. (author)

  2. 1991 OCRWM bulletin compilation and index

    International Nuclear Information System (INIS)

    1992-05-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management, to provide current information about the national program for managing spent fuel and high-level radioactive waste. The document is a compilation of issues from the 1991 calendar year. A table of contents and an index have been provided to reference information contained in this year's Bulletins

  3. You know the Science. Do you know your Code?

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    This talk is about automated code analysis and transformation tools to support scientific computing. Code bases are difficult to manage because of size, age, or safety requirements. Tools can help scientists and IT engineers understand their code, locate problems, improve quality. Tools can also help transform the code, by implementing complex refactorings, replatforming, or migration to a modern language. Such tools are themselves difficult to build. This talk describes DMS, a meta-tool for building software analysis tools. DMS is a kind of generalized compiler, and can be configured to process arbitrary programming languages, to carry out arbitrary analyses, and to convert specifications into running code. It has been used for a variety of purposes, including converting embedded mission software in the US B-2 Stealth Bomber, providing the US Social Security Administration with a deep view how their 200 millions lines of COBOL are connected, and reverse-engineering legacy factory process control code i...

  4. National energetic balance. Statistical compilation 1985-1991

    International Nuclear Information System (INIS)

    1992-01-01

    Compiles the statistical information supplied by governmental and private institutions which integrate the national energetic sector in Paraguay. The first part, refers to the whole effort of energy; second, energy transformation centres and the last part presents the energy flows, consolidated balances and other economic-power indicators

  5. 13 CFR 146.600 - Semi-annual compilation.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Semi-annual compilation. 146.600 Section 146.600 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION NEW RESTRICTIONS ON LOBBYING.... (c) Information that involves intelligence matters shall be reported only to the Select Committee on...

  6. ZettaBricks: A Language Compiler and Runtime System for Anyscale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Amarasinghe, Saman [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2015-03-27

    This grant supported the ZettaBricks and OpenTuner projects. ZettaBricks is a new implicitly parallel language and compiler where defining multiple implementations of multiple algorithms to solve a problem is the natural way of programming. ZettaBricks makes algorithmic choice a first class construct of the language. Choices are provided in a way that also allows our compiler to tune at a finer granularity. The ZettaBricks compiler autotunes programs by making both fine-grained as well as algorithmic choices. Choices also include different automatic parallelization techniques, data distributions, algorithmic parameters, transformations, and blocking. Additionally, ZettaBricks introduces novel techniques to autotune algorithms for different convergence criteria. When choosing between various direct and iterative methods, the ZettaBricks compiler is able to tune a program in such a way that delivers near-optimal efficiency for any desired level of accuracy. The compiler has the flexibility of utilizing different convergence criteria for the various components within a single algorithm, providing the user with accuracy choice alongside algorithmic choice. OpenTuner is a generalization of the experience gained in building an autotuner for ZettaBricks. OpenTuner is a new open source framework for building domain-specific multi-objective program autotuners. OpenTuner supports fully-customizable configuration representations, an extensible technique representation to allow for domain-specific techniques, and an easy to use interface for communicating with the program to be autotuned. A key capability inside OpenTuner is the use of ensembles of disparate search techniques simultaneously; techniques that perform well will dynamically be allocated a larger proportion of tests.

  7. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  8. Fifth Baltic Sea pollution load compilation (PLC-5). An executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Svendsen, L.M.; Staaf, H.; Pyhala, M.; Kotilainen, P.; Bartnicki, J.; Knuuttila, S.; Durkin, M.

    2012-07-01

    This report summarizes and combines the main results of the Fifth Baltic Sea Pollution Load Compilation (HELCOM 2011) which covers waterborne loads to the sea and data on atmospheric loads which are submitted by countries to the co-operative programme for monitoring and evaluation of the long range transmission of air pollutants in Europe (EMEP), which subsequently compiles and reports this information to HELCOM.

  9. Using MaxCompiler for the high level synthesis of trigger algorithms

    International Nuclear Information System (INIS)

    Summers, S.; Rose, A.; Sanders, P.

    2017-01-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  10. Using MaxCompiler for the high level synthesis of trigger algorithms

    Science.gov (United States)

    Summers, S.; Rose, A.; Sanders, P.

    2017-02-01

    Firmware for FPGA trigger applications at the CMS experiment is conventionally written using hardware description languages such as Verilog and VHDL. MaxCompiler is an alternative, Java based, tool for developing FPGA applications which uses a higher level of abstraction from the hardware than a hardware description language. An implementation of the jet and energy sum algorithms for the CMS Level-1 calorimeter trigger has been written using MaxCompiler to benchmark against the VHDL implementation in terms of accuracy, latency, resource usage, and code size. A Kalman Filter track fitting algorithm has been developed using MaxCompiler for a proposed CMS Level-1 track trigger for the High-Luminosity LHC upgrade. The design achieves a low resource usage, and has a latency of 187.5 ns per iteration.

  11. Languages, compilers and run-time environments for distributed memory machines

    CERN Document Server

    Saltz, J

    1992-01-01

    Papers presented within this volume cover a wide range of topics related to programming distributed memory machines. Distributed memory architectures, although having the potential to supply the very high levels of performance required to support future computing needs, present awkward programming problems. The major issue is to design methods which enable compilers to generate efficient distributed memory programs from relatively machine independent program specifications. This book is the compilation of papers describing a wide range of research efforts aimed at easing the task of programmin

  12. Combining Compile-Time and Run-Time Parallelization

    Directory of Open Access Journals (Sweden)

    Sungdo Moon

    1999-01-01

    Full Text Available This paper demonstrates that significant improvements to automatic parallelization technology require that existing systems be extended in two ways: (1 they must combine high‐quality compile‐time analysis with low‐cost run‐time testing; and (2 they must take control flow into account during analysis. We support this claim with the results of an experiment that measures the safety of parallelization at run time for loops left unparallelized by the Stanford SUIF compiler’s automatic parallelization system. We present results of measurements on programs from two benchmark suites – SPECFP95 and NAS sample benchmarks – which identify inherently parallel loops in these programs that are missed by the compiler. We characterize remaining parallelization opportunities, and find that most of the loops require run‐time testing, analysis of control flow, or some combination of the two. We present a new compile‐time analysis technique that can be used to parallelize most of these remaining loops. This technique is designed to not only improve the results of compile‐time parallelization, but also to produce low‐cost, directed run‐time tests that allow the system to defer binding of parallelization until run‐time when safety cannot be proven statically. We call this approach predicated array data‐flow analysis. We augment array data‐flow analysis, which the compiler uses to identify independent and privatizable arrays, by associating predicates with array data‐flow values. Predicated array data‐flow analysis allows the compiler to derive “optimistic” data‐flow values guarded by predicates; these predicates can be used to derive a run‐time test guaranteeing the safety of parallelization.

  13. Gravity Data for Southwestern Alaska (1294 records compiled)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The gravity station data (1294 records) were compiled by the Alaska Geological Survey and the U.S. Geological Survey, Menlo Park, California. This data base was...

  14. On the performance of the HAL/S-FC compiler. [for space shuttles

    Science.gov (United States)

    Martin, F. H.

    1975-01-01

    The HAL/S compilers which will be used in the space shuttles are described. Acceptance test objectives and procedures are described, the raw results are presented and analyzed, and conclusions and observations are drawn. An appendix is included containing an illustrative set of compiler listings and results for one of the test cases.

  15. Expectation Levels in Dictionary Consultation and Compilation ...

    African Journals Online (AJOL)

    Dictionary consultation and compilation is a two-way engagement between two parties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their consultation skills, their knowledge of the structure ...

  16. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1981-03-01

    A request list for nuclear data which was produced from a computerized data file by the National Nuclear Data Center is presented. The request list is given by target nucleus (isotope) and then reaction type. The purpose of the compilation is to summarize the current needs of US Nuclear Energy programs and other applied technologies for nuclear data. Requesters are identified by laboratory, last name, and sponsoring US government agency

  17. Expectation Levels in Dictionary Consultation and Compilation*

    African Journals Online (AJOL)

    Abstract: Dictionary consultation and compilation is a two-way engagement between two par- ties, namely a dictionary user and a lexicographer. How well users cope with looking up words in a Bantu language dictionary and to what extent their expectations are met, depends on their con- sultation skills, their knowledge of ...

  18. Methods for the Compilation of a Core List of Journals in Toxicology.

    Science.gov (United States)

    Kuch, T. D. C.

    Previously reported methods for the compilation of core lists of journals in multidisciplinary areas are first examined, with toxicology used as an example of such an area. Three approaches to the compilation of a core list of journals in toxicology were undertaken and the results analyzed with the aid of models. Analysis of the results of the…

  19. 21 CFR 20.64 - Records or information compiled for law enforcement purposes.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Records or information compiled for law enforcement purposes. 20.64 Section 20.64 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PUBLIC INFORMATION Exemptions § 20.64 Records or information compiled for law enforcement purposes. (a) Records or...

  20. Mode automata and their compilation into fault trees

    International Nuclear Information System (INIS)

    Rauzy, Antoine

    2002-01-01

    In this article, we advocate the use of mode automata as a high level representation language for reliability studies. Mode automata are states/transitions based representations with the additional notion of flow. They can be seen as a generalization of both finite capacity Petri nets and block diagrams. They can be assembled into hierarchies by means of composition operations. The contribution of this article is twofold. First, we introduce mode automata and we discuss their relationship with other formalisms. Second, we propose an algorithm to compile mode automata into Boolean equations (fault trees). Such a compilation is of interest for two reasons. First, assessment tools for Boolean models are much more efficient than those for states/transitions models. Second, the automated generation of fault trees from higher level representations makes easier their maintenance through the life cycle of systems under study

  1. Fusing a Transformation Language with an Open Compiler

    NARCIS (Netherlands)

    Kalleberg, K.T.; Visser, E.

    2007-01-01

    Program transformation systems provide powerful analysis and transformation frameworks as well as concise languages for language processing, but instantiating them for every subject language is an arduous task, most often resulting in halfcompleted frontends. Compilers provide mature frontends with

  2. Compiler Technology for Parallel Scientific Computation

    Directory of Open Access Journals (Sweden)

    Can Özturan

    1994-01-01

    Full Text Available There is a need for compiler technology that, given the source program, will generate efficient parallel codes for different architectures with minimal user involvement. Parallel computation is becoming indispensable in solving large-scale problems in science and engineering. Yet, the use of parallel computation is limited by the high costs of developing the needed software. To overcome this difficulty we advocate a comprehensive approach to the development of scalable architecture-independent software for scientific computation based on our experience with equational programming language (EPL. Our approach is based on a program decomposition, parallel code synthesis, and run-time support for parallel scientific computation. The program decomposition is guided by the source program annotations provided by the user. The synthesis of parallel code is based on configurations that describe the overall computation as a set of interacting components. Run-time support is provided by the compiler-generated code that redistributes computation and data during object program execution. The generated parallel code is optimized using techniques of data alignment, operator placement, wavefront determination, and memory optimization. In this article we discuss annotations, configurations, parallel code generation, and run-time support suitable for parallel programs written in the functional parallel programming language EPL and in Fortran.

  3. Notes on Compiling a Corpus- Based Dictionary

    Directory of Open Access Journals (Sweden)

    František Čermák

    2011-10-01

    Full Text Available

    ABSTRACT: On the basis of sample analysis of a Czech adjective, a definition based on the data drawn from the Czech National Corpus (cf. Čermák and Schmiedtová 2003 is gradually compiled and finally offered, pointing at the drawbacks of definitions found in traditional dictionaries. Steps undertaken here are then generalized and used, in an ordered sequence (similar to a work-flow ordering, as topics, briefly discussed in the second part to which lexicographers of monolingual dictionaries should pay attention. These are supplemented by additional remarks and caveats useful in the compilation of a dictionary. Thus, a brief survey of some of the major steps of dictionary compilation is presented here, supplemented by the original Czech data, analyzed in their raw, though semiotically classified form.

    OPSOMMING: Aantekeninge oor die samestelling van 'n korpusgebaseerde woordeboek. Op grond van 'n steekproefontleding van 'n Tsjeggiese adjektief, word 'n definisie gebaseer op data ontleen aan die Tsjeggiese Nasionale Korpus (cf. Čermák en Schmiedtová 2003 geleidelik saamgestel en uiteindelik aangebied wat wys op die gebreke van definisies aangetref in tradisionele woordeboeke. Stappe wat hier onderneem word, word dan veralgemeen en gebruik in 'n geordende reeks (soortgelyk aan 'n werkvloeiordening, as onderwerpe, kortliks bespreek in die tweede deel, waaraan leksikograwe van eentalige woordeboeke aandag behoort te gee. Hulle word aangevul deur bykomende opmerkings en waarskuwings wat nuttig is vir die samestelling van 'n woordeboek. Op dié manier word 'n kort oorsig van sommige van die hoofstappe van woordeboeksamestelling hier aangebied, aangevul deur die oorspronklike Tsjeggiese data, ontleed in hul onbewerkte, alhoewel semioties geklassifiseerde vorm.

    Sleutelwoorde: EENTALIGE WOORDEBOEKE, KORPUSLEKSIKOGRAFIE, SINTAGMATIEK EN PARADIGMATIEK IN WOORDEBOEKE, WOORDEBOEKINSKRYWING, SOORTE LEMMAS, PRAGMATIEK, BEHANDELING VAN

  4. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  5. Compilation of data from hadronic atoms

    International Nuclear Information System (INIS)

    Poth, H.

    1979-01-01

    This compilation is a survey of the existing data of hadronic atoms (pionic-atoms, kaonic-atoms, antiprotonic-atoms, sigmonic-atoms). It collects measurements of the energies, intensities and line width of X-rays from hadronic atoms. Averaged values for each hadronic atom are given and the data are summarized. The listing contains data on 58 pionic-atoms, on 54 kaonic-atoms, on 23 antiprotonic-atoms and on 20 sigmonic-atoms. (orig./HB) [de

  6. HAL/S-FC and HAL/S-360 compiler system program description

    Science.gov (United States)

    1976-01-01

    The compiler is a large multi-phase design and can be broken into four phases: Phase 1 inputs the source language and does a syntactic and semantic analysis generating the source listing, a file of instructions in an internal format (HALMAT) and a collection of tables to be used in subsequent phases. Phase 1.5 massages the code produced by Phase 1, performing machine independent optimization. Phase 2 inputs the HALMAT produced by Phase 1 and outputs machine language object modules in a form suitable for the OS-360 or FCOS linkage editor. Phase 3 produces the SDF tables. The four phases described are written in XPL, a language specifically designed for compiler implementation. In addition to the compiler, there is a large library containing all the routines that can be explicitly called by the source language programmer plus a large collection of routines for implementing various facilities of the language.

  7. Internal combustion engines for alcohol motor fuels: a compilation of background technical information

    Energy Technology Data Exchange (ETDEWEB)

    Blaser, Richard

    1980-11-01

    This compilation, a draft training manual containing technical background information on internal combustion engines and alcohol motor fuel technologies, is presented in 3 parts. The first is a compilation of facts from the state of the art on internal combustion engine fuels and their characteristics and requisites and provides an overview of fuel sources, fuels technology and future projections for availability and alternatives. Part two compiles facts about alcohol chemistry, alcohol identification, production, and use, examines ethanol as spirit and as fuel, and provides an overview of modern evaluation of alcohols as motor fuels and of the characteristics of alcohol fuels. The final section compiles cross references on the handling and combustion of fuels for I.C. engines, presents basic evaluations of events leading to the use of alcohols as motor fuels, reviews current applications of alcohols as motor fuels, describes the formulation of alcohol fuels for engines and engine and fuel handling hardware modifications for using alcohol fuels, and introduces the multifuel engines concept. (LCL)

  8. Borrowing and Dictionary Compilation: The Case of the Indigenous ...

    African Journals Online (AJOL)

    rbr

    Keywords: BORROWING, DICTIONARY COMPILATION, INDIGENOUS LANGUAGES,. LEXICON, MORPHEME, VOCABULARY, DEVELOPING LANGUAGES, LOAN WORDS, TER-. MINOLOGY, ETYMOLOGY, LEXICOGRAPHY. Opsomming: Ontlening en woordeboeksamestelling: Die geval van in- heemse Suid-Afrikaanse ...

  9. Scientific Programming with High Performance Fortran: A Case Study Using the xHPF Compiler

    Directory of Open Access Journals (Sweden)

    Eric De Sturler

    1997-01-01

    Full Text Available Recently, the first commercial High Performance Fortran (HPF subset compilers have appeared. This article reports on our experiences with the xHPF compiler of Applied Parallel Research, version 1.2, for the Intel Paragon. At this stage, we do not expect very High Performance from our HPF programs, even though performance will eventually be of paramount importance for the acceptance of HPF. Instead, our primary objective is to study how to convert large Fortran 77 (F77 programs to HPF such that the compiler generates reasonably efficient parallel code. We report on a case study that identifies several problems when parallelizing code with HPF; most of these problems affect current HPF compiler technology in general, although some are specific for the xHPF compiler. We discuss our solutions from the perspective of the scientific programmer, and presenttiming results on the Intel Paragon. The case study comprises three programs of different complexity with respect to parallelization. We use the dense matrix-matrix product to show that the distribution of arrays and the order of nested loops significantly influence the performance of the parallel program. We use Gaussian elimination with partial pivoting to study the parallelization strategy of the compiler. There are various ways to structure this algorithm for a particular data distribution. This example shows how much effort may be demanded from the programmer to support the compiler in generating an efficient parallel implementation. Finally, we use a small application to show that the more complicated structure of a larger program may introduce problems for the parallelization, even though all subroutines of the application are easy to parallelize by themselves. The application consists of a finite volume discretization on a structured grid and a nested iterative solver. Our case study shows that it is possible to obtain reasonably efficient parallel programs with xHPF, although the compiler

  10. Approximate Compilation of Constraints into Multivalued Decision Diagrams

    DEFF Research Database (Denmark)

    Hadzic, Tarik; Hooker, John N.; O’Sullivan, Barry

    2008-01-01

    We present an incremental refinement algorithm for approximate compilation of constraint satisfaction models into multivalued decision diagrams (MDDs). The algorithm uses a vertex splitting operation that relies on the detection of equivalent paths in the MDD. Although the algorithm is quite gene...

  11. Compiler-Enforced Cache Coherence Using a Functional Language

    Directory of Open Access Journals (Sweden)

    Rich Wolski

    1996-01-01

    Full Text Available The cost of hardware cache coherence, both in terms of execution delay and operational cost, is substantial for scalable systems. Fortunately, compiler-generated cache management can reduce program serialization due to cache contention; increase execution performance; and reduce the cost of parallel systems by eliminating the need for more expensive hardware support. In this article, we use the Sisal functional language system as a vehicle to implement and investigate automatic, compiler-based cache management. We describe our implementation of Sisal for the IBM Power/4. The Power/4, briefly available as a product, represents an early attempt to build a shared memory machine that relies strictly on the language system for cache coherence. We discuss the issues associated with deterministic execution and program correctness on a system without hardware coherence, and demonstrate how Sisal (as a functional language is able to address those issues.

  12. A compilation of subsurface hydrogeologic data

    International Nuclear Information System (INIS)

    1986-03-01

    This report presents a compilation of both fracture properties and hydrogeological parameters relevant to the flow of groundwater in fractured rock systems. Methods of data acquisition as well as the scale of and conditions during the measurement are recorded. Measurements and analytical techniques for each of the parameters under consideration have been reviewed with respect to their methodology, assumptions and accuracy. Both the rock type and geologic setting associated with these measurements have also been recorded. 373 refs

  13. QMODULE: CAMAC modules recognized by the QAL compiler

    International Nuclear Information System (INIS)

    Kellogg, M.; Minor, M.M.; Shlaer, S.; Spencer, N.; Thomas, R.F. Jr.; van der Beken, H.

    1977-10-01

    The compiler for the Q Analyzer Language, QAL, recognizes a certain set of CAMAC modules as having known characteristics. The conventions and procedures used to describe these modules are discussed as well as the tools available to the user for extending this set as required

  14. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  15. Nuclear power plant operational data compilation system

    International Nuclear Information System (INIS)

    Silberberg, S.

    1980-01-01

    Electricite de France R and D Division has set up a nuclear power plant operational data compilation system. This data bank, created through American documents allows results about plant operation and operational material behaviour to be given. At present, French units at commercial operation are taken into account. Results obtained after five years of data bank operation are given. (author)

  16. Digital Bedrock Compilation: A Geodatabase Covering Forest Service Lands in California

    Science.gov (United States)

    Elder, D.; de La Fuente, J. A.; Reichert, M.

    2010-12-01

    This digital database contains bedrock geologic mapping for Forest Service lands within California. This compilation began in 2004 and the first version was completed in 2005. Second publication of this geodatabase was completed in 2010 and filled major gaps in the southern Sierra Nevada and Modoc/Medicine Lake/Warner Mountains areas. This digital map database was compiled from previously published and unpublished geologic mapping, with source mapping and review from California Geological Survey, the U.S. Geological Survey and others. Much of the source data was itself compilation mapping. This geodatabase is huge, containing ~107,000 polygons and ~ 280,000 arcs. Mapping was compiled from more than one thousand individual sources and covers over 41,000,000 acres (~166,000 km2). It was compiled from source maps at various scales - from ~ 1:4,000 to 1:250,000 and represents the best available geologic mapping at largest scale possible. An estimated 70-80% of the source information was digitized from geologic mapping at 1:62,500 scale or better. Forest Service ACT2 Enterprise Team compiled the bedrock mapping and developed a geodatabase to store this information. This geodatabase supports feature classes for polygons (e.g, map units), lines (e.g., contacts, boundaries, faults and structural lines) and points (e.g., orientation data, structural symbology). Lookup tables provide detailed information for feature class items. Lookup/type tables contain legal values and hierarchical groupings for geologic ages and lithologies. Type tables link coded values with descriptions for line and point attributes, such as line type, line location and point type. This digital mapping is at the core of many quantitative analyses and derivative map products. Queries of the database are used to produce maps and to quantify rock types of interest. These include the following: (1) ultramafic rocks - where hazards from naturally occurring asbestos are high, (2) granitic rocks - increased

  17. Incidence and risk factors for lower limb lymphedema after gynecologic cancer surgery with initiation of periodic complex decongestive physiotherapy.

    Science.gov (United States)

    Deura, Imari; Shimada, Muneaki; Hirashita, Keiko; Sugimura, Maki; Sato, Seiya; Sato, Shinya; Oishi, Tetsuro; Itamochi, Hiroaki; Harada, Tasuku; Kigawa, Junzo

    2015-06-01

    Lower limb lymphedema (LLL) is one of the most frequent postoperative complications of retroperitoneal lymphadenectomy for gynecologic cancer. LLL often impairs quality of life, activities of daily living, sleep, and sex in patients with gynecologic cancer. We conducted this study to evaluate the incidence and risk factors for LLL after gynecologic cancer surgery in patients who received assessment and periodic complex decongestive physiotherapy (CDP). We retrospectively reviewed 126 cases of gynecologic cancer that underwent surgery involving retroperitoneal lymphadenectomy at Tottori University Hospital between 2009 and 2012. All patients received physical examinations to detect LLL and underwent CDP by nurse specialists within several months after surgery. The International Society of Lymphology staging of lymphedema severity was used as the diagnostic criteria. Of 126 patients, 57 (45.2%) had LLL, comprising 45 and 12 patients with stage 1 and stage 2 LLL, respectively. No patient had stage 3 LLL. LLL was present in 37 (29.4%) patients at the initial physical examination. Multivariate analysis revealed that adjuvant concurrent chemoradiotherapy and age ≥ 55 years were independent risk factors for ≥ stage 2 LLL. To minimize the incidence of ≥ stage 2 LLL, gynecologic oncologists should be vigilant for this condition in patients who are ≥ 55 years and in those who undergo adjuvant chemoradiotherapy. Patients should be advised to have a physical assessment for LLL and to receive education about CDP immediately after surgery involving retroperitoneal lymphadenectomy for gynecologic cancer.

  18. HAL/S-360 compiler system specification

    Science.gov (United States)

    Johnson, A. E.; Newbold, P. N.; Schulenberg, C. W.; Avakian, A. E.; Varga, S.; Helmers, P. H.; Helmers, C. T., Jr.; Hotz, R. L.

    1974-01-01

    A three phase language compiler is described which produces IBM 360/370 compatible object modules and a set of simulation tables to aid in run time verification. A link edit step augments the standard OS linkage editor. A comprehensive run time system and library provide the HAL/S operating environment, error handling, a pseudo real time executive, and an extensive set of mathematical, conversion, I/O, and diagnostic routines. The specifications of the information flow and content for this system are also considered.

  19. T.J. Kriel (original compiler), D.J. Prinsloo and B.P. Sathekge (compilers revised edition). Popular Northern Sotho Dictionary

    OpenAIRE

    Kwena J. Mashamaite

    2011-01-01

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  20. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature

    International Nuclear Information System (INIS)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included

  1. Compiler-Directed Transformation for Higher-Order Stencils

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Straalen, Brian Van [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Colella, Phillip [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-07-20

    As the cost of data movement increasingly dominates performance, developers of finite-volume and finite-difference solutions for partial differential equations (PDEs) are exploring novel higher-order stencils that increase numerical accuracy and computational intensity. This paper describes a new compiler reordering transformation applied to stencil operators that performs partial sums in buffers, and reuses the partial sums in computing multiple results. This optimization has multiple effect son improving stencil performance that are particularly important to higher-order stencils: exploits data reuse, reduces floating-point operations, and exposes efficient SIMD parallelism to backend compilers. We study the benefit of this optimization in the context of Geometric Multigrid (GMG), a widely used method to solvePDEs, using four different Jacobi smoothers built from 7-, 13-, 27-and 125-point stencils. We quantify performance, speedup, andnumerical accuracy, and use the Roofline model to qualify our results. Ultimately, we obtain over 4× speedup on the smoothers themselves and up to a 3× speedup on the multigrid solver. Finally, we demonstrate that high-order multigrid solvers have the potential of reducing total data movement and energy by several orders of magnitude.

  2. Expert Programmer versus Parallelizing Compiler: A Comparative Study of Two Approaches for Distributed Shared Memory

    Directory of Open Access Journals (Sweden)

    M. F. P. O'Boyle

    1996-01-01

    Full Text Available This article critically examines current parallel programming practice and optimizing compiler development. The general strategies employed by compiler and programmer to optimize a Fortran program are described, and then illustrated for a specific case by applying them to a well-known scientific program, TRED2, using the KSR-1 as the target architecture. Extensive measurement is applied to the resulting versions of the program, which are compared with a version produced by a commercial optimizing compiler, KAP. The compiler strategy significantly outperforms KAP and does not fall far short of the performance achieved by the programmer. Following the experimental section each approach is critiqued by the other. Perceived flaws, advantages, and common ground are outlined, with an eye to improving both schemes.

  3. Source list of nuclear data bibliographies, compilations, and evaluations

    International Nuclear Information System (INIS)

    Burrows, T.W.; Holden, N.E.

    1978-10-01

    To aid the user of nuclear data, many specialized bibliographies, compilations, and evaluations have been published. This document is an attempt to bring together a list of such publications with an indication of their availability and cost

  4. T.J. Kriel (original compiler, D.J. Prinsloo and B.P. Sathekge (compilers revised edition. Popular Northern Sotho Dictionary

    Directory of Open Access Journals (Sweden)

    Kwena J. Mashamaite

    2011-10-01

    Full Text Available The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform prospective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the case with this dictionary.

  5. Design Choices in a Compiler Course or How to Make Undergraduates Love Formal Notation

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff

    2008-01-01

    The undergraduate compiler course offers a unique opportunity to combine many aspects of the Computer Science curriculum. We discuss the many design choices that are available for the instructor and present the current compiler course at the University of Aarhus, the design of which displays at l...

  6. A Coarse-Grained Reconfigurable Architecture with Compilation for High Performance

    Directory of Open Access Journals (Sweden)

    Lu Wan

    2012-01-01

    Full Text Available We propose a fast data relay (FDR mechanism to enhance existing CGRA (coarse-grained reconfigurable architecture. FDR can not only provide multicycle data transmission in concurrent with computations but also convert resource-demanding inter-processing-element global data accesses into local data accesses to avoid communication congestion. We also propose the supporting compiler techniques that can efficiently utilize the FDR feature to achieve higher performance for a variety of applications. Our results on FDR-based CGRA are compared with two other works in this field: ADRES and RCP. Experimental results for various multimedia applications show that FDR combined with the new compiler deliver up to 29% and 21% higher performance than ADRES and RCP, respectively.

  7. Construction experiences from underground works at Forsmark. Compilation Report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders [Vattenfall Power Consultant AB, Stockholm (Sweden); Christiansson, Rolf [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)

    2007-02-15

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible.

  8. Construction experiences from underground works at Forsmark. Compilation Report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-02-01

    The main objective with this report, the Construction Experience Compilation Report (CECR), is to compile experiences from the underground works carried out at Forsmark, primarily construction experiences from the tunnelling of the two cooling water tunnels of the Forsmark nuclear power units 1, 2 and 3, and from the underground excavations of the undersea repository for low and intermediate reactor waste, SFR. In addition, a brief account is given of the operational experience of the SFR on primarily rock support solutions. The authors of this report have separately participated throughout the entire construction periods of the Forsmark units and the SFR in the capacity of engineering geologists performing geotechnical mapping of the underground excavations and acted as advisors on tunnel support; Anders Carlsson participated in the construction works of the cooling water tunnels and the open cut excavations for Forsmark 1, 2 and 3 (geotechnical mapping) and the Forsmark 3 tunnel (advise on tunnel support). Rolf Christiansson participated in the underground works for the SFR (geotechnical mapping, principal investigator for various measurements and advise on tunnel support and grouting). The report is to a great extent based on earlier published material as presented in the list of references. But it stands to reason that, during the course of the work with this report, unpublished notes, diaries, drawings, photos and personal recollections of the two authors have been utilised in order to obtain such a complete compilation of the construction experiences as possible

  9. Specification and Compilation of Real-Time Stream Processing Applications

    NARCIS (Netherlands)

    Geuns, S.J.

    2015-01-01

    This thesis is concerned with the specification, compilation and corresponding temporal analysis of real-time stream processing applications that are executed on embedded multiprocessor systems. An example of such applications are software defined radio applications. These applications typically

  10. Marshall information retrieval and display system (MIRADS)

    Science.gov (United States)

    Groover, J. L.; Jones, S. C.; King, W. L.

    1974-01-01

    Program for data management system allows sophisticated inquiries while utilizing simplified language. Online system is composed of several programs. System is written primarily in COBOL with routines in ASSEMBLER and FORTRAN V.

  11. Compilation of a preliminary checklist for the differential diagnosis of neurogenic stuttering

    Directory of Open Access Journals (Sweden)

    Mariska Lundie

    2014-06-01

    Objectives: The aim of this study was to describe and highlight the characteristics of NS in order to compile a preliminary checklist for accurate diagnosis and intervention. Method: An explorative, applied mixed method, multiple case study research design was followed. Purposive sampling was used to select four participants. A comprehensive assessment battery was compiled for data collection. Results: The results revealed a distinct pattern of core stuttering behaviours in NS, although discrepancies existed regarding stuttering severity and frequency. It was also found that DS and NS can co-occur. The case history and the core stuttering pattern are important considerations during differential diagnosis, as these are the only consistent characteristics in people with NS. Conclusion: It is unlikely that all the symptoms of NS are present in an individual. The researchers scrutinised the findings of this study and the findings of previous literature to compile a potentially workable checklist.

  12. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia; Angulo, C.; Arnould, M.

    2000-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. we report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged-particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies, the theoretical predictions obtained in the framework of the Hauser-Feshbach model is used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (authors)

  13. Charged-particle induced thermonuclear reaction rates: a compilation for astrophysics

    International Nuclear Information System (INIS)

    Grama, Cornelia

    1999-01-01

    The rapidly growing wealth of nuclear data becomes less and less easily accessible to the astrophysics community. Mastering this volume of information and making it available in an accurate and usable form for incorporation into stellar evolution or nucleosynthesis models become urgent goals of prime necessity. We report on the results of the European network NACRE (Nuclear Astrophysics Compilation of REaction rates). The principal motivation for the setting-up of the NACRE network has been the necessity of building up a well-documented and detailed compilation of rates for charged -particle induced reactions on stable targets up to Si and on unstable nuclei of special significance in astrophysics. This work is meant to supersede the only existing compilation of reaction rates issued by Fowler and collaborators. The cross section data and/or resonance parameters for a total of 86 charged-particle induced reactions are given and the corresponding reaction rates are calculated and given in tabular form. When cross section data are not available in the whole needed range of energies the theoretical predictions obtained in the framework of the Hauser-Feshbach model are used. Uncertainties are analyzed and realistic upper and lower bounds of the rates are determined. Reverse reaction rates and analytical approximations of the adopted rates are also provided. (author)

  14. Not mere lexicographic cosmetics: the compilation and structural ...

    African Journals Online (AJOL)

    This article offers a brief overview of the compilation of the Ndebele music terms dictionary, Isichazamazwi SezoMculo (henceforth the ISM), paying particular attention to its struc-tural features. It emphasises that the reference needs of the users as well as their reference skills should be given a determining role in all ...

  15. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1997, July--September

    International Nuclear Information System (INIS)

    Stevenson, L.L.

    1998-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. This report contains the third quarter 1997 abstracts

  16. Data compilation for particle impact desorption

    International Nuclear Information System (INIS)

    Oshiyama, Takashi; Nagai, Siro; Ozawa, Kunio; Takeuchi, Fujio.

    1984-05-01

    The desorption of gases from solid surfaces by incident electrons, ions and photons is one of the important processes of hydrogen recycling in the controlled thermonuclear reactors. We have surveyed the literature concerning the particle impact desorption published through 1983 and compiled the data on the desorption cross sections and desorption yields with the aid of a computer. This report presents the results obtained for electron stimulated desorption, the desorption cross sections and yields being given in graphs and tables as functions of incident electron energy, surface temperature and gas exposure. (author)

  17. Compilation of actinide neutron nuclear data

    International Nuclear Information System (INIS)

    1979-01-01

    The Swedish nuclear data committee has compiled a selected set of neutron cross section data for the 16 most important actinide isotopes. The aim of the report is to present available data in a comprehensible way to allow a comparison between different evaluated libraries and to judge about the reliability of these libraries from the experimental data. The data are given in graphical form below about 1 ev and above about 10 keV shile the 2200 m/s cross sections and resonance integrals are given in numerical form. (G.B.)

  18. Recent Efforts in Data Compilations for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Dillmann, Iris

    2008-01-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on 'Nuclear Physics Data Compilation for Nucleosynthesis Modeling' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The 'JINA Reaclib Database' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS.The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1 H and 210 Bi, over 80% of them deduced from experimental data.A ''high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. 'Workflow tools' aim to make the evaluation process transparent and allow users to follow the progress

  19. Recent Efforts in Data Compilations for Nuclear Astrophysics

    Science.gov (United States)

    Dillmann, Iris

    2008-05-01

    Some recent efforts in compiling data for astrophysical purposes are introduced, which were discussed during a JINA-CARINA Collaboration meeting on ``Nuclear Physics Data Compilation for Nucleosynthesis Modeling'' held at the ECT* in Trento/Italy from May 29th-June 3rd, 2007. The main goal of this collaboration is to develop an updated and unified nuclear reaction database for modeling a wide variety of stellar nucleosynthesis scenarios. Presently a large number of different reaction libraries (REACLIB) are used by the astrophysics community. The ``JINA Reaclib Database'' on http://www.nscl.msu.edu/~nero/db/ aims to merge and fit the latest experimental stellar cross sections and reaction rate data of various compilations, e.g. NACRE and its extension for Big Bang nucleosynthesis, Caughlan and Fowler, Iliadis et al., and KADoNiS. The KADoNiS (Karlsruhe Astrophysical Database of Nucleosynthesis in Stars, http://nuclear-astrophysics.fzk.de/kadonis) project is an online database for neutron capture cross sections relevant to the s process. The present version v0.2 is already included in a REACLIB file from Basel university (http://download.nucastro.org/astro/reaclib). The present status of experimental stellar (n,γ) cross sections in KADoNiS is shown. It contains recommended cross sections for 355 isotopes between 1H and 210Bi, over 80% of them deduced from experimental data. A ``high priority list'' for measurements and evaluations for light charged-particle reactions set up by the JINA-CARINA collaboration is presented. The central web access point to submit and evaluate new data is provided by the Oak Ridge group via the http://www.nucastrodata.org homepage. ``Workflow tools'' aim to make the evaluation process transparent and allow users to follow the progress.

  20. Indexed compilation of experimental high energy physics literature

    International Nuclear Information System (INIS)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given

  1. A Forth interpreter and compiler's study for computer aided design

    International Nuclear Information System (INIS)

    Djebbar, F. Zohra Widad

    1986-01-01

    The wide field of utilization of FORTH leads us to develop an interpreter. It has been implemented on a MC 68000 microprocessor based computer, with ASTERIX, a UNIX-like operating system (real time system written by C.E.A.). This work has been done in two different versions: - The first one, fully written in C language, assures a good portability on a wide variety of microprocessors. But the performance estimations show off excessive execution times, and lead to a new optimized version. - This new version is characterized by the compilation of the most frequently used words of the FORTH basis. This allows us to get an interpreter with good performances and an execution speed close to the resulting one of the C compiler. (author) [fr

  2. Regulatory and technical reports (abstract index journal). Annual compilation for 1984. Volume 9, No. 4

    International Nuclear Information System (INIS)

    1985-01-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  3. Compilation of the nuclear codes available in CTA

    International Nuclear Information System (INIS)

    D'Oliveira, A.B.; Moura Neto, C. de; Amorim, E.S. do; Ferreira, W.J.

    1979-07-01

    The present work is a compilation of some nuclear codes available in the Divisao de Estudos Avancados of the Instituto de Atividades Espaciais, (EAV/IAE/CTA). The codes are organized as the classification given by the Argonne National Laboratory. In each code are given: author, institution of origin, abstract, programming language and existent bibliography. (Author) [pt

  4. Regulatory and technical reports: (Abstract index journal). Compilation for first quarter 1997, January--March

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-06-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation is published quarterly and cummulated annually. Reports consist of staff-originated reports, NRC-sponsored conference reports, NRC contractor-prepared reports, and international agreement reports

  5. A Journey from Interpreters to Compilers and Virtual Machines

    DEFF Research Database (Denmark)

    Danvy, Olivier

    2003-01-01

    We review a simple sequence of steps to stage a programming-language interpreter into a compiler and virtual machine. We illustrate the applicability of this derivation with a number of existing virtual machines, mostly for functional languages. We then outline its relevance for todays language...

  6. Updated site compilation of the Latin American Pollen Database

    NARCIS (Netherlands)

    Flantua, S.G.A.; Hooghiemstra, H.; Grimm, E.C.; Behling, H.; Bush, M.B; González-Arrango, C.; Gosling, W.D.; Ledru, M.-P.; Lozano-Garciá, S.; Maldonado, A.; Prieto, A.R.; Rull, V.; van Boxel, J.H.

    2015-01-01

    The updated inventory of the Latin American Pollen Database (LAPD) offers a wide range of new insights. This paper presents a systematic compilation of palynological research in Latin America. A comprehensive inventory of publications in peer-reviewed and grey literature shows a major expansion of

  7. Compiling an OPEC Word List: A Corpus-Informed Lexical Analysis

    Directory of Open Access Journals (Sweden)

    Ebtisam Saleh Aluthman

    2017-01-01

    Full Text Available The present study is conducted within the borders of lexicographic research, where corpora have increasingly become all-pervasive. The overall goal of this study is to compile an open-source OPEC[1] Word List (OWL that is available for lexicographic research and vocabulary learning related to English language learning for the purpose of oil marketing and oil industries. To achieve this goal, an OPEC Monthly Reports Corpus (OMRC comprising of 1,004,542 words was compiled. The OMRC consists of 40 OPEC monthly reports released between 2003 and 2015. Consideration was given to both range and frequency criteria when compiling the OWL which consists of 255 word types. Along with this basic goal, this study aims to investigate the coverage of the most well-recognised word lists, the General Service List of English Words (GSL (West ,1953  and  the Academic Word List (AWL (Coxhead, 2000 in the OMRC corpus. The 255 word types included in the OWL are not overlapping with either the AWL or the GSL. Results suggest the necessity of making this discipline-specific word list for ESL students of oil marketing industries. The availability of the OWL has significant pedagogical contributions to curriculum design, learning activities and the overall process of vocabulary learning in the context of teaching English for specific purposes (ESP. OPEC stands for Organisation of Petroleum Exporting Countries.

  8. Compilation of accident statistics in PSE

    International Nuclear Information System (INIS)

    Jobst, C.

    1983-04-01

    The objective of the investigations on transportation carried out within the framework of the 'Project - Studies on Safety in Waste Management (PSE II)' is the determination of the risk of accidents in the transportation of radioactive materials by rail. The fault tree analysis is used for the determination of risks in the transportation system. This method offers a possibility for the determination of frequency and consequences of accidents which could lead to an unintended release of radionuclides. The study presented compiles all data obtained from the accident statistics of the Federal German Railways. (orig./RB) [de

  9. abc: An extensible AspectJ compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie

    2005-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its frontend is built, using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The backend is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general overview...

  10. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    account for the multilingual concept literacy glossaries being compiled under the auspices of .... a theory, i.e. the set of premises, arguments and conclusions required for explaining ... fully address cognitive and communicative needs, especially of laypersons. ..... tion at UCT, and in indigenous languages as auxiliary media.

  11. Individual risk. A compilation of recent British data

    International Nuclear Information System (INIS)

    Grist, D.R.

    1978-08-01

    A compilation of data is presented on individual risk obtained from recent British population and mortality statistics. Risk data presented include: risk of death, as a function of age, due to several important natural causes and due to accidents and violence; risk of death as a function of location of accident; and risk of death from various accidental causes. (author)

  12. Shear-wave velocity compilation for Northridge strong-motion recording sites

    Science.gov (United States)

    Borcherdt, Roger D.; Fumal, Thomas E.

    2002-01-01

    Borehole and other geotechnical information collected at the strong-motion recording sites of the Northridge earthquake of January 17, 1994 provide an important new basis for the characterization of local site conditions. These geotechnical data, when combined with analysis of strong-motion recordings, provide an empirical basis to evaluate site coefficients used in current versions of US building codes. Shear-wave-velocity estimates to a depth of 30 meters are derived for 176 strong-motion recording sites. The estimates are based on borehole shear-velocity logs, physical property logs, correlations with physical properties and digital geologic maps. Surface-wave velocity measurements and standard penetration data are compiled as additional constraints. These data as compiled from a variety of databases are presented via GIS maps and corresponding tables to facilitate use by other investigators.

  13. Methodology and procedures for compilation of historical earthquake data

    International Nuclear Information System (INIS)

    1987-10-01

    This report was prepared subsequent to the recommendations of the project initiation meeting in Vienna, November 25-29, 1985, under the IAEA Interregional project INT/9/066 Seismic Data for Nuclear Power Plant Siting. The aim of the project is to co-ordinate national efforts of Member States in the Mediterranean region in the compilation and processing of historical earthquake data in the siting of nuclear facilities. The main objective of the document is to assist the participating Member States, especially those who are initiating an NPP siting programme, in their effort to compile and process historical earthquake data and to provide a uniform interregional framework for this task. Although the document is directed mainly to the Mediterranean countries using illustrative examples from this region, the basic procedures and methods herein described may be applicable to other parts of the world such as Southeast Asia, Himalayan belt, Latin America, etc. 101 refs, 7 figs

  14. Digital compilation bedrock geologic map of the Milton quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-8A Dorsey, R, Doolan, B, Agnew, PC, Carter, CM, Rosencrantz, EJ, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the Milton...

  15. The RHNumtS compilation: Features and bioinformatics approaches to locate and quantify Human NumtS

    Directory of Open Access Journals (Sweden)

    Saccone Cecilia

    2008-06-01

    Full Text Available Abstract Background To a greater or lesser extent, eukaryotic nuclear genomes contain fragments of their mitochondrial genome counterpart, deriving from the random insertion of damaged mtDNA fragments. NumtS (Nuclear mt Sequences are not equally abundant in all species, and are redundant and polymorphic in terms of copy number. In population and clinical genetics, it is important to have a complete overview of NumtS quantity and location. Searching PubMed for NumtS or Mitochondrial pseudo-genes yields hundreds of papers reporting Human NumtS compilations produced by in silico or wet-lab approaches. A comparison of published compilations clearly shows significant discrepancies among data, due both to unwise application of Bioinformatics methods and to a not yet correctly assembled nuclear genome. To optimize quantification and location of NumtS, we produced a consensus compilation of Human NumtS by applying various bioinformatics approaches. Results Location and quantification of NumtS may be achieved by applying database similarity searching methods: we have applied various methods such as Blastn, MegaBlast and BLAT, changing both parameters and database; the results were compared, further analysed and checked against the already published compilations, thus producing the Reference Human Numt Sequences (RHNumtS compilation. The resulting NumtS total 190. Conclusion The RHNumtS compilation represents a highly reliable reference basis, which may allow designing a lab protocol to test the actual existence of each NumtS. Here we report preliminary results based on PCR amplification and sequencing on 41 NumtS selected from RHNumtS among those with lower score. In parallel, we are currently designing the RHNumtS database structure for implementation in the HmtDB resource. In the future, the same database will host NumtS compilations from other organisms, but these will be generated only when the nuclear genome of a specific organism has reached a high

  16. Digital compilation bedrock geologic map of the Lincoln quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-5A Stanley, R, DelloRusso, V, Haydock, S, Lapp, E, O'Loughlin, S, Prewitt, J,and Tauvers, PR, 1995, Digital compilation bedrock geologic map...

  17. Data compilation of angular distributions of sputtered atoms

    International Nuclear Information System (INIS)

    Yamamura, Yasunori; Takiguchi, Takashi; Tawara, Hiro.

    1990-01-01

    Sputtering on a surface is generally caused by the collision cascade developed near the surface. The process is in principle the same as that causing radiation damage in the bulk of solids. Sputtering has long been regarded as an undesirable dirty effect which destroys the cathodes and grids in gas discharge tubes or ion sources and contaminates plasma and the surrounding walls. However, sputtering is used today for many applications such as sputter ion sources, mass spectrometers and the deposition of thin films. Plasma contamination and the surface erosion of first walls due to sputtering are still the major problems in fusion research. The angular distribution of the particles sputtered from solid surfaces can possibly provide the detailed information on the collision cascade in the interior of targets. This report presents a compilation of the angular distribution of sputtered atoms at normal incidence and oblique incidence in the various combinations of incident ions and target atoms. The angular distribution of sputtered atoms from monatomic solids at normal incidence and oblique incidence, and the compilation of the data on the angular distribution of sputtered atoms are reported. (K.I.)

  18. Ada Integrated Environment III Computer Program Development Specification. Volume III. Ada Optimizing Compiler.

    Science.gov (United States)

    1981-12-01

    file.library-unit{.subunit).SYMAP Statement Map: library-file. library-unit.subunit).SMAP Type Map: 1 ibrary.fi le. 1 ibrary-unit{.subunit). TMAP The library...generator SYMAP Symbol Map code generator SMAP Updated Statement Map code generator TMAP Type Map code generator A.3.5 The PUNIT Command The P UNIT...Core.Stmtmap) NAME Tmap (Core.Typemap) END Example A-3 Compiler Command Stream for the Code Generator Texas Instruments A-5 Ada Optimizing Compiler

  19. A new compiler for the GANIL Data Acquisition description

    International Nuclear Information System (INIS)

    Saillant, F.; Raine, B.

    1997-01-01

    An important feature of the GANIL Data Acquisition System is the description of the experiments by means of a language developed at GANIL. The philosophy is to attribute to each element (parameters, spectra, etc) an operational name which will be used at any level of the system. This language references a library of modules to free the user from the technical details of the hardware. This compiler has been recently entirely re-developed using technologies as the object-oriented language (C++) and object-oriented software development method and tool. This enables us to provide a new functionality or to support a new electronic module within a very short delay and without any deep modification of the application. A new Dynamic Library of Modules has been also developed. Its complete description is available on the GANIL WEB site http://ganinfo.in2p3.fr/acquisition/homepage.html. This new compiler brings a lot of new functionalities, among which the most important is the notion of 'register' whatever the module standard is. All the registers described in the module provider's documentation can now be accessed by their names. Another important new feature is the notion of 'function' that can be executed on a module. Also a set of new instructions has been implemented to execute commands on CAMAC crates. Another possibility of this new compiler is to enable the description of specific interfaces with GANIL Data Acquisition System. This has been used to describe the coupling of the CHIMERA Data Acquisition System with the INDRA one through a shared memory in the VME crate. (authors)

  20. A quantum CISC compiler and scalable assembler for quantum computing on large systems

    Energy Technology Data Exchange (ETDEWEB)

    Schulte-Herbrueggen, Thomas; Spoerl, Andreas; Glaser, Steffen [Dept. Chemistry, Technical University of Munich (TUM), 85747 Garching (Germany)

    2008-07-01

    Using the cutting edge high-speed parallel cluster HLRB-II (with a total LINPACK performance of 63.3 TFlops/s) we present a quantum CISC compiler into time-optimised or decoherence-protected complex instruction sets. They comprise effective multi-qubit interactions with up to 10 qubits. We show how to assemble these medium-sized CISC-modules in a scalable way for quantum computation on large systems. Extending the toolbox of universal gates by optimised complex multi-qubit instruction sets paves the way to fight decoherence in realistic Markovian and non-Markovian settings. The advantage of quantum CISC compilation over standard RISC compilations into one- and two-qubit universal gates is demonstrated inter alia for the quantum Fourier transform (QFT) and for multiply-controlled NOT gates. The speed-up is up to factor of six thus giving significantly better performance under decoherence. - Implications for upper limits to time complexities are also derived.

  1. Regulatory and technical reports (abstract index journal): Annual compilation for 1994. Volume 19, Number 4

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order. These precede the following indexes: secondary report number index, personal author index, subject index, NRC originating organization index (staff reports), NRC originating organization index (international agreements), NRC contract sponsor index (contractor reports), contractor index, international organization index, and licensed facility index. A detailed explanation of the entries precedes each index.

  2. Digital compilation bedrock geologic map of the Warren quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-4A Walsh, GJ, Haydock, S, Prewitt, J, Kraus, J, Lapp, E, O'Loughlin, S, and Stanley, RS, 1995, Digital compilation bedrock geologic map of the...

  3. Fiscal 1998 research report on super compiler technology; 1998 nendo super konpaira technology no chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    For next-generation super computing systems, research was made on parallel and distributed compiler technology for enhancing an effective performance, and concerned software and architectures for enhancing a performance in coordination with compilers. As for parallel compiler technology, the researches of scalable automated parallel compiler technology, parallel tuning tools, and an operating system to use multi-processor resources effectively are pointed out to be important as concrete technical development issues. In addition, by developing these research results to the architecture technology of single-chip multi-processors, the possibility of development and expansion of the PC, WS and HPC (high-performance computer) markets, and creation of new industries is pointed out. Although wide-area distributed computing is being watched as next-generation computing industry, concrete industrial fields using such computing are now not clear, staying in the groping research stage. (NEDO)

  4. The Compilation of Multilingual Concept Literacy Glossaries at the ...

    African Journals Online (AJOL)

    In order to support concept literacy, especially for students for whom English is not the native language, a number of universities in South Africa are compiling multilingual glossaries through which the use of languages other than English may be employed as auxiliary media. Terminologies in languages other than English ...

  5. Thoughts and views on the compilation of monolingual dictionaries ...

    African Journals Online (AJOL)

    The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily ...

  6. 12 CFR 503.2 - Exemptions of records containing investigatory material compiled for law enforcement purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Exemptions of records containing investigatory material compiled for law enforcement purposes. 503.2 Section 503.2 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY PRIVACY ACT § 503.2 Exemptions of records containing investigatory material compiled for law enforcement...

  7. Compilation of historical information of 300 Area facilities and activities

    International Nuclear Information System (INIS)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided

  8. Compilation of historical information of 300 Area facilities and activities

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, M.S.

    1992-12-01

    This document is a compilation of historical information of the 300 Area activities and facilities since the beginning. The 300 Area is shown as it looked in 1945, and also a more recent (1985) look at the 300 Area is provided.

  9. Data compilation for radiation effects on ceramic insulators

    International Nuclear Information System (INIS)

    Fukuya, Koji; Terasawa, Mititaka; Nakahigashi, Shigeo; Ozawa, Kunio.

    1986-08-01

    Data of radiation effects on ceramic insulators were compiled from the literatures and summarized from the viewpoint of fast neutron irradiation effects. The data were classified according to the properties and ceramics. The properties are dimensional stability, mechanical property, thermal property and electrical and dielectric properties. The data sheets for each table or graph in the literatures were made. The characteristic feature of the data base was briefly described. (author)

  10. Just-In-Time compilation of OCaml byte-code

    OpenAIRE

    Meurer, Benedikt

    2010-01-01

    This paper presents various improvements that were applied to OCamlJIT2, a Just-In-Time compiler for the OCaml byte-code virtual machine. OCamlJIT2 currently runs on various Unix-like systems with x86 or x86-64 processors. The improvements, including the new x86 port, are described in detail, and performance measures are given, including a direct comparison of OCamlJIT2 to OCamlJIT.

  11. GRESS, FORTRAN Pre-compiler with Differentiation Enhancement

    International Nuclear Information System (INIS)

    1999-01-01

    1 - Description of program or function: The GRESS FORTRAN pre-compiler (SYMG) and run-time library are used to enhance conventional FORTRAN-77 programs with analytic differentiation of arithmetic statements for automatic differentiation in either forward or reverse mode. GRESS 3.0 is functionally equivalent to GRESS 2.1. GRESS 2.1 is an improved and updated version of the previous released GRESS 1.1. Improvements in the implementation of a the CHAIN option have resulted in a 70 to 85% reduction in execution time and up to a 50% reduction in memory required for forward chaining applications. 2 - Method of solution: GRESS uses a pre-compiler to analyze FORTRAN statements and determine the mathematical operations embodied in them. As each arithmetic assignment statement in a program is analyzed, SYMG generates the partial derivatives of the term on the left with respect to each floating-point variable on the right. The result of the pre-compilation step is a new FORTRAN program that can produce derivatives for any REAL (i.e., single or double precision) variable calculated by the model. Consequently, GRESS enhances FORTRAN programs or subprograms by adding the calculation of derivatives along with the original output. Derivatives from a GRESS enhanced model can be used internally (e.g., iteration acceleration) or externally (e.g., sensitivity studies). By calling GRESS run-time routines, derivatives can be propagated through the code via the chain rule (referred to as the CHAIN option) or accumulated to create an adjoint matrix (referred to as the ADGEN option). A third option, GENSUB, makes it possible to process a subset of a program (i.e., a do loop, subroutine, function, a sequence of subroutines, or a whole program) for calculating derivatives of dependent variables with respect to independent variables. A code enhanced with the GENSUB option can use forward mode, reverse mode, or a hybrid of the two modes. 3 - Restrictions on the complexity of the problem: GRESS

  12. Indexed compilation of experimental high energy physics literature. [Synopsis

    Energy Technology Data Exchange (ETDEWEB)

    Horne, C.P.; Yost, G.P.; Rittenberg, A.

    1978-09-01

    An indexed compilation of approximately 12,000 experimental high energy physics documents is presented. A synopsis of each document is presented, and the documenta are indexed according to beam/target/momentum, reaction/momentum, final-state-particle, particle/particle-property, accelerator/detector, and (for a limited set of the documents) experiment. No data are given.

  13. Final report: Compiled MPI. Cost-Effective Exascale Application Development

    Energy Technology Data Exchange (ETDEWEB)

    Gropp, William Douglas [Univ. of Illinois, Urbana-Champaign, IL (United States)

    2015-12-21

    This is the final report on Compiled MPI: Cost-Effective Exascale Application Development, and summarizes the results under this project. The project investigated runtime enviroments that improve the performance of MPI (Message-Passing Interface) programs; work at Illinois in the last period of this project looked at optimizing data access optimizations expressed with MPI datatypes.

  14. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  15. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1991-01-01

    Reliability data are essential in probabilistic safety assessment, with component reliability parameters being particularly important. Component failure data which is plant specific would be most appropriate but this is rather limited. However, similar components are used in different designs. Generic data, that is all data that is not plant specific to the plant being analyzed but which relates to components more generally, is important. The International Atomic Energy Agency has compiled the Generic Component Reliability Data Base from data available in the open literature. It is part of the IAEA computer code package for fault/event tree analysis. The Data Base contains 1010 different records including most of the components used in probabilistic safety analyses of nuclear power plants. The data base input was quality controlled and data sources noted. The data compilation procedure and problems associated with using generic data are explained. (UK)

  16. A program-compiling method of nuclear data on-line fast analysis

    International Nuclear Information System (INIS)

    Li Shangbai

    1990-01-01

    This paper discusses how to perform assembly float point operation by using some subroutine of applesoft system, and a program compiling method of nuclear data fast analysis in apple microcomputer is introduced

  17. Effects of a low level laser on periodontal tissue in hypofunctional teeth.

    Directory of Open Access Journals (Sweden)

    Hidetaka Hayashi

    Full Text Available Malocclusions, such as an open bite and high canines, are often encountered in orthodontic practice. Teeth without occlusal stimuli are known as hypofunctional teeth, and numerous atrophic changes have been reported in the periodontal tissue, including reductions in blood vessels in the periodontal ligament (PDL, heavy root resorption, and reduced bone mineral density (BMD in the alveolar bone. Low Level Laser (LLL has been shown to have a positive effect on bone formation and the vasculature. Although the recovery of hypofunctional teeth remains unclear, LLL is expected to have a positive influence on periodontal tissue in occlusal hypofunction. The aim of the present study was to elucidate the relationship between LLL and periodontal tissue in occlusal hypofunction. Twenty-four male rats aged 5 weeks were randomly divided into control and hypofunctional groups. An anterior metal cap and bite plate were attached to the maxillary and mandibular incisors in the hypofunctional group to simulate occlusal hypofunction in the molars. LLL irradiation was applied to the maxillary first molar through the gingival sulcus in half of the rats. Rats were divided into four groups; control, control+LLL, hypofunctional, and hypofunctional+LLL. Exposure to LLL irradiation was performed for 3 minutes every other day for 2 weeks. Animals were examined by Micro-CT at 5 and 7 weeks and were subsequently sacrificed. Heads were resected and examined histologically and immunohistologically. The hypofunctional group had obvious stricture of the PDL. However, no significant differences were observed in the PDL and alveolar bone between the hypofunctional+LLL and the control groups. In addition, the expression of basic fibroblast growth factor (bFGF and vascular endothelial growth factor (VEGF-positive cells were higher in the hypofunctional + LLL group than in the hypofunctional group. These results indicated that LLL enhanced the production of bFGF and VEGF in the

  18. The compiled catalogue of galaxies in machine-readable form and its statistical investigation

    International Nuclear Information System (INIS)

    Kogoshvili, N.G.

    1982-01-01

    The compilation of a machine-readable catalogue of relatively bright galaxies was undertaken in Abastumani Astrophysical Observatory in order to facilitate the statistical analysis of a large observational material on galaxies from the Palomar Sky Survey. In compiling the catalogue of galaxies the following problems were considered: the collection of existing information for each galaxy; a critical approach to data aimed at the selection of the most important features of the galaxies; the recording of data in computer-readable form; and the permanent updating of the catalogue. (Auth.)

  19. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    Weston, L.W.; Larson, D.C.

    1993-02-01

    This compilation represents the current needs for nuclear data measurements and evaluations as expressed by interested fission and fusion reactor designers, medical users of nuclear data, nuclear data evaluators, CSEWG members and other interested parties. The requests and justifications are reviewed by the Data Request and Status Subcommittee of CSEWG as well as most of the general CSEWG membership. The basic format and computer programs for the Request List were produced by the National Nuclear Data Center (NNDC) at Brookhaven National Laboratory. The NNDC produced the Request List for many years. The Request List is compiled from a computerized data file. Each request has a unique isotope, reaction type, requestor and identifying number. The first two digits of the identifying number are the year in which the request was initiated. Every effort has been made to restrict the notations to those used in common nuclear physics textbooks. Most requests are for individual isotopes as are most ENDF evaluations, however, there are some requests for elemental measurements. Each request gives a priority rating which will be discussed in Section 2, the neutron energy range for which the request is made, the accuracy requested in terms of one standard deviation, and the requested energy resolution in terms of one standard deviation. Also given is the requestor with the comments which were furnished with the request. The addresses and telephone numbers of the requestors are given in Appendix 1. ENDF evaluators who may be contacted concerning evaluations are given in Appendix 2. Experimentalists contemplating making one of the requested measurements are encouraged to contact both the requestor and evaluator who may provide valuable information. This is a working document in that it will change with time. New requests or comments may be submitted to the editors or a regular CSEWG member at any time

  20. Compilation of requests for nuclear data

    International Nuclear Information System (INIS)

    1983-01-01

    The purpose of this compilation is to summarize the current needs of US Nuclear Energy programs and other applied technolgies for nuclear data. It is the result of a biennial review in which the Department of Energy (DOE) and contractors, Department of Defense Laboratories and contractors, and other interested groups have been asked to review and revise their requests for nuclear data. It was felt that the evaluators of cross section data and the users of these evaluations should be involved in the review of the data requests to make this compilation more useful. This request list is ordered by target nucleus (Isotope) and then reaction type (Quantity). Each request is assigned a unique identifying number. The first two digits of this number give the year the request was initiated. All requests for a given Isotope and Quantity are grouped (or blocked) together. The requests in a block are followed by any status comments. Each request has a unique Isotope, Quantity and Requester. The requester is identified by laboratory, last name, and sponsoring US government agency, e.g., BET, DEI, DNR. All requesters, together with their addresses and phone numbers, are given in appendix B. A list of the evaluators responsible for ENDF/B-V evaluations with their affiliation appears in appendix C. All requests must give the energy (or range of energy) for the incident particle when appropriate. The accuracy needed in percent is also given. The error quoted is assumed to be 1-sigma at each measured point in the energy range requested unless a comment specifies otherwise. Sometimes a range of accuracy indicated by two values is given or some statement is given in the free text comments. An incident particle energy resolution in percent is sometimes given

  1. Compilation of a global inventory of emissions of nitrous oxide

    NARCIS (Netherlands)

    Bouwman, A.F.

    1995-01-01

    A global inventory with 1°x1° resolution was compiled of emissions of nitrous oxide (N 2 O) to the atmosphere, including emissions from soils under natural vegetation, fertilized agricultural land, grasslands and animal excreta, biomass burning, forest clearing,

  2. Market research for Idaho Transportation Department linear referencing system.

    Science.gov (United States)

    2009-09-02

    For over 30 years, the Idaho Transportation Department (ITD) has had an LRS called MACS : (MilePoint And Coded Segment), which is being implemented on a mainframe using a : COBOL/CICS platform. As ITD began embracing newer technologies and moving tow...

  3. Architectural and compiler techniques for energy reduction in high-performance microprocessors

    Science.gov (United States)

    Bellas, Nikolaos

    1999-11-01

    The microprocessor industry has started viewing power, along with area and performance, as a decisive design factor in today's microprocessors. The increasing cost of packaging and cooling systems poses stringent requirements on the maximum allowable power dissipation. Most of the research in recent years has focused on the circuit, gate, and register-transfer (RT) levels of the design. In this research, we focus on the software running on a microprocessor and we view the program as a power consumer. Our work concentrates on the role of the compiler in the construction of "power-efficient" code, and especially its interaction with the hardware so that unnecessary processor activity is saved. We propose techniques that use extra hardware features and compiler-driven code transformations that specifically target activity reduction in certain parts of the CPU which are known to be large power and energy consumers. Design for low power/energy at this level of abstraction entails larger energy gains than in the lower stages of the design hierarchy in which the design team has already made the most important design commitments. The role of the compiler in generating code which exploits the processor organization is also fundamental in energy minimization. Hence, we propose a hardware/software co-design paradigm, and we show what code transformations are necessary by the compiler so that "wasted" power in a modern microprocessor can be trimmed. More specifically, we propose a technique that uses an additional mini cache located between the instruction cache (I-Cache) and the CPU core; the mini cache buffers instructions that are nested within loops and are continuously fetched from the I-Cache. This mechanism can create very substantial energy savings, since the I-Cache unit is one of the main power consumers in most of today's high-performance microprocessors. Results are reported for the SPEC95 benchmarks in the R-4400 processor which implements the MIPS2 instruction

  4. An Optimizing Compiler for Petascale I/O on Leadership Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Choudhary, Alok [Northwestern Univ., Evanston, IL (United States); Kandemir, Mahmut [Pennsylvania State Univ., State College, PA (United States)

    2015-03-18

    In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions.

  5. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  6. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  7. User's manual for the computer-aided plant transient data compilation

    International Nuclear Information System (INIS)

    Langenbuch, S.; Gill, R.; Lerchl, G.; Schwaiger, R.; Voggenberger, T.

    1984-01-01

    The objective of this project is the compilation of data for nuclear power plants needed for transient analyses. The concept has been already described. This user's manual gives a detailed description of all functions of the dialogue system that supports data acquisition and retrieval. (orig.) [de

  8. The Effect of One Session Low Level Laser Therapy of Extracted Follicular Units on the Outcome of Hair Transplantation

    OpenAIRE

    Tabaie, Seyed Mehdi; Berenji Ardestani, Hoda; Azizjalali, Mir Hadi

    2016-01-01

    Introduction: Photobiostimulation with low level laser (LLL) has been used in medicine for a long time and its effects have been shown in many diseases. Some studies have evaluated the effect of LLL on androgenic alopecia. One of the most important limitations of the use of LLL in the treatment of alopecia is the requirement for multiple sessions, which is hardly accepted by patients. This study was conducted to evaluate the effect of the irradiation of extracted follicular hair units by LLL ...

  9. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  10. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  11. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  12. Compiling a corpus-based dictionary grammar: an example for ...

    African Journals Online (AJOL)

    In this article it is shown how a corpus-based dictionary grammar may be compiled — that is, a mini-grammar fully based on corpus data and specifically written for use in and inte-grated with a dictionary. Such an effort is, to the best of our knowledge, a world's first. We exem-plify our approach for a Northern Sotho ...

  13. Compilation of data on γ - γ → hadrons

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1986-06-01

    Data on γγ → hadrons extracted from e + e - reactions is compiled. The review includes inclusive cross-sections, structure functions, exclusive cross-sections and resonance widths. Data up to 1st July 1986 are included. All the data in this review can be found and retrieved in the Durham-RAL HEP database, together with a wide range of other reaction data. Users throughout Europe can interactively access the database through CMS on the RAL computer. (author)

  14. Engineering Amorphous Systems, Using Global-to-Local Compilation

    Science.gov (United States)

    Nagpal, Radhika

    Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.

  15. Digital compilation bedrock geologic map of the Mt. Ellen quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-6A Stanley, RS, Walsh, G, Tauvers, PR, DiPietro, JA, and DelloRusso, V, 1995,�Digital compilation bedrock geologic map of the Mt. Ellen...

  16. HOPE: A Python just-in-time compiler for astrophysical computations

    Science.gov (United States)

    Akeret, J.; Gamper, L.; Amara, A.; Refregier, A.

    2015-04-01

    The Python programming language is becoming increasingly popular for scientific applications due to its simplicity, versatility, and the broad range of its libraries. A drawback of this dynamic language, however, is its low runtime performance which limits its applicability for large simulations and for the analysis of large data sets, as is common in astrophysics and cosmology. While various frameworks have been developed to address this limitation, most focus on covering the complete language set, and either force the user to alter the code or are not able to reach the full speed of an optimised native compiled language. In order to combine the ease of Python and the speed of C++, we developed HOPE, a specialised Python just-in-time (JIT) compiler designed for numerical astrophysical applications. HOPE focuses on a subset of the language and is able to translate Python code into C++ while performing numerical optimisation on mathematical expressions at runtime. To enable the JIT compilation, the user only needs to add a decorator to the function definition. We assess the performance of HOPE by performing a series of benchmarks and compare its execution speed with that of plain Python, C++ and the other existing frameworks. We find that HOPE improves the performance compared to plain Python by a factor of 2 to 120, achieves speeds comparable to that of C++, and often exceeds the speed of the existing solutions. We discuss the differences between HOPE and the other frameworks, as well as future extensions of its capabilities. The fully documented HOPE package is available at http://hope.phys.ethz.ch and is published under the GPLv3 license on PyPI and GitHub.

  17. ERES: A PC program for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingin

    1994-01-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  18. Compilation of excitation cross sections for He atoms by electron impact

    International Nuclear Information System (INIS)

    Kato, T.; Itikawa, Y.; Sakimoto, K.

    1992-03-01

    Experimental and theoretical data are compiled on the cross section for the excitation of He atoms by electron impact. The available data are compared graphically. The survey of the literature has been made through the end 1991. (author)

  19. Digital compilation bedrock geologic map of the South Mountain quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG95-3A Stanley, R.S., DelloRusso, V., Tauvers, P.R., DiPietro, J.A., Taylor, S., and Prahl, C., 1995, Digital compilation bedrock geologic map of...

  20. ERES: A PC program for nuclear data compilation in EXFOR format

    Energy Technology Data Exchange (ETDEWEB)

    Shubing, Li [NanKai University, Tianjin (China); Qichang, Liang; Tingin, Liu [Chinese Nuclear Data Center, Institute of Atomic Energy, Beijing (China)

    1994-02-01

    This document describes the use of the personal computer software package ERES for compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request from the IAEA Nuclear Data Section. (author)

  1. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley...

  2. A Global Database of Soil Phosphorus Compiled from Studies Using Hedley Fractionation

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides concentrations of soil phosphorus (P) compiled from the peer-reviewed literature that cited the Hedley fractionation method (Hedley and...

  3. Cultural Learning Processes through Local Wisdom: A Case Study on Adult and Lifelong Learning in Thailand

    Science.gov (United States)

    Ratana-Ubol, Archanya; Henschke, John A.

    2015-01-01

    This article provides the background and concept of Thailand Lifelong Learning [LLL], even attempting a definition. The Thai LLL vision encompasses strategies for developing human qualities such as integrity, self-reliance, adaptability, resilience, and spirituality, to name a few. In some regards LLL seeks to recapture a more fully-developed…

  4. Regulatory and technical reports, compilation for 1979. Volume 4. Bibliographical report Jan-Dec 79

    International Nuclear Information System (INIS)

    Oliu, W.E.; McKenzie, L.; Aragon, R.

    1980-07-01

    The compilation lists formal regulatory and technical reports issued in 1979 by the U.S. Nuclear Regulatory Commission (NRC) staff and by NRC contractors. The compilation is divided into three major sections. The first major section consists of a sequential listing of all NRC reports in report-number order. The first portion of this sequential section lists staff reports, the second portion lists NRC-sponsored conference proceedings, and the third lists contractor reports. Each report citation in the sequential section contains full bibliographic information

  5. ANDEX. A PC software assisting the nuclear data compilation in EXFOR

    International Nuclear Information System (INIS)

    Osorio, V.

    1991-01-01

    This document describes the use of personal computer software ANDEX which assists the compilation of experimental nuclear reaction data in the internationally agreed EXFOR format. The software is available upon request, on a set of two diskettes, free of charge. (author)

  6. The Compilation of a Shona Children's Dictionary: Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Peniah Mabaso

    2011-10-01

    Full Text Available Abstract: This article outlines the challenges encountered by the African Languages Research Institute (ALRI team members in the compilation of the monolingual Shona Children's Dictionary. The focus is mainly on the problems met in headword selection. Solutions by the team members when dealing with these problems are also presented.

  7. Regulatory and technical reports (Abstract Index Journal). Compilation for third quarter 1985, July-September. Volume 10, No. 3

    International Nuclear Information System (INIS)

    1985-10-01

    This compilation consists of bibliographic data and abstracts for the formal Regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. This compilation covers the period from July through September, 1985

  8. The Relationship Between Academic Motivation and Lifelong Learning During Residency: A Study of Psychiatry Residents.

    Science.gov (United States)

    Sockalingam, Sanjeev; Wiljer, David; Yufe, Shira; Knox, Matthew K; Fefergrad, Mark; Silver, Ivan; Harris, Ilene; Tekian, Ara

    2016-10-01

    To examine the relationship between lifelong learning (LLL) and academic motivation for residents in a psychiatry residency program, trainee factors that influence LLL, and psychiatry residents' LLL practices. Between December 2014 and February 2015, 105 of 173 (61%) eligible psychiatry residents from the Department of Psychiatry, University of Toronto, completed a questionnaire with three study instruments: an LLL needs assessment survey, the Jefferson Scale of Physician Lifelong Learning (JeffSPLL), and the Academic Motivation Scale (AMS). The AMS included a relative autonomy motivation score (AMS-RAM) measuring the overall level of intrinsic motivation (IM). A significant correlation was observed between JeffSPLL and AMS-RAM scores (r = 0.39, P motivation identification domain (mean difference [M] = 0.38; 95% confidence interval [CI] [0.01, 0.75]; P = .045; d = 0.44) compared with senior residents. Clinician scientist stream (CSS) residents had significantly higher JeffSPLL scores compared with non-CSS residents (M = 3.15; 95% CI [0.52, 5.78]; P = .020; d = 0.57). The use of rigorous measures to study LLL and academic motivation confirmed prior research documenting the positive association between IM and LLL. The results suggest that postgraduate curricula aimed at enhancing IM, for example, through support for learning autonomously, could be beneficial to cultivating LLL in learners.

  9. Low level light in combination with metabolic modulators for effective therapy

    Science.gov (United States)

    Dong, Tingting; Zhang, Qi; Hamblin, Michael R.; Wu, Mei X.

    2015-03-01

    Vascular damage occurs frequently at the injured brain causing hypoxia and is associated with poor outcomes in the clinics. We found high levels of glycolysis, reduced ATP generation, and increased formation of reactive oxygen species (ROS) and apoptosis in neurons under hypoxia. Strikingly, these adverse events were reversed significantly by noninvasive exposure of injured brain to low-level light (LLL). LLL illumination sustained the mitochondrial membrane potential, constrained cytochrome C leakage in hypoxic cells, and protected them from apoptosis, underscoring a unique property of LLL. The effect of LLL was further bolstered by combination with metabolic substrates such as pyruvate or lactate both in vivo and in vitro. The combinational treatment retained memory and learning activities of injured mice to a normal level, whereas those treated with LLL or pyruvate alone, or sham light displayed partial or severe deficiency in these cognitive functions. In accordance with well-protected learning and memory function, the hippocampal region primarily responsible for learning and memory was completely protected by a combination of LLL and pyruvate, in marked contrast to the severe loss of hippocampal tissue due to secondary damage in control mice. These data clearly suggest that energy metabolic modulators can additively or synergistically enhance the therapeutic effect of LLL in energy-producing insufficient tissues like injured brain. Keywords:

  10. Self-diffusion in electrolyte solutions a critical examination of data compiled from the literature

    CERN Document Server

    Mills, R

    1989-01-01

    This compilation - the first of its kind - fills a real gap in the field of electrolyte data. Virtually all self-diffusion data in electrolyte solutions as reported in the literature have been examined and the book contains over 400 tables covering diffusion in binary and ternary aqueous solutions, in mixed solvents, and of non-electrolytes in various solvents.An important feature of the compilation is that all data have been critically examined and their accuracy assessed. Other features are an introductory chapter in which the methods of measurement are reviewed; appendices containing tables

  11. SQuAVisiT : A Software Quality Assessment and Visualisation Toolset

    NARCIS (Netherlands)

    Roubtsov, Serguei; Telea, Alexandru; Holten, Danny

    2007-01-01

    Software quality assessment of large COBOL industrial legacy systems, both for maintenance or migration purposes, mounts a serious challenge. We present the Software Quality Assessment and Visualisation Toolset (SQuAVisiT), which assists users in performing the above task. First, it allows a fully

  12. SQuAVisiT: a software quality assessment and visualisation toolset

    NARCIS (Netherlands)

    Roubtsov, S.; Telea, A.C.; Holten, D.H.R.

    2007-01-01

    Software quality assessment of large COBOL industrial legacy systems, both for maintenance or migration purposes, mounts a serious challenge. We present the software quality assessment and visualisation toolset (SQuAVisiT), which assists users in performing the above task. First, it allows a fully

  13. Statistical Compilation of the ICT Sector and Policy Analysis | Page 5 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The project is designed to expand the scope of conventional investigation beyond the telecommunications industry to include other vertically integrated components of the ICT sector such as manufacturing and services. ... Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia.

  14. Compilation of the abstracts of nuclear computer codes available at CPD/IPEN

    International Nuclear Information System (INIS)

    Granzotto, A.; Gouveia, A.S. de; Lourencao, E.M.

    1981-06-01

    A compilation of all computer codes available at IPEN in S.Paulo are presented. These computer codes are classified according to Argonne National Laboratory - and Energy Nuclear Agency schedule. (E.G.) [pt

  15. An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Kandemir, Mahmut Taylan [PSU; Choudary, Alok [Northwestern; Thakur, Rajeev [ANL

    2014-03-01

    In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.

  16. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Ukai, K.; Nakamura, T.

    1984-09-01

    An updated data compilation on single pion photoproduction experiment below 2 GeV is presented. This data bank includes not only the data of single pion photoproduction processes but also those of the proton Compton scattering (γp → γp) and the inverse process of the γn → π - p (π - p → γn). The number of total data points are 6240 for γp → π + n, 5715 for γp → π 0 p, 2835 for γn → π - p, 177 for γn → π 0 n, 669 for γp → γp, and 112 for π - p → γn processes. The compiled data are stored in the central computer (FACOM M-380R) of the Institute of Nuclear Study, University of Tokyo, for direct use of this data bank and on magnetic tapes with the standard label for other laboratories. The FACOM computer is compatible with an IBM 370 series or IBM 303X or 308X series machines. The data on the magnetic tapes are available on request. (Kato, T.)

  17. Programming time-multiplexed reconfigurable hardware using a scalable neuromorphic compiler.

    Science.gov (United States)

    Minkovich, Kirill; Srinivasa, Narayan; Cruz-Albrecht, Jose M; Cho, Youngkwan; Nogin, Aleksey

    2012-06-01

    Scalability and connectivity are two key challenges in designing neuromorphic hardware that can match biological levels. In this paper, we describe a neuromorphic system architecture design that addresses an approach to meet these challenges using traditional complementary metal-oxide-semiconductor (CMOS) hardware. A key requirement in realizing such neural architectures in hardware is the ability to automatically configure the hardware to emulate any neural architecture or model. The focus for this paper is to describe the details of such a programmable front-end. This programmable front-end is composed of a neuromorphic compiler and a digital memory, and is designed based on the concept of synaptic time-multiplexing (STM). The neuromorphic compiler automatically translates any given neural architecture to hardware switch states and these states are stored in digital memory to enable desired neural architectures. STM enables our proposed architecture to address scalability and connectivity using traditional CMOS hardware. We describe the details of the proposed design and the programmable front-end, and provide examples to illustrate its capabilities. We also provide perspectives for future extensions and potential applications.

  18. Numerical performance and throughput benchmark for electronic structure calculations in PC-Linux systems with new architectures, updated compilers, and libraries.

    Science.gov (United States)

    Yu, Jen-Shiang K; Hwang, Jenn-Kang; Tang, Chuan Yi; Yu, Chin-Hui

    2004-01-01

    A number of recently released numerical libraries including Automatically Tuned Linear Algebra Subroutines (ATLAS) library, Intel Math Kernel Library (MKL), GOTO numerical library, and AMD Core Math Library (ACML) for AMD Opteron processors, are linked against the executables of the Gaussian 98 electronic structure calculation package, which is compiled by updated versions of Fortran compilers such as Intel Fortran compiler (ifc/efc) 7.1 and PGI Fortran compiler (pgf77/pgf90) 5.0. The ifc 7.1 delivers about 3% of improvement on 32-bit machines compared to the former version 6.0. Performance improved from pgf77 3.3 to 5.0 is also around 3% when utilizing the original unmodified optimization options of the compiler enclosed in the software. Nevertheless, if extensive compiler tuning options are used, the speed can be further accelerated to about 25%. The performances of these fully optimized numerical libraries are similar. The double-precision floating-point (FP) instruction sets (SSE2) are also functional on AMD Opteron processors operated in 32-bit compilation, and Intel Fortran compiler has performed better optimization. Hardware-level tuning is able to improve memory bandwidth by adjusting the DRAM timing, and the efficiency in the CL2 mode is further accelerated by 2.6% compared to that of the CL2.5 mode. The FP throughput is measured by simultaneous execution of two identical copies of each of the test jobs. Resultant performance impact suggests that IA64 and AMD64 architectures are able to fulfill significantly higher throughput than the IA32, which is consistent with the SpecFPrate2000 benchmarks.

  19. JLAPACK – Compiling LAPACK FORTRAN to Java

    Directory of Open Access Journals (Sweden)

    David M. Doolin

    1999-01-01

    Full Text Available The JLAPACK project provides the LAPACK numerical subroutines translated from their subset Fortran 77 source into class files, executable by the Java Virtual Machine (JVM and suitable for use by Java programmers. This makes it possible for Java applications or applets, distributed on the World Wide Web (WWW to use established legacy numerical code that was originally written in Fortran. The translation is accomplished using a special purpose Fortran‐to‐Java (source‐to‐source compiler. The LAPACK API will be considerably simplified to take advantage of Java’s object‐oriented design. This report describes the research issues involved in the JLAPACK project, and its current implementation and status.

  20. Statistical Compilation of the ICT Sector and Policy Analysis | Page 2 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... to widen and deepen, so too does its impact on economic development. ... The outcomes of such efforts will subsequently inform policy discourse and ... Studies. Statistical Compilation of the ICT Sector and Policy Analysis project : country experiences; Malaysia ... Asian outlook: New growth dependent on new productivity.

  1. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually.

  2. Regulatory and technical reports (abstract index journal). Volume 20, No. 2: Compilation for second quarter April--June 1995

    International Nuclear Information System (INIS)

    1995-09-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually

  3. Herbal hepatotoxicity: a tabular compilation of reported cases.

    Science.gov (United States)

    Teschke, Rolf; Wolff, Albrecht; Frenzel, Christian; Schulze, Johannes; Eickhoff, Axel

    2012-11-01

    Herbal hepatotoxicity is a field that has rapidly grown over the last few years along with increased use of herbal products worldwide. To summarize the various facets of this disease, we undertook a literature search for herbs, herbal drugs and herbal supplements with reported cases of herbal hepatotoxicity. A selective literature search was performed to identify published case reports, spontaneous case reports, case series and review articles regarding herbal hepatotoxicity. A total of 185 publications were identified and the results compiled. They show 60 different herbs, herbal drugs and herbal supplements with reported potential hepatotoxicity, additional information including synonyms of individual herbs, botanical names and cross references are provided. If known, details are presented for specific ingredients and chemicals in herbal products, and for references with authors that can be matched to each herbal product and to its effect on the liver. Based on stringent causality assessment methods and/or positive re-exposure tests, causality was highly probable or probable for Ayurvedic herbs, Chaparral, Chinese herbal mixture, Germander, Greater Celandine, green tea, few Herbalife products, Jin Bu Huan, Kava, Ma Huang, Mistletoe, Senna, Syo Saiko To and Venencapsan(®). In many other publications, however, causality was not properly evaluated by a liver-specific and for hepatotoxicity-validated causality assessment method such as the scale of CIOMS (Council for International Organizations of Medical Sciences). This compilation presents details of herbal hepatotoxicity, assisting thereby clinical assessment of involved physicians in the future. © 2012 John Wiley & Sons A/S.

  4. Guide to Good Practice in using Open Source Compilers with the AGCC Lexical Analyzer

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available Quality software always demands a compromise between users' needs and hardware resources. To be faster means expensive devices like powerful processors and virtually unlimited amounts of RAM memory. Or you just need reengineering of the code in terms of adapting that piece of software to the client's hardware architecture. This is the purpose of optimizing code in order to get the utmost software performance from a program in certain given conditions. There are tools for designing and writing the code but the ultimate tool for optimizing remains the modest compiler, this often neglected software jewel the result of hundreds working hours by the best specialists in the world. Even though, only two compilers fulfill the needs of professional developers, a proprietary solution from a giant in the IT industry, and the Open source GNU compiler, for which we develop the AGCC lexical analyzer that helps producing even more efficient software applications. It relies on the most popular hacks and tricks used by professionals and discovered by the author who are proud to present them further below.

  5. Computer and compiler effects on code results: status report

    International Nuclear Information System (INIS)

    1996-01-01

    Within the framework of the international effort on the assessment of computer codes, which are designed to describe the overall reactor coolant system (RCS) thermalhydraulic response, core damage progression, and fission product release and transport during severe accidents, there has been a continuous debate as to whether the code results are influenced by different code users or by different computers or compilers. The first aspect, the 'Code User Effect', has been investigated already. In this paper the other aspects will be discussed and proposals are given how to make large system codes insensitive to different computers and compilers. Hardware errors and memory problems are not considered in this report. The codes investigated herein are integrated code systems (e. g. ESTER, MELCOR) and thermalhydraulic system codes with extensions for severe accident simulation (e. g. SCDAP/RELAP, ICARE/CATHARE, ATHLET-CD), and codes to simulate fission product transport (e. g. TRAPMELT, SOPHAEROS). Since all of these codes are programmed in Fortran 77, the discussion herein is based on this programming language although some remarks are made about Fortran 90. Some observations about different code results by using different computers are reported and possible reasons for this unexpected behaviour are listed. Then methods are discussed how to avoid portability problems

  6. Untitled

    Indian Academy of Sciences (India)

    Page 1. (BMIMITFSI. W c. 2,. W\\. O. A pw. , t. À à. 9 60 °. GaL aLaLLL LaLLL LaLL LLaL SDS SSLSL SLLS LLLS SSLS SLSSSS SJSSLS YLSLS JSLS SLL SLL SSLLS JSLS SLLLL LL LLL LLLLLL. S. L. L. L. L. L. L. طالا. H20 r. 10.

  7. A Compilation of Global Bio-Optical in Situ Data for Ocean-Colour Satellite Applications

    Science.gov (United States)

    Valente, Andre; Sathyendranath, Shubha; Brotus, Vanda; Groom, Steve; Grant, Michael; Taberner, Malcolm; Antoine, David; Arnone, Robert; Balch, William M.; Barker, Kathryn; hide

    2016-01-01

    A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GePCO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi:10.1594PANGAEA.854832 (Valente et al., 2015).

  8. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    Science.gov (United States)

    2015-06-07

    to SAT, and then reduces higher order terms to quadratic terms through a series of gadgets . Our mappings allow both positive and negative preconditions...to its being specific to this type of problem) and likely benefits from an homogeneous parameter setting (Venturelli et al. 2014), as it generates a...Guzik, A. 2013. Resource efficient gadgets for compiling adiabatic quan- tum optimization problems. Annalen der Physik 525(10- 11):877–888. Blum, A

  9. Towards droplet size-aware biochemical application compilation for AM-EWOD biochips

    DEFF Research Database (Denmark)

    Pop, Paul; Alistar, Mirela

    2015-01-01

    a droplet size-aware compilation by proposing a routing algorithm that considers the droplet size. Our routing algorithm is developed for a novel digital microfluidic biochip architecture based on Active Matrix Electrowetting on Dielectric, which uses a thin film transistor array for the electrodes. We also...

  10. 27 CFR 478.24 - Compilation of State laws and published ordinances.

    Science.gov (United States)

    2010-04-01

    ... published ordinances. (a) The Director shall annually revise and furnish Federal firearms licensees with a... Director annually revises the compilation and publishes it as “State Laws and Published Ordinances—Firearms... and published ordinances. 478.24 Section 478.24 Alcohol, Tobacco Products, and Firearms BUREAU OF...

  11. Statistical Compilation of the ICT Sector and Policy Analysis | Page 4 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  12. Statistical Compilation of the ICT Sector and Policy Analysis | Page 3 ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  13. The Outlook for Computer Professions: 1985 Rewrites the Program.

    Science.gov (United States)

    Drake, Larry

    1986-01-01

    The author states that graduates of junior college programs who learn COBOL will continue to find jobs, but employers will increasingly seek college graduates when filling positions for computer programers and systems analysts. Areas of growth for computer applications (services, military, data communications, and artificial intelligence) are…

  14. Yes! The Business Department Teaches Data Processing

    Science.gov (United States)

    Nord, Daryl; Seymour, Tom

    1978-01-01

    After a brief discussion of the history and current status of business data processing versus computer science, this article focuses on the characteristics of a business data processing curriculum as compared to a computer science curriculum, including distinctions between the FORTRAN and COBOL programming languages. (SH)

  15. Lower limb lymphedema: experiences and perceptions of cancer patients in the late palliative stage.

    Science.gov (United States)

    Frid, Marianne; Strang, Peter; Friedrichsen, Maria J; Johansson, Karin

    2006-01-01

    Lower limb lymphedema (LLL) is a common but neglected problem in palliative cancer patients. No studies have focused on these patients' experiences of lymphedema. The aims of this study were to explore patients' experiences regarding LLL and how they manage to deal with this in the late palliative stage. Thirteen patients with cancer-related LLL were included to satisfy a maximum variation sampling strategy. Interviews were analyzed using a qualitative phenomenographic method. LLL influenced the patients' thoughts about the future. Body image was often strongly influenced. Interactions with other persons were perceived as both positive and negative, and a range of coping strategies were expressed. LLL can exert a considerable influence on the physical experiences and the psychosocial situation of cancer patients in palliative care. Areas in need of increased education, attention, and further research are highlighted.

  16. Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont

    Data.gov (United States)

    Vermont Center for Geographic Information — Digital Data from VG96-03�Digital compilation bedrock geologic map of part of the Waitsfield quadrangle, Vermont: VGS Open-File Report VG96-3A, 2 plates, scale...

  17. [Influence of human body target's spectral characteristics on visual range of low light level image intensifiers].

    Science.gov (United States)

    Zhang, Jun-Ju; Yang, Wen-Bin; Xu, Hui; Liu, Lei; Tao, Yuan-Yaun

    2013-11-01

    To study the effect of different human target's spectral reflective characteristic on low light level (LLL) image intensifier's distance, based on the spectral characteristics of the night-sky radiation and the spectral reflective coefficients of common clothes, we established a equation of human body target's spectral reflective distribution, and analyzed the spectral reflective characteristics of different human targets wearing the clothes of different color and different material, and from the actual detection equation of LLL image intensifier distance, discussed the detection capability of LLL image intensifier for different human target. The study shows that the effect of different human target's spectral reflective characteristic on LLL image intensifier distance is mainly reflected in the average reflectivity rho(-) and the initial contrast of the target and the background C0. Reflective coefficient and spectral reflection intensity of cotton clothes are higher than polyester clothes, and detection capability of LLL image intensifier is stronger for the human target wearing cotton clothes. Experimental results show that the LLL image intensifiers have longer visual ranges for targets who wear cotton clothes than targets who wear same color but polyester clothes, and have longer visual ranges for targets who wear light-colored clothes than targets who wear dark-colored clothes. And in the full moon illumination conditions, LLL image intensifiers are more sensitive to the clothes' material.

  18. DJ Prinsloo and BP Sathekge (compil- ers — revised edition).

    African Journals Online (AJOL)

    The compilers of this new edition have successfully highlighted the important additions to the last edition of the dictionary. It is important to inform pro- spective users about new information. It is also a marketing strategy to announce the contents of a new product in both the preface and at the back of the cover page, as is the ...

  19. A compilation of Sr and Nd isotope data on Mexico

    International Nuclear Information System (INIS)

    Verma, S.P.; Verma, M.P.

    1986-01-01

    A compilation is given of the available Sr and Nd isotope data on Mexican volcanic-plutonic terranes which cover about one-third of Mexico's territory. The available data are arranged according to a subdivision of the Mexican territory in terms of geological provinces. Furthermore, site and province averages and standard deviations are calculated and their petrogenetic implications are pointed out. (author)

  20. Technique to increase performance of C-program for control systems. Compiler technique for low-cost CPU; Seigyoyo C gengo program no kosokuka gijutsu. Tei cost CPU no tame no gengo compiler gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Y [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    The software of automotive control systems has become increasingly large and complex. High level languages (primarily C) and the compilers become more important to reduce coding time. Most compilers represent real number in the floating point format specified by IEEE standard 754. Most microprocessors in the automotive industry have no hardware for the operation using the IEEE standard due to the cost requirements, resulting in the slow execution speed and large code size. Alternative formats to increase execution speed and reduce code size are proposed. Experimental results for the alternative formats show the improvement in execution speed and code size. 4 refs., 3 figs., 2 tabs.

  1. PELE-IC test problems

    International Nuclear Information System (INIS)

    Gong, E.Y.; Alexander, E.E.; McMaster, W.H.; Quinones, D.F.

    1979-01-01

    This report provides prospective users of the Lawrence Livermore Laboratory (LLL) fluid-structure interaction computer code, PELE-IC, a variety of test problems for verifying the code on CDC 7600 computer systems at facilities external to the LLL environment. The test problems have been successfully run on CDC 7600 computers at the LLL and Lawrence Berkeley Laboratory (LBL) computer centers

  2. Northern hemisphere mid-latitude geomagnetic anomaly revealed from Levantine Archaeomagnetic Compilation (LAC).

    Science.gov (United States)

    Shaar, R.; Tauxe, L.; Agnon, A.; Ben-Yosef, E.; Hassul, E.

    2015-12-01

    The rich archaeological heritage of Israel and nearby Levantine countries provides a unique opportunity for archaeomagnetic investigation in high resolution. Here we present a summary of our ongoing effort to reconstruct geomagnetic variations of the past several millennia in the Levant at decadal to millennial resolution. This effort at the Southern Levant, namely the "Levantine Archaeomagnetic Compilation" (LAC), presently consists of data from over 650 well-dated archaeological objects including pottery, slag, ovens, and furnaces. In this talk we review the methodological challenges in achieving a robust master secular variation curve with realistic error estimations from a large number of different datasets. We present the current status of the compilation, including the southern and western Levant LAC data (Israel, Cyprus, and Jordan) and other published north-eastern Levant data (Syria and southern Turkey), and outline the main findings emerging from these data. The main feature apparent from the new compilation is an extraordinary intensity high that developed over the Levant region during the first two millennia BCE. The climax of this event is a double peak intensity maximum starting at ca. 1000 BCE and ending at ca. 735 BCE, accompanied with at least two events of geomagnetic spikes. Paleomagnetic directions from this period demonstrate anomalies of up to 20 degrees far from the averaged GAD field. This leads us to postulate that the maximum in the intensity is a manifestation of an intense mid-latitude local positive geomagnetic anomaly that persisted for over two centuries.

  3. Supplementary material on passive solar heating concepts. A compilation of published articles

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-05-01

    A compilation of published articles and reports dealing with passive solar energy concepts for heating and cooling buildings is presented. The following are included: fundamental of passive systems, applications and technical analysis, graphic tools, and information sources. (MHR)

  4. NEA contributions to the worldwide collection, compilation and dissemination of nuclear reaction data

    International Nuclear Information System (INIS)

    Dupont, E.

    2012-01-01

    The NEA Data Bank is an international centre of reference for basic nuclear tools used in the analysis and prediction of phenomena in different nuclear applications. The Data Bank collects and compiles computer codes and scientific data and contributes to their improvement for the benefit of scientists in its member countries. In line with this mission, the Data Bank is a core centre of the International Network of Nuclear Reaction Data Centres (NRDC), which co-ordinates the worldwide collection, compilation and dissemination of nuclear reaction data. The NRDC network was established in 1976 from the earlier Four-Centres' Network created in 1966 by the United States, the NEA, the International Atomic Energy Agency (IAEA) and the former Soviet Union. Today, the NRDC is a worldwide co-operation network under the auspices of the IAEA, with 14 nuclear data centres from 8 countries and 2 international organisations belonging to the network. The main objective of the NRDC is to preserve, update and disseminate experimental nuclear reaction data that have been compiled for more than 40 years in a shared database (EXFOR). The EXFOR database contains basic nuclear data on low- to medium-energy experiments for incident neutron, photon and various charged-particle-induced reactions on a wide range of isotopes, natural elements and compounds. Today, with more than 140 000 data sets from approximately 20 000 experiments, EXFOR is by far the most important and complete experimental nuclear reaction database in the world and is widely used in the field of nuclear science and technology. The Data Bank is responsible for the collection and compilation of nuclear reaction data measured in its geographical area. Since 1966, the Data Bank has contributed around 5 000 experiments to the EXFOR database, and it continues to compile new data while maintaining the highest level of quality throughout the database. NRDC co-ordination meetings are held on a biennial basis. Recent meetings

  5. Interpretation, compilation and field verification procedures in the CARETS project

    Science.gov (United States)

    Alexander, Robert H.; De Forth, Peter W.; Fitzpatrick, Katherine A.; Lins, Harry F.; McGinty, Herbert K.

    1975-01-01

    The production of the CARETS map data base involved the development of a series of procedures for interpreting, compiling, and verifying data obtained from remote sensor sources. Level II land use mapping from high-altitude aircraft photography at a scale of 1:100,000 required production of a photomosaic mapping base for each of the 48, 50 x 50 km sheets, and the interpretation and coding of land use polygons on drafting film overlays. CARETS researchers also produced a series of 1970 to 1972 land use change overlays, using the 1970 land use maps and 1972 high-altitude aircraft photography. To enhance the value of the land use sheets, researchers compiled series of overlays showing cultural features, county boundaries and census tracts, surface geology, and drainage basins. In producing Level I land use maps from Landsat imagery, at a scale of 1:250,000, interpreters overlaid drafting film directly on Landsat color composite transparencies and interpreted on the film. They found that such interpretation involves pattern and spectral signature recognition. In studies using Landsat imagery, interpreters identified numerous areas of change but also identified extensive areas of "false change," where Landsat spectral signatures but not land use had changed.

  6. Data compilation of single pion photoproduction below 2 GeV

    International Nuclear Information System (INIS)

    Inagaki, Y.; Nakamura, T.; Ukai, K.

    1976-01-01

    The compilation of data of single pion photoproduction experiment below 2 GeV is presented with the keywords which specify the experiment. These data are written on a magnetic tape. Data format and the indices for the keywords are given. Various programs of using this tape are also presented. The results of the compilation are divided into two types. The one is the reference card on which the information of the experiment is given. The other is the data card. These reference and data cards are written using all A-type format on an original tape. The copy tapes are available, which are written by various types on request. There are two kinds of the copy tapes. The one is same as the original tape, and the other is the one different in the data card. Namely, this card is written by F-type following the data type. One experiment on this tape is represented by 3 kinds of the cards. One reference card with A-type format, many data cards with F-type format and one identifying card. Various programs which are written by FORTRAN are ready for these original and copy tapes. (Kato, T.)

  7. Basic circuit compilation techniques for an ion-trap quantum machine

    International Nuclear Information System (INIS)

    Maslov, Dmitri

    2017-01-01

    We study the problem of compilation of quantum algorithms into optimized physical-level circuits executable in a quantum information processing (QIP) experiment based on trapped atomic ions. We report a complete strategy: starting with an algorithm in the form of a quantum computer program, we compile it into a high-level logical circuit that goes through multiple stages of decomposition into progressively lower-level circuits until we reach the physical execution-level specification. We skip the fault-tolerance layer, as it is not within the scope of this work. The different stages are structured so as to best assist with the overall optimization while taking into account numerous optimization criteria, including minimizing the number of expensive two-qubit gates, minimizing the number of less expensive single-qubit gates, optimizing the runtime, minimizing the overall circuit error, and optimizing classical control sequences. Our approach allows a trade-off between circuit runtime and quantum error, as well as to accommodate future changes in the optimization criteria that may likely arise as a result of the anticipated improvements in the physical-level control of the experiment. (paper)

  8. Compilation of monographs on α-, β-, γ- and X-ray spectrometry

    International Nuclear Information System (INIS)

    Debertin, K.

    1977-11-01

    The working group 'α-, β-, γ-Ray Spectrometry' of the International Committee for Radionuclide Metrology (ICRM) compiled about 35 monographs on α-, β-, γ- and X-ray spectrometry which were published in the years 1970 to 1976. Support was obtained by the Zentralstelle fuer Atomkernenergie-Dokumentation (ZAED) in Karlsruhe. (orig.) [de

  9. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-05-15

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  10. Report of the Panel on Neutron Data Compilation. Brookhaven National Laboratory, USA, 10-14 February 1969

    International Nuclear Information System (INIS)

    1969-05-01

    After surveying current world needs for bibliographic and compilation activities in the field of neutron data, the report of this Panel of 31 individual technical experts, considers the immediate and future role of the world's neutron data centres in this task. In Chapter V the Panel's findings are summarized in the form of recommendations directed to the centres and their associated national and international advisory committees together with all users of the centres. The Panel's recommendations can be summarised as follows: a) The need for bibliographic indexing and numerical compilation of neutron data on an international basis has been clearly demonstrated and should continue for the foreseeable future; b) The operation of CINDA has been extremely satisfactory; c) Neutron data should be compiled at all energies by all centres subject to any mutually agreed exceptions and priorities; d) A fine-meshed classification scheme for neutron reactions should be formulated and put into use before the end of 1969 in accordance with the timetable; e) A scheme for associating a detailed statement of the main characteristics of each experiment with compilations of the resulting data should be formulated and put into preliminary operation before the end of 1969; f) The immediate primary tasks of the principal data centres are to complete the compilation of existing numerical data, whilst keeping abreast of new data, and to agree and implement an improved compilation, storage and retrieval system; g) Input of experimental data can be facilitated by specific measures; h) Centres should publish review publications which they believe will serve the user community; i) The centres should provide data to users in a variety of media: printed listings, graphs, paper tape, punched cards and magnetic tape - but should encourage standardization within each medium so as to free effort to meet special requirements of users having limited computer facilities; j) Centres should hold and

  11. The lowest Landau level in QCD

    Directory of Open Access Journals (Sweden)

    Bruckmann Falk

    2017-01-01

    Full Text Available The thermodynamics of Quantum Chromodynamics (QCD in external (electro-magnetic fields shows some unexpected features like inverse magnetic catalysis, which have been revealed mainly through lattice studies. Many effective descriptions, on the other hand, use Landau levels or approximate the system by just the lowest Landau level (LLL. Analyzing lattice configurations we ask whether such a picture is justified. We find the LLL to be separated from the rest by a spectral gap in the two-dimensional Dirac operator and analyze the corresponding LLL signature in four dimensions. We determine to what extent the quark condensate is LLL dominated at strong magnetic fields.

  12. abc: An Extensible AspectJ Compiler

    DEFF Research Database (Denmark)

    Avgustinov, Pavel; Christensen, Aske Simon; Hendren, Laurie J.

    2006-01-01

    checking and code generation, as well as data flow and control flow analyses. The AspectBench Compiler (abc) is an implementation of such a workbench. The base version of abc implements the full AspectJ language. Its front end is built using the Polyglot framework, as a modular extension of the Java...... language. The use of Polyglot gives flexibility of syntax and type checking. The back end is built using the Soot framework, to give modular code generation and analyses. In this paper, we outline the design of abc, focusing mostly on how the design supports extensibility. We then provide a general...

  13. Thoughts and Views on the Compilation of Monolingual Dictionaries in South Africa

    Directory of Open Access Journals (Sweden)

    N.C.P Golele

    2011-10-01

    Full Text Available Abstract: Developing and documenting the eleven official languages of South Africa on all levels of communication in order to fulfil all the roles and uses characteristic of truly official languages is a great challenge. To meet this need various bodies such as the National Lexicography Units have been established by the Pan South African Language Board (PanSALB. As far as dictionary compilation is concerned, acquaintance with the state-of-the-art developments in the theory and practice of lexicography is necessary. The focus of the African languages should be directed onto the compilation of monolingual dictionaries. It is important that these monolingual dictionaries should be usable right from the start on a continuous basis. Continued attention should be given to enlarging the corpora and actual consultation of these corpora on the macro- and microstructural levels. The end-products should be of a high lexicographic standard, well-balanced in terms of lemma selection, length of the articles, maximum utilisation of available dictionary space etc. They should also be planned and compiled in such a way that the transition from paper dictionaries to electronic dictionaries could be easily and naturally accomplished. Advanced and continued training in the compilation of monolingual dictionaries should be presented. Keywords: MONOLINGUAL DICTIONARIES, OFFICIAL LANGUAGES, DICTIONARY COMPILATION, CORPORA, NATIONAL LEXICOGRAPHY UNITS, TARGET USERS, DICTIONARY USE, DICTIONARY CULTURE, CORE TERMS Opsomming: Gedagtes en beskouings oor die samestelling van eentalige woordeboeke in Suid-Afrika. Die ontwikkeling en dokumentering van die elf amptelike tale van Suid-Afrika op alle vlakke van kommunikasie om alle rolle en gebruike van werklik amptelike tale te vervul, is 'n groot uitdaging. Om in hierdie behoefte te voorsien, is liggame soos die Nasionale Leksikografie-eenhede deur die Pan Suid-Afrikaanse Taalraad (PanSAT tot stand gebring. Wat

  14. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1996 July--September. Volume 21, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-02-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: secondary report number index; personal author index; subject index; NRC originating organization index (staff reports); NRC originating organization index (international agreements); NRC contract sponsor index (contractor reports); contractor index; international organization index; and licensed facility index. A detailed explanation of the entries precedes each index.

  15. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC`s intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index.

  16. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1996 July--September. Volume 21, Number 3

    International Nuclear Information System (INIS)

    1997-02-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: secondary report number index; personal author index; subject index; NRC originating organization index (staff reports); NRC originating organization index (international agreements); NRC contract sponsor index (contractor reports); contractor index; international organization index; and licensed facility index. A detailed explanation of the entries precedes each index

  17. Regulatory and technical reports (abstract index journal): Compilation for third quarter 1994, July--September. Volume 19, Number 3

    International Nuclear Information System (INIS)

    1994-12-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issues by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors. It is NRC's intention to publish this compilation quarterly and to cumulate it annually. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, NUREG/CR-XXXX, and NUREG/IA-XXXX. These precede the following indexes: Secondary Report Number Index, Personal Author Index, Subject Index, NRC Originating Organization Index (Staff Reports), NRC Originating Organization Index (International Agreements), NRC Contract Sponsor Index (Contractor Reports) Contractor Index, International Organization Index, Licensed Facility Index. A detailed explanation of the entries precedes each index

  18. Compiler generation and autotuning of communication-avoiding operators for geometric multigrid

    Energy Technology Data Exchange (ETDEWEB)

    Basu, Protonu [Univ. of Utah, Salt Lake City, UT (United States); Venkat, Anand [Univ. of Utah, Salt Lake City, UT (United States); Hall, Mary [Univ. of Utah, Salt Lake City, UT (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Van Straalen, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-04-17

    This paper describes a compiler approach to introducing communication-avoiding optimizations in geometric multigrid (GMG), one of the most popular methods for solving partial differential equations. Communication-avoiding optimizations reduce vertical communication through the memory hierarchy and horizontal communication across processes or threads, usually at the expense of introducing redundant computation. We focus on applying these optimizations to the smooth operator, which successively reduces the error and accounts for the largest fraction of the GMG execution time. Our compiler technology applies both novel and known transformations to derive an implementation comparable to manually-tuned code. To make the approach portable, an underlying autotuning system explores the tradeoff between reduced communication and increased computation, as well as tradeoffs in threading schemes, to automatically identify the best implementation for a particular architecture and at each computation phase. Results show that we are able to quadruple the performance of the smooth operation on the finest grids while attaining performance within 94% of manually-tuned code. Overall we improve the overall multigrid solve time by 2.5× without sacrificing programer productivity.

  19. Aeromagnetic map compilation: procedures for merging and an example from Washington

    Directory of Open Access Journals (Sweden)

    C. Finn

    2000-06-01

    Full Text Available Rocks in Antarctica and offshore have widely diverse magnetic properties. Consequently, aeromagnetic data collected there can improve knowledge of the geologic, tectonic and geothermal characteristics of the region. Aeromagnetic data can map concealed structures such as faults, folds and dikes, ascertain basin thickness and locate buried volcanic, as well as some intrusive and metamorphic rocks. Gridded, composite data sets allow a view of continental-scale trends that individual data sets do not provide and link widely-separated areas of outcrop and disparate geologic studies. Individual magnetic surveys must be processed so that they match adjacent surveys prior to merging. A consistent representation of the Earth's magnetic field (International Geomagnetic Reference Field (IGRF must be removed from each data set. All data sets need to be analytically continued to the same flight elevation with their datums shifted to match adjacent data. I advocate minimal processing to best represent the individual surveys in the merged compilation. An example of a compilation of aeromagnetic surveys from Washington illustrates the utility of aeromagnetic maps for providing synoptic views of regional tectonic features.

  20. Proposal for a new self-compiled questionnaire in patients affected by temporo-mandibular joint disorders (TMD).

    Science.gov (United States)

    Agrillo, A; Ramieri, V; Bianca, C; Nastro Siniscalchi, E; Fatone, F M G; Arangio, P

    2010-07-01

    In this work, we propose a self-compiled questionnaire, for those patients showing dysfunctions of the temporomandibular joint. The questionnaire, composed by 33 closed multiple-choice questions, represents one of the steps in the diagnostic procedure, together with the clinical notes compiled by the medical specialist and with the other necessary diagnostic researches. It also has the purpose to make easier anamnesis and clinic procedure and gathering of all informations useful for a right clinical diagnosis, and so for an appropriate therapy.

  1. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  2. The Study of Contemporary Stone-archives Compilation%当代石刻档案编纂的研究

    Institute of Scientific and Technical Information of China (English)

    赵彦昌; 朱效荣

    2016-01-01

    Stone-archives are original records of human activities,which saved consciously by human and use the stone as its carrier.Due to its great value,it have been inspiring people from all walks of life to study the stone-archives compilation.Since the achievements of stone-archives compilation is very rich after the founding of PRC,but there is no one have been summarizing and analyzing this compiling founding systematic.This paper analyzes the present situation of the contemporary stone-archives compilation from the view of evolution and methods and reveals the value of stone-archives compilation achievements.%石刻档案是以石为载体的人们有意识保存起来的人类活动的原始性文字记录,具有重要的史料价值,从而激发了各界对石刻档案的编纂。建国后石刻档案的编纂成果十分丰富,但目前还没有人对这些编纂成果进行过系统的总结和分析。本文从石刻档案编纂的沿革和编纂方法来着手分析当代石刻档案编纂的现状,揭示石刻档案编纂成果的价值。

  3. A compilation of structure functions in deep-inelastic scattering

    International Nuclear Information System (INIS)

    Roberts, R.G.; Whalley, M.R.

    1991-01-01

    A compilation of data on the structure functions F 2 , xF 3 , and R = σ L /σ T from lepton deep-inelastic scattering off protons and nuclei is presented. The relevant experiments at CERN, Fermilab and SLAC from 1985 are covered. All the data in this review can be found in and retrieved from the Durham-RAL HEP Databases (HEPDATA on the RAL and CERN VM systems and on DURPDG VAX/VMS) together with data on a wide variety of other reactions. (author)

  4. ERES--a PC software for nuclear data compilation in EXFOR format

    International Nuclear Information System (INIS)

    Li Shubing; Liang Qichang; Liu Tingjin

    1993-01-01

    The major functions and implementation of the software ERES (EXFOR Edit System) are introduced. The ERES is developed for nuclear data compilation in EXFOR (EXchange FORmat) format, running on IBM-PC/XT or IBM-PC/AT. EXFOR is the format for the exchange of experimental neutron data accepted by four neutron data centers in the world

  5. Risk factors and a prediction model for lower limb lymphedema following lymphadenectomy in gynecologic cancer: a hospital-based retrospective cohort study.

    Science.gov (United States)

    Kuroda, Kenji; Yamamoto, Yasuhiro; Yanagisawa, Manami; Kawata, Akira; Akiba, Naoya; Suzuki, Kensuke; Naritaka, Kazutoshi

    2017-07-25

    Lower limb lymphedema (LLL) is a chronic and incapacitating condition afflicting patients who undergo lymphadenectomy for gynecologic cancer. This study aimed to identify risk factors for LLL and to develop a prediction model for its occurrence. Pelvic lymphadenectomy (PLA) with or without para-aortic lymphadenectomy (PALA) was performed on 366 patients with gynecologic malignancies at Yaizu City Hospital between April 2002 and July 2014; we retrospectively analyzed 264 eligible patients. The intervals between surgery and diagnosis of LLL were calculated; the prevalence and risk factors were evaluated using the Kaplan-Meier and Cox proportional hazards methods. We developed a prediction model with which patients were scored and classified as low-risk or high-risk. The cumulative incidence of LLL was 23.1% at 1 year, 32.8% at 3 years, and 47.7% at 10 years post-surgery. LLL developed after a median 13.5 months. Using regression analysis, body mass index (BMI) ≥25 kg/m 2 (hazard ratio [HR], 1.616; 95% confidence interval [CI], 1.030-2.535), PLA + PALA (HR, 2.323; 95% CI, 1.126-4.794), postoperative radiation therapy (HR, 2.469; 95% CI, 1.148-5.310), and lymphocyst formation (HR, 1.718; 95% CI, 1.120-2.635) were found to be independently associated with LLL; age, type of cancer, number of lymph nodes, retroperitoneal suture, chemotherapy, lymph node metastasis, herbal medicine, self-management education, or infection were not associated with LLL. The predictive score was based on the 4 associated variables; patients were classified as high-risk (scores 3-6) and low-risk (scores 0-2). LLL incidence was significantly greater in the high-risk group than in the low-risk group (HR, 2.19; 95% CI, 1.440-3.324). The cumulative incidence at 5 years was 52.1% [95% CI, 42.9-62.1%] for the high-risk group and 28.9% [95% CI, 21.1-38.7%] for the low-risk group. The area under the receiver operator characteristics curve for the prediction model was 0.631 at 1 year, 0

  6. Customised Column Generation for Rostering Problems: Using Compile-time Customisation to create a Flexible C++ Engine for Staff Rostering

    DEFF Research Database (Denmark)

    Mason, Andrew J.; Ryan, David; Hansen, Anders Dohn

    2009-01-01

    , but is difficult to maintain, and incurs the time penalties of run-time customisation. Our new approach is to customise the software at compile time, allowing compiler optimisations to be fully exploited to give faster code. The code has also proven to be easier to read and debug....

  7. National Energy Strategy: A compilation of public comments; Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    1990-04-01

    This Report presents a compilation of what the American people themselves had to say about problems, prospects, and preferences in energy. The Report draws on the National Energy Strategy public hearing record and accompanying documents. In all, 379 witnesses appeared at the hearings to exchange views with the Secretary, Deputy Secretary, and Deputy Under Secretary of Energy, and Cabinet officers of other Federal agencies. Written submissions came from more than 1,000 individuals and organizations. Transcripts of the oral testimony and question-and-answer (Q-and-A) sessions, as well as prepared statements submitted for the record and all other written submissions, form the basis for this compilation. Citations of these sources in this document use a system of identifying symbols explained below and in the accompanying box. The Report is organized into four general subject areas concerning: (1) efficiency in energy use, (2) the various forms of energy supply, (3) energy and the environment, and (4) the underlying foundations of science, education, and technology transfer. Each of these, in turn, is subdivided into sections addressing specific topics --- such as (in the case of energy efficiency) energy use in the transportation, residential, commercial, and industrial sectors, respectively. 416 refs., 44 figs., 5 tabs.

  8. Data compilation of respiration, feeding, and growth rates of marine pelagic organisms

    DEFF Research Database (Denmark)

    2013-01-01

    's adaptation to the environment, with consequently less universal mass scaling properties. Data on body mass, maximum ingestion and clearance rates, respiration rates and maximum growth rates of animals living in the ocean epipelagic were compiled from the literature, mainly from original papers but also from...

  9. Compilation of properties data for Li{sub 2}TiO{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Roux, N [CEA Centre d` Etudes de Saclay, 91 - Gif-sur-Yvette (France)

    1998-03-01

    Properties data obtained at CEA for Li{sub 2}TiO{sub 3} are reported. The compilation includes : stability of Li{sub 2}TiO{sub 3} {beta} phase, specific heat, thermal diffusivity, thermal conductivity, linear thermal expansion, thermal creep, interaction with water and acid. (author)

  10. ACE - an algebraic compiler and encoder for the Chalk River datatron computer

    International Nuclear Information System (INIS)

    Kennedy, J.M.; Okazaki, E.A.; Millican, M.

    1960-03-01

    ACE is a program written for the Chalk River Datatron (Burroughs 205) Computer to enable the machine to compile a program for solving a problem from instructions supplied by the user in a notation related much more closely to algebra than to the machine's own code. (author)

  11. ccPDB: compilation and creation of data sets from Protein Data Bank.

    Science.gov (United States)

    Singh, Harinder; Chauhan, Jagat Singh; Gromiha, M Michael; Raghava, Gajendra P S

    2012-01-01

    ccPDB (http://crdd.osdd.net/raghava/ccpdb/) is a database of data sets compiled from the literature and Protein Data Bank (PDB). First, we collected and compiled data sets from the literature used for developing bioinformatics methods to annotate the structure and function of proteins. Second, data sets were derived from the latest release of PDB using standard protocols. Third, we developed a powerful module for creating a wide range of customized data sets from the current release of PDB. This is a flexible module that allows users to create data sets using a simple six step procedure. In addition, a number of web services have been integrated in ccPDB, which include submission of jobs on PDB-based servers, annotation of protein structures and generation of patterns. This database maintains >30 types of data sets such as secondary structure, tight-turns, nucleotide interacting residues, metals interacting residues, DNA/RNA binding residues and so on.

  12. Sharing analysis in the Pawns compiler

    Directory of Open Access Journals (Sweden)

    Lee Naish

    2015-09-01

    Full Text Available Pawns is a programming language under development that supports algebraic data types, polymorphism, higher order functions and “pure” declarative programming. It also supports impure imperative features including destructive update of shared data structures via pointers, allowing significantly increased efficiency for some operations. A novelty of Pawns is that all impure “effects” must be made obvious in the source code and they can be safely encapsulated in pure functions in a way that is checked by the compiler. Execution of a pure function can perform destructive updates on data structures that are local to or eventually returned from the function without risking modification of the data structures passed to the function. This paper describes the sharing analysis which allows impurity to be encapsulated. Aspects of the analysis are similar to other published work, but in addition it handles explicit pointers and destructive update, higher order functions including closures and pre- and post-conditions concerning sharing for functions.

  13. Molecular dynamics and diffusion a compilation

    CERN Document Server

    Fisher, David

    2013-01-01

    The molecular dynamics technique was developed in the 1960s as the outgrowth of attempts to model complicated systems by using either a) direct physical simulation or (following the great success of Monte Carlo methods) by b) using computer techniques. Computer simulation soon won out over clumsy physical simulation, and the ever-increasing speed and sophistication of computers has naturally made molecular dynamics simulation into a more and more successful technique. One of its most popular applications is the study of diffusion, and some experts now even claim that molecular dynamics simulation is, in the case of situations involving well-characterised elements and structures, more accurate than experimental measurement. The present double volume includes a compilation (over 600 items) of predicted solid-state diffusion data, for all of the major materials groups, dating back nearly four decades. The double volume also includes some original papers: "Determination of the Activation Energy for Formation and ...

  14. Compilation of Existing Neutron Screen Technology

    Directory of Open Access Journals (Sweden)

    N. Chrysanthopoulou

    2014-01-01

    Full Text Available The presence of fast neutron spectra in new reactors is expected to induce a strong impact on the contained materials, including structural materials, nuclear fuels, neutron reflecting materials, and tritium breeding materials. Therefore, introduction of these reactors into operation will require extensive testing of their components, which must be performed under neutronic conditions representative of those expected to prevail inside the reactor cores when in operation. Due to limited availability of fast reactors, testing of future reactor materials will mostly take place in water cooled material test reactors (MTRs by tailoring the neutron spectrum via neutron screens. The latter rely on the utilization of materials capable of absorbing neutrons at specific energy. A large but fragmented experience is available on that topic. In this work a comprehensive compilation of the existing neutron screen technology is attempted, focusing on neutron screens developed in order to locally enhance the fast over thermal neutron flux ratio in a reactor core.

  15. The FORTRAN NALAP code adapted to a microcomputer compiler

    International Nuclear Information System (INIS)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso

    2010-01-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  16. The FORTRAN NALAP code adapted to a microcomputer compiler

    Energy Technology Data Exchange (ETDEWEB)

    Lobo, Paulo David de Castro; Borges, Eduardo Madeira; Braz Filho, Francisco Antonio; Guimaraes, Lamartine Nogueira Frutuoso, E-mail: plobo.a@uol.com.b, E-mail: eduardo@ieav.cta.b, E-mail: fbraz@ieav.cta.b, E-mail: guimarae@ieav.cta.b [Instituto de Estudos Avancados (IEAv/CTA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    The Nuclear Energy Division of the Institute for Advanced Studies (IEAv) is conducting the TERRA project (TEcnologia de Reatores Rapidos Avancados), Technology for Advanced Fast Reactors project, aimed at a space reactor application. In this work, to attend the TERRA project, the NALAP code adapted to a microcomputer compiler called Compaq Visual Fortran (Version 6.6) is presented. This code, adapted from the light water reactor transient code RELAP 3B, simulates thermal-hydraulic responses for sodium cooled fast reactors. The strategy to run the code in a PC was divided in some steps mainly to remove unnecessary routines, to eliminate old statements, to introduce new ones and also to include extension precision mode. The source program was able to solve three sample cases under conditions of protected transients suggested in literature: the normal reactor shutdown, with a delay of 200 ms to start the control rod movement and a delay of 500 ms to stop the pumps; reactor scram after transient of loss of flow; and transients protected from overpower. Comparisons were made with results from the time when the NALAP code was acquired by the IEAv, back in the 80's. All the responses for these three simulations reproduced the calculations performed with the CDC compiler in 1985. Further modifications will include the usage of gas as coolant for the nuclear reactor to allow a Closed Brayton Cycle Loop - CBCL - to be used as a heat/electric converter. (author)

  17. Compilation of the FY 1999 Department of the Navy Working Capital Fund Financial Statements

    National Research Council Canada - National Science Library

    2000-01-01

    ...) Cleveland Center consistently and accurately compiled and consolidated financial data received from Navy field organizations and other sources to prepare the FY 1999 Navy Working Capital Fund financial statements...

  18. Compilations of measured and calculated physicochemical property values for PCBs, PBDEs, PCDDs and PAHs

    Data.gov (United States)

    U.S. Environmental Protection Agency — The dataset consists of compilations of measured and calculated physicochemical property values for PCBs, PBDEs, PCDDs and PAHs. The properties included in this...

  19. A Structural Weight Estimation Program (SWEEP) for Aircraft. Volume 2 - Program Integration and Data Management Module. Part 1: Program Integration

    Science.gov (United States)

    1974-06-01

    MWl.OMSI«S>.0*tHlT«l,«lll«i OIIOSIOH OWIItl.OVHfMl ,D«VIMl .OMIIWOI . OVNI IMI .OVUIl IOMl oiicisiiM toiiMi, anii in lOUIvaOMX IOIII.TCinilll...0VII7lll.lD«VIII.DV»«lll. I IIMIII.DV«>IIII.IO«NI|1.0Vlt7IM.IOWTlll.DVIII2IM NMWUMK llonil.DVOIMIll. IVHI11 . OVNI tl II c OU ««MlLDVlll.tUOIfl

  20. Regulatory and technical reports (Abstract Index Journal): Annual compilation for 1988

    International Nuclear Information System (INIS)

    1989-05-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  1. Regulatory and technical reports (abstract index journal): Annual compilation for 1986

    International Nuclear Information System (INIS)

    1987-03-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  2. Comparative analysis of the number of neurofilaments in rat sciatic nerve undergoing neuropraxia treated by low-level laser and therapeutic ultrasound

    International Nuclear Information System (INIS)

    Matamala, F; Cornejo, R; Paredes, M; Farfan, E; Garrido, O. S; Alves, N

    2014-01-01

    Therapy by low-level laser (LLL) or ultrasound (US) are commonly used as treatment after nerve crush. The aim of this study was to determine the effectiveness of such treatments to repair the neuronal cytoskeleton evaluating the variation in the number of neurofilaments. For this an experimental design was performed, which involved 30 rats divided into 6 groups: 1 - control healthy; 2 - control injured; 3 - irradiated by LLL 2 J/cm2; 4 - irradiated by LLL 10 J/cm2; 5 - irradiated by US 0.5 W/cm2 and 6 - irradiated by US 1W/cm2. With the exception of group 1 all specimens were anesthetized and underwent right sciatic nerve compression using 40N pressure for 45 seconds. Twenty-four hours after compression irradiation was started by LLL and US according protocol. In our research we found that the increase in the number of neurofilaments was related to the applied dose of LLL and US. The average value of neurofilaments / 0.25 mm2 obtained in each group was: 1 - 128; 2-100; 3-156; 4-140; 5-100; 6-148. We concluded that the application of LLL and therapeutic US increases the number of neurofilaments in rat sciatic nerve undergoing neuropraxia, with LLL being more effective compared to the US. Furthermore we concluded that the effectiveness of therapies to induce regeneration of injured nerve is related to the type of protocol used, demonstrating the need to establish an adequate radiation dose with the purpose of obtaining the best therapeutic response, thus achieving successful treatment [es

  3. Low-level laser treatment accelerated hair regrowth in a rat model of chemotherapy-induced alopecia (CIA).

    Science.gov (United States)

    Wikramanayake, Tongyu Cao; Villasante, Alexandra C; Mauro, Lucia M; Nouri, Keyvan; Schachner, Lawrence A; Perez, Carmen I; Jimenez, Joaquin J

    2013-05-01

    Chemotherapy-induced alopecia (CIA) is one of the most distressing side effects of antineoplastic chemotherapy for which there is no effective interventional approach. A low-level laser (LLL) device, the HairMax LaserComb®, has been cleared by the FDA to treat androgenetic alopecia. Its effects may be extended to other settings; we have demonstrated that LaserComb treatment induced hair regrowth in a mouse model for alopecia areata. In the current study, we tested whether LLL treatment could promote hair regrowth in a rat model for CIA. Chemotherapy agents cyclophosphamide, etoposide, or a combination of cyclophosphamide and doxorubicin were administered in young rats to induce alopecia, with or without LLL treatment. As expected, 7-10 days later, all the rats developed full body alopecia. However, rats receiving laser treatment regrew hair 5 days earlier than rats receiving chemotherapy alone or sham laser treatment (with the laser turned off). The accelerated hair regrowth in laser-treated rats was confirmed by histology. In addition, LLL treatment did not provide local protection to subcutaneously injected Shay chloroleukemic cells. Taken together, our results demonstrated that LLL treatment significantly accelerated hair regrowth after CIA without compromising the efficacy of chemotherapy in our rat model. Our results suggest that LLL should be explored for the treatment of CIA in clinical trials because LLL devices for home use (such as the HairMax LaserComb®) provide a user-friendly and noninvasive approach that could be translated to increased patient compliance and improved efficacy.

  4. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    Science.gov (United States)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  5. LISP software generative compilation within the frame of a SLIP system; La compilation generative de programmes LISP dans le cadre d'un systeme SLIP

    Energy Technology Data Exchange (ETDEWEB)

    Sitbon, Andre

    1968-04-24

    After having outlined the limitations associated with the use of some programming languages (Fortran, Algol, assembler, and so on), and the interest of the use of the LISP structure and its associated language, the author notices that some problems remain regarding the memorisation of the computing process obtained by interpretation. Thus, he introduces a generative compiler which produces an executable programme, and which is written in a language very close to the used machine language, i.e. the FAP assembler language.

  6. Nuclear fuel cycle risk assessment: survey and computer compilation of risk-related literature. [Once-through Cycle and Plutonium Recycle

    Energy Technology Data Exchange (ETDEWEB)

    Yates, K.R.; Schreiber, A.M.; Rudolph, A.W.

    1982-10-01

    The US Nuclear Regulatory Commission has initiated the Fuel Cycle Risk Assessment Program to provide risk assessment methods for assistance in the regulatory process for nuclear fuel cycle facilities other than reactors. Both the once-through cycle and plutonium recycle are being considered. A previous report generated by this program defines and describes fuel cycle facilities, or elements, considered in the program. This report, the second from the program, describes the survey and computer compilation of fuel cycle risk-related literature. Sources of available information on the design, safety, and risk associated with the defined set of fuel cycle elements were searched and documents obtained were catalogued and characterized with respect to fuel cycle elements and specific risk/safety information. Both US and foreign surveys were conducted. Battelle's computer-based BASIS information management system was used to facilitate the establishment of the literature compilation. A complete listing of the literature compilation and several useful indexes are included. Future updates of the literature compilation will be published periodically. 760 annotated citations are included.

  7. Selecting informative food items for compiling food-frequency questionnaires: Comparison of procedures

    NARCIS (Netherlands)

    Molag, M.L.; Vries, J.H.M. de; Duif, N.; Ocké, M.C.; Dagnelie, P.C.; Goldbohm, R.A.; Veer, P. van 't

    2010-01-01

    The authors automated the selection of foods in a computer system that compiles and processes tailored FFQ. For the selection of food items, several methods are available. The aim of the present study was to compare food lists made by MOM2, which identifies food items with highest between-person

  8. Compilation of a soil map for Nigeria: a nation-wide soil resource ...

    African Journals Online (AJOL)

    This paper presents the results of a nation-wide soil and land form inventory of Nigeria. The data compilation was conducted in the framework of two projects with the objective to calculate agricultural production potential under different input levels and assess the water erosion hazard. The information on spatial distribution ...

  9. Energy and technology review

    Energy Technology Data Exchange (ETDEWEB)

    Selden, R.W.

    1977-05-01

    Topics covered include: geothermal energy development at LLL, energy conversion engineering, continuing education at LLL, and the Western states uranium resource survey. Separate abstracts were prepared for 3 sections. (MCG)

  10. Run-Time and Compiler Support for Programming in Adaptive Parallel Environments

    Directory of Open Access Journals (Sweden)

    Guy Edjlali

    1997-01-01

    Full Text Available For better utilization of computing resources, it is important to consider parallel programming environments in which the number of available processors varies at run-time. In this article, we discuss run-time support for data-parallel programming in such an adaptive environment. Executing programs in an adaptive environment requires redistributing data when the number of processors changes, and also requires determining new loop bounds and communication patterns for the new set of processors. We have developed a run-time library to provide this support. We discuss how the run-time library can be used by compilers of high-performance Fortran (HPF-like languages to generate code for an adaptive environment. We present performance results for a Navier-Stokes solver and a multigrid template run on a network of workstations and an IBM SP-2. Our experiments show that if the number of processors is not varied frequently, the cost of data redistribution is not significant compared to the time required for the actual computation. Overall, our work establishes the feasibility of compiling HPF for a network of nondedicated workstations, which are likely to be an important resource for parallel programming in the future.

  11. Reporting session of UWTF operation. Compilation of documents

    International Nuclear Information System (INIS)

    Shimizu, Kaoru; Togashi, Akio; Irinouchi, Shigenori

    1999-07-01

    This is the compilation of the papers and OHP transparencies presented, as well as discussions and comments, on the occasion of UWTF reporting session. UWTF stands for The Second Uranium Waste Treatment Facility, which was constructed for compression of metallic wastes and used filters, which are parts of uranium bearing solid wastes generated from Tokai Works, Japan Nuclear Cycle Development Institute. UWTF has been processing wastes since June 4 1998. In the session, based on the one year experience of UWTF operation, the difficulties met and the suggestions to the waste sources are mainly discussed. A brief summary of the UWTF construction, description of waste treatment process, and operation report of fiscal year 1998 are attached. (A. Yamamoto)

  12. Manufacturing of neutral beam sources at Lawrence Livermore Laboratory

    International Nuclear Information System (INIS)

    Baird, E.D.; Duffy, T.J.; Harter, G.A.; Holland, E.D.; Kloos, W.A.; Pastrone, J.A.

    1979-01-01

    Over 50 neutral beam sources (NBS) of the joint Lawrence Berkeley Laboratory (LBL)/Lawrence Livermore Laboratory (LLL) design have been manufactured, since 1973, in the LLL Neutral Beam Source Facility. These sources have been used to provide start-up and sustaining neutral beams for LLL mirror fusion experiments, including 2XIIB, TMX, and Beta II. Experimental prototype 20-kV and 80-kV NBS have also been designed, built, and tested for the Mirror Fusion Test Facility (MFTF)

  13. The mathematical model of the task of compiling the time-table

    Directory of Open Access Journals (Sweden)

    О.Є. Литвиненко

    2004-01-01

    Full Text Available  The mathematical model of the task of compiling the time-table in High-school has been carried out.  It has been showed, that the task may be reduced to canonical form of extrimal combinatorial tasks with unlinear structure after identical transformations. The algorithm of the task’s decision for realizing the scheme of the directed sorting of variants is indicated.

  14. A Debate over the Teaching of a Legacy Programming Language in an Information Technology (IT) Program

    Science.gov (United States)

    Ali, Azad; Smith, David

    2014-01-01

    This paper presents a debate between two faculty members regarding the teaching of the legacy programming course (COBOL) in a Computer Science (CS) program. Among the two faculty members, one calls for the continuation of teaching this language and the other calls for replacing it with another modern language. Although CS programs are notorious…

  15. Irradiation of strawberries. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1994-12-01

    The document contains a compilation of all available scientific and technical data on the irradiation of strawberries. It is intended to assist governments in considering the authorization of this particular application of radiation processing of food and in ensuring its control in the facility and the control of irradiated food products moving in trade. The compilation was prepared in response to the requirement of the Codex General Standard for Irradiated Foods and associated Code that radiation treatment of food be justified on the basis of a technological need or of a need to improve the hygienic quality of food. It was prepared also in response to the recommendations of the FAO/IAEA/WHO/ITC-UNCTAD/GATT International conference on the Acceptance, Control of and Trade in Irradiated Food (Geneva, 1989) concerning the need for regulatory control of radiation processing of food. Refs, 1 tab

  16. Compilation of electron collision excitation cross sections for neutral argon

    International Nuclear Information System (INIS)

    Blanco, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p5 ns (n=4to 12), np(n=4to8) and nd(n=3to8)of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p5 ns (n=4 to 7), np (n=4 to 7) and nd (n=3 to 8). 3.- comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author) 35 refs

  17. Compilation of electron collision excitation cross sections for neutro argon

    International Nuclear Information System (INIS)

    Blanco Ramos, F.

    1993-01-01

    The present work presents a compilation and critical analysis of the available data on electron collision excitation cross sections for neutral Argon levels. This study includes: 1.- A detailed description in intermediate coupling for all the levels belonging the 20 configurations 3p''5 ns(n=4 to 12), np(n=4 to 8) and nd(n=3 to 8) of neutral Argon. 2.- Calculation of the electron collision excitation cross sections in Born and Born-Oppenheimer-Ochkur approximations for all the levels in the 14 configurations 3p''5 ns(n=4 to 7), np(n=4 to 7) and nd(n=3 to 8). 3.- Comparison and discussion of the compiled data. These are the experimental and theoretical values available from the literature, and those from this work. 4.- Analysis of the regularities and systematic behaviors in order to determine which values can be considered more reliable. It is show that the concept of one electron cross section results quite useful for this purpose. In some cases it has been possible to obtain in this way approximate analytical expressions interpolating the experimental data. 5.- All the experimental and theoretical values studied are graphically presented and compared. 6.- The last part of the work includes a listing of several general purpose programs for Atomic Physics calculations developed for this work. (Author)

  18. Review of monitoring instruments for transuranics in fuel fabrication and reprocessing plants. A progress report to the physical and technological programs, Division of Biomedical and Environmental Research, U.S. Energy Research and Development Administration

    International Nuclear Information System (INIS)

    Kordas, J.F.; Phelps, P.L.

    1977-01-01

    A comprehensive review of the monitoring instruments for transuranic elements released from nuclear fuel fabrication and reprocessing plants has been compiled. The extent of routine operational releases has been reviewed for the light water reactor (LWR) fuel cycle (including plutonium recycle), the breeder reactor fuel cycle, and the high-temperature gas cooled reactor (HTGR) fuel cycle. The stack monitoring instrumentation that is presently in use at the various fabrication and reprocessing plants around the country is examined. Sampling difficulties including the inlet-probe arrangement and the effectiveness of the entire sampling system are discussed, as are the measurement problems for alpha-emitting, long-lived, transuranic aerosols, 129 I, 106 Ru, and tritium oxide. The potential problems in the HTGR fuel cycle such as the measurement of releases of alpha-emitting aerosols and of gaseous releases of 220 Rn and 14 C are also considered. Monitoring requirements range from the detection of low-level, routine releases to high-level accidental releases. Both first and second kinds of detection errors are considered in a discussion of adequate detection limits. The presently deployed monitors are critically examined in this light and the drawbacks and limitations of each are noted. Prototype instrumentation is studied, including Argonne's mechanical separation technique, Battelle's mass separation by surface ionization method, and in particular, LLL's transuranic aerosol measurement system. The potentials, sensitivities, advantages, and limitations of each system are enumerated. The additional potential uses of the LLL system are also discussed

  19. The Concept of "Simultaneous Feedback": Towards a New Methodology for Compiling Dictionaries

    Directory of Open Access Journals (Sweden)

    Gilles-Maurice de Schryver

    2011-10-01

    Full Text Available

    Abstract: Good lexicographers are constantly striving to enhance the quality of their dictionaries. Since dictionaries are ultimately judged by their target users, there is an urgency to provide for the target users' needs. In order to determine such needs more accurately, it has become common practice to submit users of a dictionary to a series of tests to monitor their success in information rehieval. In most cases such feedback unfortunately comes too late so that it can at best be considered for. implementation in the next or revised edition of the dictionary. In this article it is argued that feedback from the target users should be obtained while the compilation of the dictionary is still in progress, a process referred to as "simultaneous feedback". This concept, which offers a new methodology for compiling dictionaries, overcomes the major problem 'of creating and publishing entire dictionaries before feedback from target users can be obtained. By this new methodology, the release of several small-scale parallel dictionaries triggers feedback that is immediately channelled to the compilation process of a main dictionary. As such, the target users constantly guide the compilers during the entire compilation process. After a theoretical presentation of the new concept, the feasibility of simultaneous feedback is illustrated with reference to the creation of a bilingual CiIuba-Dutch leamer's dictionary. It is shown how this main project has been successfully complemented by three parallel projects.

    Keywords: SIMULTANEOUS FEEDBACK, NEW METHOOOLOGY, MAIN DICTIONARY, PARALLEL DICTIONARIES, TARGET USERS' DESIRES, QUESTIONNAIRES, ELECTRONIC CORPORA, WORD-FREQUENCY STUDIES, CONCORDANCES, AFRICAN LANGUAGES, CILUBÀ

    Opsomming: Die konsep van "gelyktydige terugvoering": Onderweg na Innuwe metodologie vir die samestelling van. woordeboeke. Goeie leksikograwestreef voortdurend daama om die gehalte van hul woordeboeke te verbeter

  20. Automatic compilation from high-level biologically-oriented programming language to genetic regulatory networks.

    Science.gov (United States)

    Beal, Jacob; Lu, Ting; Weiss, Ron

    2011-01-01

    The field of synthetic biology promises to revolutionize our ability to engineer biological systems, providing important benefits for a variety of applications. Recent advances in DNA synthesis and automated DNA assembly technologies suggest that it is now possible to construct synthetic systems of significant complexity. However, while a variety of novel genetic devices and small engineered gene networks have been successfully demonstrated, the regulatory complexity of synthetic systems that have been reported recently has somewhat plateaued due to a variety of factors, including the complexity of biology itself and the lag in our ability to design and optimize sophisticated biological circuitry. To address the gap between DNA synthesis and circuit design capabilities, we present a platform that enables synthetic biologists to express desired behavior using a convenient high-level biologically-oriented programming language, Proto. The high level specification is compiled, using a regulatory motif based mechanism, to a gene network, optimized, and then converted to a computational simulation for numerical verification. Through several example programs we illustrate the automated process of biological system design with our platform, and show that our compiler optimizations can yield significant reductions in the number of genes (~ 50%) and latency of the optimized engineered gene networks. Our platform provides a convenient and accessible tool for the automated design of sophisticated synthetic biological systems, bridging an important gap between DNA synthesis and circuit design capabilities. Our platform is user-friendly and features biologically relevant compiler optimizations, providing an important foundation for the development of sophisticated biological systems.

  1. Mentoring in Early Childhood Education: A Compilation of Thinking, Pedagogy and Practice

    Science.gov (United States)

    Murphy, Caterina, Ed.; Thornton, Kate, Ed.

    2015-01-01

    Mentoring is a fundamental and increasingly important part of professional learning and development for teachers in Aotearoa New Zealand. This book is a much-needed resource for mentors, leaders and teachers in early childhood education. It is the first of its kind: a wide ranging compilation that explores the thinking, pedagogy and practice of…

  2. Physical activity and lower limb lymphedema among uterine cancer survivors.

    Science.gov (United States)

    Brown, Justin C; John, Gabriella M; Segal, Saya; Chu, Christina S; Schmitz, Kathryn H

    2013-11-01

    Physical activity (PA) is known to provide physical and mental health benefits to uterine cancer survivors. However, it is unknown if PA is associated with lower limb lymphedema (LLL), an accumulation of protein-rich fluid in the lower limbs. Therefore, we sought to examine the association between PA and LLL in uterine cancer survivors, with a focus on walking. We conducted a cross-sectional study using mailed surveys among uterine cancer survivors who received care at a university-based cancer center. We asked about PA, walking, and LLL symptoms using validated self-report questionnaires. PA was calculated using MET-hours per week, and walking was calculated using blocks per day. The response rate to our survey was 43%. Among the 213 uterine cancer survivors in our survey, 36% were classified as having LLL. Compared with participants who reported trend = 0.003). Stratified analyses suggested the association between PA and LLL existed only among women with body mass index (BMI) trend = 0.007) compared with women with BMI ≥ 30 kg · m (P trend = 0.47). Compared with participants who reported trend trend = 0.007) and women with BMI ≥ 30 kg · m (P trend = 0.03). Participation in higher levels of PA or walking is associated with reduced proportions of LLL in dose-response fashion. These findings should be interpreted as preliminary and should be investigated in future studies.

  3. JANUS: A Compilation System for Balancing Parallelism and Performance in OpenVX

    Science.gov (United States)

    Omidian, Hossein; Lemieux, Guy G. F.

    2018-04-01

    Embedded systems typically do not have enough on-chip memory for entire an image buffer. Programming systems like OpenCV operate on entire image frames at each step, making them use excessive memory bandwidth and power. In contrast, the paradigm used by OpenVX is much more efficient; it uses image tiling, and the compilation system is allowed to analyze and optimize the operation sequence, specified as a compute graph, before doing any pixel processing. In this work, we are building a compilation system for OpenVX that can analyze and optimize the compute graph to take advantage of parallel resources in many-core systems or FPGAs. Using a database of prewritten OpenVX kernels, it automatically adjusts the image tile size as well as using kernel duplication and coalescing to meet a defined area (resource) target, or to meet a specified throughput target. This allows a single compute graph to target implementations with a wide range of performance needs or capabilities, e.g. from handheld to datacenter, that use minimal resources and power to reach the performance target.

  4. Draft report on compilation of generic safety issues for light water reactor nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-01

    A generally accepted approach to characterizing the safety concerns in nuclear power plants is to express them as safety issues which need to be resolved. When such safety issues are applicable to a generation of plants of a particular design or to a family of plants of similar design, they are termed generic safety issues. Examples of generic safety issues are those related to reactor vessel embrittlement, control rod insertion reliability or strainer clogging. The safety issues compiled in this document are based on broad international experience. This compilation is one element in the framework of IAEA activities to assist Member States in reassessing the safety of operating nuclear power plants. Refs.

  5. A Cognitive Approach to the Compilation of Test Materials for the Evaluation of Translator's Skills

    Directory of Open Access Journals (Sweden)

    Elena Berg

    2016-12-01

    Full Text Available A Cognitive Approach to the Compilation of Test Materials for the Evaluation of Translator's Skills This paper discusses the importance of a cognitive approach to the evaluation of translator’s skills. The authors set forth their recommendations for the compilation of test materials for the evaluation of translators’ cognitive ability.   Kognitywne podejście do kompilowania tekstów służących ocenie umiejętności tłumacza Artykuł porusza wagę kognitywnego podejścia do ewaluacji umiejętności tłumacza. Autorzy przedstawiają swoje zalecenia co do kompilowania materiałów testowych do ewaluacji kognitywnych zdolności tłumacza.

  6. Draft report on compilation of generic safety issues for light water reactor nuclear power plants

    International Nuclear Information System (INIS)

    1997-07-01

    A generally accepted approach to characterizing the safety concerns in nuclear power plants is to express them as safety issues which need to be resolved. When such safety issues are applicable to a generation of plants of a particular design or to a family of plants of similar design, they are termed generic safety issues. Examples of generic safety issues are those related to reactor vessel embrittlement, control rod insertion reliability or strainer clogging. The safety issues compiled in this document are based on broad international experience. This compilation is one element in the framework of IAEA activities to assist Member States in reassessing the safety of operating nuclear power plants. Refs

  7. Regulatory and technical reports (abstract index journal). Compilation for third quarter 1984, July-September. Volume 9, No. 3

    International Nuclear Information System (INIS)

    1984-11-01

    This compilation consists of bibliographic data and abstracts for the formal regulatory and technical reports issued by the US Nuclear Regulatory Commission (NRC) Staff and its contractors. The main citations and abstracts in this compilation are listed in NUREG number order: NUREG-XXXX, NUREG/CP-XXXX, and NUREG/CR-XXXX. These precede the following indexes: Contractor Report Number, Personal Author, Subject, NRC Originating Organization (Staff Reports), NRC Contract Sponsor (Contractor Reports), Contractor, and Licensed Facility

  8. A systematic monograph of the Recent Pentastomida, with a compilation of their hosts

    NARCIS (Netherlands)

    Christoffersen, M.L.; De Assis, J.E.

    2013-01-01

    We compile all published information on the Recent Pentastomida published to date, including complete synonyms, and species distributions. All host species are cited and their names updated. A taxonomical history of the group, a synthesis of phylogenetic information for the taxon, and a summary of

  9. De la problématique des articles synopsis dans la compilation des ...

    African Journals Online (AJOL)

    Autrement dit, au Gabon l'attention doit porter sur l'utilisation des articles synopsis dans la compilation des dictionnaires, en vue de refléter dans ses ouvrages la diversité linguistique et culturelle de ce pays. Ceci est d'autant plus vrai que l'utilisation des articles synopsis ne dépend aucunement de la typologie du ...

  10. VizieR Online Data Catalog: Compilation of stellar rotation data (Kovacs, 2018)

    Science.gov (United States)

    Kovacs, G.

    2018-03-01

    The three datasets included in table1-1.dat, table1-2.dat and table1-6.dat respectively, correspond to the type of stars listed in Table 1 in lines 1 [Praesepe], 2 [HJ_host] and 6 [Field(C)]. These data result from the compilation of rotational and other stellar data from the literature. (4 data files).

  11. DETERMINATION OF A MATHEMATICAL MODEL TO ESTIMATE THE AREA AND DRY WEIGHT OF THE LEAF LIMBO OF Prunus persica CV. Jarillo DETERMINACIÓN DE UN MODELO MATEMÁTICO PARA LA ESTIMACIÓN DEL ÁREA FOLIAR Y PESO SECO DEL LIMBO DE Prunus persica CV. Jarillo

    Directory of Open Access Journals (Sweden)

    Enrique Quevedo García

    2012-06-01

    Full Text Available Abstract. A study was conducted to determine the variables that estimated the leaf limbo area and the leaf limbo dry weight of peach Prunus persica (L. Batsch cv. Jarillo. Fifty leaves, aged 2.5 months, were selected and measured: leaf limbo length and width, petiole length, leaf length, petiole diameter, leaf limbo fresh weight, petiole fresh weight, leaf fresh weight, leaf limbo dry weight, petiole dry weight, leaf dry weight, length/width limbo, petiole length/limbo length and leaf limbo area. The results allowed to obtain regression equations for estimating the leaf area and the limbo dry weight. Using the lineal models LA = b1 + b2 (LLL x LLW and LA= b1+ b2LLL + b3LLW a leaf area equation was determined. Alternative models to calculate limbo dry weight were evaluated LLDW = -b1+ b2 LLFW and LLDW= - b1 + b2LLL + b3PL. The best equations found with an R2 of 0.99 were LA = 1.572 + 0.65169(LLL x LLW, LA=-23.106+2.8064LLW + 3.6761LLL and LLDW = -0.002+0.401(LLFW.Resumen. Se realizó un estudio para determinar las variables que estimaran el área del limbo foliar y el peso seco del limbo de durazno Prunus persica (L. Batsch cv. Jarillo. Se seleccionaron cincuenta hojas con 2,5 meses de edad, fueron medidos: ancho del limbo, longitud del limbo, longitud del peciolo, longitud hoja, diámetro peciolo, peso fresco del limbo, peso fresco del peciolo, peso fresco de la hoja, peso seco del limbo, peso seco peciolo, peso seco de la hoja, longitud /ancho limbo, longitud del peciolo/longitud del limbo, área foliar del limbo. Los resultados alcanzados permitieron obtener ecuaciones de regresión para estimar el área foliar del limbo y el peso seco del limbo. Se halló una ecuación para la determinación del área foliar del limbo con los modelos lineales LA = b1 + b2 (LLL x LLW y LA= b1 + b2LLL + b3LLW. También se evaluaron modelos alternativas para calcular el peso seco del limbo, LLDW = -b1+ b2LLFW y LLDW= - b1 + b2LLL + b3PL. Las mejores ecuaciones

  12. SERKON program for compiling a multigroup library to be used in BETTY calculation

    International Nuclear Information System (INIS)

    Nguyen Phuoc Lan.

    1982-11-01

    A SERKON-type program was written to compile data sets generated by FEDGROUP-3 into a multigroup library for BETTY calculation. A multigroup library was generated from the ENDF/B-IV data file and tested against the TRX-1 and TRX-2 lattices with good results. (author)

  13. Compilations and evaluations of data on the interaction of electromagnetic radiation with matter

    International Nuclear Information System (INIS)

    Lorenz, A.

    1978-05-01

    The material contained in this report deals with data on the interaction of electromagnetic radiation with matter, listing major compilations of X-ray, photon and gamma-ray cross sections and attentuation coefficients, as well as selected reports featuring data on compton scattering, photoelectric absorption and pair production

  14. Microfluidic very large scale integration (VLSI) modeling, simulation, testing, compilation and physical synthesis

    CERN Document Server

    Pop, Paul; Madsen, Jan

    2016-01-01

    This book presents the state-of-the-art techniques for the modeling, simulation, testing, compilation and physical synthesis of mVLSI biochips. The authors describe a top-down modeling and synthesis methodology for the mVLSI biochips, inspired by microelectronics VLSI methodologies. They introduce a modeling framework for the components and the biochip architecture, and a high-level microfluidic protocol language. Coverage includes a topology graph-based model for the biochip architecture, and a sequencing graph to model for biochemical application, showing how the application model can be obtained from the protocol language. The techniques described facilitate programmability and automation, enabling developers in the emerging, large biochip market. · Presents the current models used for the research on compilation and synthesis techniques of mVLSI biochips in a tutorial fashion; · Includes a set of "benchmarks", that are presented in great detail and includes the source code of several of the techniques p...

  15. Methodological challenges involved in compiling the Nahua pharmacopeia.

    Science.gov (United States)

    De Vos, Paula

    2017-06-01

    Recent work in the history of science has questioned the Eurocentric nature of the field and sought to include a more global approach that would serve to displace center-periphery models in favor of approaches that take seriously local knowledge production. Historians of Iberian colonial science have taken up this approach, which involves reliance on indigenous knowledge traditions of the Americas. These traditions present a number of challenges to modern researchers, including availability and reliability of source material, issues of translation and identification, and lack of systematization. This essay explores the challenges that emerged in the author's attempt to compile a pre-contact Nahua pharmacopeia, the reasons for these challenges, and the ways they may - or may not - be overcome.

  16. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations.

    Energy Technology Data Exchange (ETDEWEB)

    Kalinina, Elena Arkadievna [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samsa, Michael [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-11-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to "ensure it has heard from as many points of view as possible." The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs

  17. A Literature Review and Compilation of Nuclear Waste Management System Attributes for Use in Multi-Objective System Evaluations

    International Nuclear Information System (INIS)

    Kalinina, Elena Arkadievna; Samsa, Michael

    2015-01-01

    The purpose of this work was to compile a comprehensive initial set of potential nuclear waste management system attributes. This initial set of attributes is intended to serve as a starting point for additional consideration by system analysts and planners to facilitate the development of a waste management system multi-objective evaluation framework based on the principles and methodology of multi-attribute utility analysis. The compilation is primarily based on a review of reports issued by the Canadian Nuclear Waste Management Organization (NWMO) and the Blue Ribbon Commission on America's Nuclear Future (BRC), but also an extensive review of the available literature for similar and past efforts as well. Numerous system attributes found in different sources were combined into a single objectives-oriented hierarchical structure. This study provides a discussion of the data sources and the descriptions of the hierarchical structure. A particular focus of this study was on collecting and compiling inputs from past studies that involved the participation of various external stakeholders. However, while the important role of stakeholder input in a country's waste management decision process is recognized in the referenced sources, there are only a limited number of in-depth studies of the stakeholders' differing perspectives. Compiling a comprehensive hierarchical listing of attributes is a complex task since stakeholders have multiple and often conflicting interests. The BRC worked for two years (January 2010 to January 2012) to 'ensure it has heard from as many points of view as possible.' The Canadian NWMO study took four years and ample resources, involving national and regional stakeholders' dialogs, internet-based dialogs, information and discussion sessions, open houses, workshops, round tables, public attitude research, website, and topic reports. The current compilation effort benefited from the distillation of these many varied inputs conducted by the

  18. Pin count-aware biochemical application compilation for mVLSI biochips

    DEFF Research Database (Denmark)

    Lander Raagaard, Michael; Pop, Paul

    2015-01-01

    Microfluidic biochips are replacing the conventional biochemical analyzers and are able to integrate the necessary functions for biochemical analysis on-chip. In this paper we are interested in flow-based biochips, in which the fluidic flow manipulated using integrated microvalves, which are cont...... a biochemical application. We focus on the compilation task, where the strategy is to delay operations, without missing their deadlines, such that the sharing of control signals is maximized. The evaluation shows a significant reduction in the number of control pins required....

  19. Path to Stochastic Stability: Comparative Analysis of Stochastic Learning Dynamics in Games

    KAUST Repository

    Jaleel, Hassan; Shamma, Jeff S.

    2018-01-01

    dynamics: Log-Linear Learning (LLL) and Metropolis Learning (ML). Although both of these dynamics have the same stochastically stable states, LLL and ML correspond to different behavioral models for decision making. Moreover, we demonstrate through

  20. Code of ethics for the national pharmaceutical system: Codifying and compilation.

    Science.gov (United States)

    Salari, Pooneh; Namazi, Hamidreza; Abdollahi, Mohammad; Khansari, Fatemeh; Nikfar, Shekoufeh; Larijani, Bagher; Araminia, Behin

    2013-05-01

    Pharmacists as one of health-care providers face ethical issues in terms of pharmaceutical care, relationship with patients and cooperation with the health-care team. Other than pharmacy, there are pharmaceutical companies in various fields of manufacturing, importing or distributing that have their own ethical issues. Therefore, pharmacy practice is vulnerable to ethical challenges and needs special code of conducts. On feeling the need, based on a shared project between experts of the ethics from relevant research centers, all the needs were fully recognized and then specified code of conduct for each was written. The code of conduct was subject to comments of all experts involved in the pharmaceutical sector and thus criticized in several meetings. The prepared code of conduct is comprised of professional code of ethics for pharmacists, ethics guideline for pharmaceutical manufacturers, ethics guideline for pharmaceutical importers, ethics guideline for pharmaceutical distributors, and ethics guideline for policy makers. The document was compiled based on the principles of bioethics and professionalism. The compiling the code of ethics for the national pharmaceutical system is the first step in implementing ethics in pharmacy practice and further attempts into teaching the professionalism and the ethical code as the necessary and complementary effort are highly recommended.

  1. IAEA's experience in compiling a generic component reliability data base

    International Nuclear Information System (INIS)

    Tomic, B.; Lederman, L.

    1988-01-01

    Reliability data are an essential part of probabilistic safety assessment. The quality of data can determine the quality of the study as a whole. It is obvious that component failure data originated from the plant being analyzed would be most appropriate. However, in few cases complete reliance on plant experience is possible, mainly because of the rather limited operating experience. Nuclear plants, although of different design, often use fairly similar components, so some of the experience could be combined and transferred from one plant to another. In addition information about component failures is available also from experts with knowledge on component design, manufacturing and operation. That bring us to the importance of assessing generic data. (Generic is meant to be everything that is not plant specific regarding the plant being analyzed). The generic data available in the open literature, can be divided in three broad categories. The first one includes data base used in previous analysis. These can be plant specific or updated from generic with plant specific information (latter case deserve special attention). The second one is based on compilation of plants' operating experience usually based on some kind of event reporting system. The third category includes data sources based on expert opinions (single or aggregate) or combination of expert opinions and other nuclear and non-nuclear experience. This paper reflects insights gained in compiling data from generic data sources and highlights advantages and pitfalls of using generic component reliability data in PSAs

  2. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    Energy Technology Data Exchange (ETDEWEB)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors.

  3. Regulatory and technical reports (abstract index journal): Annual compilation for 1996, Volume 21, No. 4

    International Nuclear Information System (INIS)

    Sheehan, M.A.

    1997-04-01

    This compilation is the annual cumulation of bibliographic data and abstracts for the formal regulatory and technical reports issued by the U.S. Nuclear Regulatory Commission (NRC) Staff and its contractors

  4. Compiling for Novel Scratch Pad Memory based Multicore Architectures for Extreme Scale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, Aviral

    2016-02-05

    The objective of this proposal is to develop tools and techniques (in the compiler) to manage data of a task and communication among tasks on the scratch pad memory (SPM) of the core, so that any application (a set of tasks) can be executed efficiently on an SPM based manycore architecture.

  5. The Research and Compilation of City Maps in the National Geomatics Atlas of the PEOPLE'S Republic of China

    Science.gov (United States)

    Wang, G.; Wang, D.; Zhou, W.; Chen, M.; Zhao, T.

    2018-04-01

    The research and compilation of new century version of the National Huge Atlas of the People's Republic of China is the special basic work project by Ministry of Science and Technology of the People's Republic of China. Among them, the research and compilation of the National Geomatics Atlas of the People's Republic of China is its main content. The National Geomatics Atlas of China consists of 4 groups of maps and place name index. The 4 groups of maps are separately nationwide thematic map group, provincial fundamental geographical map group, landcover map group and city map group. The city map group is an important component part of the National Geomatics Atlas of China and mainly shows the process of urbanization in China. This paper, aim at design and compilation of 39 city-wide maps, briefly introduces mapping area research and scale design, mapping technical route, content selection and cartographic generalization, symbol design and visualization of map, etc.

  6. Rubus: A compiler for seamless and extensible parallelism

    Science.gov (United States)

    Adnan, Muhammad; Aslam, Faisal; Sarwar, Syed Mansoor

    2017-01-01

    Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU), originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer’s expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84 times has been

  7. Rubus: A compiler for seamless and extensible parallelism.

    Directory of Open Access Journals (Sweden)

    Muhammad Adnan

    Full Text Available Nowadays, a typical processor may have multiple processing cores on a single chip. Furthermore, a special purpose processing unit called Graphic Processing Unit (GPU, originally designed for 2D/3D games, is now available for general purpose use in computers and mobile devices. However, the traditional programming languages which were designed to work with machines having single core CPUs, cannot utilize the parallelism available on multi-core processors efficiently. Therefore, to exploit the extraordinary processing power of multi-core processors, researchers are working on new tools and techniques to facilitate parallel programming. To this end, languages like CUDA and OpenCL have been introduced, which can be used to write code with parallelism. The main shortcoming of these languages is that programmer needs to specify all the complex details manually in order to parallelize the code across multiple cores. Therefore, the code written in these languages is difficult to understand, debug and maintain. Furthermore, to parallelize legacy code can require rewriting a significant portion of code in CUDA or OpenCL, which can consume significant time and resources. Thus, the amount of parallelism achieved is proportional to the skills of the programmer and the time spent in code optimizations. This paper proposes a new open source compiler, Rubus, to achieve seamless parallelism. The Rubus compiler relieves the programmer from manually specifying the low-level details. It analyses and transforms a sequential program into a parallel program automatically, without any user intervention. This achieves massive speedup and better utilization of the underlying hardware without a programmer's expertise in parallel programming. For five different benchmarks, on average a speedup of 34.54 times has been achieved by Rubus as compared to Java on a basic GPU having only 96 cores. Whereas, for a matrix multiplication benchmark the average execution speedup of 84

  8. The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States

    Science.gov (United States)

    Horton, John D.; San Juan, Carma A.; Stoeser, Douglas B.

    2017-06-30

    The State Geologic Map Compilation (SGMC) geodatabase of the conterminous United States (https://doi. org/10.5066/F7WH2N65) represents a seamless, spatial database of 48 State geologic maps that range from 1:50,000 to 1:1,000,000 scale. A national digital geologic map database is essential in interpreting other datasets that support numerous types of national-scale studies and assessments, such as those that provide geochemistry, remote sensing, or geophysical data. The SGMC is a compilation of the individual U.S. Geological Survey releases of the Preliminary Integrated Geologic Map Databases for the United States. The SGMC geodatabase also contains updated data for seven States and seven entirely new State geologic maps that have been added since the preliminary databases were published. Numerous errors have been corrected and enhancements added to the preliminary datasets using thorough quality assurance/quality control procedures. The SGMC is not a truly integrated geologic map database because geologic units have not been reconciled across State boundaries. However, the geologic data contained in each State geologic map have been standardized to allow spatial analyses of lithology, age, and stratigraphy at a national scale.

  9. Compilation of contract research for the Materials Engineering Branch, Division of Engineering: Annual report for FY 1987

    International Nuclear Information System (INIS)

    1988-06-01

    This compilation of annual reports by contractors to the Materials Engineering Branch of the NRC Office of Research concentrates on achievements in safety research for the primary system of commercial light water power reactors, particularly with regard to reactor vessels, primary system piping, steam generators, nondestructive examination of primary components, and in safety research for decommissioning and decontamination, on-site storage, and engineered safety features. This report, covering research conducted during Fiscal Year 1987 is the sixth volume of the series of NUREG-0975, ''Compilation of Contractor Research for the Materials Engineering Branch, Division of Engineering.''

  10. WHO GLOBAL TUBERCULOSIS REPORTS: COMPILATION AND INTERPRETATION

    Directory of Open Access Journals (Sweden)

    I. A. Vаsilyevа

    2017-01-01

    Full Text Available The purpose of the article is to inform national specialists involved in tuberculosis control about methods for compilation of WHO global tuberculosis statistics, which are used when developing strategies and programmes for tuberculosis control and evaluation of their efficiency.  The article explains in detail the notions of main WHO epidemiological rates, used in the international publications on tuberculosis along with the data on their registered values, new approaches to making the list of country with the highest burden of tuberculosis, drug resistant tuberculosis and tuberculosis with concurrent HIV infection. The article compares the rates in the Russian Federation with global data as well as data from countries within WHO European Regions and countries with highest TB burden. It presents materials on the achievement of Global goals in tuberculosis control and main provisions of WHO End TB Strategy for 2015-2035 adopted as a part of UNO Sustainable Development Goals.  

  11. Parallelizing Compiler Framework and API for Power Reduction and Software Productivity of Real-Time Heterogeneous Multicores

    Science.gov (United States)

    Hayashi, Akihiro; Wada, Yasutaka; Watanabe, Takeshi; Sekiguchi, Takeshi; Mase, Masayoshi; Shirako, Jun; Kimura, Keiji; Kasahara, Hironori

    Heterogeneous multicores have been attracting much attention to attain high performance keeping power consumption low in wide spread of areas. However, heterogeneous multicores force programmers very difficult programming. The long application program development period lowers product competitiveness. In order to overcome such a situation, this paper proposes a compilation framework which bridges a gap between programmers and heterogeneous multicores. In particular, this paper describes the compilation framework based on OSCAR compiler. It realizes coarse grain task parallel processing, data transfer using a DMA controller, power reduction control from user programs with DVFS and clock gating on various heterogeneous multicores from different vendors. This paper also evaluates processing performance and the power reduction by the proposed framework on a newly developed 15 core heterogeneous multicore chip named RP-X integrating 8 general purpose processor cores and 3 types of accelerator cores which was developed by Renesas Electronics, Hitachi, Tokyo Institute of Technology and Waseda University. The framework attains speedups up to 32x for an optical flow program with eight general purpose processor cores and four DRP(Dynamically Reconfigurable Processor) accelerator cores against sequential execution by a single processor core and 80% of power reduction for the real-time AAC encoding.

  12. Some measurements of Java-to-bytecode compiler performance in the Java Virtual Machine

    OpenAIRE

    Daly, Charles; Horgan, Jane; Power, James; Waldron, John

    2001-01-01

    In this paper we present a platform independent analysis of the dynamic profiles of Java programs when executing on the Java Virtual Machine. The Java programs selected are taken from the Java Grande Forum benchmark suite, and five different Java-to-bytecode compilers are analysed. The results presented describe the dynamic instruction usage frequencies.

  13. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD50s and a compilation of experimental data

    International Nuclear Information System (INIS)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD 50 where sufficient experimental data are available. Exposure rates varied in magnitude from about 10 -2 to 10 3 R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs

  14. A ‘Social Form Of Knowledge’ in Practice: Unofficial Compiling of 1960s Pop Music on CD-R

    Directory of Open Access Journals (Sweden)

    Paul Martin

    2012-01-01

    Full Text Available In this article I explore the ‘unofficial’ (and technically illegal compiling of marginally known 1960s pop records on Compact Disc Recordable (CD-R. I do so by situating it within the proposition by the late Raphael Samuel, that history is ‘social knowledge’ and a practice rather than a profession. I propose that this compiling activity exemplifies this proposition. The core of the paper is centred on a 2007 survey which I conducted via three on-line 1960s music enthusiast discussion forums. I draw on the sixteen responses to demonstrate how the motivations, values and intentions of those respondents engaging in the practice of CD-R compiling are historically and socially centred. In doing so, I seek to problematise the music industry’s undifferentiated condemnation of all copying as theft. I do so by showing how, far from stealing, these CD-R compilers are adding to the musical social knowledge of 1960s pop and rock music. I further situate them within a longer lineage of ‘unofficial listening’ dating back to at least the 1930s. In using the term ‘unofficial’ in both a legal and public historical sense (eg to take issue with a received narrative, I point to wider definitions of what historically has or has not been musically ‘official’ to listen to. I seek also to point to the practice of CD-R compiling as a historical ‘moment’ in technological change, which might otherwise go unremarked upon as the CD-R itself heads towards utilitarian obsolescence. Although, the issues and concepts raised in the paper can be little more than pointed to, it is hoped it might act as one platform for the historical engagement with a subject more commonly discussed in sociological terms. As public historians we should be reflexive and inter-disciplinary and it is with this mind set that this article is written.

  15. LISP software generative compilation within the frame of a SLIP system

    International Nuclear Information System (INIS)

    Sitbon, Andre

    1968-01-01

    After having outlined the limitations associated with the use of some programming languages (Fortran, Algol, assembler, and so on), and the interest of the use of the LISP structure and its associated language, the author notices that some problems remain regarding the memorisation of the computing process obtained by interpretation. Thus, he introduces a generative compiler which produces an executable programme, and which is written in a language very close to the used machine language, i.e. the FAP assembler language

  16. GfW-handbook for data compilation of irradiation tested electronic components

    International Nuclear Information System (INIS)

    Wulf, F.; Braeunig, D.; Gaebler, W.

    1981-06-01

    The present 2. edition of the Data Compilation of Irradiation Tested Electronic Components represents a continuation of the 1. edition and is published as a loose-leaf handbook. In addition to the 190 reports provided in the 1. issue the present handbook contains further 44 test reports of currently used semiconductor devices in a comprehensive but easily to handle graphical and tabular presentation. Statistical values are given in order to facilitate the parts life time evaluation in a radiative environment. (orig.) [de

  17. Rad Chem data acquisition chassis users manual

    International Nuclear Information System (INIS)

    Jones, B.A.

    1980-01-01

    The Shiva Laser at LLL requires many forms of diagnostics to measure and analyze fusion experiments. This manual describes the operation of a Micro-Processor controlled data acquisition system designed at LLL to measure Neutron Activation during fusion experiments on the Shiva Laser

  18. Lifelong Learning as a goal - Do autonomy and self-regulation in school result in well prepared pupils?

    NARCIS (Netherlands)

    Lüftenegger, M.; Schober, B.; Van de Schoot, R.; Wagner, P.; Finsterwald, M.; Spiel, C.

    2012-01-01

    Fostering lifelong learning (LLL) is a topic of high relevance for current educational policy. School lays the cornerstone for the key components of LLL, specifically persistent motivation to learn and self-regulated learning behavior. The present study investigated the impact of classroom

  19. Preventing Run-Time Bugs at Compile-Time Using Advanced C++

    Energy Technology Data Exchange (ETDEWEB)

    Neswold, Richard [Fermilab

    2018-01-01

    When writing software, we develop algorithms that tell the computer what to do at run-time. Our solutions are easier to understand and debug when they are properly modeled using class hierarchies, enumerations, and a well-factored API. Unfortunately, even with these design tools, we end up having to debug our programs at run-time. Worse still, debugging an embedded system changes its dynamics, making it tough to find and fix concurrency issues. This paper describes techniques using C++ to detect run-time bugs *at compile time*. A concurrency library, developed at Fermilab, is used for examples in illustrating these techniques.

  20. Principal facts for gravity data collected in the southern Albuquerque Basin area and a regional compilation, central New Mexico

    Science.gov (United States)

    Gillespie, Cindy L.; Grauch, V.J.S.; Oshetski, Kim; Keller, Gordon R.

    2000-01-01

    Principal facts for 156 new gravity stations in the southern Albuquerque basin are presented. These data fill a gap in existing data coverage. The compilation of the new data and two existing data sets into a regional data set of 5562 stations that cover the Albuquerque basin and vicinity is also described. Bouguer anomaly and isostatic residual gravity data for this regional compilation are available in digital form from ftp://greenwood.cr.usgs.gov/pub/openfile- reports/ofr-00-490.

  1. Compilation of neutron flux density spectra and reaction rates in different neutron fields. V.3

    International Nuclear Information System (INIS)

    Ertek, C.

    1980-04-01

    Upon the recommendation of the International Working Group of Reactor Radiation Measurements (IWGRRM) a compilation of documents containing neutron flux density spectra and the reaction rates obtained by activiation and fission foils in different neutron fields is presented

  2. Encounters of aircraft with volcanic ash clouds; A compilation of known incidents, 1953-2009

    Science.gov (United States)

    Guffanti, Marianne; Casadevall, Thomas J.; Budding, Karin

    2010-01-01

    Information about reported encounters of aircraft with volcanic ash clouds from 1953 through 2009 has been compiled to document the nature and scope of risks to aviation from volcanic activity. The information, gleaned from a variety of published and other sources, is presented in database and spreadsheet formats; the compilation will be updated as additional encounters occur and as new data and corrections come to light. The effects observed by flight crews and extent of aircraft damage vary greatly among incidents, and each incident in the compilation is rated according to a severity index. Of the 129 reported incidents, 94 incidents are confirmed ash encounters, with 79 of those having various degrees of airframe or engine damage; 20 are low-severity events that involve suspected ash or gas clouds; and 15 have data that are insufficient to assess severity. Twenty-six of the damaging encounters involved significant to very severe damage to engines and (or) airframes, including nine encounters with engine shutdown during flight. The average annual rate of damaging encounters since 1976, when reporting picked up, has been approximately 2 per year. Most of the damaging encounters occurred within 24 hours of the onset of ash production or at distances less than 1,000 kilometers from the source volcanoes. The compilation covers only events of relatively short duration for which aircraft were checked for damage soon thereafter; documenting instances of long-term repeated exposure to ash (or sulfate aerosols) will require further investigation. Of 38 source volcanoes, 8 have caused 5 or more encounters, of which the majority were damaging: Augustine (United States), Chaiten (Chile), Mount St. Helens (United States), Pacaya (Guatemala), Pinatubo (Philippines), Redoubt (United States), Sakura-jima (Japan), and Soufriere Hills (Montserrat, Lesser Antilles, United Kingdom). Aircraft have been damaged by eruptions ranging from small, recurring episodes to very large

  3. Octopus: LLL's computing utility

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The Laboratory's Octopus network constitutes one of the greatest concentrations of computing power in the world. This power derives from the network's organization as well as from the size and capability of its computers, storage media, input/output devices, and communication channels. Being in a network enables these facilities to work together to form a unified computing utility that is accessible on demand directly from the users' offices. This computing utility has made a major contribution to the pace of research and development at the Laboratory; an adequate rate of progress in research could not be achieved without it. 4 figures

  4. Investigating Pre-Service Early Childhood Teachers' Views and Intentions about Integrating and Using Computers in Early Childhood Settings: Compilation of an Instrument

    Science.gov (United States)

    Nikolopoulou, Kleopatra; Gialamas, Vasilis

    2009-01-01

    This paper discusses the compilation of an instrument in order to investigate pre-service early childhood teachers' views and intentions about integrating and using computers in early childhood settings. For the purpose of this study a questionnaire was compiled and administered to 258 pre-service early childhood teachers (PECTs), in Greece. A…

  5. Certification of School Librarians: A Compilation of State Requirements, 1958. Bulletin, 1958, No. 12

    Science.gov (United States)

    Mahar, Mary Helen

    1958-01-01

    This bulletin on requirements by States and Territories for the certification of school librarians was prepared to provide a compilation of current certification regulations for school librarians and a summary of practices in formulating these regulations. The requirements for each State were obtained from either the State school library…

  6. Uddannelseshistorie 2009

    DEFF Research Database (Denmark)

    Antologi om livslang læring med bidrag om den nyere nordiske udikling inden for LLL og traditionen fra Comenius til idag......Antologi om livslang læring med bidrag om den nyere nordiske udikling inden for LLL og traditionen fra Comenius til idag...

  7. Accelerator safety program at the Lawrence Livermore Laboratory

    International Nuclear Information System (INIS)

    Graham, C.L.

    1976-01-01

    A proposed accelerator safety standard for the Lawrence Livermore Laboratory (LLL) is given. All accelerators will comply with this standard when it is included in the LLL Health and Safety Manual. The radiation alarm and radiation safety system for a radiography facility are also described

  8. Compilation of PRF Canyon Floor Pan Sample Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Pool, Karl N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Minette, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wahl, Jon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Greenwood, Lawrence R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coffey, Deborah S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McNamara, Bruce K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Bryan, Samuel A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Scheele, Randall D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Delegard, Calvin H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sinkov, Sergey I. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Soderquist, Chuck Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fiskum, Sandra K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Brown, Garrett N. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clark, Richard A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    On September 28, 2015, debris collected from the PRF (236-Z) canyon floor, Pan J, was observed to exhibit chemical reaction. The material had been transferred from the floor pan to a collection tray inside the canyon the previous Friday. Work in the canyon was stopped to allow Industrial Hygiene to perform monitoring of the material reaction. Canyon floor debris that had been sealed out was sequestered at the facility, a recovery plan was developed, and drum inspections were initiated to verify no additional reactions had occurred. On October 13, in-process drums containing other Pan J material were inspected and showed some indication of chemical reaction, limited to discoloration and degradation of inner plastic bags. All Pan J material was sealed back into the canyon and returned to collection trays. Based on the high airborne levels in the canyon during physical debris removal, ETGS (Encapsulation Technology Glycerin Solution) was used as a fogging/lock-down agent. On October 15, subject matter experts confirmed a reaction had occurred between nitrates (both Plutonium Nitrate and Aluminum Nitrate Nonahydrate (ANN) are present) in the Pan J material and the ETGS fixative used to lower airborne radioactivity levels during debris removal. Management stopped the use of fogging/lock-down agents containing glycerin on bulk materials, declared a Management Concern, and initiated the Potential Inadequacy in the Safety Analysis determination process. Additional drum inspections and laboratory analysis of both reacted and unreacted material are planned. This report compiles the results of many different sample analyses conducted by the Pacific Northwest National Laboratory on samples collected from the Plutonium Reclamation Facility (PRF) floor pans by the CH2MHill’s Plateau Remediation Company (CHPRC). Revision 1 added Appendix G that reports the results of the Gas Generation Rate and methodology. The scope of analyses requested by CHPRC includes the determination of

  9. Compilation of neutron flux density spectra and reaction rates in different neutron fields

    International Nuclear Information System (INIS)

    Ertek, C.

    1979-07-01

    Upon the recommendation of International Working Group of Reactor Radiation Measurements (IWGRRM), the compilation of neutron flux density spectra and the reaction rates obtained by activation and fission foils in different neutron fields is presented. The neutron fields considered are as follows: 1/E; iron block; LWR core and pressure vessel; LMFBR core and blanket; CTR first wall and blanket; fission spectrum

  10. Current status of the HAL/S compiler on the Modcomp classic 7870 computer

    Science.gov (United States)

    Lytle, P. J.

    1981-01-01

    A brief history of the HAL/S language, including the experience of other users of the language at the Jet Propulsion Laboratory is presented. The current status of the compiler, as implemented on the Modcomp 7870 Classi computer, and future applications in the Deep Space Network (DSN) are discussed. The primary applications in the DSN will be in the Mark IVA network.

  11. Automated Analysis of ARM Binaries using the Low-Level Virtual Machine Compiler Framework

    Science.gov (United States)

    2011-03-01

    Maintenance ABACAS offers a level of flexibility in software development that would be very useful later in the software engineering life cycle. New... Blackjacking : security threats to blackberry devices, PDAs and cell phones in the enterprise. Indianapolis, Indiana, U.S.A.: Wiley Publishing, 2007...AUTOMATED ANALYSIS OF ARM BINARIES USING THE LOW- LEVEL VIRTUAL MACHINE COMPILER FRAMEWORK THESIS Jeffrey B. Scott

  12. An improved bathymetry compilation for the Bellingshausen Sea, Antarctica, to inform ice-sheet and ocean models

    Directory of Open Access Journals (Sweden)

    A. G. C. Graham

    2011-02-01

    Full Text Available The southern Bellingshausen Sea (SBS is a rapidly-changing part of West Antarctica, where oceanic and atmospheric warming has led to the recent basal melting and break-up of the Wilkins ice shelf, the dynamic thinning of fringing glaciers, and sea-ice reduction. Accurate sea-floor morphology is vital for understanding the continued effects of each process upon changes within Antarctica's ice sheets. Here we present a new bathymetric grid for the SBS compiled from shipborne multibeam echo-sounder, spot-sounding and sub-ice measurements. The 1-km grid is the most detailed compilation for the SBS to-date, revealing large cross-shelf troughs, shallow banks, and deep inner-shelf basins that continue inland of coastal ice shelves. The troughs now serve as pathways which allow warm deep water to access the ice sheet in the SBS. Our dataset highlights areas still lacking bathymetric constraint, as well as regions for further investigation, including the likely routes of palaeo-ice streams. The new compilation is a major improvement upon previous grids and will be a key dataset for incorporating into simulations of ocean circulation, ice-sheet change and history. It will also serve forecasts of ice stability and future sea-level contributions from ice loss in West Antarctica, required for the next IPCC assessment report in 2013.

  13. Groundwater-quality data associated with abandoned underground coal mine aquifers in West Virginia, 1973-2016: Compilation of existing data from multiple sources

    Science.gov (United States)

    McAdoo, Mitchell A.; Kozar, Mark D.

    2017-11-14

    This report describes a compilation of existing water-quality data associated with groundwater resources originating from abandoned underground coal mines in West Virginia. Data were compiled from multiple sources for the purpose of understanding the suitability of groundwater from abandoned underground coal mines for public supply, industrial, agricultural, and other uses. This compilation includes data collected for multiple individual studies conducted from July 13, 1973 through September 7, 2016. Analytical methods varied by the time period of data collection and requirements of the independent studies.This project identified 770 water-quality samples from 294 sites that could be attributed to abandoned underground coal mine aquifers originating from multiple coal seams in West Virginia.

  14. Regulatory and technical reports (Abstract Index Journal): Compilation for third quarter 1986, July-September

    International Nuclear Information System (INIS)

    1986-11-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors, as well as proceedings of conferences and workshops. The entries in the compilation are indexed for access by title and abstract, contractor report number, personal author, subject, NRC organization, contractor, and licensed facility

  15. Research at GANIL. A compilation 1996-1997

    Energy Technology Data Exchange (ETDEWEB)

    Balanzat, E.; Bex, M.; Galin, J.; Geswend, S. [eds.

    1998-12-01

    The present compilation gives an overview of experimental results obtained with the GANIL facility during the period 1996-1997. It includes nuclear physics activities as well as interdisciplinary research. The scientific domain presented here extends well beyond the traditional nuclear physics and includes atomic physics, condensed matter physics, nuclear astrophysics, radiation chemistry, radiobiology as well as applied physics. In the nuclear physics field, many new results have been obtained concerning nuclear structure as well as the dynamics of nuclear collisions and nuclear disassembly of complex systems. Results presented deal in particular with the problem of energy equilibration, timescales and the origin of multifragmentation. Nuclear structure studies using both stable and radioactive beams deal with halo systems, study of shell closures far from stability, the existence of nuclear molecules as well as measurements of fundamental data s half lives, nuclear masses, nuclear radii, quadrupole and magnetic moments. In addition to traditional fields of atomic and solid state physics, new themes such as radiation chemistry and radiobiology are progressively being tackled. (K.A.)

  16. Data compilation and assessment for water resources in Pennsylvania state forest and park lands

    Science.gov (United States)

    Galeone, Daniel G.

    2011-01-01

    As a result of a cooperative study between the U.S. Geological Survey and the Pennsylvania Department of Conservation and Natural Resources (PaDCNR), available electronic data were compiled for Pennsylvania state lands (state forests and parks) to allow PaDCNR to initially determine if data exist to make an objective evaluation of water resources for specific basins. The data compiled included water-quantity and water-quality data and sample locations for benthic macroinvertebrates within state-owned lands (including a 100-meter buffer around each land parcel) in Pennsylvania. In addition, internet links or contacts for geographic information system coverages pertinent to water-resources studies also were compiled. Water-quantity and water-quality data primarily available through January 2007 were compiled and summarized for site types that included streams, lakes, ground-water wells, springs, and precipitation. Data were categorized relative to 35 watershed boundaries defined by the Pennsylvania Department of Environmental Protection for resource-management purposes. The primary sources of continuous water-quantity data for Pennsylvania state lands were the U.S. Geological Survey (USGS) and the National Weather Service (NWS). The USGS has streamflow data for 93 surface-water sites located in state lands; 38 of these sites have continuous-recording data available. As of January 2007, 22 of these 38 streamflow-gaging stations were active; the majority of active gaging stations have over 40 years of continuous record. The USGS database also contains continuous ground-water elevation data for 32 wells in Pennsylvania state lands, 18 of which were active as of January 2007. Sixty-eight active precipitation stations (primarily from the NWS network) are located in state lands. The four sources of available water-quality data for Pennsylvania state lands were the USGS, U.S. Environmental Protection Agency, Pennsylvania Department of Environmental Protection (PaDEP), and

  17. Compilation and evaluation of a Paso del Norte emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Funk, T.H.; Chinkin, L.R.; Roberts, P.T. [Sonoma Technology, Inc., 1360 Redwood Way, Suite C, 94954-1169 Petaluma, CA (United States); Saeger, M.; Mulligan, S. [Pacific Environmental Services, 5001 S. Miami Blvd., Suite 300, 27709 Research Triangle Park, NC (United States); Paramo Figueroa, V.H. [Instituto Nacional de Ecologia, Avenue Revolucion 1425, Nivel 10, Col. Tlacopac San Angel, Delegacion Alvaro Obregon, C.P., 01040, D.F. Mexico (Mexico); Yarbrough, J. [US Environmental Protection Agency - Region 6, 1445 Ross Avenue, Suite 1200, 75202-2733 Dallas, TX (United States)

    2001-08-10

    Emission inventories of ozone precursors are routinely used as input to comprehensive photochemical air quality models. Photochemical model performance and the development of effective control strategies rely on the accuracy and representativeness of an underlying emission inventory. This paper describes the tasks undertaken to compile and evaluate an ozone precursor emission inventory for the El Paso/Ciudad Juarez/Southern Dona Ana region. Point, area and mobile source emission data were obtained from local government agencies and were spatially and temporally allocated to a gridded domain using region-specific demographic and land-cover information. The inventory was then processed using the US Environmental Protection Agency (EPA) recommended Emissions Preprocessor System 2.0 (UAM-EPS 2.0) which generates emissions files compatible with the Urban Airshed Model (UAM). A top-down evaluation of the emission inventory was performed to examine how well the inventory represented ambient pollutant compositions. The top-down evaluation methodology employed in this study compares emission inventory ratios of non-methane hydrocarbon (NMHC)/nitrogen oxide (NO{sub x}) and carbon monoxide (CO)/NO{sub x} ratios to corresponding ambient ratios. Detailed NMHC species comparisons were made in order to investigate the relative composition of individual hydrocarbon species in the emission inventory and in the ambient data. The emission inventory compiled during this effort has since been used to model ozone in the Paso del Norte airshed (Emery et al., CAMx modeling of ozone and carbon monoxide in the Paso del Norte airshed. In: Proc of Ninety-Third Annual Meeting of Air and Waste Management Association, 18-22 June 2000, Air and Waste Management Association, Pittsburgh, PA, 2000)

  18. Animal mortality resulting from uniform exposures to photon radiations: Calculated LD/sub 50/s and a compilation of experimental data

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T.D.; Morris, M.D.; Wells, S.M.; Young, R.W.

    1986-12-01

    Studies conducted during the 1950s and 1960s of radiation-induced mortality to diverse animal species under various exposure protocols were compiled into a mortality data base. Some 24 variables were extracted and recomputed from each of the published studies, which were collected from a variety of available sources, primarily journal articles. Two features of this compilation effort are (1) an attempt to give an estimate of the uniform dose received by the bone marrow in each treatment so that interspecies differences due to body size were minimized and (2) a recomputation of the LD/sub 50/ where sufficient experimental data are available. Exposure rates varied in magnitude from about 10/sup -2/ to 10/sup 3/ R/min. This report describes the data base, the sources of data, and the data-handling techniques; presents a bibliography of studies compiled; and tabulates data from each study. 103 refs., 44 tabs.

  19. Description of source term data on contaminated sites and buildings compiled for the waste management programmatic environmental impact statement (WMPEIS)

    International Nuclear Information System (INIS)

    Short, S.M.; Smith, D.E.; Hill, J.G.; Lerchen, M.E.

    1995-10-01

    The U.S. Department of Energy (DOE) and its predecessor agencies have historically had responsibility for carrying out various national missions primarily related to nuclear weapons development and energy research. Recently, these missions have been expanded to include remediation of sites and facilities contaminated as a result of past activities. In January 1990, the Secretary of Energy announced that DOE would prepare a Programmatic Environmental Impact Statement on the DOE's environmental restoration and waste management program; the primary focus was the evaluation of (1) strategies for conducting remediation of all DOE contaminated sites and facilities and (2) potential configurations for waste management capabilities. Several different environmental restoration strategies were identified for evaluation, ranging from doing no remediation to strategies where the level of remediation was driven by such factors as final land use and health effects. A quantitative assessment of the costs and health effects of remediation activities and residual contamination levels associated with each remediation strategy was made. These analyses required that information be compiled on each individual contaminated site and structure located at each DOE installation and that the information compiled include quantitative measurements and/or estimates of contamination levels and extent of contamination. This document provides a description of the types of information and data compiled for use in the analyses. Also provided is a description of the database used to manage the data, a detailed discussion of the methodology and assumptions used in compiling the data, and a summary of the data compiled into the database as of March 1995. As of this date, over 10,000 contaminated sites and structures and over 8,000 uncontaminated structures had been identified across the DOE complex of installations

  20. Compilation of nucleon-nucleon and nucleon-antinucleon elastic scattering data

    International Nuclear Information System (INIS)

    Carter, M.K.; Collins, P.D.B.; Whalley, M.R.

    1986-01-01

    A compilation of the data on pp, pn, nn, p-barp, p-barn, n-barp, and n-barn is presented, in both tabular and graphical form, including when available the total and elastic cross sections, the differences of the total cross section in different spin states, the ratio of the real to imaginary part of the forward scattering amplitude, the elastic differential cross sections, the polarization asymmetry and the spin correlation parameters, for all laboratory-frame momenta >=2 GeV/c. All the data in this review can be found in and retrieved from the Durham-RAL HEP data base together with data on a wide variety of other reactions. (author)

  1. Intelligence of programs

    Energy Technology Data Exchange (ETDEWEB)

    Novak, D

    1982-01-01

    A general discussion about the level of artificial intelligence in computer programs is presented. The suitability of various languages for the development of complex, intelligent programs is discussed, considering fourth-generation language as well as the well established structured COBOL language. It is concluded that the success of automation in many administrative fields depends to a large extent on the development of intelligent programs.

  2. Oral Mucositis Prevention By Low-Level Laser Therapy in Head-and-Neck Cancer Patients Undergoing Concurrent Chemoradiotherapy: A Phase III Randomized Study

    Energy Technology Data Exchange (ETDEWEB)

    Gouvea de Lima, Aline [Departamento de Radiologia, Disciplina de Oncologia, Faculdade de Medicina da Universidade de Sao Paulo, Sao Paulo, SP (Brazil); Villar, Rosangela Correa [Instituto de Radiologia, Servico de Radioterapia, Hospital das Clinicas da Faculdade de Medicina da Universidade de Sao Paulo, Sao Paulo, SP (Brazil); Castro, Gilberto de, E-mail: gilberto.castro@usp.br [Department of Clinical Oncology, Instituto do Cancer do Estado de Sao Paulo, Sao Paulo, SP (Brazil); Antequera, Reynaldo [Divisao de Odontologia, Hospital das Clinicas da Faculdade de Medicina da Universidade de Sao Paulo, Sao Paulo, SP (Brazil); Gil, Erlon; Rosalmeida, Mauro Cabral [Instituto de Radiologia, Servico de Radioterapia, Hospital das Clinicas da Faculdade de Medicina da Universidade de Sao Paulo, Sao Paulo, SP (Brazil); Federico, Miriam Hatsue Honda; Snitcovsky, Igor Moises Longo [Departamento de Radiologia, Disciplina de Oncologia, Faculdade de Medicina da Universidade de Sao Paulo, Sao Paulo, SP (Brazil)

    2012-01-01

    Purpose: Oral mucositis is a major complication of concurrent chemoradiotherapy (CRT) in head-and-neck cancer patients. Low-level laser (LLL) therapy is a promising preventive therapy. We aimed to evaluate the efficacy of LLL therapy to decrease severe oral mucositis and its effect on RT interruptions. Methods and Materials: In the present randomized, double-blind, Phase III study, patients received either gallium-aluminum-arsenide LLL therapy 2.5 J/cm{sup 2} or placebo laser, before each radiation fraction. Eligible patients had to have been diagnosed with squamous cell carcinoma or undifferentiated carcinoma of the oral cavity, pharynx, larynx, or metastases to the neck with an unknown primary site. They were treated with adjuvant or definitive CRT, consisting of conventional RT 60-70 Gy (range, 1.8-2.0 Gy/d, 5 times/wk) and concurrent cisplatin. The primary endpoints were the oral mucositis severity in Weeks 2, 4, and 6 and the number of RT interruptions because of mucositis. The secondary endpoints included patient-reported pain scores. To detect a decrease in the incidence of Grade 3 or 4 oral mucositis from 80% to 50%, we planned to enroll 74 patients. Results: A total of 75 patients were included, and 37 patients received preventive LLL therapy. The mean delivered radiation dose was greater in the patients treated with LLL (69.4 vs. 67.9 Gy, p = .03). During CRT, the number of patients diagnosed with Grade 3 or 4 oral mucositis treated with LLL vs. placebo was 4 vs. 5 (Week 2, p = 1.0), 4 vs. 12 (Week 4, p = .08), and 8 vs. 9 (Week 6, p = 1.0), respectively. More of the patients treated with placebo had RT interruptions because of mucositis (6 vs. 0, p = .02). No difference was detected between the treatment arms in the incidence of severe pain. Conclusions: LLL therapy was not effective in reducing severe oral mucositis, although a marginal benefit could not be excluded. It reduced RT interruptions in these head-and-neck cancer patients, which might

  3. Compilation and evaluation of atomic and molecular data relevant to controlled thermonuclear research needs: USA programs

    International Nuclear Information System (INIS)

    Barnett, C.F.

    1976-01-01

    The U.S. role in the compilation and evaluation of atomic data for controlled thermonuclear research is discussed in the following three areas: (1) atomic structure data, (2) atomic collision data, and (3) surface data

  4. 1989 OCRWM [Office of Civilian Radioactive Waste Management] Bulletin compilation and index

    International Nuclear Information System (INIS)

    1990-02-01

    The OCRWM Bulletin is published by the Department of Energy, Office of Civilian Radioactive Waste Management to provide current information about the national program for managing spent fuel and high-level radioactive waste. This document is a compilation of issues from the 1989 calendar year. A table of contents and one index have been provided to assist in finding information contained in this year's Bulletins. The pages have been numbered consecutively at the bottom for easy reference. 7 figs

  5. Abstracts: NRC Waste Management Program reports

    Energy Technology Data Exchange (ETDEWEB)

    Heckman, R.A.; Minichino, C.

    1979-11-01

    This document consists of abstracts of all reports published by the Nuclear Regulatory Commission (NRC) Waste Management Program at Lawrence Livermore Laboratory (LLL). It will be updated at regular intervals. Reports are arranged in numerical order, within each category. Unless otherwise specified, authors are LLL scientists and engineers.

  6. Abstracts: NRC Waste Management Program reports

    International Nuclear Information System (INIS)

    Heckman, R.A.; Minichino, C.

    1979-11-01

    This document consists of abstracts of all reports published by the Nuclear Regulatory Commission (NRC) Waste Management Program at Lawrence Livermore Laboratory (LLL). It will be updated at regular intervals. Reports are arranged in numerical order, within each category. Unless otherwise specified, authors are LLL scientists and engineers

  7. Mineralogy and geochemistry of rocks and fracture fillings from Forsmark and Oskarshamn: Compilation of data for SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Drake, Henrik; Sandstroem, Bjoern [Isochron GeoConsulting HB, Goeteborg (Sweden); Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden)

    2006-11-15

    This report is a compilation of the so far available data for the safety assessment SR-Can carried out by SKB. The data consists of mineralogy, geochemistry, porosity, density and redox properties for both dominating rock types and fracture fillings at the Forsmark and Oskarshamn candidate areas. In addition to the compilation of existing information, the aim has been to identify missing data and to clarify some conception of e.g. deformation zones. The objective of the report is to present the available data requested for the modelling of the chemical stability of the two sites. The report includes no interpretation of the data.

  8. OpenMP-accelerated SWAT simulation using Intel C and FORTRAN compilers: Development and benchmark

    Science.gov (United States)

    Ki, Seo Jin; Sugimura, Tak; Kim, Albert S.

    2015-02-01

    We developed a practical method to accelerate execution of Soil and Water Assessment Tool (SWAT) using open (free) computational resources. The SWAT source code (rev 622) was recompiled using a non-commercial Intel FORTRAN compiler in Ubuntu 12.04 LTS Linux platform, and newly named iOMP-SWAT in this study. GNU utilities of make, gprof, and diff were used to develop the iOMP-SWAT package, profile memory usage, and check identicalness of parallel and serial simulations. Among 302 SWAT subroutines, the slowest routines were identified using GNU gprof, and later modified using Open Multiple Processing (OpenMP) library in an 8-core shared memory system. In addition, a C wrapping function was used to rapidly set large arrays to zero by cross compiling with the original SWAT FORTRAN package. A universal speedup ratio of 2.3 was achieved using input data sets of a large number of hydrological response units. As we specifically focus on acceleration of a single SWAT run, the use of iOMP-SWAT for parameter calibrations will significantly improve the performance of SWAT optimization.

  9. Activities of the cross-section compilation and evaluation centers at the Brookhaven National Laboratory

    International Nuclear Information System (INIS)

    Chernick, J.

    1967-01-01

    The growth of the compilation and evaluation efforts at the Brookhaven National Laboratory are reviewed. The current work of the Sigma Center is discussed, including the status of the publication of supplements to BNL-325 and the current state of the SCISRS-I tape. Future needs for BNL-325 type publications and SCISRS-II cross-section tapes are outlined. The history of the Cross-Section Evaluation Center at the Brookhaven National Laboratory is similarly reviewed. The status of current work is discussed, including the growth of the ENDF/A tape. The status of US efforts to produce a cross-section tape (ENDF7B) at an early date to satisfy the needs of US reactor designers is discussed. The continued importance of integral experiments and their accurate analysis to provide checks of the cross-section tapes is pointed out. The role of the Brookhaven National Laboratory in collaboration on an international basis is reviewed, including its current relationship to the ENEA Neutron Data Compilation Centre, the International Atomic Energy Agency and other nuclear centres. (author)

  10. Association of self-reported cognitive concerns with mobility in people with lower limb loss.

    Science.gov (United States)

    Kelly, Valerie E; Morgan, Sara J; Amtmann, Dagmar; Salem, Rana; Hafner, Brian J

    2018-01-01

    This study tested the hypothesis that greater perceived cognitive concerns are associated with worse mobility in a cohort of prosthesis users with lower limb loss (LLL). We performed a secondary analysis of cross-sectional self-report data from a volunteer sample of people with LLL due to dysvascular and non-dysvacular causes. Perceived cognitive difficulties were assessed using the Quality of Life in Neurological Disorders Applied Cognition - General Concerns (Neuro-QoL ACGC). Mobility was measured with the Activities-Specific Balance Confidence Scale (ABC) and the Prosthetic Limb Users Survey of Mobility (PLUS-M). Simple linear regressions examined univariate relationships between cognitive concerns and mobility. Multiple linear regression analyses included demographic and amputation-related variables that could influence this relationship. Analysis of data from 1291 people with LLL demonstrated that greater cognitive concerns, measured by the Neuro-QoL ACGC, were associated with poorer perceived mobility, measured by both ABC and PLUS-M instruments. This relationship remained statistically significant after adjusting for demographic and amputation-related factors. These results suggest that greater cognitive concerns are associated with worse mobility among a broad range of people with LLL. An improved understanding of this relationship is critical for optimizing rehabilitation outcomes for this population. Implications for rehabilitation Rehabilitation for people with lower limb loss (LLL) typically focuses on physical impairments and mobility limitations, but cognition is increasingly recognized to have an impact on functional outcomes. Greater perceived cognitive concerns are associated with poorer mobility among a broad range of people with LLL, even when adjusting for demographic and amputation-related factors. Cognitive status can impact relevant rehabilitative outcomes, including mobility, and should be considered when planning prosthetic and therapeutic

  11. Construction experiences from underground works at Oskarshamn. Compilation report

    International Nuclear Information System (INIS)

    Carlsson, Anders; Christiansson, Rolf

    2007-12-01

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m 3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  12. Construction experiences from underground works at Oskarshamn. Compilation report

    Energy Technology Data Exchange (ETDEWEB)

    Carlsson, Anders (Vattenfall Power Consultant AB, Stockholm (SE)); Christiansson, Rolf (Swedish Nuclear Fuel and Waste Management Co., Stockholm (SE))

    2007-12-15

    The main objective with this report is to compile experiences from the underground works carried out at Oskarshamn, primarily construction experiences from the tunnelling of the cooling water tunnels of the Oskarshamn nuclear power units 1,2 and 3, from the underground excavations of Clab 1 and 2 (Central Interim Storage Facility for Spent Nuclear Fuel), and Aespoe Hard Rock Laboratory. In addition, an account is given of the operational experience of Clab 1 and 2 and of the Aespoe HRL on primarily scaling and rock support solutions. This report, as being a compilation report, is in its substance based on earlier published material as presented in the list of references. Approximately 8,000 m of tunnels including three major rock caverns with a total volume of about 550,000 m3 have been excavated. The excavation works of the various tunnels and rock caverns were carried out during the period of 1966-2000. In addition, minor excavation works were carried out at the Aespoe HRL in 2003. The depth location of the underground structures varies from near surface down to 450 m. As an overall conclusion it may be said that the rock mass conditions in the area are well suited for underground construction. This conclusion is supported by the experiences from the rock excavation works in the Simpevarp and Aespoe area. These works have shown that no major problems occurred during the excavation works; nor have any stability or other rock engineering problems of significance been identified after the commissioning of the Oskarshamn nuclear power units O1, O2 and O3, BFA, Clab 1 and 2, and Aespoe Hard Rock Laboratory. The underground structures of these facilities were built according to plan, and since than been operated as planned. Thus, the quality of the rock mass within the construction area is such that it lends itself to excavation of large rock caverns with a minimum of rock support

  13. A methodology to compile food metrics related to diet sustainability into a single food database: Application to the French case.

    Science.gov (United States)

    Gazan, Rozenn; Barré, Tangui; Perignon, Marlène; Maillot, Matthieu; Darmon, Nicole; Vieux, Florent

    2018-01-01

    The holistic approach required to assess diet sustainability is hindered by lack of comprehensive databases compiling relevant food metrics. Those metrics are generally scattered in different data sources with various levels of aggregation hampering their matching. The objective was to develop a general methodology to compile food metrics describing diet sustainability dimensions into a single database and to apply it to the French context. Each step of the methodology is detailed: indicators and food metrics identification and selection, food list definition, food matching and values assignment. For the French case, nutrient and contaminant content, bioavailability factors, distribution of dietary intakes, portion sizes, food prices, greenhouse gas emission, acidification and marine eutrophication estimates were allocated to 212 commonly consumed generic foods. This generic database compiling 279 metrics will allow the simultaneous evaluation of the four dimensions of diet sustainability, namely health, economic, social and environmental, dimensions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Neutron data compilation. Report of a Panel sponsored by the International Atomic Energy Agency and held in Brookhaven, 10-14 February 1969

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1969-02-15

    The IAEA organized and convened a Panel on Neutron Data Compilation. This Panel was organized by the Agency following the recommendations made by the International Nuclear Data Committee (INDC) which agreed that a general review of world neutron data compilation activities was desirable. In this context neutron data compilation encompasses the collection, storage and dissemination of bibliographic information and of qualitative and numerical data on the interaction of neutrons with nuclei and atoms for all incident energies. Such information and data have important applications in low energy neutron physics and many important. areas of nuclear technology. The principal objective of the Panel on Neutron Data Compilation, Which was held at Brookhaven National Laboratory during 10-14 February 1969, was to review how the world's principal data centers located at Brookhaven, Saclay, Obninsk and Vienna could ideally meet the demands and needs of experimental and theoretical neutron physicists, evaluators, reactor physicists as well as other existing and potential users. Fourteen papers were considered during formal sessions of the Panel and are reported on the following pages. The members of the Panel separated into five working groups to consider specific terms of references and make recommendations. Their reports were discussed.

  15. Discussion on the compilation of document of iso quality management system for radiation sterilization enterprises

    International Nuclear Information System (INIS)

    Li Chunhong; Ha Yiming; Zhou Hongjie; Feng Zhiguo; Wang Feng

    2006-01-01

    According to the character of cooperation of radiation sterilization, and association with request of ISO9001, ISO13485 and ISO11137, compilation of document in quality manual, procedure document and technological document during certification of ISO quality management system of cooperation of radiation sterilization was discussed. (authors)

  16. Compilation of data and descriptions for United States and foreign liquid metal fast breeder reactors

    International Nuclear Information System (INIS)

    Appleby, E.R.

    1975-08-01

    This document is a compilation of design and engineering information pertaining to liquid metal cooled fast breeder reactors which have operated, are operating, or are currently under construction, in the United States and abroad. All data has been taken from publicly available documents, journals, and books

  17. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure

    KAUST Repository

    Labschutz, Matthias

    2015-08-12

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  18. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure

    KAUST Repository

    Labschutz, Matthias; Bruckner, Stefan; Groller, M. Eduard; Hadwiger, Markus; Rautek, Peter

    2015-01-01

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  19. JiTTree: A Just-in-Time Compiled Sparse GPU Volume Data Structure.

    Science.gov (United States)

    Labschütz, Matthias; Bruckner, Stefan; Gröller, M Eduard; Hadwiger, Markus; Rautek, Peter

    2016-01-01

    Sparse volume data structures enable the efficient representation of large but sparse volumes in GPU memory for computation and visualization. However, the choice of a specific data structure for a given data set depends on several factors, such as the memory budget, the sparsity of the data, and data access patterns. In general, there is no single optimal sparse data structure, but a set of several candidates with individual strengths and drawbacks. One solution to this problem are hybrid data structures which locally adapt themselves to the sparsity. However, they typically suffer from increased traversal overhead which limits their utility in many applications. This paper presents JiTTree, a novel sparse hybrid volume data structure that uses just-in-time compilation to overcome these problems. By combining multiple sparse data structures and reducing traversal overhead we leverage their individual advantages. We demonstrate that hybrid data structures adapt well to a large range of data sets. They are especially superior to other sparse data structures for data sets that locally vary in sparsity. Possible optimization criteria are memory, performance and a combination thereof. Through just-in-time (JIT) compilation, JiTTree reduces the traversal overhead of the resulting optimal data structure. As a result, our hybrid volume data structure enables efficient computations on the GPU, while being superior in terms of memory usage when compared to non-hybrid data structures.

  20. Neutron data compilation at the International Atomic Energy Agency

    International Nuclear Information System (INIS)

    Lemmel, H.D.; Attree, P.M.; Byer, T.A.; Good, W.M.; Hjaerne, L.; Konshin, V.A.; Lorens, A.

    1968-03-01

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  1. Neutron data compilation at the International Atomic Energy Agency

    Energy Technology Data Exchange (ETDEWEB)

    Lemmel, H D; Attree, P M; Byer, T A; Good, W M; Hjaerne, L; Konshin, V A; Lorens, A [Nuclear Data Unit, International Atomic Energy Agency, Vienna (Austria)

    1968-03-15

    The paper describes the present status of the neutron data compilation center of the IAEA Nuclear Data Unit, which is now in full operation. An outline i s given of the principles and objectives, the working routines, and the services available within the two-fold functions of the Unit: a) to promote cooperation and international neutron data exchange between the four major centers at Brookhaven, Saclay, Obninsk and Vienna, which share responsibilities in a geographical distribution of labour; b) to collect systematically the neutron data arising from countries in East Europe, Asia, Australia, Africa, South and Central America and to offer certain services to these countries. A brief description of DASTAR, the DAta STorage And Retrieval system, and of CINDU, the data Catalog of the JAEA Nuclear Data Unit, is given. (author)

  2. Irradiation of red meat. A compilation of technical data for its authorization and control

    International Nuclear Information System (INIS)

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ''Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control''. 146 refs

  3. Regulatory and technical reports (abstract index journal), Compilation for third quarter 1993, July--September

    International Nuclear Information System (INIS)

    1993-11-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors, proceedings of conferences and workshops, grants, and international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  4. Irradiation of red meat. A compilation of technical data for its authorization and control

    Energy Technology Data Exchange (ETDEWEB)

    International consultative group on food irradiation

    1996-08-01

    The aim of this monograph is to provide the rationale and justification for treating red meats with ionizing radiation for improving microbiological safety, parasite control and extending non-frozen shelf-life. It is intended to complement a previous publication ``Irradiation of Poultry Meat and its Products - A compilation of Technical Data for its Authorization and Control``. 146 refs.

  5. What we heard within WPDD on stakeholder involvement in decommissioning, 2001-2004. A Compilation of Papers

    International Nuclear Information System (INIS)

    2006-01-01

    At its sixth meeting, the WPDD held a topical session on Stakeholder Involvement in Decommissioning Projects. The topical session was jointly planned and run with members of the NEA Forum on Stakeholder Confidence (FSC). The Topical Session provided a stimulus to review the contributions in the area of stakeholder involvement that the WPDD have received since its inception. This report contains a compilation of all papers regarding stakeholder involvement in decommissioning given at WPDD meetings and workshops between 2001 and the end of 2004. The compilation, together with other relevant material collected by FSC, will serve as background material for a review, focusing on lessons to be learnt and including examples of key statements by representatives from different NEA member states involved in or affected by decommissioning projects. The review is intended to be published during 2006 in a NEA brochure

  6. Investigation of the Lifelong Learning Tendency of College Students

    Science.gov (United States)

    Aksoy, Hamit; Erbay, Hasan; Kör, Hakan; Engin, Melih

    2017-01-01

    The concept of "lifelong learning (LLL)" has emerged because of the necessity of renewing earlier and immemorial information in time. LLL is involved in a various kind of international foundations' works. European Union is perceived as more efficient in comparison with international companies in terms of lifelong learning practices.…

  7. Compilation of contract research for the Materials Engineering Branch, Division of Engineering Technology. Annual report for FY 1985. Volume 4

    International Nuclear Information System (INIS)

    1986-03-01

    The compilation of annual reports by contractors to the Materials Engineering Branch of the NRC Office of Research, concentrates on achievements in safety research for the primary system of commercial light water power reactors, particularly with regard to reactor vessels, primary system piping, steam generators and for non-destructive examination of primary system components. This report, covering research conducted during Fiscal Year 1985, is the fourth volume of the series of NUREG-0975, Compilation of Contractor Research for the Materials Engineering Branch, Division of Engineering Technology

  8. Compilation of contract research for the Materials Engineering Branch, Division of Engineering Technology. Annual report for FY 1984. Volume 3

    International Nuclear Information System (INIS)

    1985-04-01

    This compilation of annual reports by contractors to the Materials Engineering Branch of the NRC Office of Research, concentrates on achievments in safety research for the primary system of commercial light water power reactors, particularly with regard to reactor vessels, primary system piping, steam generators and for non-destructive examination of primary system components. This report, covering research conducted during Fiscal Year 1984, is the third volume of the series of NUREG-0975, compilation of Contractor Research for the Materials Engineering Branch, Division of Engineering Technology

  9. Hydrogeological conditions in the Finnsjoen area. Compilation of data and conceptual model

    International Nuclear Information System (INIS)

    Andersson, J.E.; Nordqvist, R.; Nyberg, G.; Smellie, J.; Tiren, S.

    1991-02-01

    In the present report all available data gathered from the Finnsjoen area of potential use for numerical modelling are compiled and discussed. The data have been collected during different phases during the period 1977-1989. This inevitably means that the quality of the measured and interpreted data varies in accordance with the continuous developments of improved equipments and interpretation techniques. The present report is an updated version of the SKB progress report 89-24 with the same title and authors, see introduction. (au)

  10. A compilation of structural property data for computer impact calculation (5/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 5 involve structural property data of wood. (author)

  11. Compiling the functional data-parallel language SaC for Microgrids of Self-Adaptive Virtual Processors

    NARCIS (Netherlands)

    Grelck, C.; Herhut, S.; Jesshope, C.; Joslin, C.; Lankamp, M.; Scholz, S.-B.; Shafarenko, A.

    2009-01-01

    We present preliminary results from compiling the high-level, functional and data-parallel programming language SaC into a novel multi-core design: Microgrids of Self-Adaptive Virtual Processors (SVPs). The side-effect free nature of SaC in conjunction with its data-parallel foundation make it an

  12. Compilation of the FY 1996 Army Financial Statements at the Defense Finance and Accounting Service Indianapolis Center

    National Research Council Canada - National Science Library

    Lane, F

    1998-01-01

    ... Consolidated Financial Statements of the Army General Fund. We evaluated the processes, including internal controls and methods that the DFAS Indianapolis Center used to compile the Army FY 1996 General Fund financial statements...

  13. Hazardous chemical tracking system (HAZ-TRAC)

    International Nuclear Information System (INIS)

    Bramlette, J.D.; Ewart, S.M.; Jones, C.E.

    1990-07-01

    Westinghouse Idaho Nuclear Company, Inc. (WINCO) developed and implemented a computerized hazardous chemical tracking system, referred to as Haz-Trac, for use at the Idaho Chemical Processing Plant (ICPP). Haz-Trac is designed to provide a means to improve the accuracy and reliability of chemical information, which enhances the overall quality and safety of ICPP operations. The system tracks all chemicals and chemical components from the time they enter the ICPP until the chemical changes form, is used, or becomes a waste. The system runs on a Hewlett-Packard (HP) 3000 Series 70 computer. The system is written in COBOL and uses VIEW/3000, TurboIMAGE/DBMS 3000, OMNIDEX, and SPEEDWARE. The HP 3000 may be accessed throughout the ICPP, and from remote locations, using data communication lines. Haz-Trac went into production in October, 1989. Currently, over 1910 chemicals and chemical components are tracked on the system. More than 2500 personnel hours were saved during the first six months of operation. Cost savings have been realized by reducing the time needed to collect and compile reporting information, identifying and disposing of unneeded chemicals, and eliminating duplicate inventories. Haz-Trac maintains information required by the Superfund Amendment Reauthorization Act (SARA), the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) and the Occupational Safety and Health Administration (OSHA)

  14. Hazardous chemical tracking system (HAZ-TRAC)

    Energy Technology Data Exchange (ETDEWEB)

    Bramlette, J D; Ewart, S M; Jones, C E

    1990-07-01

    Westinghouse Idaho Nuclear Company, Inc. (WINCO) developed and implemented a computerized hazardous chemical tracking system, referred to as Haz-Trac, for use at the Idaho Chemical Processing Plant (ICPP). Haz-Trac is designed to provide a means to improve the accuracy and reliability of chemical information, which enhances the overall quality and safety of ICPP operations. The system tracks all chemicals and chemical components from the time they enter the ICPP until the chemical changes form, is used, or becomes a waste. The system runs on a Hewlett-Packard (HP) 3000 Series 70 computer. The system is written in COBOL and uses VIEW/3000, TurboIMAGE/DBMS 3000, OMNIDEX, and SPEEDWARE. The HP 3000 may be accessed throughout the ICPP, and from remote locations, using data communication lines. Haz-Trac went into production in October, 1989. Currently, over 1910 chemicals and chemical components are tracked on the system. More than 2500 personnel hours were saved during the first six months of operation. Cost savings have been realized by reducing the time needed to collect and compile reporting information, identifying and disposing of unneeded chemicals, and eliminating duplicate inventories. Haz-Trac maintains information required by the Superfund Amendment Reauthorization Act (SARA), the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) and the Occupational Safety and Health Administration (OSHA).

  15. Regulatory and technical reports: Abstracts index journal: Compilation for second quarter, April-June 1987

    International Nuclear Information System (INIS)

    1987-08-01

    This journal includes all formal reports in the NUREG Series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstracts, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  16. Regulatory and technical reports (abstract index journal): Compilation for first quarter 1988, January-March

    International Nuclear Information System (INIS)

    1988-06-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  17. Regulatory and technical reports (Abstract Index Journal): Compilation for third quarter, July-September 1987

    International Nuclear Information System (INIS)

    1987-11-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  18. Regulatory and technical reports (Abstract index journal): Compilation for first quarter 1987, January-March

    International Nuclear Information System (INIS)

    1987-05-01

    This journal includes all formal reports in the NUREG series prepared by the NRC staff and contractors; proceedings of conferences and workshops; as well as international agreement reports. The entries in this compilation are indexed for access by title and abstract, secondary report number, personal author, subject, NRC organization for staff and international agreements, contractor, international organization, and licensed facility

  19. A compilation of structural property data for computer impact calculation (1/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi; Nagata, Norio.

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 1 involve structural property data and data processing computer program. (author)

  20. A compilation of structural property data for computer impact calculation (3/5)

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-10-01

    The paper describes structural property data for computer impact calculations of nuclear fuel shipping casks. Four kinds of material data, mild steel, stainless steel, lead and wood are compiled. These materials are main structural elements of shipping casks. Structural data such as, the coefficient of thermal expansion, the modulus of longitudinal elasticity, the modulus of transverse elasticity, the Poisson's ratio and stress and strain relationships, have been tabulated against temperature or strain rate. This volume 3 involve structural property data of stainless steel. (author)