WorldWideScience

Sample records for program mead estimates

  1. Patriot/Medium Extended Air Defense System Combined Aggregate Program (Patriot/MEADS CAP)

    Science.gov (United States)

    2013-12-01

    SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY...States, Germany, and Italy to replace the U.S. Patriot air defense systems, Patriot and Hawk systems in Germany, and the Nike system in Italy. The MEADS...combat demonstrated capability against these threats. MEADS will employ a netted distributed architecture with modular components to increase

  2. Effects of Mead Wort Heat Treatment on the Mead Fermentation Process and Antioxidant Activity.

    Science.gov (United States)

    Czabaj, Sławomir; Kawa-Rygielska, Joanna; Kucharska, Alicja Z; Kliks, Jarosław

    2017-05-14

    The effects of mead wort heat treatment on the mead fermentation process and antioxidant activity were tested. The experiment was conducted with the use of two different honeys (multiflorous and honeydew) collected from the Lower Silesia region (Poland). Heat treatment was performed with the use of a traditional technique (gently boiling), the more commonly used pasteurization, and without heat treatment (control). During the experiment fermentation dynamics were monitored using high performance liquid chromatography with refractive index detection (HPLC-RID). Total antioxidant capacity (TAC) and total phenolic content (TPC) were estimated for worts and meads using UV/Vis spectrophotometric analysis. The formation of 5-hydroxymethylfurfural (HMF) was monitored by HPLC analyses. Heat treatment had a great impact on the final antioxidant capacity of meads.

  3. Estimating survival rates of quagga mussel (Dreissena rostriformis bugensis veliger larvae under summer and autumn temperature regimes in residual water of trailered watercraft at Lake Mead, USA

    Directory of Open Access Journals (Sweden)

    Wook Jin Choi

    2013-01-01

    Full Text Available On 6 January 2007, invasive quagga mussels [Dreissena rostriformis bugensis (Andrusov, 1897] were discovered in the Boulder Basin ofLake Mead, Nevada, a popular site for recreational boating in the southwestern United States. Recreational watercraft are considered aprimary vector for overland dispersal of quagga mussel veliger larvae between water bodies. Thus, effective decontamination of veligers inresidual water carried by trailered recreation boats is critical to controlling this species’ spread. The survival rate of quagga mussel veligerswas measured during exposure to environmental temperature conditions mimicking those experienced in the residual water of traileredvessels during warm summer and cooler autumn months in the semi-arid southwestern United States. Under warm summer conditions,quagga mussel veligers survived approximately five days while under cooler autumn conditions they survived 27 days. When tested underautumn temperature conditions veliger survival times increased with increased level of larval development. The results suggested a greaterlikelihood of veliger transport in the residual water of trailered watercraft during autumn months. The results indicated that presentlyrecommended vessel quarantine times to kill all externally attached juvenile and adult dreissenid mussels prior to launching in an uninfested water body should be increased to generate 100% veliger mortality in residual water unable to be fully drained from the internal areas of watercraft.

  4. The Economic Costs of a Shrinking Lake Mead: a Spatial Hedonic Analysis

    Science.gov (United States)

    Singh, A.; Saphores, J. D.

    2017-12-01

    Persistent arid conditions and population growth in the Southwest have taken a toll on the Colorado River. This has led to substantial drawdowns of many water reservoirs around the Southwest, and especially of Lake Mead, which is Las Vegas' main source of drinking water. Due to its importance, Lake Mead has received a great deal of media attention about its "bathtub ring" and the exposure of rock that used to be underwater. Drops in water levels have caused some local marinas to close, thereby affecting the aesthetic and recreational value of Lake Mead, which is located in the country's largest National Recreation Area (NRA), and surrounded by protected land. Although a rich literature analyzes how water quality impacts real estate values, relatively few studies have examined how dropping water levels are capitalized in surrounding residential properties. In this context, the goal of this study is to quantify how Lake Mead's water level changes are reflected in changes in local property values, an important source of tax income for any community. Since Lake Mead is the primary attraction within its recreation area, we are also concerned with how this recreation area, which is a few miles southeast of Las Vegas, is capitalized in real estate values of the Las Vegas metropolitan area as few valuation studies have examined how proximity to national parks influences residential property value. We estimate spatial hedonic and geographically weighted regression models of single family residences to delineate the value of proximity to the Lake Mead NRA and to understand how this value changed with Lake Mead's water levels. Our explanatory variables include common structural characteristics, fixed effects to account for unobserved locally constant characteristics, and specific variables such as distance to the Las Vegas strip and to downtown casinos. Because the sharpest declines in Lake Mead water levels happened in 2010 (NASA, 2010) and winter 2016 saw an unexpected

  5. Lake Mead Intake No. 3

    Directory of Open Access Journals (Sweden)

    Jon Hurt

    2017-12-01

    Full Text Available As a result of a sustained drought in the Southwestern United States, and in order to maintain existing water capacity in the Las Vegas Valley, the Southern Nevada Water Authority constructed a new deep-water intake (Intake No. 3 located in Lake Mead. The project included a 185 m deep shaft, 4.7 km tunnel under very difficult geological conditions, and marine works for a submerged intake. This paper presents the experience that was gained during the design and construction and the innovative solutions that were developed to handle the difficult conditions that were encountered during tunneling with a dual-mode slurry tunnel-boring machine (TBM in up to 15 bar (1 bar = 105 Pa pressure. Specific attention is given to the main challenges that were overcome during the TBM excavation, which included the mode of operation, face support pressures, pre-excavation grouting, and maintenance; to the construction of the intake, which involved deep underwater shaft excavation with blasting using shaped charges; to the construction of the innovative over 1200 t concrete-and-steel intake structure; to the placement of the intake structure in the underwater shaft; and to the docking and connection to an intake tunnel excavated by hybrid TBM. Keywords: Sub-aqueous tunneling, Tunnel-boring machine excavation, Water intakes

  6. NEWBOX: A computer program for parameter estimation in diffusion problems

    International Nuclear Information System (INIS)

    Nestor, C.W. Jr.; Godbee, H.W.; Joy, D.S.

    1989-01-01

    In the analysis of experiments to determine amounts of material transferred form 1 medium to another (e.g., the escape of chemically hazardous and radioactive materials from solids), there are at least 3 important considerations. These are (1) is the transport amenable to treatment by established mass transport theory; (2) do methods exist to find estimates of the parameters which will give a best fit, in some sense, to the experimental data; and (3) what computational procedures are available for evaluating the theoretical expressions. The authors have made the assumption that established mass transport theory is an adequate model for the situations under study. Since the solutions of the diffusion equation are usually nonlinear in some parameters (diffusion coefficient, reaction rate constants, etc.), use of a method of parameter adjustment involving first partial derivatives can be complicated and prone to errors in the computation of the derivatives. In addition, the parameters must satisfy certain constraints; for example, the diffusion coefficient must remain positive. For these reasons, a variant of the constrained simplex method of M. J. Box has been used to estimate parameters. It is similar, but not identical, to the downhill simplex method of Nelder and Mead. In general, they calculate the fraction of material transferred as a function of time from expressions obtained by the inversion of the Laplace transform of the fraction transferred, rather than by taking derivatives of a calculated concentration profile. With the above approaches to the 3 considerations listed at the outset, they developed a computer program NEWBOX, usable on a personal computer, to calculate the fractional release of material from 4 different geometrical shapes (semi-infinite medium, finite slab, finite circular cylinder, and sphere), accounting for several different boundary conditions

  7. Om at oversætte Mead

    DEFF Research Database (Denmark)

    Willert, Søren

    2005-01-01

    George Herbert Mead har skrevet en rigtig dårlig bog, der dog også er en klassiker. "sindet, selvet og samfundet" er den blevet til i en udgivelsesklar dansk udgave- og oversættelsen er en historie i seig selv.......George Herbert Mead har skrevet en rigtig dårlig bog, der dog også er en klassiker. "sindet, selvet og samfundet" er den blevet til i en udgivelsesklar dansk udgave- og oversættelsen er en historie i seig selv....

  8. George Herbert Mead on consciousness: antidote to Cartesian absurdities?

    DEFF Research Database (Denmark)

    Willert, Søren

    The article explicates George Herbert Mead's theory of consciousness as presented in Mind, Self and Society. According to Mead, the term consciousness may refer to three different sets of phenomena: (1) the environment as implied by our goal-directed action; Mead names this consciousness aspect...... experience; it is shared by humans and subhuman animals alike; (2) consciousness of environmental experience; Mead names this consciousness aspect awareness; it is exclusively human; (3) the peculiar sensed qualities attaching to consciousness, equalling what is today named qualia. Descartes......-inspired psychology makes the third consciousness aspect all-important. Within Mead's framework for a darwinistically inspired psycholgy, it becomes theoretically insignificant....

  9. HYDRAULICS, MEADE COUNTY, KENTUCKY, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — Hydraulic data include spatial datasets and data tables necessary for documenting the hydrologic procedures for estimating flood discharges for a flood insurance...

  10. Influence of Sweetness and Ethanol Content on Mead Acceptability

    Directory of Open Access Journals (Sweden)

    Gomes Teresa

    2015-06-01

    Full Text Available Mead is a traditional alcoholic beverage obtained by fermenting mead wort; however, its production still remains frequently an empirical exercise. Different meads can be produced, depending on fermentation conditions. Nevertheless, to date few studies have been developed on factors that may influence mead quality. The main objective of this work was to study the influence of sweetness and ethanol content on mead acceptability. Different meads were produced with two sweetness levels (sweet and dry meads and three ethanol contents (18, 20, 22% (v/v, adjusted by brandy addition. Afterwards, meads acceptability was evaluated by sensory analysis through a consumers’ panel (n=108 along with chemical analysis by HPLC-RID of glucose, fructose, ethanol, glycerol and acetic acid. The sweet (75 gglucose+fructose/L and dry (23 gglucose+fructose/L meads presented glycerol contents equal to 5.10±0.54 and 5.96±0.95 g/L, respectively, that were desirable since glycerol improves mead quality. Low concentrations of acetic acid were determined (0.46±0.08 and 0.57±0.09 g/L, avoiding the vinegar off-character. Concerning sensory analysis, the alcohol content of mead had no effect on the sensory attributes studied, namely, aroma, sweetness, flavour, alcohol feeling and general appreciation. Regarding sweetness, the “sweet meads” were the most appreciated by the consumers (score of 5.4±2.56, whereas the “dry meads” (score of 2.7±2.23 showed low acceptability. In conclusion, this work revealed that sweetness is a sensory key attribute for mead acceptance by the consumers, whereas ethanol content (18 to 22% (v/v is not.

  11. 76 FR 2579 - Safety Zone; Lake Mead Intake Construction, Lake Mead, Boulder City, NV

    Science.gov (United States)

    2011-01-14

    ... blasting activities. Background and Purpose Vegas Tunnel Construction will be conducting intermittent blasting operations for the placement of a water intake pipe in Lake Mead during the first 6 months of 2011... Energy Supply, Distribution, or Use. We have determined that it is not a ``significant energy action...

  12. George Herbert Mead's Contribution to the Philosophy of American Education.

    Science.gov (United States)

    Renger, Paul, III

    1980-01-01

    George Herbert Mead's general philsophy showed that he regarded the development of distinctively human behavior as essentially the result of an individual's meaningful participation in the social process of the community to which he belongs. Mead believed that education was a social process involving the meaningful interaction and communication…

  13. G. H. Mead in the history of sociological ideas.

    Science.gov (United States)

    da Silva, Filipe Carreira

    2006-01-01

    My aim is to discuss the history of the reception of George Herbert Mead's ideas in sociology. After discussing the methodological debate between presentism and historicism, I address the interpretations of those responsible for Mead's inclusion in the sociological canon: Herbert Blumer, Jürgen Habermas, and Hans Joas. In the concluding section, I assess these reconstructions of Mead's thought and suggest an alternative more consistent with my initial methodological remarks. In particular, I advocate a reconstruction of Mead's ideas that apprehends simultaneously its evolution over time and its thematic breadth. Such a historically minded reconstruction can be not only a useful corrective to possible anachronisms incurred by contemporary social theorists, but also a fruitful resource for their theory-building endeavors. Only then can meaningful and enriching dialogue with Mead begin. Copyright 2006 Wiley Periodicals, Inc.

  14. Cost-estimating relationships for space programs

    Science.gov (United States)

    Mandell, Humboldt C., Jr.

    1992-01-01

    Cost-estimating relationships (CERs) are defined and discussed as they relate to the estimation of theoretical costs for space programs. The paper primarily addresses CERs based on analogous relationships between physical and performance parameters to estimate future costs. Analytical estimation principles are reviewed examining the sources of errors in cost models, and the use of CERs is shown to be affected by organizational culture. Two paradigms for cost estimation are set forth: (1) the Rand paradigm for single-culture single-system methods; and (2) the Price paradigms that incorporate a set of cultural variables. For space programs that are potentially subject to even small cultural changes, the Price paradigms are argued to be more effective. The derivation and use of accurate CERs is important for developing effective cost models to analyze the potential of a given space program.

  15. Indirect estimators in US federal programs

    CERN Document Server

    1996-01-01

    In 1991, a subcommittee of the Federal Committee on Statistical Methodology met to document the use of indirect estimators - that is, estimators which use data drawn from a domain or time different from the domain or time for which an estimate is required. This volume comprises the eight reports which describe the use of indirect estimators and they are based on case studies from a variety of federal programs. As a result, many researchers will find this book provides a valuable survey of how indirect estimators are used in practice and which addresses some of the pitfalls of these methods.

  16. Mead production: selection and characterization assays of Saccharomyces cerevisiae strains.

    Science.gov (United States)

    Pereira, Ana Paula; Dias, Teresa; Andrade, João; Ramalhosa, Elsa; Estevinho, Letícia M

    2009-08-01

    Mead is a traditional drink, which results from the alcoholic fermentation of diluted honey carried out by yeasts. However, when it is produced in a homemade way, mead producers find several problems, namely, the lack of uniformity in the final product, delayed and arrested fermentations, and the production of "off-flavours" by the yeasts. These problems are usually associated with the inability of yeast strains to respond and adapt to unfavourable and stressful growth conditions. The main objectives of this work were to evaluate the capacity of Saccharomyces cerevisiae strains, isolated from honey of the Trás-os-Montes (Northeast Portugal), to produce mead. Five strains from honey, as well as one laboratory strain and one commercial wine strain, were evaluated in terms of their fermentation performance under ethanol, sulphur dioxide and osmotic stress. All the strains showed similar behaviour in these conditions. Two yeasts strains isolated from honey and the commercial wine strain were further tested for mead production, using two different honey (a dark and a light honey), enriched with two supplements (one commercial and one developed by the research team), as fermentation media. The results obtained in this work show that S. cerevisiae strains isolated from honey, are appropriate for mead production. However it is of extreme importance to take into account the characteristics of the honey, and supplements used in the fermentation medium formulation, in order to achieve the best results in mead production.

  17. Multi-pitch Estimation using Semidefinite Programming

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm; Vandenberghe, Lieven

    2017-01-01

    assuming a Nyquist sampled signal by adding an additional semidefinite constraint. We show that the proposed estimator has superior performance compared to state- of-the-art methods for separating two closely spaced fundamentals and approximately achieves the asymptotic Cramér-Rao lower bound.......Multi-pitch estimation concerns the problem of estimating the fundamental frequencies (pitches) and amplitudes/phases of multiple superimposed harmonic signals with application in music, speech, vibration analysis etc. In this paper we formulate a complex-valued multi-pitch estimator via...... a semidefinite programming representation of an atomic decomposition over a continuous dictionary of complex exponentials and extend this to real-valued data via a real semidefinite pro-ram with the same dimensions (i.e. half the size). We further impose a continuous frequency constraint naturally occurring from...

  18. The isostatic state of Mead crater

    Science.gov (United States)

    Banerdt, W. B.; Konopliv, A. S.; Rappaport, N. J.; Sjogren, W. L.; Grimm, R. E.; Ford, P. G.

    1994-01-01

    We have analyzed high-resolution Magellan Doppler tracking data over Mead crater, using both line-of-sight and spherical harmonic methods, and have found a negative gravity anomaly of about 4-5 mgal (at spacecraft altitude, 182 km). This is consistent with no isostatic compensation of the present topography; the uncertainty in the analysis allows perhaps as much as 30% compensation at shallow dpeths (approximately 25 km). This is similar to observations of large craters on Earth, which are not generally compensated, but contrasts with at least some lunar basins which are inferred to have large Moho uplifts and corresponding positive Bouguer anomalies. An uncompensated load of this size requires a lithosphere with an effective elastic lithosphere thickness greater than 30 km. In order for the crust-mantle boundary not to have participated in the deformation associated with the collapse of the transient cavity during the creation of the crater, the yield strength near the top of the mantle must have been significantly higher on Earth and Venus than on the Moon at the time of basin formation. This might be due to increased strength against frictional sliding at the higher confining pressures within the larger planets. Alternatively, the thinner crusts of Earth and Venus compared to that of the Moon may result in higher creep strength of the upper mantle at shallower depths.

  19. MEAD Marine Effects of Atmospheric Deposition

    Science.gov (United States)

    Jickells, T.; Spokes, L.

    2003-04-01

    The coastal seas are one of the most valuable resources on the planet but they are threatened by human activity. We rely on the coastal area for mineral resources, waste disposal, fisheries and recreation. In Europe, high population densities and high levels of industrial activity mean that the pressures arising from these activities are particularly acute. One of the main problems concerning coastal seas is the rapid increase in the amounts of nitrogen-based pollutants entering the water. They come from many sources, the most important ones being traffic, industry and agriculture. These pollutants can be used by algae as nutrients. The increasing concentrations of these nutrients have led to excessive growth of algae, some of which are harmful. When algae die and decay, oxygen in the water is used up and the resulting lower levels of oxygen may lead to fish kills. Human activity has probably doubled the amount of chemically and biologically reactive nitrogen present globally. In Europe the increases have been greater than this, leading to real concern over the health of coastal waters. Rivers have, until recently, been thought to be the most important source of reactive nitrogen to the coastal seas but we now know that inputs from the atmosphere are large and can equal, or exceed, those from the rivers. Our initial hypothesis was that atmospheric inputs are important and potentially different in their effect on coastal ecosystems to riverine inputs and hence require different management strategies. However, we had almost no information on the direct effects of atmospheric deposition on marine ecosystems, though clearly such a large external nitrogen input should lead to enhanced phytoplankton growth The aim of this European Union funded MEAD project has been to determine how inputs of nitrogen from the atmosphere affect the chemistry and biology of coastal waters. To try to answer this, we have conducted field experiments in the Kattegat, an area where we know

  20. George Herbert Mead on Humans and Other Animals: Social Relations After Human-Animal Studies

    OpenAIRE

    Rhoda Wilkie; Andrew McKinnon

    2013-01-01

    The turn towards nonhuman animals within sociology has shed a critical light on George Herbert Mead, his apparent prioritisation of language and the anthropocentric focus of Symbolic Interactionism (SI). Although Herbert Blumer canonised Mead as the founder of this perspective he also played a key role in excising the evolutionary and 'more-than-human' components in Mead's work. This intervention not only misrepresented Mead's intellectual project, it also made symbols the predominant concern...

  1. George Herbert Mead: contributions to history of the social psychology

    OpenAIRE

    Souza, Renato Ferreira de

    2011-01-01

    Com este artigo pretende-se contribuir para a compreensão histórica de um autor/personagem da Psicologia. Analisamos e acrescemos conhecimento sobre George Herbert Mead e os desdobramentos de sua teoria psicossocial. Para esse propósito, explicitaremos, no texto, uma das vertentes analíticas utilizadas em nossa dissertação, qual seja: por meio da abordagem social em história da psicologia, confrontamos a vida de Mead com momentos de constituição da psicologia, colocando em relevo aspectos cen...

  2. Generating High-Resolution Lake Bathymetry over Lake Mead using the ICESat-2 Airborne Simulator

    Science.gov (United States)

    Li, Y.; Gao, H.; Jasinski, M. F.; Zhang, S.; Stoll, J.

    2017-12-01

    Precise lake bathymetry (i.e., elevation/contour) mapping is essential for optimal decision making in water resources management. Although the advancement of remote sensing has made it possible to monitor global reservoirs from space, most of the existing studies focus on estimating the elevation, area, and storage of reservoirs—and not on estimating the bathymetry. This limitation is attributed to the low spatial resolution of satellite altimeters. With the significant enhancement of ICESat-2—the Ice, Cloud & Land Elevation Satellite #2, which is scheduled to launch in 2018—producing satellite-based bathymetry becomes feasible. Here we present a pilot study for deriving the bathymetry of Lake Mead by combining Landsat area estimations with airborne elevation data using the prototype of ICESat-2—the Multiple Altimeter Beam Experimental Lidar (MABEL). First, an ISODATA classifier was adopted to extract the lake area from Landsat images during the period from 1982 to 2017. Then the lake area classifications were paired with MABEL elevations to establish an Area-Elevation (AE) relationship, which in turn was applied to the classification contour map to obtain the bathymetry. Finally, the Lake Mead bathymetry image was embedded onto the Shuttle Radar Topography Mission (SRTM) Digital Elevation Model (DEM), to replace the existing constant values. Validation against sediment survey data indicates that the bathymetry derived from this study is reliable. This algorithm has the potential for generating global lake bathymetry when ICESat-2 data become available after next year's launch.

  3. Peak flood estimation using gene expression programming

    Science.gov (United States)

    Zorn, Conrad R.; Shamseldin, Asaad Y.

    2015-12-01

    As a case study for the Auckland Region of New Zealand, this paper investigates the potential use of gene-expression programming (GEP) in predicting specific return period events in comparison to the established and widely used Regional Flood Estimation (RFE) method. Initially calibrated to 14 gauged sites, the GEP derived model was further validated to 10 and 100 year flood events with a relative errors of 29% and 18%, respectively. This is compared to the RFE method providing 48% and 44% errors for the same flood events. While the effectiveness of GEP in predicting specific return period events is made apparent, it is argued that the derived equations should be used in conjunction with those existing methodologies rather than as a replacement.

  4. Reinterpreting Internalization and Agency through G. H. Mead's Perspectival Realism

    Science.gov (United States)

    Martin, Jack

    2006-01-01

    Toward the end of his life, George Herbert Mead developed a theory of perspectives that may be used to reinterpret his social, developmental psychology. This paper attempts such a reinterpretation, leading to the emergence of a theory of perspective taking in early childhood that looks quite different from that which is assumed in most extant work…

  5. Mead and Dewey: Thematic Connections on Educational Topics.

    Science.gov (United States)

    Dennis, Lawrence J.; Stickel, George W.

    1981-01-01

    Common themes emerge from the writings of John Dewey and George Herbert Mead on four educational topics discussed here: (1) play; (2) science teaching; (3) history teaching; and (4) industrial education. Both men deplored the fragmentation of education and believed moral insight could be furthered through social understanding, science, and…

  6. George Herbert Mead, Um, osoba i društvo

    OpenAIRE

    Bačeković, Alica

    2004-01-01

    Prikaz knjige George Herbert Mead, Um, osoba i društvo sa stajališta socijalnog biheviorista, priredio i uvod napisao Charles W. Morris, s engleskoga preveo Srđan Dvornik, Naklada Jesenski i Turk, Hrvatsko sociološko društvo, Zagreb 2003, xxx + 392 str

  7. Mead features fermented by Saccharomyces cerevisiae (lalvin k1 ...

    African Journals Online (AJOL)

    Eduardo Morales

    Full Length Research Paper. Mead features fermented by Saccharomyces cerevisiae. (lalvin k1-1116). Eduardo Marin MORALES1*, Valmir Eduardo ALCARDE2 and Dejanira de Franceschi de. ANGELIS1. 1Department of Biochemistry and Microbiology, Institute of Biosciences, UNESP - Univ Estadual Paulista, Av. 24-A,.

  8. Mead features fermented by Saccharomyces cerevisiae (lalvin k1 ...

    African Journals Online (AJOL)

    Alcoholic beverages are produced practically in every country in the world representing a significant percentage of the economy. Mead is one of the oldest beverages and it is easily obtained by the fermentation of a mixture of honey and water. However, it is still less studied compared to other beverages and does not have ...

  9. Harry Stack Sullivan Colloquium: George Herbert Mead and Harry Stack Sullivan: an unfinished synthesis.

    Science.gov (United States)

    Cottrell, L S

    1978-05-01

    HOW DO YOU create a new self? However he may phrase this question, it is a central theoretical and practical concern of the therapist every time he confronts a client who comes to him for help. What are the processes out of which the human self emerges? However he may phrase the question, it is a central concern of the social psychologist. The obvious convergence of interests indicated by these two questions should occasion no surprise among students of Sullivan and Mead. What perhaps should be surprising is that an effective synthesis of their theories has progressed no further than it has to date. My remarks today are based on the conviction that a more adequate psychiatric theory and practice and a more complete social psychological theory and research program depend on such a synthesis. Behavioral scientists concerned with the development of a truly interactionist social psychology are, I believe, generally agreed that George Herbert Mead (1863-1931), philosopher and social psychologist, and Harry Stack Sullivan (1892-1949), psychiatrist and social psychologist, have laid conceptual foundations upon which such a discipline can be erected. Now a vast assortment of activities is tagged as social psychology and its boundaries are, indeed, difficult to draw. However, for our present purposes we can define its focus as the study of the processes and products of inter- and intrapersonal and inter- and intragoup interaction, let the boundaries fall where they will.

  10. Simulation of groundwater flow to assess future withdrawals associated with Base Realignment and Closure (BRAC) at Fort George G. Meade, Maryland

    Science.gov (United States)

    Raffensperger, Jeff P.; Fleming, Brandon J.; Banks, William S.L.; Horn, Marilee A.; Nardi, Mark R.; Andreasen, David C.

    2010-01-01

    Increased groundwater withdrawals from confined aquifers in the Maryland Coastal Plain to supply anticipated growth at Fort George G. Meade (Fort Meade) and surrounding areas resulting from the Department of Defense Base Realignment and Closure Program may have adverse effects in the outcrop or near-outcrop areas. Specifically, increased pumping from the Potomac Group aquifers (principally the Patuxent aquifer) could potentially reduce base flow in small streams below rates necessary for healthy biological functioning. Additionally, water levels may be lowered near, or possibly below, the top of the aquifer within the confined-unconfined transition zone near the outcrop area. A three-dimensional groundwater flow model was created to incorporate and analyze data on water withdrawals, streamflow, and hydraulic head in the region. The model is based on an earlier model developed to assess the effects of future withdrawals from well fields in Anne Arundel County, Maryland and surrounding areas, and includes some of the same features, including model extent, boundary conditions, and vertical discretization (layering). The resolution (horizontal grid discretization) of the earlier model limited its ability to simulate the effects of withdrawals on the outcrop and near-outcrop areas. The model developed for this study included a block-shaped higher-resolution local grid, referred to as the child model, centered on Fort Meade, which was coupled to the coarser-grid parent model using the shared node Local Grid Refinement capability of MODFLOW-LGR. A more detailed stream network was incorporated into the child model. In addition, for part of the transient simulation period, stress periods were reduced in length from 1 year to 3 months, to allow for simulation of the effects of seasonally varying withdrawals and recharge on the groundwater-flow system and simulated streamflow. This required revision of the database on withdrawals and estimation of seasonal variations in

  11. Estimating Supplies Program (ESP), Version 1.00, User's Guide

    National Research Council Canada - National Science Library

    Tropeano, Anne

    2000-01-01

    The Estimating Supplies Program (ESP) is an easy to use Windows(TM)-based software program for military medical providers, planners, and trainers that calculates the amount of supplies and equipment required to treat a patient stream...

  12. Appendix C: Biomass Program inputs for FY 2008 benefits estimates

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2009-01-18

    Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

  13. Hans Joas & Daniel R. Huebner (eds.), The Timeliness of George Herbert Mead

    OpenAIRE

    Baggio, Guido

    2018-01-01

    The Timeliness of George Herbert Mead is a significant contribution to the recent “Mead renaissance.” It gathers some contributions first presented at the conference celebrating the 150th anniversary of the birth of George Herbert Mead held in April 2013 at the University of Chicago and organized by Hans Joas, Andrew Abbott, Daniel Huebner, and Christopher Takacs. The volume brings scholarship on G. H. Mead up to date highlighting Mead’s relevance for areas of research completely ignored by p...

  14. UA criterion for NPP safety estimation SALP program

    International Nuclear Information System (INIS)

    Gorynina, L.V.; Tishchenko, V.A.

    1992-01-01

    Adopted by NRC program SALR is considered. The program is intended for acquisition and estimation of data on the activities of forms having licences for NPP operation and (or) construction. The criteria for estimation and the mechanism for determination of the rating of the firm activity quality are discussed

  15. Hazardous Waste Minimization Assessment: Fort Meade, MD

    Science.gov (United States)

    1991-01-01

    Aooessloo For NTIS GRA61 DTIC TAB 0 Unannounced Justiricatlon Dy Distribution/ Availabilit !Y Code$ Avail ad/or M-t Spool 2 CONTENTS Page ’SF 298 1...26 History/Geography 26 Tenants 26 Environmental Programs 26 Air Pollution Control 27 Water Pollution Control 27 Radiation...Solvmt-lausd Faints. Product Substitution - Powder Coatings 86 Solvemt.lassd Paints - Product Substitution • Water -Baud Formulations 87 Solvent-Bad

  16. The Use of the Nelder-Mead Method in Determining Projection Parameters for Globe Photographs

    Science.gov (United States)

    Gede, M.

    2009-04-01

    A photo of a terrestrial or celestial globe can be handled as a map. The only hard issue is its projection: the so-called Tilted Perspective Projection which, if the optical axis of the photo intersects the globe's centre, is simplified to the Vertical Near-Side Perspective Projection. When georeferencing such a photo, the exact parameters of the projections are also needed. These parameters depend on the position of the viewpoint of the camera. Several hundreds of globe photos had to be georeferenced during the Virtual Globes Museum project, which made necessary to automatize the calculation of the projection parameters. The author developed a program for this task which uses the Nelder-Mead Method in order to find the optimum parameters when a set of control points are given as input. The Nelder-Mead method is a numerical algorithm for minimizing a function in a many-dimensional space. The function in the present application is the average error of the control points calculated from the actual values of parameters. The parameters are the geographical coordinates of the projection centre, the image coordinates of the same point, the rotation of the projection, the height of the perspective point and the scale of the photo (calculated in pixels/km). The program reads the Global Mappers Ground Control Point (.GCP) file format as input and creates projection description files (.PRJ) for the same software. The initial values of the geographical coordinates of the projection centre are calculated as the average of the control points, while the other parameters are set to experimental values which represent the most common circumstances of taking a globe photograph. The algorithm runs until the change of the parameters sinks below a pre-defined limit. The minimum search can be refined by using the previous result parameter set as new initial values. This paper introduces the calculation mechanism and examples of the usage. Other possible other usages of the method are

  17. Getting an "Inside": The Role of Objects in Mead's Theory of Self.

    Science.gov (United States)

    McCarthy, E. Doyle

    The paper examines George Herbert Mead's account of the individual's relation to the physical world. Mead (1863-1931) taught social psychology and philosophy at the University of Chicago from 1893-1931 and is best known for his theory of self. This theory maintains that the self is formed in a particular historical context and that it includes…

  18. The Texture of Educational Inquiry: An Exploration of George Herbert Mead's Concept of the Scientific.

    Science.gov (United States)

    Franzosa, Susan Douglas

    1984-01-01

    Explores the implications of Mead's philosophic social psychology for current disputes concerning the nature of the scientific in educational studies. Mead's contextualization of the knower and the known are found to be compatible with a contemporary critique of positivist paradigms and a critical reconceptualization of educational inquiry.…

  19. George Herbert Mead's Lecture on Philosophy of Education at the University of Chicago (1910-1911).

    Science.gov (United States)

    Biesta, Gert J. J.

    This paper recounts the influence of two of the great educational philosophers of this century, John Dewey and George Herbert Mead. Both men came to the University of Chicago from teaching at the University of Michigan. The men were life-long personal friends and professional colleagues. Although Mead published little during his life, his…

  20. Educating Communal Agents: Building on the Perspectivism of G.H. Mead

    Science.gov (United States)

    Martin, Jack

    2007-01-01

    In their search for more communal forms of agency that might guide education, contemporary educational psychologists have mostly neglected the theorizing of George Herbert Mead. In this essay, Jack Martin aims to remedy such oversight by interpreting Mead's social-psychological and educational theorizing of selfhood and agency through the lenses…

  1. 33 CFR 162.220 - Hoover Dam, Lake Mead, and Lake Mohave (Colorado River), Ariz.-Nev.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Hoover Dam, Lake Mead, and Lake... REGULATIONS § 162.220 Hoover Dam, Lake Mead, and Lake Mohave (Colorado River), Ariz.-Nev. (a) Lake Mead and... the axis of Hoover Dam and that portion of Lake Mohave (Colorado River) extending 4,500 feet...

  2. EDIN0613P weight estimating program. [for launch vehicles

    Science.gov (United States)

    Hirsch, G. N.

    1976-01-01

    The weight estimating relationships and program developed for space power system simulation are described. The program was developed to size a two-stage launch vehicle for the space power system. The program is actually part of an overall simulation technique called EDIN (Engineering Design and Integration) system. The program sizes the overall vehicle, generates major component weights and derives a large amount of overall vehicle geometry. The program is written in FORTRAN V and is designed for use on the Univac Exec 8 (1110). By utilizing the flexibility of this program while remaining cognizant of the limits imposed upon output depth and accuracy by utilization of generalized input, this program concept can be a useful tool for estimating purposes at the conceptual design stage of a launch vehicle.

  3. KERNELHR: A program for estimating animal home ranges

    Science.gov (United States)

    Seaman, D.E.; Griffith, B.; Powell, R.A.

    1998-01-01

    Kernel methods are state of the art for estimating animal home-range area and utilization distribution (UD). The KERNELHR program was developed to provide researchers and managers a tool to implement this extremely flexible set of methods with many variants. KERNELHR runs interactively or from the command line on any personal computer (PC) running DOS. KERNELHR provides output of fixed and adaptive kernel home-range estimates, as well as density values in a format suitable for in-depth statistical and spatial analyses. An additional package of programs creates contour files for plotting in geographic information systems (GIS) and estimates core areas of ranges.

  4. Geohydrology of the Unconsolidated Valley-Fill Aquifer in the Meads Creek Valley, Schuyler and Steuben Counties, New York

    Science.gov (United States)

    Miller, Todd S.; Bugliosi, Edward F.; Reddy, James E.

    2008-01-01

    The Meads Creek valley encompasses 70 square miles of predominantly forested uplands in the upper Susquehanna River drainage basin. The valley, which was listed as a Priority Waterbody by the New York State Department of Environmental Conservation in 2004, is prone to periodic flooding, mostly in its downstream end, where development is occurring most rapidly. Hydraulic characteristics of the unconsolidated valley-fill aquifer were evaluated, and seepage rates in losing and gaining tributaries were calculated or estimated, in an effort to delineate the aquifer geometry and identify the factors that contribute to flooding. Results indicated that (1) Meads Creek gained about 61 cubic feet of flow per second (about 6.0 cubic feet per second per mile of stream channel) from ground-water discharge and inflow from tributaries in its 10.2-mile reach between the northernmost and southernmost measurement sites; (2) major tributaries in the northern part of the valley are not significant sources of recharge to the aquifer; and (3) major tributaries in the central and southern part of the valley provide recharge to the aquifer. The ground-water portion of streamflow in Meads Creek (excluding tributary inflow) was 11.3 cubic feet per second (ft3/s) in the central part of the valley and 17.2 ft3/s in the southern part - a total of 28.5 ft3/s. Ground-water levels were measured in 29 wells finished in unconfined deposits for construction of a potentiometric-surface map to depict directions of ground-water flow within the valley. In general, ground water flows from the edges of the valley toward Meads Creek and ultimately discharges to it. The horizontal hydraulic gradient for the entire 12-mile-long aquifer averages about 30 feet per mile, whereas the gradient in the southern fourth of the valley averages about half that - about 17 feet per mile. A water budget for the aquifer indicated that 28 percent of recharge was derived from precipitation that falls on the aquifer, 32

  5. Development of computer program for estimating decommissioning cost - 59037

    International Nuclear Information System (INIS)

    Kim, Hak-Soo; Park, Jong-Kil

    2012-01-01

    The programs for estimating the decommissioning cost have been developed for many different purposes and applications. The estimation of decommissioning cost is required a large amount of data such as unit cost factors, plant area and its inventory, waste treatment, etc. These make it difficult to use manual calculation or typical spreadsheet software such as Microsoft Excel. The cost estimation for eventual decommissioning of nuclear power plants is a prerequisite for safe, timely and cost-effective decommissioning. To estimate the decommissioning cost more accurately and systematically, KHNP, Korea Hydro and Nuclear Power Co. Ltd, developed a decommissioning cost estimating computer program called 'DeCAT-Pro', which is Decommission-ing Cost Assessment Tool - Professional. (Hereinafter called 'DeCAT') This program allows users to easily assess the decommissioning cost with various decommissioning options. Also, this program provides detailed reporting for decommissioning funding requirements as well as providing detail project schedules, cash-flow, staffing plan and levels, and waste volumes by waste classifications and types. KHNP is planning to implement functions for estimating the plant inventory using 3-D technology and for classifying the conditions of radwaste disposal and transportation automatically. (authors)

  6. [George Herbert Mead. Thought as the conversation of interior gestures].

    Science.gov (United States)

    Quéré, Louis

    2010-01-01

    For George Herbert Mead, thinking amounts to holding an "inner conversation of gestures ". Such a conception does not seem especially original at first glance. What makes it truly original is the "social-behavioral" approach of which it is a part, and, particularly, two ideas. The first is that the conversation in question is a conversation of gestures or attitudes, and the second, that thought and reflexive intelligence arise from the internalization of an external process supported by the social mechanism of communication: that of conduct organization. It imports then to understand what distinguishes such ideas from those of the founder of behavioral psychology, John B. Watson, for whom thinking amounts to nothing other than subvocal speech.

  7. The QUELCE Method: Using Change Drivers to Estimate Program Costs

    Science.gov (United States)

    2016-08-01

    Analysis 4 2.4 Assign Conditional Probabilities 5 2.5 Apply Uncertainty to Cost Formula Inputs for Scenarios 5 2.6 Perform Monte Carlo Simulation to...Distribution Statement A: Approved for Public Release; Distribution is Unlimited 1 Introduction: The Cost Estimation Challenge Because large-scale programs... challenged [Bliss 2012]. Improvements in cost estimation that would make these assumptions more precise and reduce early lifecycle uncertainty can

  8. Population Estimation with Mark and Recapture Method Program

    International Nuclear Information System (INIS)

    Limohpasmanee, W.; Kaewchoung, W.

    1998-01-01

    Population estimation is the important information which required for the insect control planning especially the controlling with SIT. Moreover, It can be used to evaluate the efficiency of controlling method. Due to the complexity of calculation, the population estimation with mark and recapture methods were not used widely. So that, this program is developed with Qbasic on the purpose to make it accuracy and easier. The program evaluation consists with 6 methods; follow Seber's, Jolly-seber's, Jackson's Ito's, Hamada's and Yamamura's methods. The results are compared with the original methods, found that they are accuracy and more easier to applied

  9. Estimating the Impact of Low-Income Universal Service Programs

    OpenAIRE

    Daniel A. Ackerberg; David R. DeRemer; Michael H. Riordan; Gregory L. Rosston; Bradley S. Wimmer

    2013-01-01

    This policy study uses U.S. Census microdata to evaluate how subsidies for universal telephone service vary in their impact across low-income racial groups, gender, age, and home ownership. Our demand specification includes both the subsidized monthly price (Lifeline program) and the subsidized initial connection price (Linkup program) for local telephone service. Our quasi-maximum likelihood estimation controls for location differences and instruments for price endogeneity. The microdata all...

  10. A sociohistorical examination of George Herbert Mead's approach to science education.

    Science.gov (United States)

    Edwards, Michelle L

    2016-07-01

    Although George Herbert Mead is widely known for his social psychological work, his views on science education also represent a significant, yet sometimes overlooked contribution. In a speech delivered in March 1906 entitled "The Teaching of Science in College," Mead calls for cultural courses on the sciences, such as sociology of science or history of science courses, to increase the relevancy of natural and physical science courses for high school and university students. These views reflect Mead's perspective on a number of traditional dualisms, including objectivity versus subjectivity and the social sciences versus natural and physical sciences. Taking a sociohistorical outlook, I identify the context behind Mead's approach to science education, which includes three major influences: (1) German intellectual thought and the Methodenstreit debate, (2) pragmatism and Darwin's theory of evolution, and (3) social reform efforts in Chicago and the General Science Movement. © The Author(s) 2014.

  11. Performance investigation of an advanced multi-effect adsorption desalination (MEAD) cycle

    KAUST Repository

    Thu, Kyaw; Kim, Young Deuk; Shahzad, Muhammad Wakil; Saththasivam, Jayaprakash; Ng, Kim Choon

    2015-01-01

    This article presents the development of an advanced adsorption desalination system with quantum performance improvement. The proposed multi-effect adsorption desalination (MEAD) cycle utilizes a single heat source i.e., low-temperature hot water

  12. A technical report on structural evaluation of the Meade County reinforced concrete bridge.

    Science.gov (United States)

    2009-01-01

    This is a technical report on the first phase of the evaluation of the Meade County reinforced concrete bridge. : The first three chapters introduce the main problem and provide a general review of the existing evaluation : methods and the procedures...

  13. George Herbert Mead and Sören Kierkegaard as theorists of the self

    DEFF Research Database (Denmark)

    Willert, Søren

    The self concepts of Mead and Kierkegaard respectively show striking similarities. A comparative analysis of the two self concepts is carried out. Similarities are indeed present at a structural level. Clear-cut differences appear when semantic deep structures of the concepts used by the two...... thinkers are included in the analysis. The observed differences are reflecting the fact that they drew their inspiration from widely divergent intellectual traditions, Kierkegaard from Christian theology, Mead from a darwinistically inspired world-view....

  14. HEDPIN: a computer program to estimate pinwise power density

    International Nuclear Information System (INIS)

    Cappiello, M.W.

    1976-05-01

    A description is given of the digital computer program, HEDPIN. This program, modeled after a previously developed program, POWPIN, provides a means of estimating the pinwise power density distribution in fast reactor triangular pitched pin bundles. The capability also exists for computing any reaction rate of interest at the respective pin positions within an assembly. HEDPIN was developed in support of FTR fuel and test management as well as fast reactor core design and core characterization planning and analysis. The results of a test devised to check out HEDPIN's computational method are given, and the realm of application is discussed. Nearly all programming is in FORTRAN IV. Variable dimensioning is employed to make efficient use of core memory and maintain short running time for small problems. Input instructions, sample problem, and a program listing are also given

  15. George Herbert Mead: contribuições para a história da psicologia social George Herbert Mead: contributions to history of the social psychology

    Directory of Open Access Journals (Sweden)

    Renato Ferreira de Souza

    2011-08-01

    Full Text Available Com este artigo pretende-se contribuir para a compreensão histórica de um autor/personagem da Psicologia. Analisamos e acrescemos conhecimento sobre George Herbert Mead e os desdobramentos de sua teoria psicossocial. Para esse propósito, explicitaremos, no texto, uma das vertentes analíticas utilizadas em nossa dissertação, qual seja: por meio da abordagem social em história da psicologia, confrontamos a vida de Mead com momentos de constituição da psicologia, colocando em relevo aspectos centrais dessa interlocução nem sempre identificados. Correlacionamos a história de Mead com questões sociais, políticas, econômicas e científicas, assim como suas conexões com práticas e valores culturais específicos de sua época. Buscamos compreender sua limitada difusão na ciência psicológica, dando, assim, continuidade ao processo de (revolta do autor.This article intends to contribute to historical understanding of author/character of Psychology. We analyzed and enlarged knowledge about George Herbert Mead and the developing of his psychosocial theory. For this reason, we will explain in the text as analytical side used in our dissertation, in other words: through of the social approach in history of psychology we confront the life of Mead with facts of constitution of the psychology, emphasizing central aspects of this discussion not always identified. We correlate the history of Mead with social, politic, economic and scientific questions as well as his connections with practices and specific cultural values of his time. We look to understand his limited diffusion in the psychological science, giving, so, continuity to the process of returns of the author.

  16. SECPOP90: Sector population, land fraction, and economic estimation program

    Energy Technology Data Exchange (ETDEWEB)

    Humphreys, S.L.; Rollstin, J.A.; Ridgely, J.N.

    1997-09-01

    In 1973 Mr. W. Athey of the Environmental Protection Agency wrote a computer program called SECPOP which calculated population estimates. Since that time, two things have changed which suggested the need for updating the original program - more recent population censuses and the widespread use of personal computers (PCs). The revised computer program uses the 1990 and 1992 Population Census information and runs on current PCs as {open_quotes}SECPOP90.{close_quotes} SECPOP90 consists of two parts: site and regional. The site provides population and economic data estimates for any location within the continental United States. Siting analysis is relatively fast running. The regional portion assesses site availability for different siting policy decisions; i.e., the impact of available sites given specific population density criteria within the continental United States. Regional analysis is slow. This report compares the SECPOP90 population estimates and the nuclear power reactor licensee-provided information. Although the source, and therefore the accuracy, of the licensee information is unknown, this comparison suggests SECPOP90 makes reasonable estimates. Given the total uncertainty in any current calculation of severe accidents, including the potential offsite consequences, the uncertainty within SECPOP90 population estimates is expected to be insignificant. 12 refs., 55 figs., 7 tabs.

  17. SECPOP90: Sector population, land fraction, and economic estimation program

    International Nuclear Information System (INIS)

    Humphreys, S.L.; Rollstin, J.A.; Ridgely, J.N.

    1997-09-01

    In 1973 Mr. W. Athey of the Environmental Protection Agency wrote a computer program called SECPOP which calculated population estimates. Since that time, two things have changed which suggested the need for updating the original program - more recent population censuses and the widespread use of personal computers (PCs). The revised computer program uses the 1990 and 1992 Population Census information and runs on current PCs as open-quotes SECPOP90.close quotes SECPOP90 consists of two parts: site and regional. The site provides population and economic data estimates for any location within the continental United States. Siting analysis is relatively fast running. The regional portion assesses site availability for different siting policy decisions; i.e., the impact of available sites given specific population density criteria within the continental United States. Regional analysis is slow. This report compares the SECPOP90 population estimates and the nuclear power reactor licensee-provided information. Although the source, and therefore the accuracy, of the licensee information is unknown, this comparison suggests SECPOP90 makes reasonable estimates. Given the total uncertainty in any current calculation of severe accidents, including the potential offsite consequences, the uncertainty within SECPOP90 population estimates is expected to be insignificant. 12 refs., 55 figs., 7 tabs

  18. Temporally stratified sampling programs for estimation of fish impingement

    International Nuclear Information System (INIS)

    Kumar, K.D.; Griffith, J.S.

    1977-01-01

    Impingement monitoring programs often expend valuable and limited resources and fail to provide a dependable estimate of either total annual impingement or those biological and physicochemical factors affecting impingement. In situations where initial monitoring has identified ''problem'' fish species and the periodicity of their impingement, intensive sampling during periods of high impingement will maximize information obtained. We use data gathered at two nuclear generating facilities in the southeastern United States to discuss techniques of designing such temporally stratified monitoring programs and their benefits and drawbacks. Of the possible temporal patterns in environmental factors within a calendar year, differences among seasons are most influential in the impingement of freshwater fishes in the Southeast. Data on the threadfin shad (Dorosoma petenense) and the role of seasonal temperature changes are utilized as an example to demonstrate ways of most efficiently and accurately estimating impingement of the species

  19. A remark on empirical estimates in multistage stochastic programming

    Czech Academy of Sciences Publication Activity Database

    Kaňková, Vlasta

    2002-01-01

    Roč. 9, č. 17 (2002), s. 31-50 ISSN 1212-074X R&D Projects: GA ČR GA402/01/0539; GA ČR GA402/02/1015; GA ČR GA402/01/0034 Institutional research plan: CEZ:AV0Z1075907 Keywords : multistage stochastic programming * empirical estimates * Markov dependence Subject RIV: BB - Applied Statistics, Operational Research

  20. A philosophical examination of Mead's pragmatist constructivism as a referent for adult science education

    Science.gov (United States)

    Furbish, Dean Russel

    The purpose of this study is to examine pragmatist constructivism as a science education referent for adult learners. Specifically, this study seeks to determine whether George Herbert Mead's doctrine, which conflates pragmatist learning theory and philosophy of natural science, might facilitate (a) scientific concept acquisition, (b) learning scientific methods, and (c) preparation of learners for careers in science and science-related areas. A philosophical examination of Mead's doctrine in light of these three criteria has determined that pragmatist constructivism is not a viable science education referent for adult learners. Mead's pragmatist constructivism does not portray scientific knowledge or scientific methods as they are understood by practicing scientists themselves, that is, according to scientific realism. Thus, employment of pragmatist constructivism does not adequately prepare future practitioners for careers in science-related areas. Mead's metaphysics does not allow him to commit to the existence of the unobservable objects of science such as molecular cellulose or mosquito-borne malarial parasites. Mead's anti-realist metaphysics also affects his conception of scientific methods. Because Mead does not commit existentially to the unobservable objects of realist science, Mead's science does not seek to determine what causal role if any the hypothetical objects that scientists routinely posit while theorizing might play in observable phenomena. Instead, constructivist pragmatism promotes subjective epistemology and instrumental methods. The implication for learning science is that students are encouraged to derive scientific concepts based on a combination of personal experience and personal meaningfulness. Contrary to pragmatist constructivism, however, scientific concepts do not arise inductively from subjective experience driven by personal interests. The broader implication of this study for adult education is that the philosophically laden

  1. Notas sobre a presença de Mead na obra de Habermas Notes on Mead's presence in the work of Habermas

    Directory of Open Access Journals (Sweden)

    Luciana Aparecida de Araújo Penitente

    2013-01-01

    Full Text Available Habermas pensa a questão da individuação e da socialização a partir dos estudos de George Hebert Mead, que, na sua concepção, foi o primeiro a refletir substancialmente sobre um modelo de eu produzido socialmente. Mead oferece todo subsídio teórico para o desenvolvimento de uma teoria da evolução humana que envolve o processo de individuação e de socialização. Pelo paradigma de intercompreensão, ou seja, da relação intersubjetiva de indivíduos que se socializam por meio da comunicação e se reconhecem mutuamente, Mead permite a mudança de paradigma da consciência de si, da autorreferência de um sujeito que age isoladamente para o indivíduo que processa trocas sociais mediante a linguagem. Portanto, um dos principais componentes da teoria de Mead, em que Habermas busca contribuição para sua Teoria da Ação Comunicativa, é o processo de constituição do "eu", sua identidade. Mead acredita ser a individuação representada como um processo que é linguisticamente mediador da socialização e da construção de uma história de vida, na qual os sujeitos são conscientes de si. É esse meio linguístico estabelecido entre os sujeitos e o meio do entendimento intrassubjetivo e histórico vital que possibilita a formação de uma identidade de sujeitos socializados. É o reconhecimento intersubjetivo e autoentendimento mediado intersubjetivamente que propicia a formação da identidade. Esse quadro conceitual será fundamental a Habermas, na sua acepção de eu pós-convencional.Habermas discusses the question of individualization and socialization on the basis of the studies of George Herbert Mead, who, in Habermas' view, was the first to reflect substantially on a model of the socially produced "I". Mead offers a theoretical basis for the development of a theory of human evolution that involves the process of individualization and socialization. Through the paradigm of mutual understanding, that is, the intersubjective

  2. Estimated emission reductions from California's enhanced Smog Check program.

    Science.gov (United States)

    Singer, Brett C; Wenzel, Thomas P

    2003-06-01

    The U.S. Environmental Protection Agency requires that states evaluate the effectiveness of their vehicle emissions inspection and maintenance (I/M) programs. This study demonstrates an evaluation approach that estimates mass emission reductions over time and includes the effect of I/M on vehicle deterioration. It includes a quantitative assessment of benefits from pre-inspection maintenance and repairs and accounts for the selection bias effect that occurs when intermittent high emitters are tested. We report estimates of one-cycle emission benefits of California's Enhanced Smog Check program, ca. 1999. Program benefits equivalent to metric tons per day of prevented emissions were calculated with a "bottom-up" approach that combined average per vehicle reductions in mass emission rates (g/gal) with average per vehicle activity, resolved by model year. Accelerated simulation mode test data from the statewide vehicle information database (VID) and from roadside Smog Check testing were used to determine 2-yr emission profiles of vehicles passing through Smog Check and infer emission profiles that would occur without Smog Check. The number of vehicles participating in Smog Check was also determined from the VID. We estimate that in 1999 Smog Check reduced tailpipe emissions of HC, CO, and NO(x) by 97, 1690, and 81 t/d, respectively. These correspond to 26, 34, and 14% of the HC, CO, and NO(x) that would have been emitted by vehicles in the absence of Smog Check. These estimates are highly sensitive to assumptions about vehicle deterioration in the absence of Smog Check. Considering the estimated uncertainty in these assumptions yields a range for calculated benefits: 46-128 t/d of HC, 860-2200 t/d of CO, and 60-91 t/d of NO(x). Repair of vehicles that failed an initial, official Smog Check appears to be the most important mechanism of emission reductions, but pre-inspection maintenance and repair also contributed substantially. Benefits from removal of nonpassing

  3. Evaporation from Lake Mead, Nevada and Arizona, March 2010 through February 2012

    Science.gov (United States)

    Moreo, Michael T.; Swancar, Amy

    2013-01-01

    Evaporation from Lake Mead was measured using the eddy-covariance method for the 2-year period starting March 2010 and ending February 2012. When corrected for energy imbalances, annual eddy-covariance evaporation was 2,074 and 1,881 millimeters (81.65 and 74.07 inches), within the range of previous estimates. There was a 9-percent decrease in the evaporation rate and a 10-percent increase in the lake surface area during the second year of the study compared to the first. These offsetting factors resulted in a nearly identical 720 million cubic meters (584,000 acre feet) evaporation volume for both years. Monthly evaporation rates were best correlated with wind speed, vapor pressure difference, and atmospheric stability. Differences between individual monthly evaporation and mean monthly evaporation were as much as 20 percent. Net radiation provided most of the energy available for evaporative processes; however, advected heat from the Colorado River was an important energy source during the second year of the study. Peak evaporation lagged peak net radiation by 2 months because a larger proportion of the net radiation that reaches the lake goes to heating up the water column during the spring and summer months. As most of this stored energy is released, higher evaporation rates are sustained during fall months even though net radiation declines. The release of stored heat also fueled nighttime evaporation, which accounted for 37 percent of total evaporation. The annual energy-balance ratio was 0.90 on average and varied only 0.01 between the 2 years, thus implying that 90 percent of estimated available energy was accounted for by turbulent energy measured using the eddy-covariance method. More than 90 percent of the turbulent-flux source area represented the open-water surface, and 94 percent of 30-minute turbulent-flux measurements originated from wind directions where the fetch ranged from 2,000 to 16,000 meters. Evaporation uncertainties were estimated to be 5

  4. Estimating Arrhenius parameters using temperature programmed molecular dynamics

    International Nuclear Information System (INIS)

    Imandi, Venkataramana; Chatterjee, Abhijit

    2016-01-01

    Kinetic rates at different temperatures and the associated Arrhenius parameters, whenever Arrhenius law is obeyed, are efficiently estimated by applying maximum likelihood analysis to waiting times collected using the temperature programmed molecular dynamics method. When transitions involving many activated pathways are available in the dataset, their rates may be calculated using the same collection of waiting times. Arrhenius behaviour is ascertained by comparing rates at the sampled temperatures with ones from the Arrhenius expression. Three prototype systems with corrugated energy landscapes, namely, solvated alanine dipeptide, diffusion at the metal-solvent interphase, and lithium diffusion in silicon, are studied to highlight various aspects of the method. The method becomes particularly appealing when the Arrhenius parameters can be used to find rates at low temperatures where transitions are rare. Systematic coarse-graining of states can further extend the time scales accessible to the method. Good estimates for the rate parameters are obtained with 500-1000 waiting times.

  5. Estimating Arrhenius parameters using temperature programmed molecular dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Imandi, Venkataramana; Chatterjee, Abhijit, E-mail: abhijit@che.iitb.ac.in [Department of Chemical Engineering, Indian Institute of Technology Bombay, Mumbai 400076 (India)

    2016-07-21

    Kinetic rates at different temperatures and the associated Arrhenius parameters, whenever Arrhenius law is obeyed, are efficiently estimated by applying maximum likelihood analysis to waiting times collected using the temperature programmed molecular dynamics method. When transitions involving many activated pathways are available in the dataset, their rates may be calculated using the same collection of waiting times. Arrhenius behaviour is ascertained by comparing rates at the sampled temperatures with ones from the Arrhenius expression. Three prototype systems with corrugated energy landscapes, namely, solvated alanine dipeptide, diffusion at the metal-solvent interphase, and lithium diffusion in silicon, are studied to highlight various aspects of the method. The method becomes particularly appealing when the Arrhenius parameters can be used to find rates at low temperatures where transitions are rare. Systematic coarse-graining of states can further extend the time scales accessible to the method. Good estimates for the rate parameters are obtained with 500-1000 waiting times.

  6. Cost estimation model for advanced planetary programs, fourth edition

    Science.gov (United States)

    Spadoni, D. J.

    1983-01-01

    The development of the planetary program cost model is discussed. The Model was updated to incorporate cost data from the most recent US planetary flight projects and extensively revised to more accurately capture the information in the historical cost data base. This data base is comprised of the historical cost data for 13 unmanned lunar and planetary flight programs. The revision was made with a two fold objective: to increase the flexibility of the model in its ability to deal with the broad scope of scenarios under consideration for future missions, and to maintain and possibly improve upon the confidence in the model's capabilities with an expected accuracy of 20%. The Model development included a labor/cost proxy analysis, selection of the functional forms of the estimating relationships, and test statistics. An analysis of the Model is discussed and two sample applications of the cost model are presented.

  7. Multigene Genetic Programming for Estimation of Elastic Modulus of Concrete

    Directory of Open Access Journals (Sweden)

    Alireza Mohammadi Bayazidi

    2014-01-01

    Full Text Available This paper presents a new multigene genetic programming (MGGP approach for estimation of elastic modulus of concrete. The MGGP technique models the elastic modulus behavior by integrating the capabilities of standard genetic programming and classical regression. The main aim is to derive precise relationships between the tangent elastic moduli of normal and high strength concrete and the corresponding compressive strength values. Another important contribution of this study is to develop a generalized prediction model for the elastic moduli of both normal and high strength concrete. Numerous concrete compressive strength test results are obtained from the literature to develop the models. A comprehensive comparative study is conducted to verify the performance of the models. The proposed models perform superior to the existing traditional models, as well as those derived using other powerful soft computing tools.

  8. George Herbert Mead, La Philosophie du temps en perspective(s)

    OpenAIRE

    Leclerc, Natalia

    2013-01-01

    L’ouvrage intitulé La Philosophie du temps en perspective(s) comporte plusieurs textes de G. H. Mead, précédés d’une importante introduction de Michèle Leclerc-Olive, qui présente « Les figures du temps dans la philosophie de George Herbert Mead ». Son caractère pédagogique est très appréciable : elle permet en effet d’entrer progressivement dans la pensée, ou plutôt dans les différentes facettes et étapes de la réflexion de Mead. Il est extrêmement dense, et le présent compte rendu n’a pas d...

  9. Homelessness in Modern Society: An Integration of Mead and Berger and Implications for a Paradigm of Mass Communication.

    Science.gov (United States)

    Jones, Charlotte

    George Herbert Mead's theory of mind, self, and society is synthesized in this paper, as is the extension of that basic theory by Peter Berger and Thomas Luckmann. The paper argues that Mead's functionalist perspective, while rich and internally consistent, is naive in that it lacks a theory of institutions, and it shows how Berger and Luckmann's…

  10. Redefining the Subject, Redefining the Social, Reconsidering Education: George Herbert Mead's Course on Philosophy of Education at the University of Chicago.

    Science.gov (United States)

    Biesta, Gert J. J.

    1999-01-01

    George Mead's posthumously published works express a genuine philosophy of education. This paper contributes to the reconstruction of Mead's educational philosophy, examining a typescript of student notes from his course on philosophy of education at the University of Chicago. The essay discusses the typescript against the backdrop of Mead's…

  11. Morphometric and histopathological parameters of gonadal development in adult common carp from contaminated and reference sites in Lake Mead, Nevada

    Science.gov (United States)

    Patino, R.; Goodbred, S.L.; Draugelis-Dale, R.; Barry, C.E.; Scott, Foott J.; Wainscott, M.R.; Gross, T.S.; Covay, K.J.

    2003-01-01

    This study examined the hypothesis that exposure to sublethal concentrations of contaminants alters the gonadal condition of feral common carp Cyprinus carpio. Adult common carp in Lake Mead, Nevada, were collected from a contaminated site (Las Vegas Bay) that receives municipal and industrial effluent and from a reference site (Overton Arm) with a relatively low level of contamination. Fish were sampled seven times over a 1-year period extending over two separate spawning seasons. Morphometric and histopathological parameters of gonadal and germ cell development were determined. In males, the pattern of seasonal changes in the gonadosomatic index (GSI) was similar between the sites and showed no clear association with site-specific seasonal temperature profiles. However, Las Vegas Bay males had consistently lower GSI values and, on one of the sampling dates, a lower proportion of sperm relative to other germ cell stages (determined histologically). Further, Las Vegas Bay males had a higher incidence of gonadal macrophage aggregates, which are putative tissue biomarkers of contaminant exposure in fishes. In females, seasonal GSI profiles, the frequency of fish with postovulatory follicles (an index of spawning activity), and the timing of new follicle recruitment all showed differences between sites, but these differences generally matched differences in water temperature profile. Also, the peak size-frequency of full-grown follicles did not differ between sites, and estimates of fecundity for the second spawning season indicated that females from the reference site unexpectedly produced a lower number of gametes, Overall, site differences in gonadal condition were observed in carp of both sexes but they seemed to be associated with site differences in contaminant levels only in males. The apparent lack of association between contaminant level and gonadal condition in female carp from mildly mesotrophic Lake Mead may indicate a lack of contaminant effects in

  12. 75 FR 16120 - Notice of Issuance of Exposure Draft on Accrual Estimates for Grant Programs

    Science.gov (United States)

    2010-03-31

    ... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Issuance of Exposure Draft on Accrual Estimates for Grant Programs AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board... Accounting Technical Release entitled Accrual Estimates for Grant Programs. The proposed Technical Release...

  13. The quagga mussel crisis at Lake Mead National Recreation Area, Nevada (U.S.A.).

    Science.gov (United States)

    Hickey, Valerie

    2010-08-01

    Parks are cornerstones of conservation; and non-native invasive species drive extensive changes to biological diversity in parks. Knowing this, national park staff at Lake Mead National Recreation Area in the southwestern United States had a program in place for early detection of the non-native, invasive quagga mussel (Dreissena rostriformis bugensis). Upon finding the mussel in January 2007, managers moved quickly to access funding and the best available science to implement a response. Managers considered four options--doing nothing, closing the park, restricting movement on the lakes, and educating and enforcing park visitors--and decided to focus on education and enforcing existing laws. Nonetheless, quagga spread throughout the park and soon began to appear throughout the western United States. I examined why efforts to control the expansion failed and determined the general lessons to be learned from this case. Concentrating human visitation on the lakes through land-use zoning opened a pathway for invasion, reduced management options, and led to the rapid spread of quagga. To reconcile competing mandates to protect nature and provide recreation, zoning in parks has become a common practice worldwide. It reduces stress on some areas of a park by restricting and thus concentrating human activity in particular areas. Concentrating the human activity in one area does three things: cements pathways that repeatedly import and export vectors of non-native invasive species; creates the disturbed area necessary to enable non-native invasive species to gain a foothold; and, establishes a source of invasions that, without appropriate controls, can quickly spread to a park's wilderness areas.

  14. Cost estimate for a proposed GDF Suez LNG testing program

    Energy Technology Data Exchange (ETDEWEB)

    Blanchat, Thomas K.; Brady, Patrick Dennis; Jernigan, Dann A.; Luketa, Anay Josephine; Nissen, Mark R.; Lopez, Carlos; Vermillion, Nancy; Hightower, Marion Michael

    2014-02-01

    At the request of GDF Suez, a Rough Order of Magnitude (ROM) cost estimate was prepared for the design, construction, testing, and data analysis for an experimental series of large-scale (Liquefied Natural Gas) LNG spills on land and water that would result in the largest pool fires and vapor dispersion events ever conducted. Due to the expected cost of this large, multi-year program, the authors utilized Sandia's structured cost estimating methodology. This methodology insures that the efforts identified can be performed for the cost proposed at a plus or minus 30 percent confidence. The scale of the LNG spill, fire, and vapor dispersion tests proposed by GDF could produce hazard distances and testing safety issues that need to be fully explored. Based on our evaluations, Sandia can utilize much of our existing fire testing infrastructure for the large fire tests and some small dispersion tests (with some modifications) in Albuquerque, but we propose to develop a new dispersion testing site at our remote test area in Nevada because of the large hazard distances. While this might impact some testing logistics, the safety aspects warrant this approach. In addition, we have included a proposal to study cryogenic liquid spills on water and subsequent vaporization in the presence of waves. Sandia is working with DOE on applications that provide infrastructure pertinent to wave production. We present an approach to conduct repeatable wave/spill interaction testing that could utilize such infrastructure.

  15. Hydrogeochemical and stream sediment reconnaissance basic data for Meade River quadrangle, Alaska

    International Nuclear Information System (INIS)

    1981-01-01

    Field and laboratory data are presented for 515 water samples from the Meade River Quadrangle, Alaska. The samples were collected by Los Alamos National Laboratory; laboratory analysis and data reporting were performed by the Uranium Resource Evaluation Project at Oak Ridge, Tennessee

  16. 75 FR 5115 - Temporary Concession Contract for Lake Mead National Recreation Area, AZ/NV

    Science.gov (United States)

    2010-02-01

    ... National Recreation Area, AZ/NV AGENCY: National Park Service, Department of the Interior. ACTION: Notice of intention to award temporary concession contract for Lake Mead National Recreation Area. SUMMARY: Pursuant to 36 CFR 51.24, public notice is hereby given that the National Park Service intends to award a...

  17. A technical report on structural evaluation of the Meade County reinforced concrete bridge : research [summary].

    Science.gov (United States)

    2009-01-01

    Meade County Bridge is a two-lane highway reinforced concrete bridge with two girders each with 20 continuous spans. The bridge was built in 1965. It has been reported that in early years of the bridge service period, a considerable amount of cracks ...

  18. Persistence of echimidine, a hepatotoxic pyrrolizidine alkaloid, from honey into mead

    Science.gov (United States)

    Honey produced by bees foraging on Echium plantagineum is known to contain dehydropyrrolizidine alkaloids characteristic of the plant. Following a prolific growth of E. plantagineum in the wake of Australian bushfires, two samples of mead, a fermented drink made from honey, and the honey used to pre...

  19. 75 FR 36371 - Draft Environmental Impact Statement Addressing Campus Development at Fort Meade, MD

    Science.gov (United States)

    2010-06-25

    ...'s (NSA) continually evolving requirements and for Intelligence Community use. The purpose of the..., or e-mail [email protected]nsa.gov . SUPPLEMENTARY INFORMATION: Background: The NSA is a tenant DOD agency on Fort Meade. NSA is a high-technology organization that is on the frontier of communications and data...

  20. Influence and canonical supremacy: an analysis of how George Herbert Mead demoted Charles Horton Cooley in the sociological canon.

    Science.gov (United States)

    Jacobs, Glenn

    2009-01-01

    This analysis assesses the factors underlying Charles Horton Cooley's place in the sociological canon as they relate to George Herbert Mead's puzzling diatribe-echoed in secondary accounts-against Cooley's social psychology and view of the self published scarcely a year after his death. The illocutionary act of publishing his critique stands as an effort to project the image of Mead's intellectual self and enhance his standing among sociologists within and outside the orbit of the University of Chicago. It expressed Mead's ambivalence toward his precursor Cooley, whose influence he never fully acknowledged. In addition, it typifies the contending fractal distinctions of the scientifically discursive versus literary styles of Mead and Cooley, who both founded the interpretive sociological tradition. The contrasting styles and attitudes toward writing of the two figures are discussed, and their implications for the problems of scale that have stymied the symbolic interactionist tradition are explored.

  1. Lake Mead National Recreational Area air tour management plan and planning and National Environmental Policy Act scoping document

    Science.gov (United States)

    2004-04-19

    The Federal Aviation Administration (FAA), in cooperation with the National Park Service (NPS), has initiated the development of an Air Tour Management Plan (ATMP) for Lake Mead National Recreation Area (LAME) pursuant to the National Parks Air Tour ...

  2. The construction of mind, self, and society: the social process behind G. H. Mead'S social psychology.

    Science.gov (United States)

    Huebner, Daniel R

    2012-01-01

    Mind, Self, and Society, the posthumously published volume by which George Herbert Mead is primarily known, poses acute problems of interpretation so long as scholarship does not consider the actual process of its construction. This paper utilizes extensive archival correspondence and notes in order to analyze this process in depth. The analysis demonstrates that the published form of the book is the result of a consequential interpretive process in which social actors manipulated textual documents within given practical constraints over a course of time. The paper contributes to scholarship on Mead by indicating how this process made possible certain understandings of his social psychology and by relocating the materials that make up the single published text within the disparate contexts from which they were originally drawn. © 2012 Wiley Periodicals, Inc.

  3. George Herbert Mead and the Allen controversy at the University of Wisconsin.

    Science.gov (United States)

    Cook, Gary A

    2007-01-01

    This essay uses previously unpublished correspondence of George Herbert Mead to tell the story of his involvement in the aftermath of a political dispute that took place at the University of Wisconsin during the years 1914-1915. It seeks thereby to clarify the historical significance of an article he published on this controversy in late 1915. Taken together with relevant information about the educational activities of William H. Allen of the New York Bureau of Municipal Research, Mead's correspondence and article throw helpful light upon his understanding of how an educational survey of a university should proceed; they also show how he went about the task of evaluating a failed attempt at such a survey. (c) 2007 Wiley Periodicals, Inc.

  4. Appendix E: Wind Technologies Program inputs for FY 2008 benefits estimates

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2009-01-18

    Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

  5. Appendix B: Hydrogen, Fuel Cells, and Infrastructure Technologies Program inputs for FY 2008 benefits estimates

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2009-01-18

    Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

  6. A dynamic programming approach to missing data estimation using neural networks

    CSIR Research Space (South Africa)

    Nelwamondo, FV

    2013-01-01

    Full Text Available method where dynamic programming is not used. This paper also suggests a different way of formulating a missing data problem such that the dynamic programming is applicable to estimate the missing data....

  7. Appendix G: Building Technologies Program inputs for FY 2008 benefits estimates

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2009-01-18

    Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

  8. Appendix J: Weatherization and Intergovernmental Program (WIP) inputs for FY 2008 benefits estimates

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2009-01-18

    Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

  9. Estimating the Population-Level Effectiveness of Vaccination Programs in the Netherlands.

    NARCIS (Netherlands)

    van Wijhe, Maarten; McDonald, Scott A; de Melker, Hester E; Postma, Maarten J; Wallinga, Jacco

    There are few estimates of the effectiveness of long-standing vaccination programs in developed countries. To fill this gap, we investigate the direct and indirect effectiveness of childhood vaccination programs on mortality at the population level in the Netherlands.

  10. Appendix F: FreedomCAR and Vehicle Technologies Program inputs for FY 2008 benefits estimates

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2009-01-18

    Document summarizes the results of the benefits analysis of EERE’s programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

  11. 78 FR 255 - Resumption of the Population Estimates Challenge Program

    Science.gov (United States)

    2013-01-03

    ... commenter would like the Census Bureau to continue to leave open the option for a challenging county-level... Challenging Certain Population and Income Estimates'' to ``Procedure for Challenging Population Estimates'' to... governmental unit. In those instances where a non-functioning county-level government or statistical equivalent...

  12. Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure

    Science.gov (United States)

    Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.

    2014-01-01

    Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…

  13. A stump-to-mill timber production cost-estimating program for cable logging eastern hardwoods

    Science.gov (United States)

    Chris B. LeDoux

    1987-01-01

    ECOST utilizes data from stand inventory, cruise data, and the logging plan for the tract in question. The program produces detailed stump-to-mill cost estimates for specific proposed timber sales. These estimates are then utilized, in combination with specific landowner objectives, to assess the economic feasibility of cable logging a given area. The program output is...

  14. The Program Module of Information Risk Numerical Estimation

    Directory of Open Access Journals (Sweden)

    E. S. Stepanova

    2011-03-01

    Full Text Available The algorithm of information risks analysis realized in the program module on the basis of threats matrixes and fuzzy cognitive maps describing potential threats on resources is offered in this paper.

  15. ESTIMATING FINANCIAL SUPPORT OF REGIONAL PROGRAMS OF SOCIAL ECONOMIC DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Iryna Kokhan

    2016-03-01

    Full Text Available The given article presents the analysis of the experience of the financial support of the regional programs of social economic development and the areas of usage of internal and external resources of the area. Dynamic and balanced development of regions is one of the most important issues for further establishment of marketing relations and social transformations in Ukraine. The Aim lies in the evaluation of financial support of the approved regional programs and launching the amount of their financing. The assessment of social economic situation in Ivano-Frankivsk region in terms of nationwide tendencies allows asserting that economic growth depends on the amounts and sources provided by the state. To determine close connection between  the amount of financing  for the programs  and  gross domestic product, the coefficient of correlation was calculated according to Pierson. It was proved that the amount of financing regional programs of social economic development influences the growth rate of gross domestic product. During research period the activation of regional authority institutions is being surveyed regarding the adoption and financing target regional programs. It was determined that the dynamic activity of the regional community and its territorial units on realization in terms of defined strategic priorities for programs of social economic development will facilitate disproportion reduction and differences in the development of territory units in the region, as well as positively influences the growth of gross domestic product providing steady increase of social welfare. Keywords: social economic regional development, ecology programs, social programs, gross regional domestic product, Pierson’s correlation coefficient. JEL: R 58

  16. SEISRISK II; a computer program for seismic hazard estimation

    Science.gov (United States)

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  17. Estimating Effective Subsidy Rates of Student Aid Programs

    OpenAIRE

    Stacey H. CHEN

    2008-01-01

    Every year millions of high school students and their parents in the US are asked to fill out complicated financial aid application forms. However, few studies have estimated the responsiveness of government financial aid schemes to changes in financial needs of the students. This paper identifies the effective subsidy rate (ESR) of student aid, as defined by the coefficient of financial needs in the regression of financial aid. The ESR measures the proportion of subsidy of student aid under ...

  18. Improved stove programs need robust methods to estimate carbon offsets

    OpenAIRE

    Johnson, Michael; Edwards, Rufus; Masera, Omar

    2010-01-01

    Current standard methods result in significant discrepancies in carbon offset accounting compared to approaches based on representative community based subsamples, which provide more realistic assessments at reasonable cost. Perhaps more critically, neither of the currently approved methods incorporates uncertainties inherent in estimates of emission factors or non-renewable fuel usage (fNRB). Since emission factors and fNRB contribute 25% and 47%, respectively, to the overall uncertainty in ...

  19. Depandent samples in empirical estimation of stochastic programming problems

    Czech Academy of Sciences Publication Activity Database

    Kaňková, Vlasta; Houda, Michal

    2006-01-01

    Roč. 35, 2/3 (2006), s. 271-279 ISSN 1026-597X R&D Projects: GA ČR GA402/04/1294; GA ČR GD402/03/H057; GA ČR GA402/05/0115 Institutional research plan: CEZ:AV0Z10750506 Keywords : stochastic programming * stability * probability metrics * Wasserstein metric * Kolmogorov metric * simulations Subject RIV: BB - Applied Statistics , Operational Research

  20. Performance investigation of an advanced multi-effect adsorption desalination (MEAD) cycle

    KAUST Repository

    Thu, Kyaw

    2015-12-01

    This article presents the development of an advanced adsorption desalination system with quantum performance improvement. The proposed multi-effect adsorption desalination (MEAD) cycle utilizes a single heat source i.e., low-temperature hot water (as low as 55°C). Passive heating of the feed water (no direct heating) is adopted using total internal heat recovery from the kinetic energy of desorbed vapor and water vapor uptake potential of the adsorbent. Thus, the evaporation in the MEAD cycle ensues at low temperatures ranging from 35°C to 7°C yet providing significantly high performance ratio. The energy from the regenerated vapor is recovered for multiple evaporation/condensation of saline water by a water-run-around circuit between the top brine temperature (TBT) effect and the AD condenser. The adsorbent material is the hydrophilic mesoporous silica gel with high pore surface area. Numerical simulation for such a cycle is developed based on experimentally verified model extending to multi-effect cycle. The system is investigated under several operation conditions such as cycle time allocation, heat source temperature and the number of intermediate effects. It is observed that most of the evaporating-condensing effects operate at low temperature i.e., below 35°C as opposed to conventional multi-effect distillation (MED) cycle. For a MEAD cycle with 7 intermediate effects, the specific water production rate, the performance ratio and the gain output ratio are found to be 1.0m3/htonne of silica gel, 6.3 and 5.1, respectively. Low scaling and fouling potentials being evaporation at low temperatures yet high recovery ratio makes the cycle suitable for effectively and efficiently handling highly concentrated feed water such as produced water, brine rejected from other desalination plants and zero liquid discharge (ZLD) system. © 2015 Elsevier Ltd.

  1. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  2. Geohydrologic reconnaissance of Lake Mead National Recreation Area; Las Vegas Wash to Opal Mountain, Nevada

    Science.gov (United States)

    Laney, R.L.

    1981-01-01

    The study is a geohydrologic reconnaissance of about 170 square miles in the Lake Mead National Recreation Area from Las Vegas Wash to Opal Mountain, Nevada. The study is one of a series that describes the geohydrology of the recreation area and that indentifies areas where water supplies can be developed. Precipitation in this arid area is about 5 inches per year. Streamflow is seasonal and extremely variable except for that in the Colorado River, which adjoins the area. Pan evaporation is more than 20 times greater than precipitation; therefore, regional ground-water supplies are meager except near the Colorado River, Lake Mead, and Lake Mohave. Large ground-water supplies can be developed near the river and lakes, and much smaller supplies may be obtained in a few favorable locations farther from the river and lakes. Ground water in most of the areas probably contains more than 1,000 milligrams per liter of dissolved solids, but water that contains less than 1,000 milligrams per liter of dissolved solids can be obtained within about 1 mile of the lakes. Crystalline rocks of metamorphic, intrusive and volcanic origin crop out in the area. These rocks are overlain by conglomerate and mudstone of the Muddy Creek Formation, gravel and conglomerate of the older alluvium, and sand and gravel of the Chemehuevi Formation and younger alluvium. The crystalline rocks, where sufficiently fractured, yield water to springs and would yield small amounts of water to favorably located wells. The poorly cemented and more permeable beds of the older alluvium, Chemehuevi Formation, and younger alluvium are the better potential aquifers, particularly along the Colorado River and Lakes Mead and Mohave. Thermal springs in the gorge of the Colorado River south of Hoover Dam discharge at least 2,580 acre-feet per year of water from the volcanic rocks and metamorphic and plutonic rocks. The discharge is much greater than could be infiltrated in the drainage basin above the springs

  3. Airborne gamma-ray spectrometer and magnetometer survey, Meade River Quadrangle, Alaska. Final report

    International Nuclear Information System (INIS)

    1981-02-01

    The results obtained from an airborne high sensitivity gamma-ray spectrometer and magnetometer survey over the Meade River map area of Alaska are presented. Based on the criteria outlined in the general section on interpretation, a total of eight uranium anomalies have been outlined on the interpretation map. Most of these are only weakly to moderately anomalous. Zones 3 and 7 are relatively better than the others though none of the anomalies are thought to be of any economic significance. No follow-up work is recommended

  4. Performance investigation of an advanced multi-effect adsorption desalination (MEAD) cycle

    International Nuclear Information System (INIS)

    Thu, Kyaw; Kim, Young-Deuk; Shahzad, Muhammad Wakil; Saththasivam, Jayaprakash; Ng, Kim Choon

    2015-01-01

    Highlights: • Multi-effect adsorption desalination (MEAD) cycle for improved performance. • Passive heating of saline water recovering kinetic energy from desorption. • All effects operate at low temperature i.e., below 35 °C unlike conventional cycle. • High PR (6.3) with low temperature heat source. • Analyzed using p–T–q diagram tracking the temperatures and uptakes. - Abstract: This article presents the development of an advanced adsorption desalination system with quantum performance improvement. The proposed multi-effect adsorption desalination (MEAD) cycle utilizes a single heat source i.e., low-temperature hot water (as low as 55 °C). Passive heating of the feed water (no direct heating) is adopted using total internal heat recovery from the kinetic energy of desorbed vapor and water vapor uptake potential of the adsorbent. Thus, the evaporation in the MEAD cycle ensues at low temperatures ranging from 35 °C to 7 °C yet providing significantly high performance ratio. The energy from the regenerated vapor is recovered for multiple evaporation/condensation of saline water by a water-run-around circuit between the top brine temperature (TBT) effect and the AD condenser. The adsorbent material is the hydrophilic mesoporous silica gel with high pore surface area. Numerical simulation for such a cycle is developed based on experimentally verified model extending to multi-effect cycle. The system is investigated under several operation conditions such as cycle time allocation, heat source temperature and the number of intermediate effects. It is observed that most of the evaporating–condensing effects operate at low temperature i.e., below 35 °C as opposed to conventional multi-effect distillation (MED) cycle. For a MEAD cycle with 7 intermediate effects, the specific water production rate, the performance ratio and the gain output ratio are found to be 1.0 m"3/h tonne of silica gel, 6.3 and 5.1, respectively. Low scaling and fouling

  5. Sosiaalipsykologian sydän : George H. Mead ja symbolinen interaktionismi sosiaalipsykologian tutkimustraditioissa ja toimijuuden teoriassa

    OpenAIRE

    Hankamäki, Jukka Sakari

    2016-01-01

    In this study my aim is to clarify George H. Mead’s (1863–1931) impact history and his significance for the development of social psychology. Another task is to systematically analyse the problems and paradoxes faced by Mead in his theory of meaning and concept of the human being. The third target is to draw a holistic theory of human being for the needs of present social psychology and the theory of agency. My approach is philosophical and epistemological, and my method is hermeneutical,...

  6. Estimating intervention effects of prevention programs: accounting for noncompliance.

    Science.gov (United States)

    Stuart, Elizabeth A; Perry, Deborah F; Le, Huynh-Nhu; Ialongo, Nicholas S

    2008-12-01

    Individuals not fully complying with their assigned treatments is a common problem encountered in randomized evaluations of behavioral interventions. Treatment group members rarely attend all sessions or do all "required" activities; control group members sometimes find ways to participate in aspects of the intervention. As a result, there is often interest in estimating both the effect of being assigned to participate in the intervention, as well as the impact of actually participating and doing all of the required activities. Methods known broadly as "complier average causal effects" (CACE) or "instrumental variables" (IV) methods have been developed to estimate this latter effect, but they are more commonly applied in medical and treatment research. Since the use of these statistical techniques in prevention trials has been less widespread, many prevention scientists may not be familiar with the underlying assumptions and limitations of CACE and IV approaches. This paper provides an introduction to these methods, described in the context of randomized controlled trials of two preventive interventions: one for perinatal depression among at-risk women and the other for aggressive disruptive behavior in children. Through these case studies, the underlying assumptions and limitations of these methods are highlighted.

  7. Estimation of irradiation temperature within the irradiation program Rheinsberg

    CERN Document Server

    Stephan, I; Prokert, F; Scholz, A

    2003-01-01

    The temperature monitoring within the irradiation programme Rheinsberg II was performed by diamond powder monitors. The method bases on the effect of temperature on the irradiation-induced increase of the diamond lattice constant. The method is described by a Russian code. In order to determine the irradiation temperature, the lattice constant is measured by means of a X-ray diffractometer after irradiation and subsequent isochronic annealing. The kink of the linearized temperature-lattice constant curves provides a value for the irradiation temperature. It has to be corrected according to the local neutron flux. The results of the lattice constant measurements show strong scatter. Furthermore there is a systematic error. The results of temperature monitoring by diamond powder are not satisfying. The most probable value lays within 255 C and 265 C and is near the value estimated from the thermal condition of the irradiation experiments.

  8. Joko Tingkir program for estimating tsunami potential rapidly

    Energy Technology Data Exchange (ETDEWEB)

    Madlazim,, E-mail: m-lazim@physics.its.ac.id; Hariyono, E., E-mail: m-lazim@physics.its.ac.id [Department of Physics, Faculty of Mathematics and Natural Sciences, Universitas Negeri Surabaya (UNESA) , Jl. Ketintang, Surabaya 60231 (Indonesia)

    2014-09-25

    The purpose of the study was to estimate P-wave rupture durations (T{sub dur}), dominant periods (T{sub d}) and exceeds duration (T{sub 50Ex}) simultaneously for local events, shallow earthquakes which occurred off the coast of Indonesia. Although the all earthquakes had parameters of magnitude more than 6,3 and depth less than 70 km, part of the earthquakes generated a tsunami while the other events (Mw=7.8) did not. Analysis using Joko Tingkir of the above stated parameters helped understand the tsunami generation of these earthquakes. Measurements from vertical component broadband P-wave quake velocity records and determination of the above stated parameters can provide a direct procedure for assessing rapidly the potential for tsunami generation. The results of the present study and the analysis of the seismic parameters helped explain why the events generated a tsunami, while the others did not.

  9. A computer program for the estimation of time of death

    DEFF Research Database (Denmark)

    Lynnerup, N

    1993-01-01

    In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant and that t......In the 1960s Marshall and Hoare presented a "Standard Cooling Curve" based on their mathematical analyses on the postmortem cooling of bodies. Although fairly accurate under standard conditions, the "curve" or formula is based on the assumption that the ambience temperature is constant...... cooling of bodies is presented. It is proposed that by having a computer program that solves the equation, giving the length of the cooling period in response to a certain rectal temperature, and which allows easy comparison of multiple solutions, the uncertainties related to ambience temperature...

  10. Estimating radiological consequences using the Java programming language

    International Nuclear Information System (INIS)

    Crawford, J.; Hayward, M.; Harris, F.; Domel, R.

    1998-01-01

    At the Australian Nuclear Science and Technology Organisation (ANSTO) a model is being developed to determine critical parameters affecting radioactive doses to humans following a release of radionuclides into the atmosphere. Java programming language was chosen because of the Graphical User Interface (GUI) capabilities and its portability across computer platforms, which were a requirement for the application, called RadCon. The mathematical models are applied over the 2D region, performing time varying calculations of dose to humans for each grid point, according to user selected options. The information combined includes: two dimensional time varying air and ground concentrations, transfer factors from soil to plant, plant to animal, plant to humans, plant interception factors to determine amount of radionuclide on plant surfaces, dosimetric data, such as dose conversion factors and user defined parameters, e.g. soil types, lifestyle, diet of animals and humans. Details of the software requirements, pathway parameters and implementation of RadCon are given

  11. Dexamethasone intravitreal implant in previously treated patients with diabetic macular edema : Subgroup analysis of the MEAD study

    OpenAIRE

    Augustin, A.J.; Kuppermann, B.D.; Lanzetta, P.; Loewenstein, A.; Li, X.; Cui, H.; Hashad, Y.; Whitcup, S.M.; Abujamra, S.; Acton, J.; Ali, F.; Antoszyk, A.; Awh, C.C.; Barak, A.; Bartz-Schmidt, K.U.

    2015-01-01

    Background Dexamethasone intravitreal implant 0.7?mg (DEX 0.7) was approved for treatment of diabetic macular edema (DME) after demonstration of its efficacy and safety in the MEAD registration trials. We performed subgroup analysis of MEAD study results to evaluate the efficacy and safety of DEX 0.7 treatment in patients with previously treated DME. Methods Three-year, randomized, sham-controlled phase 3 study in patients with DME, best-corrected visual acuity (BCVA) of 34?68 Early Treatment...

  12. Estimating radiological consequences using the Java programming language

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, J.; Hayward, M. [Australian Nuclear Science and Technology Organisation (ANSTO), Lucas Heights, NSW (Australia). Information Management Div; Harris, F.; Domel, R. [Australian Nuclear Science and Technology Organisation (ANSTO), Lucas Heights, NSW (Australia). Safety Div.

    1998-12-31

    At the Australian Nuclear Science and Technology Organisation (ANSTO) a model is being developed to determine critical parameters affecting radioactive doses to humans following a release of radionuclides into the atmosphere. Java programming language was chosen because of the Graphical User Interface (GUI) capabilities and its portability across computer platforms, which were a requirement for the application, called RadCon. The mathematical models are applied over the 2D region, performing time varying calculations of dose to humans for each grid point, according to user selected options. The information combined includes: two dimensional time varying air and ground concentrations, transfer factors from soil to plant, plant to animal, plant to humans, plant interception factors to determine amount of radionuclide on plant surfaces, dosimetric data, such as dose conversion factors and user defined parameters, e.g. soil types, lifestyle, diet of animals and humans. Details of the software requirements, pathway parameters and implementation of RadCon are given 10 refs., 2 tabs., 4 figs.

  13. A New Hybrid Nelder-Mead Particle Swarm Optimization for Coordination Optimization of Directional Overcurrent Relays

    Directory of Open Access Journals (Sweden)

    An Liu

    2012-01-01

    Full Text Available Coordination optimization of directional overcurrent relays (DOCRs is an important part of an efficient distribution system. This optimization problem involves obtaining the time dial setting (TDS and pickup current (Ip values of each DOCR. The optimal results should have the shortest primary relay operating time for all fault lines. Recently, the particle swarm optimization (PSO algorithm has been considered an effective tool for linear/nonlinear optimization problems with application in the protection and coordination of power systems. With a limited runtime period, the conventional PSO considers the optimal solution as the final solution, and an early convergence of PSO results in decreased overall performance and an increase in the risk of mistaking local optima for global optima. Therefore, this study proposes a new hybrid Nelder-Mead simplex search method and particle swarm optimization (proposed NM-PSO algorithm to solve the DOCR coordination optimization problem. PSO is the main optimizer, and the Nelder-Mead simplex search method is used to improve the efficiency of PSO due to its potential for rapid convergence. To validate the proposal, this study compared the performance of the proposed algorithm with that of PSO and original NM-PSO. The findings demonstrate the outstanding performance of the proposed NM-PSO in terms of computation speed, rate of convergence, and feasibility.

  14. Counting the cost: estimating the economic benefit of pedophile treatment programs.

    Science.gov (United States)

    Shanahan, M; Donato, R

    2001-04-01

    The principal objective of this paper is to identify the economic costs and benefits of pedophile treatment programs incorporating both the tangible and intangible cost of sexual abuse to victims. Cost estimates of cognitive behavioral therapy programs in Australian prisons are compared against the tangible and intangible costs to victims of being sexually abused. Estimates are prepared that take into account a number of problematic issues. These include the range of possible recidivism rates for treatment programs; the uncertainty surrounding the number of child sexual molestation offences committed by recidivists; and the methodological problems associated with estimating the intangible costs of sexual abuse on victims. Despite the variation in parameter estimates that impact on the cost-benefit analysis of pedophile treatment programs, it is found that potential range of economic costs from child sexual abuse are substantial and the economic benefits to be derived from appropriate and effective treatment programs are high. Based on a reasonable set of parameter estimates, in-prison, cognitive therapy treatment programs for pedophiles are likely to be of net benefit to society. Despite this, a critical area of future research must include further methodological developments in estimating the quantitative impact of child sexual abuse in the community.

  15. Estimating BrAC from transdermal alcohol concentration data using the BrAC estimator software program.

    Science.gov (United States)

    Luczak, Susan E; Rosen, I Gary

    2014-08-01

    Transdermal alcohol sensor (TAS) devices have the potential to allow researchers and clinicians to unobtrusively collect naturalistic drinking data for weeks at a time, but the transdermal alcohol concentration (TAC) data these devices produce do not consistently correspond with breath alcohol concentration (BrAC) data. We present and test the BrAC Estimator software, a program designed to produce individualized estimates of BrAC from TAC data by fitting mathematical models to a specific person wearing a specific TAS device. Two TAS devices were worn simultaneously by 1 participant for 18 days. The trial began with a laboratory alcohol session to calibrate the model and was followed by a field trial with 10 drinking episodes. Model parameter estimates and fit indices were compared across drinking episodes to examine the calibration phase of the software. Software-generated estimates of peak BrAC, time of peak BrAC, and area under the BrAC curve were compared with breath analyzer data to examine the estimation phase of the software. In this single-subject design with breath analyzer peak BrAC scores ranging from 0.013 to 0.057, the software created consistent models for the 2 TAS devices, despite differences in raw TAC data, and was able to compensate for the attenuation of peak BrAC and latency of the time of peak BrAC that are typically observed in TAC data. This software program represents an important initial step for making it possible for non mathematician researchers and clinicians to obtain estimates of BrAC from TAC data in naturalistic drinking environments. Future research with more participants and greater variation in alcohol consumption levels and patterns, as well as examination of gain scheduling calibration procedures and nonlinear models of diffusion, will help to determine how precise these software models can become. Copyright © 2014 by the Research Society on Alcoholism.

  16. Closing the Education Gender Gap: Estimating the Impact of Girls' Scholarship Program in the Gambia

    Science.gov (United States)

    Gajigo, Ousman

    2016-01-01

    This paper estimates the impact of a school fee elimination program for female secondary students in The Gambia to reduce gender disparity in education. To assess the impact of the program, two nationally representative household surveys were used (1998 and 2002/2003). By 2002/2003, about half of the districts in the country had benefited from the…

  17. Estimation dose in patients of nuclear medicine. Implementation of a calculi program and methodology

    International Nuclear Information System (INIS)

    Prieto, C.; Espana, M.L.; Tomasi, L.; Lopez Franco, P.

    1998-01-01

    Our hospital is developing a nuclear medicine quality assurance program in order to comply with medical exposure Directive 97/43 EURATOM and the legal requirements established in our legislation. This program includes the quality control of equipment and, in addition, the dose estimation in patients undergoing nuclear medicine examinations. This paper is focused in the second aspect, and presents a new computer program, developed in our Department, in order to estimate the absorbed dose in different organs and the effective dose to the patients, based upon the data from the ICRP publication 53 and its addendum. (Author) 16 refs

  18. TETRA-COM: a comprehensive SPSS program for estimating the tetrachoric correlation.

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2012-12-01

    We provide an SPSS program that implements descriptive and inferential procedures for estimating tetrachoric correlations. These procedures have two main purposes: (1) bivariate estimation in contingency tables and (2) constructing a correlation matrix to be used as input for factor analysis (in particular, the SPSS FACTOR procedure). In both cases, the program computes accurate point estimates, as well as standard errors and confidence intervals that are correct for any population value. For purpose (1), the program computes the contingency table together with five other measures of association. For purpose (2), the program checks the positive definiteness of the matrix, and if it is found not to be Gramian, performs a nonlinear smoothing procedure at the user's request. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  19. Deconvolution of gamma energy spectra from NaI (Tl) detector using the Nelder-Mead zero order optimisation method

    International Nuclear Information System (INIS)

    RAVELONJATO, R.H.M.

    2010-01-01

    The aim of this work is to develop a method for gamma ray spectrum deconvolution from NaI(Tl) detector. Deconvolution programs edited with Matlab 7.6 using Nelder-Mead method were developed to determine multiplet shape parameters. The simulation parameters were: centroid distance/FWHM ratio, Signal/Continuum ratio and counting rate. The test using synthetic spectrum was built with 3σ uncertainty. The tests gave suitable results for centroid distance/FWHM ratio≥2, Signal/Continuum ratio ≥2 and counting level 100 counts. The technique was applied to measure the activity of soils and rocks samples from the Anosy region. The rock activity varies from (140±8) Bq.kg -1 to (190±17)Bq.kg -1 for potassium-40; from (343±7)Bq.Kg -1 to (881±6)Bq.kg -1 for thorium-213 and from (100±3)Bq.kg -1 to (164 ±4) Bq.kg -1 for uranium-238. The soil activity varies from (148±1) Bq.kg -1 to (652±31)Bq.kg -1 for potassium-40; from (1100±11)Bq.kg -1 to (5700 ± 40)Bq.kg -1 for thorium-232 and from (190 ±2) Bq.kg -1 to (779 ±15) Bq -1 for uranium -238. Among 11 samples, the activity value discrepancies compared to high resolution HPGe detector varies from 0.62% to 42.86%. The fitting residuals are between -20% and +20%. The Figure of Merit values are around 5%. These results show that the method developed is reliable for such activity range and the convergence is good. So, NaI(Tl) detector combined with deconvolution method developed may replace HPGe detector within an acceptable limit, if the identification of each nuclides in the radioactive series is not required [fr

  20. Kinetics of selenium release in mine waste from the Meade Peak Phosphatic Shale, Phosphoria Formation, Wooley Valley, Idaho, USA

    Science.gov (United States)

    Lisa L. Stillings; Michael C. Amacher

    2010-01-01

    Phosphorite from the Meade Peak Phosphatic Shale member of the Permian Phosphoria Formation has been mined in southeastern Idaho since 1906. Dumps of waste rock from mining operations contain high concentrations of Se which readily leach into nearby streams and wetlands. While the most common mineralogical residence of Se in the phosphatic shale is elemental Se, Se(0...

  1. Lake water quality: Chapter 4 in A synthesis of aquatic science for management of Lakes Mead and Mohave

    Science.gov (United States)

    Tietjen, Todd; Holdren, G. Chris; Rosen, Michael R.; Veley, Ronald J.; Moran, Michael J.; Vanderford, Brett; Wong, Wai Hing; Drury, Douglas D.

    2012-01-01

    Given the importance of the availability and quality of water in Lake Mead, it has become one of the most intensely sampled and studied bodies of water in the United States. As a result, data are available from sampling stations across the lake (fig. 4-1 and see U.S. Geological Survey Automated Water-Quality Platforms) to provide information on past and current (2012) water-quality conditions and on invasive species that influence—and are affected by—water quality. Water quality in Lakes Mead and Mohave generally exceeds standards set by the State of Nevada to protect water supplies for public uses: drinking water, aquatic ecosystem health, recreation, or agricultural irrigation. In comparison to other reservoirs studied by the U.S. Environmental Protection Agency (USEPA) for a national lake assessment (U.S. Environmental Protection Agency, 2010), Lake Mead is well within the highest or ‘good’ category for recreation and aquatic health (see U.S. Environmental Protection Agency National Lakes Assessment and Lake Mead for more details). While a small part of the lake, particularly Las Vegas Bay, is locally influenced by runoff from urbanized tributaries such as Las Vegas Wash, contaminant loading in the lake as a whole is low compared to other reservoirs in the nation, which are influenced by runoff from more heavily urbanized watersheds (Rosen and Van Metre, 2010).

  2. Algorithms and programs of dynamic mixture estimation unified approach to different types of components

    CERN Document Server

    Nagy, Ivan

    2017-01-01

    This book provides a general theoretical background for constructing the recursive Bayesian estimation algorithms for mixture models. It collects the recursive algorithms for estimating dynamic mixtures of various distributions and brings them in the unified form, providing a scheme for constructing the estimation algorithm for a mixture of components modeled by distributions with reproducible statistics. It offers the recursive estimation of dynamic mixtures, which are free of iterative processes and close to analytical solutions as much as possible. In addition, these methods can be used online and simultaneously perform learning, which improves their efficiency during estimation. The book includes detailed program codes for solving the presented theoretical tasks. Codes are implemented in the open source platform for engineering computations. The program codes given serve to illustrate the theory and demonstrate the work of the included algorithms.

  3. Human-health pharmaceutical compounds in Lake Mead, Nevada and Arizona, and Las Vegas Wash, Nevada, October 2000-August 2001

    Science.gov (United States)

    Boyd, Robert A.; Furlong, Edward T.

    2002-01-01

    The U.S. Geological Survey and the National Park Service conducted a reconnaissance study to investigate the occurrence of selected human-health pharmaceutical compounds in water samples collected from Lake Mead on the Colorado River and Las Vegas Wash, a waterway used to transport treated wastewater from the Las Vegas metropolitan area to Lake Mead. Current research indicates many of these compounds can bioaccumulate and may adversely affect aquatic organisms by disrupting physiological processes, impairing reproductive functions, increasing cancer rates, contributing to the development of antibiotic-resistant strains of bacteria, and acting in undesirable ways when mixed with other substances. These compounds may be present in effluent because a high percentage of prescription and non-prescription drugs used for human-health purposes are excreted from the body as a mixture of parent compounds and degraded metabolite compounds; also, they can be released to the environment when unused products are discarded by way of toilets, sinks, and trash in landfills. Thirteen of 33 targeted compounds were detected in at least one water sample collected between October 2000 and August 2001. All concentrations were less than or equal to 0.20 micrograms per liter. The most frequently detected compounds in samples from Las Vegas Wash were caffeine, carbamazepine (used to treat epilepsy), cotinine (a metabolite of nicotine), and dehydronifedipine (a metabolite of the antianginal Procardia). Less frequently detected compounds in samples collected from Las Vegas Wash were antibiotics (clarithromycin, erythromycin, sulfamethoxazole, and trimethoprim), acetaminophen (an analgesic and anti-inflammatory), cimetidine (used to treat ulcers), codeine (a narcotic and analgesic), diltiazem (an antihypertensive), and 1,7-dimethylxanthine (a metabolite of caffeine). Fewer compounds were detected in samples collected from Lake Mead than from Las Vegas Wash. Caffeine was detected in all samples

  4. Value drivers: an approach for estimating health and disease management program savings.

    Science.gov (United States)

    Phillips, V L; Becker, Edmund R; Howard, David H

    2013-12-01

    Health and disease management (HDM) programs have faced challenges in documenting savings related to their implementation. The objective of this eliminate study was to describe OptumHealth's (Optum) methods for estimating anticipated savings from HDM programs using Value Drivers. Optum's general methodology was reviewed, along with details of 5 high-use Value Drivers. The results showed that the Value Driver approach offers an innovative method for estimating savings associated with HDM programs. The authors demonstrated how real-time savings can be estimated for 5 Value Drivers commonly used in HDM programs: (1) use of beta-blockers in treatment of heart disease, (2) discharge planning for high-risk patients, (3) decision support related to chronic low back pain, (4) obesity management, and (5) securing transportation for primary care. The validity of savings estimates is dependent on the type of evidence used to gauge the intervention effect, generating changes in utilization and, ultimately, costs. The savings estimates derived from the Value Driver method are generally reasonable to conservative and provide a valuable framework for estimating financial impacts from evidence-based interventions.

  5. BAESNUM, a conversational computer program for the Bayesian estimation of a parameter by a numerical method

    International Nuclear Information System (INIS)

    Colombo, A.G.; Jaarsma, R.J.

    1982-01-01

    This report describes a conversational computer program which, via Bayes' theorem, numerically combines the prior distribution of a parameter with a likelihood function. Any type of prior and likelihood function can be considered. The present version of the program includes six types of prior and employs the binomial likelihood. As input the program requires the law and parameters of the prior distribution and the sample data. As output it gives the posterior distribution as a histogram. The use of the program for estimating the constant failure rate of an item is briefly described

  6. A Sum-of-Squares and Semidefinite Programming Approach for Maximum Likelihood DOA Estimation

    Directory of Open Access Journals (Sweden)

    Shu Cai

    2016-12-01

    Full Text Available Direction of arrival (DOA estimation using a uniform linear array (ULA is a classical problem in array signal processing. In this paper, we focus on DOA estimation based on the maximum likelihood (ML criterion, transform the estimation problem into a novel formulation, named as sum-of-squares (SOS, and then solve it using semidefinite programming (SDP. We first derive the SOS and SDP method for DOA estimation in the scenario of a single source and then extend it under the framework of alternating projection for multiple DOA estimation. The simulations demonstrate that the SOS- and SDP-based algorithms can provide stable and accurate DOA estimation when the number of snapshots is small and the signal-to-noise ratio (SNR is low. Moveover, it has a higher spatial resolution compared to existing methods based on the ML criterion.

  7. Solutions to estimation problems for scalar hamilton-jacobi equations using linear programming

    KAUST Repository

    Claudel, Christian G.; Chamoin, Timothee; Bayen, Alexandre M.

    2014-01-01

    This brief presents new convex formulations for solving estimation problems in systems modeled by scalar Hamilton-Jacobi (HJ) equations. Using a semi-analytic formula, we show that the constraints resulting from a HJ equation are convex, and can be written as a set of linear inequalities. We use this fact to pose various (and seemingly unrelated) estimation problems related to traffic flow-engineering as a set of linear programs. In particular, we solve data assimilation and data reconciliation problems for estimating the state of a system when the model and measurement constraints are incompatible. We also solve traffic estimation problems, such as travel time estimation or density estimation. For all these problems, a numerical implementation is performed using experimental data from the Mobile Century experiment. In the context of reproducible research, the code and data used to compute the results presented in this brief have been posted online and are accessible to regenerate the results. © 2013 IEEE.

  8. PRILAKU SEKS BEBAS REMAJA DI KABUPATEN PONOROGO PERSEPKETIF INTERAKSIONALISME SIMBOLIK GEORGE HERBERT MEAD

    Directory of Open Access Journals (Sweden)

    M. Harir Muzakki

    2010-12-01

    Full Text Available Abstraks: Penelitian ini berusaha mengungkap proses terjadinya prilaku seks bebas di kalangan remaja dalam interaksionalisme simbolik George Herbert Mead dan pola interaksi seks bebas di kalangan remaja di kabupaten Ponorogo. Penelitian ini bersifat deskriptif-analitis dan didesain dengan pendekatan kualitatif. Data diambil dengan cara wawancara dengan para pelaku sek bebas. Ada beberapa tahapan sebelum aktor melakukan tindakan atau hubungan seks yaitu, impuls, persepsi,manipulasi dan terakhir konsumasi. Dari hasil penilitian dapat disimpulkan bahwa proses terjadinya seks bebas pada awalnya mereka tertarik dengan lawan jenisnya. Kemudian melakukan pendekatan, saling melirik, berkenalan, kemudian pacaran. Tahap berikutnya mereka saling berpegangan, berciuman, meremas payudara, kedudian melakukan hubungan seks. Sementara pola interaksi seks bebas di kalangan remaja ada dua: pertama, remaja melakukan hubungan seks bebas dengan pacarnya sendiri. Kedua, remaja tersebut melakukan hubungan dengan membeli atau menyewa wanita lain.

  9. The constant experience of self: conceptual approaches between Dewey and Mead

    Directory of Open Access Journals (Sweden)

    Tiago Barcelos Pereira Salgado

    2012-09-01

    Full Text Available We look at our article on a relational communicationapproach. In this sense, we seek to understandthe communication process from a praxeologicalmodel as elaborated by Louis Quéré (1991,rather than an epistemological model, seized in earlierformulations about the communicative practice.With pragmatism as a guideline of our argument,we are interested in understand and approach thenotions of self and experience as discussed by JohnDewey (1896, 1980, 2010 and George Herbert Mead(1934. Both concepts seem appropriate to thinkabout communication. Our purpose is then to seeto what extent we can reasonably argue that the selfis in constant experience and the implications thatpermeate this conceptual relationship.

  10. Mercury concentrations in Quagga Mussels, Dreissena bugensis, from Lakes Mead, Mohave and Havasu.

    Science.gov (United States)

    Mueting, Sara A; Gerstenberger, Shawn L

    2010-04-01

    The recent invasion of the Dressenid species, the quagga mussel, Dreissena bugensis, into Lakes Mead, Mohave and Havasu has raised questions about their ability to alter contaminant cycling. Mussels were collected from 25 locations in the three lakes. The overall average was 0.036 +/- 0.016 microg g(-1) Hg dry wt. The range of the three lakes was from 0.014-0.093 microg g(-1) Hg dry wt. There were no significant differences in mercury concentrations among the three lakes (F = 0.07; p = 0.794). From this baseline data of contaminants in quagga mussels from the lower Colorado River, this species may be used to biomonitor lake health.

  11. Remembering Mead' s 'I-me'-dialectic in organizational socialization theory

    DEFF Research Database (Denmark)

    Revsbæk, Line

    From the standpoint of a recent case study on newcomer innovation during organizational entry, G. H. Mead’s theory on becoming a self in community is explored. It is argued that Mead’s concept of the ‘I-me’-dialectic is a key notion in understanding newcomer innovation on process theory terms....... The emphasis on the spontaneous response of ‘I’ in Mead’s theory supplements otherwise dominant assimilation perspectives in the field of organizational socialization, and suggests for understanding newcomer innovation and assimilation not as an ‘either/or’ but a simultaneous process. The presentation of Mead......’s theory is accentuated with case study narratives. Drawing on Mead’s theory the experience of ‘being insider’ is understood as a situational attribute, rather than as something a community member start out not being and at some point become for the extend of the membership period....

  12. Developing the remote sensing-based early warning system for monitoring TSS concentrations in Lake Mead.

    Science.gov (United States)

    Imen, Sanaz; Chang, Ni-Bin; Yang, Y Jeffrey

    2015-09-01

    Adjustment of the water treatment process to changes in water quality is a focus area for engineers and managers of water treatment plants. The desired and preferred capability depends on timely and quantitative knowledge of water quality monitoring in terms of total suspended solids (TSS) concentrations. This paper presents the development of a suite of nowcasting and forecasting methods by using high-resolution remote-sensing-based monitoring techniques on a daily basis. First, the integrated data fusion and mining (IDFM) technique was applied to develop a near real-time monitoring system for daily nowcasting of the TSS concentrations. Then a nonlinear autoregressive neural network with external input (NARXNET) model was selected and applied for forecasting analysis of the changes in TSS concentrations over time on a rolling basis onward using the IDFM technique. The implementation of such an integrated forecasting and nowcasting approach was assessed by a case study at Lake Mead hosting the water intake for Las Vegas, Nevada, in the water-stressed western U.S. Long-term monthly averaged results showed no simultaneous impact from forest fire events on accelerating the rise of TSS concentration. However, the results showed a probable impact of a decade of drought on increasing TSS concentration in the Colorado River Arm and Overton Arm. Results of the forecasting model highlight the reservoir water level as a significant parameter in predicting TSS in Lake Mead. In addition, the R-squared value of 0.98 and the root mean square error of 0.5 between the observed and predicted TSS values demonstrates the reliability and application potential of this remote sensing-based early warning system in terms of TSS projections at a drinking water intake. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Update to the Fissile Materials Disposition program SST/SGT transportation estimation

    International Nuclear Information System (INIS)

    John Didlake

    1999-01-01

    This report is an update to ''Fissile Materials Disposition Program SST/SGT Transportation Estimation,'' SAND98-8244, June 1998. The Department of Energy Office of Fissile Materials Disposition requested this update as a basis for providing the public with an updated estimation of the number of transportation loads, load miles, and costs associated with the preferred alternative in the Surplus Plutonium Disposition Final Environmental Impact Statement (EIS)

  14. Informing Estimates of Program Effects for Studies of Mathematics Professional Development Using Teacher Content Knowledge Outcomes.

    Science.gov (United States)

    Phelps, Geoffrey; Kelcey, Benjamin; Jones, Nathan; Liu, Shuangshuang

    2016-10-03

    Mathematics professional development is widely offered, typically with the goal of improving teachers' content knowledge, the quality of teaching, and ultimately students' achievement. Recently, new assessments focused on mathematical knowledge for teaching (MKT) have been developed to assist in the evaluation and improvement of mathematics professional development. This study presents empirical estimates of average program change in MKT and its variation with the goal of supporting the design of experimental trials that are adequately powered to detect a specified program effect. The study drew on a large database representing five different assessments of MKT and collectively 326 professional development programs and 9,365 teachers. Results from cross-classified hierarchical growth models found that standardized average change estimates across the five assessments ranged from a low of 0.16 standard deviations (SDs) to a high of 0.26 SDs. Power analyses using the estimated pre- and posttest change estimates indicated that hundreds of teachers are needed to detect changes in knowledge at the lower end of the distribution. Even studies powered to detect effects at the higher end of the distribution will require substantial resources to conduct rigorous experimental trials. Empirical benchmarks that describe average program change and its variation provide a useful preliminary resource for interpreting the relative magnitude of effect sizes associated with professional development programs and for designing adequately powered trials. © The Author(s) 2016.

  15. Software documentation and user's manual for fish-impingement sampling design and estimation method computer programs

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.

    1977-11-01

    This report contains a description of three computer programs that implement the theory of sampling designs and the methods for estimating fish-impingement at the cooling-water intakes of nuclear power plants as described in companion report ANL/ES-60. Complete FORTRAN listings of these programs, named SAMPLE, ESTIMA, and SIZECO, are given and augmented with examples of how they are used

  16. Key Aspects of the Federal Direct Loan Program's Cost Estimates: Department of Education. Report to Congressional Requesters.

    Science.gov (United States)

    Calbom, Linda M.; Ashby, Cornelia M.

    Because of concerns about the Department of Education's reliance on estimates to project costs of the William D. Ford Federal Direct Loan Program (FDLP) and a lack of historical information on which to base those estimates, Congress asked the General Accounting Office (GAO) to review how the department develops its cost estimates for the program,…

  17. Budget estimates: Fiscal year 1994. Volume 3: Research and program management

    Science.gov (United States)

    1994-01-01

    The research and program management (R&PM) appropriation provides the salaries, other personnel and related costs, and travel support for NASA's civil service workforce. This FY 1994 budget funds costs associated with 23,623 full-time equivalent (FTE) work years. Budget estimates are provided for all NASA centers by categories such as space station and new technology investments, space flight programs, space science, life and microgravity sciences, advanced concepts and technology, center management and operations support, launch services, mission to planet earth, tracking and data programs, aeronautical research and technology, and safety, reliability, and quality assurance.

  18. Development of a package program for estimating ground level concentrations of radioactive gases

    International Nuclear Information System (INIS)

    Nilkamhang, W.

    1986-01-01

    A package program for estimating ground level concentration of radioactive gas from elevate release was develop for use on IBM P C microcomputer. The main program, GAMMA PLUME NT10, is based on the well known VALLEY MODEL which is a Fortran computer code intended for mainframe computers. Other two options were added, namely, calculation of radioactive gas ground level concentration in Ci/m 3 and dose equivalent rate in mren/hr. In addition, a menu program and editor program were developed to render the program easier to use since the option could be readily selected and the input data could be easily modified as required through the keyboard. The accuracy and reliability of the program is almost identical to the mainframe. Ground level concentration of radioactive radon gas due to ore program processing in the nuclear chemistry laboratory of the Department of Nuclear Technology was estimated. In processing radioactive ore at a rate of 2 kg/day, about 35 p Ci/s of radioactive gas was released from a 14 m stack. When meteorological data of Don Muang (average for 5 years 1978-1982) were used maximum ground level concentration and the dose equivalent rate were found to be 0.00094 p Ci/m 3 and 5.0 x 10 -10 mrem/hr respectively. The processing time required for the above problem was about 7 minutes for any case of source on IBM P C which was acceptable for a computer of this class

  19. Empirical estimates in stochastic programs with probability and second order stochastic dominance constraints

    Czech Academy of Sciences Publication Activity Database

    Omelchenko, Vadym; Kaňková, Vlasta

    2015-01-01

    Roč. 84, č. 2 (2015), s. 267-281 ISSN 0862-9544 R&D Projects: GA ČR GA13-14445S Institutional support: RVO:67985556 Keywords : Stochastic programming problems * empirical estimates * light and heavy tailed distributions * quantiles Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/E/omelchenko-0454495.pdf

  20. Gompertz: A Scilab Program for Estimating Gompertz Curve Using Gauss-Newton Method of Least Squares

    Directory of Open Access Journals (Sweden)

    Surajit Ghosh Dastidar

    2006-04-01

    Full Text Available A computer program for estimating Gompertz curve using Gauss-Newton method of least squares is described in detail. It is based on the estimation technique proposed in Reddy (1985. The program is developed using Scilab (version 3.1.1, a freely available scientific software package that can be downloaded from http://www.scilab.org/. Data is to be fed into the program from an external disk file which should be in Microsoft Excel format. The output will contain sample size, tolerance limit, a list of initial as well as the final estimate of the parameters, standard errors, value of Gauss-Normal equations namely GN1 GN2 and GN3 , No. of iterations, variance(σ2 , Durbin-Watson statistic, goodness of fit measures such as R2 , D value, covariance matrix and residuals. It also displays a graphical output of the estimated curve vis a vis the observed curve. It is an improved version of the program proposed in Dastidar (2005.

  1. Gompertz: A Scilab Program for Estimating Gompertz Curve Using Gauss-Newton Method of Least Squares

    Directory of Open Access Journals (Sweden)

    Surajit Ghosh Dastidar

    2006-04-01

    Full Text Available A computer program for estimating Gompertz curve using Gauss-Newton method of least squares is described in detail. It is based on the estimation technique proposed in Reddy (1985. The program is developed using Scilab (version 3.1.1, a freely available scientific software package that can be downloaded from http://www.scilab.org/. Data is to be fed into the program from an external disk file which should be in Microsoft Excel format. The output will contain sample size, tolerance limit, a list of initial as well as the final estimate of the parameters, standard errors, value of Gauss-Normal equations namely GN1 GN2 and GN3, No. of iterations, variance(σ2, Durbin-Watson statistic, goodness of fit measures such as R2, D value, covariance matrix and residuals. It also displays a graphical output of the estimated curve vis a vis the observed curve. It is an improved version of the program proposed in Dastidar (2005.

  2. PROFIT-PC: a program for estimating maximum net revenue from multiproduct harvests in Appalachian hardwoods

    Science.gov (United States)

    Chris B. LeDoux; John E. Baumgras; R. Bryan Selbe

    1989-01-01

    PROFIT-PC is a menu driven, interactive PC (personal computer) program that estimates optimum product mix and maximum net harvesting revenue based on projected product yields and stump-to-mill timber harvesting costs. Required inputs include the number of trees/acre by species and 2 inches diameter at breast-height class, delivered product prices by species and product...

  3. Survey of engineering computational methods and experimental programs for estimating supersonic missile aerodynamic characteristics

    Science.gov (United States)

    Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.

    1982-01-01

    This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.

  4. Artificial Neural Networks and Gene Expression Programing based age estimation using facial features

    Directory of Open Access Journals (Sweden)

    Baddrud Z. Laskar

    2015-10-01

    Full Text Available This work is about estimating human age automatically through analysis of facial images. It has got a lot of real-world applications. Due to prompt advances in the fields of machine vision, facial image processing, and computer graphics, automatic age estimation via faces in computer is one of the dominant topics these days. This is due to widespread real-world applications, in areas of biometrics, security, surveillance, control, forensic art, entertainment, online customer management and support, along with cosmetology. As it is difficult to estimate the exact age, this system is to estimate a certain range of ages. Four sets of classifications have been used to differentiate a person’s data into one of the different age groups. The uniqueness about this study is the usage of two technologies i.e., Artificial Neural Networks (ANN and Gene Expression Programing (GEP to estimate the age and then compare the results. New methodologies like Gene Expression Programing (GEP have been explored here and significant results were found. The dataset has been developed to provide more efficient results by superior preprocessing methods. This proposed approach has been developed, tested and trained using both the methods. A public data set was used to test the system, FG-NET. The quality of the proposed system for age estimation using facial features is shown by broad experiments on the available database of FG-NET.

  5. Lift/cruise fan V/STOL technology aircraft design definition study. Volume 3: Development program and budgetary estimates

    Science.gov (United States)

    Obrien, W. J.

    1976-01-01

    The aircraft development program, budgetary estimates in CY 1976 dollars, and cost reduction program variants are presented. Detailed cost matrices are also provided for the mechanical transmission system, turbotip transmission system, and the thrust vector hoods and yaw doors.

  6. Satellite-based mapping of field-scale stress indicators for crop yield forecasting: an application over Mead, NE

    Science.gov (United States)

    Yang, Y.; Anderson, M. C.; Gao, F.; Wardlow, B.; Hain, C.; Otkin, J.; Sun, L.; Dulaney, W.

    2017-12-01

    In agricultural regions, water is one of the most widely limiting factors of crop performance and production. Evapotranspiration (ET) describes crop water use through transpiration and water lost through direct soil evaporation, which makes it a good indicator of soil moisture availability and vegetation health and thus has been an integral part of many yield estimation efforts. The Evaporative Stress Index (ESI) describes temporal anomalies in a normalized evapotranspiration metric (fRET) as derived from satellite remote sensing and has demonstrated capacity to explain regional yield variability in water limited crop growing regions. However, its performance in some regions where the vegetation cycle is intensively managed appears to be degraded. In this study we generated maps of ET, fRET, and ESI at high spatiotemporal resolution (30-m pixels, daily timesteps) using a multi-sensor data fusion method, integrating information from satellite platforms with good temporal coverage and other platforms that provide field-scale spatial detail. The study was conducted over the period 2010-2014, covering a region around Mead, Nebraska that includes both rainfed and irrigated crops. Correlations between ESI and measurements of corn yield are investigated at both the field and county level to assess the value of ESI as a yield forecasting tool. To examine the role of phenology in ESI-yield correlations, annual input fRET timeseries were aligned by both calendar day and by biophysically relevant dates (e.g. days since planting or emergence). Results demonstrate that mapping of fRET and ESI at 30-m has the advantage of being able to resolve different crop types with varying phenology. The study also suggests that incorporating phenological information significantly improves yield-correlations by accounting for effects of phenology such as variable planting date and emergence date. The yield-ESI relationship in this study well captures the inter-annual variability of yields

  7. Estimating data from figures with a Web-based program: Considerations for a systematic review.

    Science.gov (United States)

    Burda, Brittany U; O'Connor, Elizabeth A; Webber, Elizabeth M; Redmond, Nadia; Perdue, Leslie A

    2017-09-01

    Systematic reviewers often encounter incomplete or missing data, and the information desired may be difficult to obtain from a study author. Thus, systematic reviewers may have to resort to estimating data from figures with little or no raw data in a study's corresponding text or tables. We discuss a case study in which participants used a publically available Web-based program, called webplotdigitizer, to estimate data from 2 figures. We evaluated and used the intraclass coefficient and the accuracy of the estimates to the true data to inform considerations when using estimated data from figures in systematic reviews. The estimates for both figures were consistent, although the distribution of estimates in the figure of a continuous outcome was slightly higher. For the continuous outcome, the percent difference ranged from 0.23% to 30.35% while the percent difference of the event rate ranged from 0.22% to 8.92%. For both figures, the intraclass coefficient was excellent (>0.95). Systematic reviewers should consider and be transparent when estimating data from figures when the information cannot be obtained from study authors and perform sensitivity analyses of pooled results to reduce bias. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Consolidated Fuel Reprocessing Program. Operating experience with pulsed-column holdup estimators

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1986-01-01

    Methods for estimating pulsed-column holdup are being investigated as part of the Safeguards Assessment task of the Consolidated Fuel Reprocessing Program (CFRP) at the Oak Ridge National Laboratory. The CFRP was a major sponsor of test runs at the Barnwell Nuclear Fuel plant (BNFP) in 1980 and 1981. During these tests, considerable measurement data were collected for pulsed columns in the plutonium purification portion of the plant. These data have been used to evaluate and compare three available methods of holdup estimation

  9. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  10. The Expected Loss in the Discretization of Multistage Stochastic Programming Problems - Estimation and Convergence Rate

    Czech Academy of Sciences Publication Activity Database

    Šmíd, Martin

    2009-01-01

    Roč. 165, č. 1 (2009), s. 29-45 ISSN 0254-5330 R&D Projects: GA ČR GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : multistage stochastic programming problems * approximation * discretization * Monte Carlo Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.961, year: 2009 http://library.utia.cas.cz/separaty/2008/E/smid-the expected loss in the discretization of multistage stochastic programming problems - estimation and convergence rate.pdf

  11. Advanced Transportation System Studies. Technical Area 3: Alternate Propulsion Subsystems Concepts. Volume 3; Program Cost Estimates

    Science.gov (United States)

    Levack, Daniel J. H.

    2000-01-01

    The objective of this contract was to provide definition of alternate propulsion systems for both earth-to-orbit (ETO) and in-space vehicles (upper stages and space transfer vehicles). For such propulsion systems, technical data to describe performance, weight, dimensions, etc. was provided along with programmatic information such as cost, schedule, needed facilities, etc. Advanced technology and advanced development needs were determined and provided. This volume separately presents the various program cost estimates that were generated under three tasks: the F- IA Restart Task, the J-2S Restart Task, and the SSME Upper Stage Use Task. The conclusions, technical results , and the program cost estimates are described in more detail in Volume I - Executive Summary and in individual Final Task Reports.

  12. The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.

    Science.gov (United States)

    Pang, Haotian; Liu, Han; Vanderbei, Robert

    2014-02-01

    We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.

  13. The Hatfield SCT lunar atlas photographic atlas for Meade, Celestron, and other SCT telescopes

    CERN Document Server

    2014-01-01

    In a major publishing event for lunar observers, the justly famous Hatfield atlas is updated in even more usable form. This version of Hatfield’s classic atlas solves the problem of mirror images, making identification of left-right reversed imaged lunar features both quick and easy. SCT and Maksutov telescopes – which of course include the best-selling models from Meade and Celestron – reverse the visual image left to right. Thus it is extremely difficult to identify lunar features at the eyepiece of one of the instruments using a conventional Moon atlas, as the human brain does not cope well when trying to compare the real thing with a map that is a mirror image of it. Now this issue has at last been solved.   In this atlas the Moon’s surface is shown at various sun angles, and inset keys show the effects of optical librations. Smaller non-mirrored reference images are also included to make it simple to compare the mirrored SCT plates and maps with those that appear in other atlases. This edition s...

  14. The hatfield SCT lunar atlas photographic atlas for Meade, Celestron and other SCT telescopes

    CERN Document Server

    Cook, Jeremy

    2005-01-01

    Schmitt-Cassegrain Telescopes (SCT) and Schmitt-Maksutov telescopes - which include the best-selling models from Meade, Celestron, and other important manufacturers - reverse the visual image left for right, giving a "mirror image". This makes it extremely difficult for observers to identify lunar features at the eyepiece of one of these instruments, using conventional atlases which show the Moon "upside-down" with south at the top. The human brain just doesn't cope well with trying to compare the real thing with a map that is a mirror-image of it!The Hatfield SCT Lunar Atlas solves the problem. Photographs and the detailed key maps are exactly as the Moon appears through the eyepiece of an SCT or Maksutov telescope. Smaller IAU-standard reference photographs are included on each page, to make it simple to compare the mirrored SCT photographs and maps with those that appear in other conventional atlases.Every owner of an SCT - and that's most amateur astronomers - will want this!.

  15. 2003 status report savings estimates for the energy star(R)voluntary labeling program

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Carrie A.; Brown, Richard E.; McWhinney, Marla

    2004-11-09

    ENERGY STAR(R) is a voluntary labeling program designed to identify and promote energy-efficient products, buildings and practices. Operated jointly by the Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE), ENERGY STAR labels exist for more than thirty products, spanning office equipment, residential heating and cooling equipment, commercial and residential lighting, home electronics, and major appliances. This report presents savings estimates for a subset of ENERGY STAR program activities, focused primarily on labeled products. We present estimates of the energy, dollar and carbon savings achieved by the program in the year 2002, what we expect in 2003, and provide savings forecasts for two market penetration scenarios for the period 2003 to 2020. The target market penetration forecast represents our best estimate of future ENERGY STAR savings. It is based on realistic market penetration goals for each of the products. We also provide a forecast under the assumption of 100 percent market penetration; that is, we assume that all purchasers buy ENERGY STAR-compliant products instead of standard efficiency products throughout the analysis period.

  16. 2002 status report: Savings estimates for the ENERGY STAR(R) voluntary labeling program

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Carrie A.; Brown, Richard E.; McWhinney, Marla; Koomey, Jonathan

    2003-03-03

    ENERGY STAR [registered trademark] is a voluntary labeling program designed to identify and promote energy-efficient products, buildings and practices. Operated jointly by the Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE), ENERGY STAR labels exist for more than thirty products, spanning office equipment, residential heating and cooling equipment, commercial and residential lighting, home electronics, and major appliances. This report presents savings estimates for a subset of ENERGY STAR program activities, focused primarily on labeled products. We present estimates of the energy, dollar and carbon savings achieved by the program in the year 2001, what we expect in 2002, and provide savings forecasts for two market penetration scenarios for the period 2002 to 2020. The target market penetration forecast represents our best estimate of future ENERGY STAR savings. It is based on realistic market penetration goals for each of the products. We also provide a forecast under the assumption of 100 percent market penetration; that is, we assume that all purchasers buy ENERGY STAR-compliant products instead of standard efficiency products throughout the analysis period.

  17. Savings estimates for the ENERGY STAR (registered trademark) voluntary labeling program: 2001 status report

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Carrie A.; Brown, Richard E.; Mahajan, Akshay; Koomey, Jonathan G.

    2002-02-15

    ENERGY STAR(Registered Trademark) is a voluntary labeling program designed to identify and promote energy-efficient products, buildings and practices. Operated jointly by the Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE), ENERGY STAR labels exist for more than thirty products, spanning office equipment, residential heating and cooling equipment, commercial and residential lighting, home electronics, and major appliances. This report presents savings estimates for a subset of ENERGY STAR program activities, focused primarily on labeled products. We present estimates of the energy, dollar and carbon savings achieved by the program in the year 2000, what we expect in 2001, and provide savings forecasts for two market penetration scenarios for the period 2001 to 2020. The target market penetration forecast represents our best estimate of future ENERGY STAR savings. It is based on realistic market penetration goals for each of the products. We also provide a forecast under the assumption of 100 percent market penetration; that is, we assume that all purchasers buy ENERGY STAR-compliant products instead of standard efficiency products throughout the analysis period.

  18. Savings estimates for the ENERGY STAR (registered trademark) voluntary labeling program: 2001 status report; TOPICAL

    International Nuclear Information System (INIS)

    Webber, Carrie A.; Brown, Richard E.; Mahajan, Akshay; Koomey, Jonathan G.

    2002-01-01

    ENERGY STAR(Registered Trademark) is a voluntary labeling program designed to identify and promote energy-efficient products, buildings and practices. Operated jointly by the Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE), ENERGY STAR labels exist for more than thirty products, spanning office equipment, residential heating and cooling equipment, commercial and residential lighting, home electronics, and major appliances. This report presents savings estimates for a subset of ENERGY STAR program activities, focused primarily on labeled products. We present estimates of the energy, dollar and carbon savings achieved by the program in the year 2000, what we expect in 2001, and provide savings forecasts for two market penetration scenarios for the period 2001 to 2020. The target market penetration forecast represents our best estimate of future ENERGY STAR savings. It is based on realistic market penetration goals for each of the products. We also provide a forecast under the assumption of 100 percent market penetration; that is, we assume that all purchasers buy ENERGY STAR-compliant products instead of standard efficiency products throughout the analysis period

  19. Avoided cost estimation and post-reform funding allocation for California's energy efficiency programs

    International Nuclear Information System (INIS)

    Baskette, C.; Horii, B.; Price, S.; Kollman, E.

    2006-01-01

    This paper summarizes the first comprehensive estimation of California's electricity avoided costs since the state reformed its electricity market. It describes avoided cost estimates that vary by time and location, thus facilitating targeted design, funding, and marketing of demand-side management (DSM) and energy efficiency (EE) programs that could not have occurred under the previous methodology of system average cost estimation. The approach, data, and results reflect two important market structure changes: (a) wholesale spot and forward markets now supply electricity commodities to load serving entities; and (b) the evolution of an emissions market that internalizes and prices some of the externalities of electricity generation. The paper also introduces the multiplier effect of a price reduction due to DSM/EE implementation on electricity bills of all consumers. It affirms that area- and time-specific avoided cost estimates can improve the allocation of the state's public funding for DSM/EE programs, a finding that could benefit other parts of North America (e.g. Ontario and New York), which have undergone electricity deregulation. (author)

  20. MoisturEC: A New R Program for Moisture Content Estimation from Electrical Conductivity Data.

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D; Werkema, Dale; Lane, John W

    2018-03-06

    Noninvasive geophysical estimation of soil moisture has potential to improve understanding of flow in the unsaturated zone for problems involving agricultural management, aquifer recharge, and optimization of landfill design and operations. In principle, several geophysical techniques (e.g., electrical resistivity, electromagnetic induction, and nuclear magnetic resonance) offer insight into soil moisture, but data-analysis tools are needed to "translate" geophysical results into estimates of soil moisture, consistent with (1) the uncertainty of this translation and (2) direct measurements of moisture. Although geostatistical frameworks exist for this purpose, straightforward and user-friendly tools are required to fully capitalize on the potential of geophysical information for soil-moisture estimation. Here, we present MoisturEC, a simple R program with a graphical user interface to convert measurements or images of electrical conductivity (EC) to soil moisture. Input includes EC values, point moisture estimates, and definition of either Archie parameters (based on experimental or literature values) or empirical data of moisture vs. EC. The program produces two- and three-dimensional images of moisture based on available EC and direct measurements of moisture, interpolating between measurement locations using a Tikhonov regularization approach. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  1. MoisturEC: a new R program for moisture content estimation from electrical conductivity data

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Werkema, Dale D.; Lane, John W.

    2018-01-01

    Noninvasive geophysical estimation of soil moisture has potential to improve understanding of flow in the unsaturated zone for problems involving agricultural management, aquifer recharge, and optimization of landfill design and operations. In principle, several geophysical techniques (e.g., electrical resistivity, electromagnetic induction, and nuclear magnetic resonance) offer insight into soil moisture, but data‐analysis tools are needed to “translate” geophysical results into estimates of soil moisture, consistent with (1) the uncertainty of this translation and (2) direct measurements of moisture. Although geostatistical frameworks exist for this purpose, straightforward and user‐friendly tools are required to fully capitalize on the potential of geophysical information for soil‐moisture estimation. Here, we present MoisturEC, a simple R program with a graphical user interface to convert measurements or images of electrical conductivity (EC) to soil moisture. Input includes EC values, point moisture estimates, and definition of either Archie parameters (based on experimental or literature values) or empirical data of moisture vs. EC. The program produces two‐ and three‐dimensional images of moisture based on available EC and direct measurements of moisture, interpolating between measurement locations using a Tikhonov regularization approach.

  2. P3T+: A Performance Estimator for Distributed and Parallel Programs

    Directory of Open Access Journals (Sweden)

    T. Fahringer

    2000-01-01

    Full Text Available Developing distributed and parallel programs on today's multiprocessor architectures is still a challenging task. Particular distressing is the lack of effective performance tools that support the programmer in evaluating changes in code, problem and machine sizes, and target architectures. In this paper we introduce P3T+ which is a performance estimator for mostly regular HPF (High Performance Fortran programs but partially covers also message passing programs (MPI. P3T+ is unique by modeling programs, compiler code transformations, and parallel and distributed architectures. It computes at compile-time a variety of performance parameters including work distribution, number of transfers, amount of data transferred, transfer times, computation times, and number of cache misses. Several novel technologies are employed to compute these parameters: loop iteration spaces, array access patterns, and data distributions are modeled by employing highly effective symbolic analysis. Communication is estimated by simulating the behavior of a communication library used by the underlying compiler. Computation times are predicted through pre-measured kernels on every target architecture of interest. We carefully model most critical architecture specific factors such as cache lines sizes, number of cache lines available, startup times, message transfer time per byte, etc. P3T+ has been implemented and is closely integrated with the Vienna High Performance Compiler (VFC to support programmers develop parallel and distributed applications. Experimental results for realistic kernel codes taken from real-world applications are presented to demonstrate both accuracy and usefulness of P3T+.

  3. Cost of employee assistance programs: comparison of national estimates from 1993 and 1995.

    Science.gov (United States)

    French, M T; Zarkin, G A; Bray, J W; Hartwell, T D

    1999-02-01

    The cost and financing of mental health services is gaining increasing importance with the spread of managed care and cost-cutting measures throughout the health care system. The delivery of mental health services through structured employee assistance programs (EAPs) could be undermined by revised health insurance contracts and cutbacks in employer-provided benefits at the workplace. This study uses two recently completed national surveys of EAPs to estimate the costs of providing EAP services during 1993 and 1995. EAP costs are determined by program type, worksite size, industry, and region. In addition, information on program services is reported to determine the most common types and categories of services and whether service delivery changes have occurred between 1993 and 1995. The results of this study will be useful to EAP managers, mental health administrators, and mental health services researchers who are interested in the delivery and costs of EAP services.

  4. The cost of crime to society: new crime-specific estimates for policy and program evaluation.

    Science.gov (United States)

    McCollister, Kathryn E; French, Michael T; Fang, Hai

    2010-04-01

    Estimating the cost to society of individual crimes is essential to the economic evaluation of many social programs, such as substance abuse treatment and community policing. A review of the crime-costing literature reveals multiple sources, including published articles and government reports, which collectively represent the alternative approaches for estimating the economic losses associated with criminal activity. Many of these sources are based upon data that are more than 10 years old, indicating a need for updated figures. This study presents a comprehensive methodology for calculating the cost to society of various criminal acts. Tangible and intangible losses are estimated using the most current data available. The selected approach, which incorporates both the cost-of-illness and the jury compensation methods, yields cost estimates for more than a dozen major crime categories, including several categories not found in previous studies. Updated crime cost estimates can help government agencies and other organizations execute more prudent policy evaluations, particularly benefit-cost analyses of substance abuse treatment or other interventions that reduce crime. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  5. 2005 Status Report Savings Estimates for the ENERGY STAR(R)Voluntary Labeling Program

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Carrie A.; Brown, Richard E.; Sanchez, Marla

    2006-03-07

    ENERGY STAR(R) is a voluntary labeling program designed toidentify and promote energy-efficient products, buildings and practices.Operated jointly by the Environmental Protection Agency (EPA) and theU.S. Department of Energy (DOE), Energy Star labels exist for more thanforty products, spanning office equipment, residential heating andcooling equipment, commercial and residential lighting, home electronics,and major appliances. This report presents savings estimates for a subsetof ENERGY STAR labeled products. We present estimates of the energy,dollar and carbon savings achieved by the program in the year 2004, whatwe expect in 2005, and provide savings forecasts for two marketpenetration scenarios for the periods 2005 to 2010 and 2005 to 2020. Thetarget market penetration forecast represents our best estimate of futureENERGY STAR savings. It is based on realistic market penetration goalsfor each of the products. We also provide a forecast under the assumptionof 100 percent market penetration; that is, we assume that all purchasersbuy ENERGY STAR-compliant products instead of standard efficiencyproducts throughout the analysis period.

  6. 2004 status report: Savings estimates for the Energy Star(R)voluntarylabeling program

    Energy Technology Data Exchange (ETDEWEB)

    Webber, Carrie A.; Brown, Richard E.; McWhinney, Marla

    2004-03-09

    ENERGY STAR(R) is a voluntary labeling program designed toidentify and promote energy-efficient products, buildings and practices.Operated jointly by the Environmental Protection Agency (EPA) and theU.S. Department of Energy (DOE), ENERGY STAR labels exist for more thanthirty products, spanning office equipment, residential heating andcooling equipment, commercial and residential lighting, home electronics,and major appliances. This report presents savings estimates for a subsetof ENERGY STAR labeled products. We present estimates of the energy,dollar and carbon savings achieved by the program in the year 2003, whatwe expect in 2004, and provide savings forecasts for two marketpenetration scenarios for the periods 2004 to 2010 and 2004 to 2020. Thetarget market penetration forecast represents our best estimate of futureENERGY STAR savings. It is based on realistic market penetration goalsfor each of the products. We also provide a forecast under the assumptionof 100 percent market penetration; that is, we assume that all purchasersbuy ENERGY STAR-compliant products instead of standard efficiencyproducts throughout the analysis period.

  7. 2007 Status Report: Savings Estimates for the ENERGY STAR(R)VoluntaryLabeling Program

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Marla; Webber, Carrie A.; Brown, Richard E.; Homan,Gregory K.

    2007-03-23

    ENERGY STAR(R) is a voluntary labeling program designed toidentify and promote energy-efficient products, buildings and practices.Operated jointly by the Environmental Protection Agency (EPA) and theU.S. Department of Energy (DOE), ENERGY STAR labels exist for more thanthirty products, spanning office equipment, residential heating andcooling equipment, commercial and residential lighting, home electronics,and major appliances. This report presents savings estimates for a subsetof ENERGY STAR labeled products. We present estimates of the energy,dollar and carbon savings achieved by the program in the year 2006, whatwe expect in 2007, and provide savings forecasts for two marketpenetration scenarios for the periods 2007 to 2015 and 2007 to 2025. Thetarget market penetration forecast represents our best estimate of futureENERGY STAR savings. It is based on realistic market penetration goalsfor each of the products. We also provide a forecast under the assumptionof 100 percent market penetration; that is, we assume that all purchasersbuy ENERGY STAR-compliant products instead of standard efficiencyproducts throughout the analysis period.

  8. George Herbert Mead y la psicología social de los objetos

    Directory of Open Access Journals (Sweden)

    Doménech Miquel

    2003-01-01

    Full Text Available Hace más de una década que diversas disciplinas de las ciencias sociales vindican la necesidad de una semiología de lo material. Sin duda, la realidad social es eminentemente simbólica, pero tal simbolismo no se ciñe exclusivamente a lo textual, discursivo o lingüístico. Existen prácticas más allá de esta dimensión que producen sentido y significado. Los objetos y las cosas están implicados en ellas. ¿Qué elementos definen semejante semiología? ¿Cómo hay que interpretar esas prácticas? ¿Cómo se relacionan con la producción de lo social? Las respuestas vienen de la mano de la formulación de una cultura material. Mas la elaboración de ésta exige la revisión de las propuestas que al respecto realizó G.H.Mead. Efectivamente, en su obra es posible encontrar una explicación para el papel que los objetos juegan en la constitución y mantenimiento de identidades sociales, entender cómo confieren al self un ambiente estable y familiar, examinar cómo los actos de tocar y comprender, en tanto que relación básica con lo material, detentan un papel clave en la construcción y mantenimiento de la realidad, y, en definitiva, observar como la relación del self con el mundo físico se configura como relación social. En el presente trabajo revisaremos todas estas cuestiones. Y concluiremos que constituyen los primeros pasos para esbozar una Psicología Social de los objetos.

  9. Parameter estimation of an ARMA model for river flow forecasting using goal programming

    Science.gov (United States)

    Mohammadi, Kourosh; Eslami, H. R.; Kahawita, Rene

    2006-11-01

    SummaryRiver flow forecasting constitutes one of the most important applications in hydrology. Several methods have been developed for this purpose and one of the most famous techniques is the Auto regressive moving average (ARMA) model. In the research reported here, the goal was to minimize the error for a specific season of the year as well as for the complete series. Goal programming (GP) was used to estimate the ARMA model parameters. Shaloo Bridge station on the Karun River with 68 years of observed stream flow data was selected to evaluate the performance of the proposed method. The results when compared with the usual method of maximum likelihood estimation were favorable with respect to the new proposed algorithm.

  10. Estimate of the area occupied by reforestation programs in Rio de Janeiro state

    Directory of Open Access Journals (Sweden)

    Hugo Barbosa Amorim

    2012-03-01

    Full Text Available This study was based on a preliminary survey and inventory of existing reforestation programs in Rio de Janeiro state, through geoprocessing techniques and collection of field data. The reforested area was found to occupy 18,426.96 ha, which amounts to 0.42% of the territory of the state. Much of reforestation programs consists of eucalyptus (98%, followed by pine plantations (0.8%, and the remainder is distributed among 10 other species. The Médio Paraíba region was found to contribute the most to the reforested area of the state (46.6%. The estimated volume of eucalyptus timber was nearly two million cubic meters. This study helped crystallize the ongoing perception among those militating in the forestry sector of Rio de Janeiro state that the planted area and stock of reforestation timber is still incipient in the state.

  11. TRAC-PF1: an advanced best-estimate computer program for pressurized water reactor analysis

    International Nuclear Information System (INIS)

    Liles, D.R.; Mahaffy, J.H.

    1984-02-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos National Laboratory to provide advanced best-estimate predictions of postulated accidents in light water reactors. The TRAC-PF1 program provides this capability for pressurized water reactors and for many thermal-hydraulic experimental facilities. The code features either a one-dimensional or a three-dimensional treatment of the pressure vessel and its associated internals; a two-phase, two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field; flow-regime-dependent constitutive equation treatment; optional reflood tracking capability for both bottom flood and falling-film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. This report describes the thermal-hydraulic models and the numerical solution methods used in the code. Detailed programming and user information also are provided

  12. Semantic Edge Based Disparity Estimation Using Adaptive Dynamic Programming for Binocular Sensors.

    Science.gov (United States)

    Zhu, Dongchen; Li, Jiamao; Wang, Xianshun; Peng, Jingquan; Shi, Wenjun; Zhang, Xiaolin

    2018-04-03

    Disparity calculation is crucial for binocular sensor ranging. The disparity estimation based on edges is an important branch in the research of sparse stereo matching and plays an important role in visual navigation. In this paper, we propose a robust sparse stereo matching method based on the semantic edges. Some simple matching costs are used first, and then a novel adaptive dynamic programming algorithm is proposed to obtain optimal solutions. This algorithm makes use of the disparity or semantic consistency constraint between the stereo images to adaptively search parameters, which can improve the robustness of our method. The proposed method is compared quantitatively and qualitatively with the traditional dynamic programming method, some dense stereo matching methods, and the advanced edge-based method respectively. Experiments show that our method can provide superior performance on the above comparison.

  13. READSCAN: A fast and scalable pathogen discovery program with accurate genome relative abundance estimation

    KAUST Repository

    Naeem, Raeece

    2012-11-28

    Summary: READSCAN is a highly scalable parallel program to identify non-host sequences (of potential pathogen origin) and estimate their genome relative abundance in high-throughput sequence datasets. READSCAN accurately classified human and viral sequences on a 20.1 million reads simulated dataset in <27 min using a small Beowulf compute cluster with 16 nodes (Supplementary Material). Availability: http://cbrc.kaust.edu.sa/readscan Contact: or raeece.naeem@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. 2012 The Author(s).

  14. Optimal Input Design for Aircraft Parameter Estimation using Dynamic Programming Principles

    Science.gov (United States)

    Morelli, Eugene A.; Klein, Vladislav

    1990-01-01

    A new technique was developed for designing optimal flight test inputs for aircraft parameter estimation experiments. The principles of dynamic programming were used for the design in the time domain. This approach made it possible to include realistic practical constraints on the input and output variables. A description of the new approach is presented, followed by an example for a multiple input linear model describing the lateral dynamics of a fighter aircraft. The optimal input designs produced by the new technique demonstrated improved quality and expanded capability relative to the conventional multiple input design method.

  15. EFFAIR: a computer program for estimating the dispersion of atmospheric emissions from a nuclear site

    International Nuclear Information System (INIS)

    Dormuth, K.W.; Lyon, R.B.

    1978-11-01

    Analysis of the transport of material through the turbulent atmospheric boundary layer is an important part of environmental impact assessments for nuclear plants. Although this is a complex phenomenon, practical estimates of ground level concentrations downwind of release are usually obtained using a simple Gaussian formula whose coefficients are obtained from empirical correlations. Based on this formula, the computer program EFFAIR has been written to provide a flexible tool for atmospheric dispersion calculations. It is considered appropriate for calculating dilution factors at distances of 10 2 to 10 4 metres from an effluent source if reflection from the inversion lid is negligible in that range. (author)

  16. READSCAN: A fast and scalable pathogen discovery program with accurate genome relative abundance estimation

    KAUST Repository

    Naeem, Raeece; Rashid, Mamoon; Pain, Arnab

    2012-01-01

    Summary: READSCAN is a highly scalable parallel program to identify non-host sequences (of potential pathogen origin) and estimate their genome relative abundance in high-throughput sequence datasets. READSCAN accurately classified human and viral sequences on a 20.1 million reads simulated dataset in <27 min using a small Beowulf compute cluster with 16 nodes (Supplementary Material). Availability: http://cbrc.kaust.edu.sa/readscan Contact: or raeece.naeem@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. 2012 The Author(s).

  17. Estimating Typhoon Rainfall over Sea from SSM/I Satellite Data Using an Improved Genetic Programming

    Science.gov (United States)

    Yeh, K.; Wei, H.; Chen, L.; Liu, G.

    2010-12-01

    Estimating Typhoon Rainfall over Sea from SSM/I Satellite Data Using an Improved Genetic Programming Keh-Chia Yeha, Hsiao-Ping Weia,d, Li Chenb, and Gin-Rong Liuc a Department of Civil Engineering, National Chiao Tung University, Hsinchu, Taiwan, 300, R.O.C. b Department of Civil Engineering and Engineering Informatics, Chung Hua University, Hsinchu, Taiwan, 300, R.O.C. c Center for Space and Remote Sensing Research, National Central University, Tao-Yuan, Taiwan, 320, R.O.C. d National Science and Technology Center for Disaster Reduction, Taipei County, Taiwan, 231, R.O.C. Abstract This paper proposes an improved multi-run genetic programming (GP) and applies it to predict the rainfall using meteorological satellite data. GP is a well-known evolutionary programming and data mining method, used to automatically discover the complex relationships among nonlinear systems. The main advantage of GP is to optimize appropriate types of function and their associated coefficients simultaneously. This study makes an improvement to enhance escape ability from local optimums during the optimization procedure. The GP continuously runs several times by replacing the terminal nodes at the next run with the best solution at the current run. The current novel model improves GP, obtaining a highly nonlinear mathematical equation to estimate the rainfall. In the case study, this improved GP described above combining with SSM/I satellite data is employed to establish a suitable method for estimating rainfall at sea surface during typhoon periods. These estimated rainfalls are then verified with the data from four rainfall stations located at Peng-Jia-Yu, Don-Gji-Dao, Lan-Yu, and Green Island, which are four small islands around Taiwan. From the results, the improved GP can generate sophisticated and accurate nonlinear mathematical equation through two-run learning procedures which outperforms the traditional multiple linear regression, empirical equations and back-propagated network

  18. Monitoring multiple species: Estimating state variables and exploring the efficacy of a monitoring program

    Science.gov (United States)

    Mattfeldt, S.D.; Bailey, L.L.; Grant, E.H.C.

    2009-01-01

    Monitoring programs have the potential to identify population declines and differentiate among the possible cause(s) of these declines. Recent criticisms regarding the design of monitoring programs have highlighted a failure to clearly state objectives and to address detectability and spatial sampling issues. Here, we incorporate these criticisms to design an efficient monitoring program whose goals are to determine environmental factors which influence the current distribution and measure change in distributions over time for a suite of amphibians. In designing the study we (1) specified a priori factors that may relate to occupancy, extinction, and colonization probabilities and (2) used the data collected (incorporating detectability) to address our scientific questions and adjust our sampling protocols. Our results highlight the role of wetland hydroperiod and other local covariates in the probability of amphibian occupancy. There was a change in overall occupancy probabilities for most species over the first three years of monitoring. Most colonization and extinction estimates were constant over time (years) and space (among wetlands), with one notable exception: local extinction probabilities for Rana clamitans were lower for wetlands with longer hydroperiods. We used information from the target system to generate scenarios of population change and gauge the ability of the current sampling to meet monitoring goals. Our results highlight the limitations of the current sampling design, emphasizing the need for long-term efforts, with periodic re-evaluation of the program in a framework that can inform management decisions.

  19. Estimating landholders' probability of participating in a stewardship program, and the implications for spatial conservation priorities.

    Directory of Open Access Journals (Sweden)

    Vanessa M Adams

    Full Text Available The need to integrate social and economic factors into conservation planning has become a focus of academic discussions and has important practical implications for the implementation of conservation areas, both private and public. We conducted a survey in the Daly Catchment, Northern Territory, to inform the design and implementation of a stewardship payment program. We used a choice model to estimate the likely level of participation in two legal arrangements--conservation covenants and management agreements--based on payment level and proportion of properties required to be managed. We then spatially predicted landholders' probability of participating at the resolution of individual properties and incorporated these predictions into conservation planning software to examine the potential for the stewardship program to meet conservation objectives. We found that the properties that were least costly, per unit area, to manage were also the least likely to participate. This highlights a tension between planning for a cost-effective program and planning for a program that targets properties with the highest probability of participation.

  20. PEDIC - A COMPUTER PROGRAM TO ESTIMATE THE EFFECT OF EVACUATION ON POPULATION EXPOSURE FOLLOWING ACUTE RADIONUCLIDE RELEASES TO THE ATOMSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Strenge, D. L.; Peloquin, R. A.

    1981-01-01

    The computer program PEDIC is described for estimation of the effect of evacuation on population exposure. The program uses joint frequency, annual average meteorological data and a simple population evacuation model to estimate exposure reduction due to movement of people away from radioactive plumes following an acute release of activity. Atmospheric dispersion is based on a sector averaged Gaussian model with consideration of plume rise and building wake effects. Appendices to the report provide details of the computer program design, a program listing, input card preparation instructions and sample problems.

  1. Savings estimates for the Energy Star(registered trademark) voluntary labeling program

    International Nuclear Information System (INIS)

    Webber, Carrie A.; Brown, Richard E.; Koomey, Jonathan G.

    2000-01-01

    ENERGY STAR7 is a voluntary labeling program designed to identify and promote energy-efficient products. Operated jointly by the Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE), ENERGY STAR labels exist for more than twenty products, spanning office equipment, residential heating and cooling equipment, new homes, commercial and residential lighting, home electronics, and major appliances. We present estimates of the energy, dollar and carbon savings already achieved by the program and provide savings forecasts for several market penetration scenarios for the period 2001 to 2010. The target market penetration forecast represents our best estimate of future ENERGY STAR savings. It is based on realistic market penetration goals for each of the products. We also provide a forecast under the assumption of 100 percent market penetration; that is, we assume that all purchasers buy ENERGY STAR-compliant products instead of standard efficiency products throughout the analysis period. Finally, we assess the sensitivity of our target penetration case forecasts to greater or lesser marketing success by EPA and DOE, lower-than-expected future energy prices, and higher or lower rates of carbon emissions by electricity generators

  2. The Air Force Mobile Forward Surgical Team (MFST): Using the Estimating Supplies Program to Validate Clinical Requirement

    National Research Council Canada - National Science Library

    Nix, Ralph E; Onofrio, Kathleen; Konoske, Paula J; Galarneau, Mike R; Hill, Martin

    2004-01-01

    .... The primary objective of the study was to provide the Air Force with the ability to validate clinical requirements of the MFST assemblage, with the goal of using NHRC's Estimating Supplies Program (ESP...

  3. Estimation of radiation exposure from lung cancer screening program with low-dose computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Su Yeon; Jun, Jae Kwan [Graduate School of Cancer Science and Policy, National Cancer Center, Seoul (Korea, Republic of)

    2016-12-15

    The National Lung Screening Trial (NLST) demonstrated that screening with Low-dose Computed Tomography (LDCT) screening reduced lung cancer mortality in a high-risk population. Recently, the United States Preventive Services Task Force (USPSTF) gave a B recommendation for annual LDCT screening for individuals at high-risk. With the promising results, Korea developed lung cancer screening guideline and is planning a pilot study for implementation of national lung cancer screening. With widespread adoption of lung cancer screening with LDCT, there are concerns about harms of screening, including high false-positive rates and radiation exposure. Over the 3 rounds of screening in the NLST, 96.4% of positive results were false-positives. Although the initial screening is performed at low dose, subsequent diagnostic examinations following positive results additively contribute to patient's lifetime exposure. As with implementing a large-scale screening program, there is a lack of established risk assessment about the effect of radiation exposure from long-term screening program. Thus, the purpose of this study was to estimate cumulative radiation exposure of annual LDCT lung cancer screening program over 20-year period.

  4. A PC program for estimating organ dose and effective dose values in computed tomography

    International Nuclear Information System (INIS)

    Kalender, W.A.; Schmidt, B.; Schmidt, M.; Zankl, M.

    1999-01-01

    Dose values in CT are specified by the manufacturers for all CT systems and operating conditions in phantoms. It is not trivial, however, to derive dose values in patients from this information. Therefore, we have developed a PC-based program which calculates organ dose and effective dose values for arbitrary scan parameters and anatomical ranges. Values for primary radiation are derived from measurements or manufacturer specifications; values for scattered radiation are derived from Monte Carlo calculations tabulated for standard anthropomorphic phantoms. Based on these values, organ doses can be computed by the program for arbitrary scan protocols in conventional and in spiral CT. Effective dose values are also provided, both with ICRP 26 and ICRP 60 tissue-weighting coefficients. Results for several standard CT protocols are presented in tabular form in this paper. In addition, potential for dose reduction is demonstrated, for example, in spiral CT and in quantitative CT. Providing realistic patient dose estimates for arbitrary CT protocols is relevant both for the physician and the patient, and it is particularly useful for educational and training purposes. The program, called WinDose, is now in use at the Erlangen University hospitals (Germany) as an information tool for radiologists and patients. Further extensions are planned. (orig.)

  5. Education Demographic and Geographic Estimates Program (EDGE): Locale Boundaries User's Manual. NCES 2016-012

    Science.gov (United States)

    Geverdt, Douglas E.

    2015-01-01

    The National Center for Education Statistics (NCES) Education Demographic and Geographic Estimates (EDGE) program develops geographic data to help policymakers, program administrators, and the public understand relationships between educational institutions and the communities they serve. One of the commonly used geographic data items is the NCES…

  6. Dynamic Programming and Error Estimates for Stochastic Control Problems with Maximum Cost

    International Nuclear Information System (INIS)

    Bokanowski, Olivier; Picarelli, Athena; Zidani, Hasnaa

    2015-01-01

    This work is concerned with stochastic optimal control for a running maximum cost. A direct approach based on dynamic programming techniques is studied leading to the characterization of the value function as the unique viscosity solution of a second order Hamilton–Jacobi–Bellman (HJB) equation with an oblique derivative boundary condition. A general numerical scheme is proposed and a convergence result is provided. Error estimates are obtained for the semi-Lagrangian scheme. These results can apply to the case of lookback options in finance. Moreover, optimal control problems with maximum cost arise in the characterization of the reachable sets for a system of controlled stochastic differential equations. Some numerical simulations on examples of reachable analysis are included to illustrate our approach

  7. Dynamic Programming and Error Estimates for Stochastic Control Problems with Maximum Cost

    Energy Technology Data Exchange (ETDEWEB)

    Bokanowski, Olivier, E-mail: boka@math.jussieu.fr [Laboratoire Jacques-Louis Lions, Université Paris-Diderot (Paris 7) UFR de Mathématiques - Bât. Sophie Germain (France); Picarelli, Athena, E-mail: athena.picarelli@inria.fr [Projet Commands, INRIA Saclay & ENSTA ParisTech (France); Zidani, Hasnaa, E-mail: hasnaa.zidani@ensta.fr [Unité de Mathématiques appliquées (UMA), ENSTA ParisTech (France)

    2015-02-15

    This work is concerned with stochastic optimal control for a running maximum cost. A direct approach based on dynamic programming techniques is studied leading to the characterization of the value function as the unique viscosity solution of a second order Hamilton–Jacobi–Bellman (HJB) equation with an oblique derivative boundary condition. A general numerical scheme is proposed and a convergence result is provided. Error estimates are obtained for the semi-Lagrangian scheme. These results can apply to the case of lookback options in finance. Moreover, optimal control problems with maximum cost arise in the characterization of the reachable sets for a system of controlled stochastic differential equations. Some numerical simulations on examples of reachable analysis are included to illustrate our approach.

  8. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  9. Improving reliability of state estimation programming and computing suite based on analyzing a fault tree

    Directory of Open Access Journals (Sweden)

    Kolosok Irina

    2017-01-01

    Full Text Available Reliable information on the current state parameters obtained as a result of processing the measurements from systems of the SCADA and WAMS data acquisition and processing through methods of state estimation (SE is a condition that enables to successfully manage an energy power system (EPS. SCADA and WAMS systems themselves, as any technical systems, are subject to failures and faults that lead to distortion and loss of information. The SE procedure enables to find erroneous measurements, therefore, it is a barrier for the distorted information to penetrate into control problems. At the same time, the programming and computing suite (PCS implementing the SE functions may itself provide a wrong decision due to imperfection of the software algorithms and errors. In this study, we propose to use a fault tree to analyze consequences of failures and faults in SCADA and WAMS and in the very SE procedure. Based on the analysis of the obtained measurement information and on the SE results, we determine the state estimation PCS fault tolerance level featuring its reliability.

  10. Estimating impacts of a breakfast in the classroom program on school outcomes.

    Science.gov (United States)

    Anzman-Frasca, Stephanie; Djang, Holly Carmichael; Halmo, Megan M; Dolan, Peter R; Economos, Christina D

    2015-01-01

    Short-term impacts of breakfast consumption on diet quality and cognitive functioning have been reported, but more evidence is needed to draw causal inferences about long-term impacts of school breakfast on indicators of school engagement and academic achievement. To estimate the impact of a Breakfast in the Classroom (BIC) program on School Breakfast Program participation, school attendance, and academic achievement. This quasi-experimental study included a sample of 446 public elementary schools from a large, urban US school district that served predominantly low-income, racial/ethnic minority students. A total of 257 schools (57.6%) implemented a BIC program during the 2012-2013 academic year, whereas 189 (42.4%) did not. School- and grade-level data from 2012-2013 and grade-level achievement data from the prior year were collected from school district records across the elementary schools. Hypotheses that a BIC program would improve school breakfast participation at the school level, school attendance at the grade level (kindergarten through sixth grade), and academic achievement at the grade level (second through sixth grades) were tested using propensity score weights to adjust for demographic differences between the BIC and non-BIC schools. The BIC program was linked with increased breakfast participation during the academic year (F10,414=136.90, Pperforming attendance analyses in the subset of grade levels for which achievement data were available, results were mostly consistent, although there was a group × time interaction (F10,1891=1.94, P=.04) such that differences between least squares means in the BIC vs non-BIC groups did not reach statistical significance at every month. There were no group differences in standardized test performance in math (57.9% in the BIC group vs 57.4% in the non-BIC group; F1,1890=0.41, P=.52) or reading (44.9% in the BIC group vs 44.7% in the non-BIC group; F1,1890=0.15, P=.70). Findings add to the evidence that BIC can

  11. Reproductive Responses of Common Carp Cyprinus carpio in Cages to Influent of the Las Vegas Wash in Lake Mead, Nevada, from late Winter to early Spring

    Science.gov (United States)

    To investigate the potential for contaminants in Las Vegas Wash (LW) influent to produce effects indicative of endocrine disruption in vivo, adult male and female common carp were exposed in cages for 42-48 d at four sites and two reference locations in Lake Mead.

  12. Estimating the cost of saving electricity through U.S. utility customer-funded energy efficiency programs

    International Nuclear Information System (INIS)

    Hoffman, Ian M.; Goldman, Charles A.; Rybka, Gregory; Leventis, Greg; Schwartz, Lisa; Sanstad, Alan H.; Schiller, Steven

    2017-01-01

    The program administrator and total cost of saved energy allow comparison of the cost of efficiency across utilities, states, and program types, and can identify potential performance improvements. Comparing program administrator cost with the total cost of saved energy can indicate the degree to which programs leverage investment by participants. Based on reported total costs and savings information for U.S. utility efficiency programs from 2009 to 2013, we estimate the savings-weighted average total cost of saved electricity across 20 states at $0.046 per kilowatt-hour (kW h), comparing favorably with energy supply costs and retail rates. Programs targeted on the residential market averaged $0.030 per kW h compared to $0.053 per kW h for non-residential programs. Lighting programs, with an average total cost of $0.018 per kW h, drove lower savings costs in the residential market. We provide estimates for the most common program types and find that program administrators and participants on average are splitting the costs of efficiency in half. More consistent, standardized and complete reporting on efficiency programs is needed. Differing definitions and quantification of costs, savings and savings lifetimes pose challenges for comparing program results. Reducing these uncertainties could increase confidence in efficiency as a resource among planners and policymakers. - Highlights: • The cost of saved energy allows comparisons among energy resource investments. • Findings from the most expansive collection yet of total energy efficiency program costs. • The weighted average total cost of saved electricity was $0.046 for 20 states in 2009–2013. • Averages in the residential and non-residential sectors were $0.030 and $0.053 per kW h, respectively. • Results strongly indicate need for more consistent, reliable and complete reporting on efficiency programs.

  13. Investigations of the Effects of Synthetic Chemicals on the Endocrine System of Common Carp in Lake Mead, Nevada and Arizona

    Science.gov (United States)

    Rosen, Michael R.; Goodbred, Steven L.; Patiño, Reynaldo; Leiker, Thomas A.; Orsak, Erik

    2006-01-01

    Introduction: Lake Mead is the largest reservoir by volume in the United States and was created by the construction of the 221-meter high Hoover Dam in 1935 at Black Canyon on the lower Colorado River between Nevada and Arizona (fig. 1). Inflows of water into the lake include three rivers, Colorado, Virgin, and Muddy; as well as Las Vegas Wash, which is now perennial because of discharges from municipal wastewater treatment plants (Covay and Leiker, 1998) and urban stormwater runoff. As the population within the Las Vegas Valley began to increase in the 1940s, the treated effluent volume also has increased and in 1993 it constituted about 96 percent of the annual discharge of Las Vegas Wash (Bevans and others, 1996). The mean flow of Las Vegas Wash into Las Vegas Bay from 1992 to 1998 was about 490,000 m3/d (Preissler and others, 1999) and in 2001 increased to 606,000 m3/d (U.S. Bureau of Reclamation, 2001). The nutrient concentration in most areas of the lake is low, but wastewater discharged into Las Vegas Bay has caused an increased level of nutrients and primary productivity (aquatic plant and algal production) in this area of the lake (LaBounty and Horn, 1997). A byproduct of this increase in productivity has been the establishment of an important recreational fishery in Las Vegas Bay. However, concentrations of chlorophyll a (a measure of algal biomass) have also increased (LaBounty and Horn, 1997). In the spring of 2001, parts of Lake Mead experienced massive algal blooms. In addition to nutrient loading by wastewater, the presence of numerous synthetic chemicals in water, bottom sediments, and in fish tissue also has been reported (Bevans and others, 1996). Synthetic chemicals discharging into Las Vegas Bay and Lake Mead (fig. 1) originate from several sources that include surplus residential-irrigation water runoff, stormwater runoff, subsurface inflow, and tertiary treated sewage effluent discharging from three sewage-treatment plants. Chemicals detected

  14. Development of an Accelerated Test Design for Predicting the Service Life of the Solar Array at Mead, Nebraska

    Science.gov (United States)

    Gaines, G. B.; Thomas, R. E.; Noel, G. T.; Shilliday, T. S.; Wood, V. E.; Carmichael, D. C.

    1979-01-01

    An accelerated life test is described which was developed to predict the life of the 25 kW photovoltaic array installed near Mead, Nebraska. A quantitative model for accelerating testing using multiple environmental stresses was used to develop the test design. The model accounts for the effects of thermal stress by a relation of the Arrhenius form. This relation was then corrected for the effects of nonthermal environmental stresses, such as relative humidity, atmospheric pollutants, and ultraviolet radiation. The correction factors for the nonthermal stresses included temperature-dependent exponents to account for the effects of interactions between thermal and nonthermal stresses on the rate of degradation of power output. The test conditions, measurements, and data analyses for the accelerated tests are presented. Constant-temperature, cyclic-temperature, and UV types of tests are specified, incorporating selected levels of relative humidity and chemical contamination and an imposed forward-bias current and static electric field.

  15. Estimation of an Examinee's Ability in the Web-Based Computerized Adaptive Testing Program IRT-CAT

    Directory of Open Access Journals (Sweden)

    Yoon-Hwan Lee

    2006-11-01

    Full Text Available We developed a program to estimate an examinee's ability in order to provide freely available access to a web-based computerized adaptive testing (CAT program. We used PHP and Java Script as the program languages, PostgresSQL as the database management system on an Apache web server and Linux as the operating system. A system which allows for user input and searching within inputted items and creates tests was constructed. We performed an ability estimation on each test based on a Rasch model and 2- or 3-parametric logistic models. Our system provides an algorithm for a web-based CAT, replacing previous personal computer-based ones, and makes it possible to estimate an examinee?占퐏 ability immediately at the end of test.

  16. O conceito "socialização" caiu em desuso? Uma análise dos processos de socialização na infância com base em Georg Simmel e George H. Mead Is the " socialization" concept outdated? An analisis of the socialization processes in childhood according to Georg Simmel and George H. Mead

    Directory of Open Access Journals (Sweden)

    Tamara Grigorowitschs

    2008-04-01

    Full Text Available O artigo apresenta uma leitura das obras de George H. Mead e Georg Simmel a respeito do conceito processos de socialização à luz de questões suscitadas no interior do domínio da sociologia da infância. Aborda o desenvolvimento do conceito processos de socialização em Simmel e correlaciona as concepções simmelinas com a obra de Mead a respeito do desenvolvimento do self, com o objetivo de definir os processos de socialização na infância. Visa demonstrar como as obras de Simmel e Mead permitem pensar a infância como um período específico dos processos de socialização, em que as crianças desempenham papéis ativos na construção de seus selves individuais e da sociedade e cultura em que estão inseridas.This paper presents an analysis of George H. Mead's and Georg Simmel's works about the concept of socialization processes in the light of questions arised in the scope of childhood sociology. It approaches the development of Simmel's socialization processes concept and establishes a correlation between the Simmelian conceptions and Mead's work about the self development, in an attempt to define the socialization processes in childhood. It is intended to demonstrate how Simmel's and Mead's works allow us to think about childhood as a specific period of the socialization processes in which children play active roles in building their individual selves, as well as the society and culture they are in.

  17. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    International Nuclear Information System (INIS)

    Lee, Gi Hwa

    1997-11-01

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances

  18. Development of Quadratic Programming Algorithm Based on Interior Point Method with Estimation Mechanism of Active Constraints

    Science.gov (United States)

    Hashimoto, Hiroyuki; Takaguchi, Yusuke; Nakamura, Shizuka

    Instability of calculation process and increase of calculation time caused by increasing size of continuous optimization problem remain the major issues to be solved to apply the technique to practical industrial systems. This paper proposes an enhanced quadratic programming algorithm based on interior point method mainly for improvement of calculation stability. The proposed method has dynamic estimation mechanism of active constraints on variables, which fixes the variables getting closer to the upper/lower limit on them and afterwards releases the fixed ones as needed during the optimization process. It is considered as algorithm-level integration of the solution strategy of active-set method into the interior point method framework. We describe some numerical results on commonly-used bench-mark problems called “CUTEr” to show the effectiveness of the proposed method. Furthermore, the test results on large-sized ELD problem (Economic Load Dispatching problems in electric power supply scheduling) are also described as a practical industrial application.

  19. Designing a monitoring program to estimate estuarine survival of anadromous salmon smolts: simulating the effect of sample design on inference

    Science.gov (United States)

    Romer, Jeremy D.; Gitelman, Alix I.; Clements, Shaun; Schreck, Carl B.

    2015-01-01

    A number of researchers have attempted to estimate salmonid smolt survival during outmigration through an estuary. However, it is currently unclear how the design of such studies influences the accuracy and precision of survival estimates. In this simulation study we consider four patterns of smolt survival probability in the estuary, and test the performance of several different sampling strategies for estimating estuarine survival assuming perfect detection. The four survival probability patterns each incorporate a systematic component (constant, linearly increasing, increasing and then decreasing, and two pulses) and a random component to reflect daily fluctuations in survival probability. Generally, spreading sampling effort (tagging) across the season resulted in more accurate estimates of survival. All sampling designs in this simulation tended to under-estimate the variation in the survival estimates because seasonal and daily variation in survival probability are not incorporated in the estimation procedure. This under-estimation results in poorer performance of estimates from larger samples. Thus, tagging more fish may not result in better estimates of survival if important components of variation are not accounted for. The results of our simulation incorporate survival probabilities and run distribution data from previous studies to help illustrate the tradeoffs among sampling strategies in terms of the number of tags needed and distribution of tagging effort. This information will assist researchers in developing improved monitoring programs and encourage discussion regarding issues that should be addressed prior to implementation of any telemetry-based monitoring plan. We believe implementation of an effective estuary survival monitoring program will strengthen the robustness of life cycle models used in recovery plans by providing missing data on where and how much mortality occurs in the riverine and estuarine portions of smolt migration. These data

  20. Cost Estimation of Software Development and the Implications for the Program Manager

    Science.gov (United States)

    1992-06-01

    Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome

  1. Management and Data Management Plan for Remedial Investigation at Fort George G. Meade Landfill and Preliminary Assessment/Site Investigation at the Former Gaithersburg NIKE Control and Launch Areas

    National Research Council Canada - National Science Library

    Edwards, D

    1989-01-01

    Work assignments under this contract will include a Preliminary Assessment/Site Investigation at the former Gaithersburg NIKE Control and Launch Areas and a Remedial Investigation at the Fort Meade...

  2. Estimates of Intraclass Correlation Coefficients from Longitudinal Group-Randomized Trials of Adolescent HIV/STI/Pregnancy Prevention Programs

    Science.gov (United States)

    Glassman, Jill R.; Potter, Susan C.; Baumler, Elizabeth R.; Coyle, Karin K.

    2015-01-01

    Introduction: Group-randomized trials (GRTs) are one of the most rigorous methods for evaluating the effectiveness of group-based health risk prevention programs. Efficiently designing GRTs with a sample size that is sufficient for meeting the trial's power and precision goals while not wasting resources exceeding them requires estimates of the…

  3. iBEST: a program for burnup history estimation of spent fuels based on ORIGEN-S

    International Nuclear Information System (INIS)

    Kim, Do Yeon; Hong, Ser Gi; Ahn, Gil Hoon

    2015-01-01

    In this paper, we describe a computer program, iBEST (inverse Burnup ESTimator), that we developed to accurately estimate the burnup histories of spent nuclear fuels based on sample measurement data. The burnup history parameters include initial uranium enrichment, burnup, cooling time after discharge from reactor, and reactor type. The program uses algebraic equations derived using the simplified burnup chains of major actinides for initial estimations of burnup and uranium enrichment, and it uses the ORIGEN-S code to correct its initial estimations for improved accuracy. In addition, we newly developed a stable bisection method coupled with ORIGEN-S to correct burnup and enrichment values and implemented it in iBEST in order to fully take advantage of the new capabilities of ORIGEN-S for improving accuracy. The iBEST program was tested using several problems for verification and well-known realistic problems with measurement data from spent fuel samples from the Mihama-3 reactor for validation. The test results show that iBEST accurately estimates the burnup history parameters for the test problems and gives an acceptable level of accuracy for the realistic Mihama-3 problems

  4. The Displacement Effect of Labour-Market Programs: Estimates from the MONASH Model

    OpenAIRE

    Peter B. Dixon; Maureen T. Rimmer

    2005-01-01

    A key question concerning labour-market programs is the extent to which they generate jobs for their target group at the expense of others. This effect is measured by displacement percentages. We describe a version of the MONASH model designed to quantify the effects of labour-market programs. Our simulation results suggests that: (1) labour-market programs can generate significant long-run increases in employment; (2) displacement percentages depend on how a labour-market program affects the...

  5. Writing women into medical history in the 1930s: Kate Campbell Hurd-Mead and "medical women" of the past and present.

    Science.gov (United States)

    Appel, Toby A

    2014-01-01

    Kate Campbell Hurd-Mead (1867–1941), a leader among second-generation women physicians in America, became a pioneer historian of women in medicine in the 1930s. The coalescence of events in her personal life, the declining status of women in medicine, and the growing significance of the new and relatively open field of history of medicine all contributed to this transformation in her career. While she endeavored to become part of the community of male physicians who wrote medical history, her primary identity remained that of a “medical woman.” For Hurd-Mead, the history of women in the past not only filled a vital gap in scholarship but served practical ends that she had earlier pursued by other means—those of inspiring and advancing the careers of women physicians of the present day, promoting organizations of women physicians, and advocating for equality of opportunity in the medical profession.

  6. Estimating the Counterfactual Impact of Conservation Programs on Land Cover Outcomes: The Role of Matching and Panel Regression Techniques.

    Science.gov (United States)

    Jones, Kelly W; Lewis, David J

    2015-01-01

    Deforestation and conversion of native habitats continues to be the leading driver of biodiversity and ecosystem service loss. A number of conservation policies and programs are implemented--from protected areas to payments for ecosystem services (PES)--to deter these losses. Currently, empirical evidence on whether these approaches stop or slow land cover change is lacking, but there is increasing interest in conducting rigorous, counterfactual impact evaluations, especially for many new conservation approaches, such as PES and REDD, which emphasize additionality. In addition, several new, globally available and free high-resolution remote sensing datasets have increased the ease of carrying out an impact evaluation on land cover change outcomes. While the number of conservation evaluations utilizing 'matching' to construct a valid control group is increasing, the majority of these studies use simple differences in means or linear cross-sectional regression to estimate the impact of the conservation program using this matched sample, with relatively few utilizing fixed effects panel methods--an alternative estimation method that relies on temporal variation in the data. In this paper we compare the advantages and limitations of (1) matching to construct the control group combined with differences in means and cross-sectional regression, which control for observable forms of bias in program evaluation, to (2) fixed effects panel methods, which control for observable and time-invariant unobservable forms of bias, with and without matching to create the control group. We then use these four approaches to estimate forest cover outcomes for two conservation programs: a PES program in Northeastern Ecuador and strict protected areas in European Russia. In the Russia case we find statistically significant differences across estimators--due to the presence of unobservable bias--that lead to differences in conclusions about effectiveness. The Ecuador case illustrates that

  7. Estimating the Counterfactual Impact of Conservation Programs on Land Cover Outcomes: The Role of Matching and Panel Regression Techniques.

    Directory of Open Access Journals (Sweden)

    Kelly W Jones

    Full Text Available Deforestation and conversion of native habitats continues to be the leading driver of biodiversity and ecosystem service loss. A number of conservation policies and programs are implemented--from protected areas to payments for ecosystem services (PES--to deter these losses. Currently, empirical evidence on whether these approaches stop or slow land cover change is lacking, but there is increasing interest in conducting rigorous, counterfactual impact evaluations, especially for many new conservation approaches, such as PES and REDD, which emphasize additionality. In addition, several new, globally available and free high-resolution remote sensing datasets have increased the ease of carrying out an impact evaluation on land cover change outcomes. While the number of conservation evaluations utilizing 'matching' to construct a valid control group is increasing, the majority of these studies use simple differences in means or linear cross-sectional regression to estimate the impact of the conservation program using this matched sample, with relatively few utilizing fixed effects panel methods--an alternative estimation method that relies on temporal variation in the data. In this paper we compare the advantages and limitations of (1 matching to construct the control group combined with differences in means and cross-sectional regression, which control for observable forms of bias in program evaluation, to (2 fixed effects panel methods, which control for observable and time-invariant unobservable forms of bias, with and without matching to create the control group. We then use these four approaches to estimate forest cover outcomes for two conservation programs: a PES program in Northeastern Ecuador and strict protected areas in European Russia. In the Russia case we find statistically significant differences across estimators--due to the presence of unobservable bias--that lead to differences in conclusions about effectiveness. The Ecuador case

  8. Using the Estimating Supplies Program to Develop Materiel Solutions for the U.S. Air Force Aeromedical Evacuation In-Flight Kit (FFQDM)

    National Research Council Canada - National Science Library

    Hopkins, Curtis; Nix, Ralph; Pang, Gerry; Konoske, Paula

    2008-01-01

    ... NHRC's medical modeling tool the Estimating Supplies Program (ESP) for the development and management of Air Force medical Allowance Standards as a baseline for standardization throughout the services...

  9. Designing programs to improve diets for maternal and child health: estimating costs and potential dietary impacts of nutrition-sensitive programs in Ethiopia, Nigeria, and India.

    Science.gov (United States)

    Masters, William A; Rosettie, Katherine L; Kranz, Sarah; Danaei, Goodarz; Webb, Patrick; Mozaffarian, Dariush

    2018-05-01

    Improving maternal and child nutrition in resource-poor settings requires effective use of limited resources, but priority-setting is constrained by limited information about program costs and impacts, especially for interventions designed to improve diet quality. This study utilized a mixed methods approach to identify, describe and estimate the potential costs and impacts on child dietary intake of 12 nutrition-sensitive programs in Ethiopia, Nigeria and India. These potential interventions included conditional livestock and cash transfers, media and education, complementary food processing and sales, household production and food pricing programs. Components and costs of each program were identified through a novel participatory process of expert regional consultation followed by validation and calibration from literature searches and comparison with actual budgets. Impacts on child diets were determined by estimating of the magnitude of economic mechanisms for dietary change, comprehensive reviews of evaluations and effectiveness for similar programs, and demographic data on each country. Across the 12 programs, total cost per child reached (net present value, purchasing power parity adjusted) ranged very widely: from 0.58 to 2650 USD/year among five programs in Ethiopia; 2.62 to 1919 USD/year among four programs in Nigeria; and 27 to 586 USD/year among three programs in India. When impacts were assessed, the largest dietary improvements were for iron and zinc intakes from a complementary food production program in Ethiopia (increases of 17.7 mg iron/child/day and 7.4 mg zinc/child/day), vitamin A intake from a household animal and horticulture production program in Nigeria (335 RAE/child/day), and animal protein intake from a complementary food processing program in Nigeria (20.0 g/child/day). These results add substantial value to the limited literature on the costs and dietary impacts of nutrition-sensitive interventions targeting children in resource

  10. Use of Excel ion exchange equilibrium solver with WinGEMS to model and predict NPE distribution in the Mead/Westvaco Evandale, TX, hardwood bleach plant

    Science.gov (United States)

    Christopher Litvay; Alan Rudie; Peter Hart

    2003-01-01

    An Excel spreadsheet developed to solve the ion-exchange equilibrium in wood pulps has been linked by dynamic data exchange to WinGEMS and used to model the non-process elements in the hardwood bleach plant of the Mead/Westvaco Evandale mill. Pulp and filtrate samples were collected from the diffusion washers and final wash press of the bleach plant. A WinGEMS model of...

  11. Scoping Summary Report: Development of Lower Basin Shortage Guidelines and Coordinated Management Strategies for Lake Powell and Lake Mead, Particularly Under Low Reservoir Conditions

    OpenAIRE

    U.S. Department of the Interior, Bureau of Reclamation

    2006-01-01

    The Bureau of Reclamation (Reclamation) acting on behalf of the Secretary of the Department of the Interior (Secretary) proposes to take action to adopt specific Colorado River Lower Basin shortage guidelines and coordinated reservoir management strategies to address operations of Lake Powell and Lake Mead, particularly under low reservoir conditions. This proposed Action will provide a greater degree of certainty to all water users and managers in the Colorado River Basin by providing more d...

  12. Estimating the return on investment in disease management programs using a pre-post analysis.

    Science.gov (United States)

    Fetterolf, Donald; Wennberg, David; Devries, Andrea

    2004-01-01

    Disease management programs have become increasingly popular over the past 5-10 years. Recent increases in overall medical costs have precipitated new concerns about the cost-effectiveness of medical management programs that have extended to the program directors for these programs. Initial success of the disease management movement is being challenged on the grounds that reported results have been the result of the application of faulty, if intuitive, methodologies. This paper discusses the use of "pre-post" methodology approaches in the analysis of disease management programs, and areas where application of this approach can result in spurious results and incorrect financial outcome assessments. The paper includes a checklist of these items for use by operational staff working with the programs, and a comprehensive bibliography that addresses many of the issues discussed.

  13. Assessment of multiple sources of anthropogenic and natural chemical inputs to a morphologically complex basin, Lake Mead, USA

    Science.gov (United States)

    Rosen, Michael R.; Van Metre, P.C.

    2010-01-01

    Lakes with complex morphologies and with different geologic and land-use characteristics in their sub-watersheds could have large differences in natural and anthropogenic chemical inputs to sub-basins in the lake. Lake Mead in southern Nevada and northern Arizona, USA, is one such lake. To assess variations in chemical histories from 1935 to 1998 for major sub-basins of Lake Mead, four sediment cores were taken from three different parts of the reservoir (two from Las Vegas Bay and one from the Overton Arm and Virgin Basin) and analyzed for major and trace elements, radionuclides, and organic compounds. As expected, anthropogenic contaminant inputs are greatest to Las Vegas Bay reflecting inputs from the Las Vegas urban area, although concentrations are low compared to sediment quality guidelines and to other USA lakes. One exception to this pattern was higher Hg in the Virgin Basin core. The Virgin Basin core is located in the main body of the lake (Colorado River channel) and is influenced by the hydrology of the Colorado River, which changed greatly with completion of Glen Canyon Dam upstream in 1963. Major and trace elements in the core show pronounced shifts in the early 1960s and, in many cases, gradually return to concentrations more typical of pre-1960s by the 1980s and 1990s, after the filling of Lake Powell. The Overton Arm is the sub-basin least effected by anthropogenic contaminant inputs but has a complex 137Cs profile with a series of large peaks and valleys over the middle of the core, possibly reflecting fallout from nuclear tests in the 1950s at the Nevada Test Site. The 137Cs profile suggests a much greater sedimentation rate during testing which we hypothesize results from greatly increased dust fall on the lake and Virgin and Muddy River watersheds. The severe drought in the southwestern USA during the 1950s might also have played a role in variations in sedimentation rate in all of the cores. ?? 2009.

  14. Review of Evaluation, Measurement and Verification Approaches Used to Estimate the Load Impacts and Effectiveness of Energy Efficiency Programs

    Energy Technology Data Exchange (ETDEWEB)

    Messenger, Mike; Bharvirkar, Ranjit; Golemboski, Bill; Goldman, Charles A.; Schiller, Steven R.

    2010-04-14

    Public and private funding for end-use energy efficiency actions is expected to increase significantly in the United States over the next decade. For example, Barbose et al (2009) estimate that spending on ratepayer-funded energy efficiency programs in the U.S. could increase from $3.1 billion in 2008 to $7.5 and 12.4 billion by 2020 under their medium and high scenarios. This increase in spending could yield annual electric energy savings ranging from 0.58% - 0.93% of total U.S. retail sales in 2020, up from 0.34% of retail sales in 2008. Interest in and support for energy efficiency has broadened among national and state policymakers. Prominent examples include {approx}$18 billion in new funding for energy efficiency programs (e.g., State Energy Program, Weatherization, and Energy Efficiency and Conservation Block Grants) in the 2009 American Recovery and Reinvestment Act (ARRA). Increased funding for energy efficiency should result in more benefits as well as more scrutiny of these results. As energy efficiency becomes a more prominent component of the U.S. national energy strategy and policies, assessing the effectiveness and energy saving impacts of energy efficiency programs is likely to become increasingly important for policymakers and private and public funders of efficiency actions. Thus, it is critical that evaluation, measurement, and verification (EM&V) is carried out effectively and efficiently, which implies that: (1) Effective program evaluation, measurement, and verification (EM&V) methodologies and tools are available to key stakeholders (e.g., regulatory agencies, program administrators, consumers, and evaluation consultants); and (2) Capacity (people and infrastructure resources) is available to conduct EM&V activities and report results in ways that support program improvement and provide data that reliably compares achieved results against goals and similar programs in other jurisdictions (benchmarking). The National Action Plan for Energy

  15. Estimating the returns to education from a non-stationary dynamic programming model

    DEFF Research Database (Denmark)

    Belzil, Christian; Hansen, Jörgen

    1997-01-01

    ability at school and in the labor market which can be estimated non-parametrically. The construction of the likelihood based on eduction choices and wage data enables us to estimate the wage returns to education as well as the effect of education on non-wage benefits and/or job quality...

  16. Estimating the Counterfactual Impact of Conservation Programs on Land Cover Outcomes: The Role of Matching and Panel Regression Techniques

    Science.gov (United States)

    Jones, Kelly W.; Lewis, David J.

    2015-01-01

    Deforestation and conversion of native habitats continues to be the leading driver of biodiversity and ecosystem service loss. A number of conservation policies and programs are implemented—from protected areas to payments for ecosystem services (PES)—to deter these losses. Currently, empirical evidence on whether these approaches stop or slow land cover change is lacking, but there is increasing interest in conducting rigorous, counterfactual impact evaluations, especially for many new conservation approaches, such as PES and REDD, which emphasize additionality. In addition, several new, globally available and free high-resolution remote sensing datasets have increased the ease of carrying out an impact evaluation on land cover change outcomes. While the number of conservation evaluations utilizing ‘matching’ to construct a valid control group is increasing, the majority of these studies use simple differences in means or linear cross-sectional regression to estimate the impact of the conservation program using this matched sample, with relatively few utilizing fixed effects panel methods—an alternative estimation method that relies on temporal variation in the data. In this paper we compare the advantages and limitations of (1) matching to construct the control group combined with differences in means and cross-sectional regression, which control for observable forms of bias in program evaluation, to (2) fixed effects panel methods, which control for observable and time-invariant unobservable forms of bias, with and without matching to create the control group. We then use these four approaches to estimate forest cover outcomes for two conservation programs: a PES program in Northeastern Ecuador and strict protected areas in European Russia. In the Russia case we find statistically significant differences across estimators—due to the presence of unobservable bias—that lead to differences in conclusions about effectiveness. The Ecuador case

  17. The Evaluation Market in Germany: Estimating Market Size for Evaluation of Political Programs

    Science.gov (United States)

    Lowenbein, Oded

    2008-01-01

    The United States has a long tradition in evaluation of political programs. In the 1930s and 1940s, programs were initiated to reduce unemployment and improve social security as part of the "New Deal." In the late 1960s, somewhat comparable to the U. S. at that time, Germany's new government started its own "New Deal."…

  18. An Estimation of the Effects of China's Priority Forestry Programs on Farmers' Income

    Science.gov (United States)

    Liu, Can; Lu, Jinzhi; Yin, Runsheng

    2010-03-01

    In the late 1990s, the Chinese government initiated some new programs and consolidated other existing ones of ecological restoration and resource development in its forest sector, and renamed them as “Priority Forestry Programs,” or PFPs. They include the Natural Forest Protection Program (NFPP), the Sloping Land Conversion Program (SLCP), the Desertification Combating Program around Beijing and Tianjin (DCBT), the Shelterbelt Development Program (SBDP), and the Wildlife Conservation and Nature Reserve Development Program (WCNR). In addition to improving the environmental and resource conditions, a frequently reiterated goal of these PFPs is to increase rural households’ income, therefore discussing why looking at rural household income impacts might be an important part of forest program evaluation. Thus, an interesting and important question is: How has implementing the PFPs affected the farmers’ income and poverty status? This article addresses this question using a fixed-effects model and a panel dataset that covers 1968 households in four provinces for ten consecutive years (1995-2004). The empirical evidence indicates that their effects are mixed. The SLCP, the SBDP, and the NFPP have made positive impact and, by far, the SLCP has the largest effect. But the WCNR and the DCBT still have not had a pronounced overall effect due to their short time span of execution, even though they may have exerted certain influence at the margin. Notably, the impact of the WCNR, if any, is negative.

  19. A study on industrial accident rate forecasting and program development of estimated zero accident time in Korea.

    Science.gov (United States)

    Kim, Tae-gu; Kang, Young-sig; Lee, Hyung-won

    2011-01-01

    To begin a zero accident campaign for industry, the first thing is to estimate the industrial accident rate and the zero accident time systematically. This paper considers the social and technical change of the business environment after beginning the zero accident campaign through quantitative time series analysis methods. These methods include sum of squared errors (SSE), regression analysis method (RAM), exponential smoothing method (ESM), double exponential smoothing method (DESM), auto-regressive integrated moving average (ARIMA) model, and the proposed analytic function method (AFM). The program is developed to estimate the accident rate, zero accident time and achievement probability of an efficient industrial environment. In this paper, MFC (Microsoft Foundation Class) software of Visual Studio 2008 was used to develop a zero accident program. The results of this paper will provide major information for industrial accident prevention and be an important part of stimulating the zero accident campaign within all industrial environments.

  20. Exact solutions to traffic density estimation problems involving the Lighthill-Whitham-Richards traffic flow model using mixed integer programming

    KAUST Repository

    Canepa, Edward S.; Claudel, Christian G.

    2012-01-01

    This article presents a new mixed integer programming formulation of the traffic density estimation problem in highways modeled by the Lighthill Whitham Richards equation. We first present an equivalent formulation of the problem using an Hamilton-Jacobi equation. Then, using a semi-analytic formula, we show that the model constraints resulting from the Hamilton-Jacobi equation result in linear constraints, albeit with unknown integers. We then pose the problem of estimating the density at the initial time given incomplete and inaccurate traffic data as a Mixed Integer Program. We then present a numerical implementation of the method using experimental flow and probe data obtained during Mobile Century experiment. © 2012 IEEE.

  1. Exact solutions to traffic density estimation problems involving the Lighthill-Whitham-Richards traffic flow model using mixed integer programming

    KAUST Repository

    Canepa, Edward S.

    2012-09-01

    This article presents a new mixed integer programming formulation of the traffic density estimation problem in highways modeled by the Lighthill Whitham Richards equation. We first present an equivalent formulation of the problem using an Hamilton-Jacobi equation. Then, using a semi-analytic formula, we show that the model constraints resulting from the Hamilton-Jacobi equation result in linear constraints, albeit with unknown integers. We then pose the problem of estimating the density at the initial time given incomplete and inaccurate traffic data as a Mixed Integer Program. We then present a numerical implementation of the method using experimental flow and probe data obtained during Mobile Century experiment. © 2012 IEEE.

  2. Analysis of safety information for nuclear power plants and development of source term estimation program

    International Nuclear Information System (INIS)

    Kim, Tae Woon; Choi, Seong Soo; Park, Jin Hee

    1999-12-01

    Current CARE(Computerized Advisory System for Radiological Emergency) in KINS(Korea Institute of Nuclear Safety) has no STES(Source Term Estimation System) which links between SIDS(Safety Information Display System) and FADAS(Following Accident Dose Assessment System). So in this study, STES is under development. STES system is the system that estimates the source term based on the safety information provided by SIDS. Estimated source term is given to FADAS as an input for estimation of environmental effect of radiation. Through this first year project STES for the Kori 3,4 and Younggwang 1,2 has been developed. Since there is no CARE for Wolsong(PHWR) plants yet, CARE for Wolsong is under construction. The safety parameters are selected and the safety information display screens and the alarm logic for plant status change are developed for Wolsong Unit 2 based on the design documents for CANDU plants

  3. Methods for Estimating the Social Benefits of EPA Land Cleanup and Reuse Programs (2007)

    Science.gov (United States)

    The Office of Policy, Economics and Innovation’s National Center for Environmental Economics, and the Office of Solid Waste and Emergency Response’s Land Revitalization Office convened a workshop on risk assessment and benefit estimation methods in 2006.

  4. Os riscos do texto e da imagem - Em torno de Balinese character (1942, de Gregory Bateson e Margaret Mead

    Directory of Open Access Journals (Sweden)

    Etienne Samain

    2000-11-01

    Full Text Available Balinese character. A photographie analysis (1942 de Gregory Bateson et Margaret Mead est, sans nul doute, un livre fondateur de l’anthropologie visuelle (photographique. Souvent cité, il reste insuffisamment exploré. Cet article soulève - après une brève présentation de l’organisation d’ensemble de l’oeuvre - un questionnement heuristique et une réflexion sur la nature de l’utilisation intégrative de l’image et du texte dans l’élaboration du discours anthropologique. Pour ce faire, nous étudions trois modèles d’organisation des planches photographiques de Balinese character et leurs respectifs commentaires écrits, effectuant des parcours inverses: de l’image au texte, du texte à l’image. S’il est vrai que l'un et l’autre de ces supports communicationnels sont singuliers autant que complémentaires, leurs richesse respective n’est pas à l’abri d’autres risques que cette recherche dévoilera.

  5. Automated gamma knife radiosurgery treatment planning with image registration, data-mining, and Nelder-Mead simplex optimization

    International Nuclear Information System (INIS)

    Lee, Kuan J.; Barber, David C.; Walton, Lee

    2006-01-01

    Gamma knife treatments are usually planned manually, requiring much expertise and time. We describe a new, fully automatic method of treatment planning. The treatment volume to be planned is first compared with a database of past treatments to find volumes closely matching in size and shape. The treatment parameters of the closest matches are used as starting points for the new treatment plan. Further optimization is performed with the Nelder-Mead simplex method: the coordinates and weight of the isocenters are allowed to vary until a maximally conformal plan specific to the new treatment volume is found. The method was tested on a randomly selected set of 10 acoustic neuromas and 10 meningiomas. Typically, matching a new volume took under 30 seconds. The time for simplex optimization, on a 3 GHz Xeon processor, ranged from under a minute for small volumes ( 30 000 cubic mm,>20 isocenters). In 8/10 acoustic neuromas and 8/10 meningiomas, the automatic method found plans with conformation number equal or better than that of the manual plan. In 4/10 acoustic neuromas and 5/10 meningiomas, both overtreatment and undertreatment ratios were equal or better in automated plans. In conclusion, data-mining of past treatments can be used to derive starting parameters for treatment planning. These parameters can then be computer optimized to give good plans automatically

  6. Jogo, esporte, criança e ensino: aproximações com a psicologia social de Mead

    Directory of Open Access Journals (Sweden)

    Carlos Luiz Cardoso

    2014-06-01

    Full Text Available http://dx.doi.org/10.5007/2175-8042.2014v26n42p259 O artigo reflete o jogo e o esporte no interior da psicologia social do filósofo George H. Mead, como orientação para a Educação Física escolar. Inicialmente indica os princípios do ‘condutismo social’ [sociedade, pessoa e mente]. Em seguida mostra a ‘emergência’ do ‘outro generalizado’ na passagem do jogo ao esporte. Mais adiante destaca o ‘movimento renovador’ da década de 80 e algumas referências para o ensino no Brasil. Finalmente reflete a necessidade de compreender o desenvolvimento do selfna educação física escolar, destacando as concepções ‘críticas’ para a formação didática dos professores.

  7. Importance of benthic production to fish populations in Lake Mead prior to the establishment of quagga mussels

    Science.gov (United States)

    Umek, John; Chandra, Sudeep; Rosen, Michael; Wittmann, Marion; Sullivan, Joe; Orsak, Erik

    2010-01-01

    Limnologists recently have developed an interest in quantifying benthic resource contributions to higher-level consumers. Much of this research focuses on natural lakes with very little research in reservoirs. In this study, we provide a contemporary snapshot of the food web structure of Lake Mead to evaluate the contribution of benthic resources to fish consumers. In addition, we document the available food to fishes on soft sediments and changes to the invertebrate community over 2 time periods. Benthic invertebrate food availability for fishes is greater in Las Vegas Bay than Overton Arm. Las Vegas Bay is dominated by oligochaetes, whose biomass increased with depth, while Overton Arm is dominated by chironomids, whose biomass did not change with depth. Diet and isotopic measurements indicate the fish community largely relies on benthic resources regardless of basin (Las Vegas Bay >80%; Overton Arm >92%); however, the threadfin shad likely contribute more to largemouth and striped bass production in Overton Arm versus Las Vegas Bay. A 2-time period analysis, pre and post quagga mussel establishment and during lake level declines, suggests there is no change in the density of benthic invertebrates in Boulder Basin, but there were greater abundances of select taxa in this basin by season and depth than in other basins. Given the potential of alterations as a result of the expansion of quagga mussel and the reliance of the fishery on benthic resources, future investigation of basin specific, benthic processes is recommended.

  8. Blockmodeling and the Estimation of Evolutionary Architectural Growth in Major Defense Acquisition Programs

    Science.gov (United States)

    2016-04-30

    attachment probabilities ∑ , 6d. For each interface established in (6c), assign complexity ( wiX ), Connection Option B 6a. Attach X to subsystems...using attachment probabilities ∑ , 6b. For each interface established in (6a), assign complexity ( wiX ), Cost Estimation 7. Estimate the cost for...Connection Option A does not condition interface complexities based on the connected subsystems’ positions of assignment (i.e., wiX ,l), as any

  9. Estimation of Rural Households’ Willingness to Accept Two PES Programs and Their Service Valuation in the Miyun Reservoir Catchment, China

    Directory of Open Access Journals (Sweden)

    Hao Li

    2018-01-01

    Full Text Available As the only surface water source for Beijing, the Miyun Reservoir and its catchment (MRC are a focus for concern about the degradation of ecosystem services (ES unless appropriate payments for ecosystem services (PES are in place. This study used the contingent valuation method (CVM to estimate the costs of two new PES programs, for agriculture and forestry, and to further calculate the economic value of ES in the MRC from the perspective of local rural households’ willingness to accept (WTA. The results of Logit model including WTA and the variables of household and village indicate that the local socio-economic context has complex effects on the WTA of rural households. In particular, the bid amount, location and proportion of off-farm employment would have significant positive effects on the local WTA. In contrast, the insignificance of the PES participation variable suggests that previous PES program experiences may negatively impact subsequent program participation. The mean WTA payments for agriculture and forestry PES programs were estimated as 8531 and 8187 yuan/ha/year, respectively. These results consistently explain the differentiated opportunity costs on both farmland and forestry land. Meanwhile, the differentiated WTA values in Beijing vs. the surrounding Hebei Province follow the interest differences and development gaps between jurisdictions. Finally, the total economic value of ES in the MRC area was estimated at 11.1 billion yuan/year. The rational economic value of ES for the restoration priority areas reaches 515.2 million yuan/year. For the existing budget gap (299 million yuan/year, the study proposed that decision makers increase the water tariff by 0.08 yuan to raise the funds needed. The study also concluded that these results are not only financially and politically feasible but also cost-effective. This study has policy implications for improving the implementation efficiency and providing quantified supports for PES

  10. Estimation of genetic parameters for growth traits in a breeding program for rainbow trout (Oncorhynchus mykiss) in China.

    Science.gov (United States)

    Hu, G; Gu, W; Bai, Q L; Wang, B Q

    2013-04-26

    Genetic parameters and breeding values for growth traits were estimated in the first and, currently, the only family selective breeding program for rainbow trout (Oncorhynchus mykiss) in China. Genetic and phenotypic data were collected for growth traits from 75 full-sibling families with a 2-generation pedigree. Genetic parameters and breeding values for growth traits of rainbow trout were estimated using the derivative-free restricted maximum likelihood method. The goodness-of-fit of the models was tested using Akaike and Bayesian information criteria. Genetic parameters and breeding values were estimated using the best-fit model for each trait. The values for heritability estimating body weight and length ranged from 0.20 to 0.45 and from 0.27 to 0.60, respectively, and the heritability of condition factor was 0.34. Our results showed a moderate degree of heritability for growth traits in this breeding program and suggested that the genetic and phenotypic tendency of body length, body weight, and condition factor were similar. Therefore, the selection of phenotypic values based on pedigree information was also suitable in this research population.

  11. A MATLAB program for estimation of unsaturated hydraulic soil parameters using an infiltrometer technique

    DEFF Research Database (Denmark)

    Mollerup, Mikkel; Hansen, Søren; Petersen, Carsten

    2008-01-01

    We combined an inverse routine for assessing the hydraulic soil parameters of the Campbell/Mualem model with the power series solution developed by Philip for describing one-dimensional vertical infiltration into a homogenous soil. We based the estimation routine on a proposed measurement procedure....... An independent measurement of the soil water content at saturation may reduce the uncertainty of estimated parameters. Response surfaces of the objective function were analysed. Scenarios for various soils and conditions, using numerically generated synthetic cumulative infiltration data with normally...

  12. A dynamic programming approach for quickly estimating large network-based MEV models

    DEFF Research Database (Denmark)

    Mai, Tien; Frejinger, Emma; Fosgerau, Mogens

    2017-01-01

    We propose a way to estimate a family of static Multivariate Extreme Value (MEV) models with large choice sets in short computational time. The resulting model is also straightforward and fast to use for prediction. Following Daly and Bierlaire (2006), the correlation structure is defined by a ro...... to converge (4.3 h on an Intel(R) 3.2 GHz machine using a non-parallelized code). We also show that our approach allows to estimate a cross-nested logit model of 111 nests with a real data set of more than 100,000 observations in 14 h....

  13. Efficiency estimation of using phased program of caries prevention in children domiciled in Transcarpathian region

    Directory of Open Access Journals (Sweden)

    Klitynska Oksana V.

    2016-01-01

    Full Text Available Background: Caries is a pathological process that occurs in the hard tissues of the teeth after eruption and reduced quality of life due to significant complications, especially in children. An extremely high incidence of dental caries among children living permanently in Transcarpathian region requires a comprehensive prevention program. The aim of this study was to determine the efficiency of complex caries prevention program among children permanently living in the area of biogeochemical fluorine deficiency. Aim of the study: To evaluate efficiency level of using phased program of caries prevention among children of different age groups domiciled in Transcarpathian region. Material and Methods: On examination of 346 children aged 3-8 years, among which 163 (46.9% boys and 183 (53.1% girls, a phased program of complex prophylaxis was created, covering the basic dental diseases in children living permanently in deficiency conditions. The program included: hygienic education of preschool children and their parents; exogenous medicament prevention; early identification and treatment of caries using conventional methods according to treatment protocols; endogenous non-medical prevention, nutrition correction have proved its effectiveness. Results: The indicator of caries prevention efficiency of the proposed scheme for children 5-7 (3-5 years is 69.5%; for children 8-10 age group (6-8 years - 66.9%. Conclusion: The main strategy of pediatric dental services in Ukraine should be created for the children population (aged up 18 years through national and regional programs for the primary prevention of main dental diseases with providing adequate financing in sufficient volume to preserve the nation's dental health for the next 20 years.

  14. Introducing adapted Nelder & Mead's downhill simplex method to a fully automated analysis of eclipsing binaries

    OpenAIRE

    Prsa, A.; Zwitter, T.

    2004-01-01

    Eclipsing binaries are extremely attractive objects because absolute physical parameters (masses, luminosities, radii) of both components may be determined from observations. Since most efforts to extract these parameters were based on dedicated observing programs, existing modeling code is based on interactivity. Gaia will make a revolutionary advance in shear number of observed eclipsing binaries and new methods for automatic handling must be introduced and thoroughly tested. This paper foc...

  15. Evaluation of fission-product gases in program GAPCON series and FREG-3 to estimate the gap heat transfer coefficient

    International Nuclear Information System (INIS)

    Ohki, Naohisa; Harayama, Yasuo; Takeda, Tsuneo; Izumi, Fumio.

    1977-12-01

    In safety evaluation of a fuel rod, estimation of the stored energy in the fuel rod is indispensable. For this estimation, the temperature distribution in the fuel rod is calculated. Most important in determination of the temperature distribution is the gap heat transfer coefficient (gap conductance) between pellet surface and cladding inner surface. Under fuel rod operating condition, the mixed gas in the gap is composed of He, Xe and Kr. He is initial seald gas. Xe and Kr are fission-product gases, of which the quantities depend on the fuel burn-up. In program GAPCON series (GAPCON and GAPCON-THERMAL-1 and -2) and FREG-3, these quantities are given as a function of the irradiation time, power rating and neutron flux in estimation of the thermal conductivity of the mixed gas. The methods of calculating the quantities of Xe and Kr in the programs have been examined. Input of the neutron flux which influences F.P. gas production rates is better than the determination from the fuel-rod power rating. (auth.)

  16. Time-Varying Estimation of Crop Insurance Program in Altering North Dakota Farm Economic Structure

    OpenAIRE

    Coleman, Jane A.; Shaik, Saleem

    2009-01-01

    This study examines how federal farm policies, specifically crop insurance, have affected the farm economic structure of North Dakota’s agriculture sector. The system of derived input demand equations is estimated to quantify the changes in North Dakota farmers’ input use when they purchase crop insurance. Further, the cumulative rolling regression technique is applied to capture the varying effects of the farm policies over time. Empirical results from the system of input demand functions in...

  17. Estimation of Finite Population Mean in Multivariate Stratified Sampling under Cost Function Using Goal Programming

    Directory of Open Access Journals (Sweden)

    Atta Ullah

    2014-01-01

    Full Text Available In practical utilization of stratified random sampling scheme, the investigator meets a problem to select a sample that maximizes the precision of a finite population mean under cost constraint. An allocation of sample size becomes complicated when more than one characteristic is observed from each selected unit in a sample. In many real life situations, a linear cost function of a sample size nh is not a good approximation to actual cost of sample survey when traveling cost between selected units in a stratum is significant. In this paper, sample allocation problem in multivariate stratified random sampling with proposed cost function is formulated in integer nonlinear multiobjective mathematical programming. A solution procedure is proposed using extended lexicographic goal programming approach. A numerical example is presented to illustrate the computational details and to compare the efficiency of proposed compromise allocation.

  18. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    Science.gov (United States)

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  19. BATEMANATER: a computer program to estimate and bootstrap mating system variables based on Bateman's principles.

    Science.gov (United States)

    Jones, Adam G

    2015-11-01

    Bateman's principles continue to play a major role in the characterization of genetic mating systems in natural populations. The modern manifestations of Bateman's ideas include the opportunity for sexual selection (i.e. I(s) - the variance in relative mating success), the opportunity for selection (i.e. I - the variance in relative reproductive success) and the Bateman gradient (i.e. β(ss) - the slope of the least-squares regression of reproductive success on mating success). These variables serve as the foundation for one convenient approach for the quantification of mating systems. However, their estimation presents at least two challenges, which I address here with a new Windows-based computer software package called BATEMANATER. The first challenge is that confidence intervals for these variables are not easy to calculate. BATEMANATER solves this problem using a bootstrapping approach. The second, more serious, problem is that direct estimates of mating system variables from open populations will typically be biased if some potential progeny or adults are missing from the analysed sample. BATEMANATER addresses this problem using a maximum-likelihood approach to estimate mating system variables from incompletely sampled breeding populations. The current version of BATEMANATER addresses the problem for systems in which progeny can be collected in groups of half- or full-siblings, as would occur when eggs are laid in discrete masses or offspring occur in pregnant females. BATEMANATER has a user-friendly graphical interface and thus represents a new, convenient tool for the characterization and comparison of genetic mating systems. © 2015 John Wiley & Sons Ltd.

  20. New horizontal global solar radiation estimation models for Turkey based on robust coplot supported genetic programming technique

    International Nuclear Information System (INIS)

    Demirhan, Haydar; Kayhan Atilgan, Yasemin

    2015-01-01

    Highlights: • Precise horizontal global solar radiation estimation models are proposed for Turkey. • Genetic programming technique is used to construct the models. • Robust coplot analysis is applied to reduce the impact of outlier observations. • Better estimation and prediction properties are observed for the models. - Abstract: Renewable energy sources have been attracting more and more attention of researchers due to the diminishing and harmful nature of fossil energy sources. Because of the importance of solar energy as a renewable energy source, an accurate determination of significant covariates and their relationships with the amount of global solar radiation reaching the Earth is a critical research problem. There are numerous meteorological and terrestrial covariates that can be used in the analysis of horizontal global solar radiation. Some of these covariates are highly correlated with each other. It is possible to find a large variety of linear or non-linear models to explain the amount of horizontal global solar radiation. However, models that explain the amount of global solar radiation with the smallest set of covariates should be obtained. In this study, use of the robust coplot technique to reduce the number of covariates before going forward with advanced modelling techniques is considered. After reducing the dimensionality of model space, yearly and monthly mean daily horizontal global solar radiation estimation models for Turkey are built by using the genetic programming technique. It is observed that application of robust coplot analysis is helpful for building precise models that explain the amount of global solar radiation with the minimum number of covariates without suffering from outlier observations and the multicollinearity problem. Consequently, over a dataset of Turkey, precise yearly and monthly mean daily global solar radiation estimation models are introduced using the model spaces obtained by robust coplot technique and

  1. The Role of Inflation and Price Escalation Adjustments in Properly Estimating Program Costs: F-35 Case Study

    Science.gov (United States)

    2016-03-01

    standard practice is to deflate costs to constant dollars (the dependent variable in the analogous regression) using a previously determined price ...I N S T I T U T E F O R D E F E N S E A N A L Y S E S IDA Document D-5489 March 2016 The Role of Inflation and Price Escalation Adjustments in...DFARS 252.227-7013 (a)(16) [Jun 2013]. The Role of Inflation and Price Escalation Adjustments in Properly Estimating Program Costs: F-35 Case Study

  2. Program Potential: Estimates of Federal Energy Cost Savings from Energy Efficient Procurement

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Margaret [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-09-17

    In 2011, energy used by federal buildings cost approximately $7 billion. Reducing federal energy use could help address several important national policy goals, including: (1) increased energy security; (2) lowered emissions of greenhouse gases and other air pollutants; (3) increased return on taxpayer dollars; and (4) increased private sector innovation in energy efficient technologies. This report estimates the impact of efficient product procurement on reducing the amount of wasted energy (and, therefore, wasted money) associated with federal buildings, as well as on reducing the needless greenhouse gas emissions associated with these buildings.

  3. Using Multiple and Logistic Regression to Estimate the Median WillCost and Probability of Cost and Schedule Overrun for Program Managers

    Science.gov (United States)

    2017-03-23

    Logistic Regression to Estimate the Median Will-Cost and Probability of Cost and Schedule Overrun for Program Managers Ryan C. Trudelle, B.S...not the other. We are able to give logistic regression models to program managers that identify several program characteristics for either...considered acceptable. We recommend the use of our logistic models as a tool to manage a portfolio of programs in order to gain potential elusive

  4. Deemed Savings Estimates for Legacy Air Conditioning and WaterHeating Direct Load Control Programs in PJM Region

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, Charles

    2007-03-01

    During 2005 and 2006, the PJM Interconnection (PJM) Load Analysis Subcommittee (LAS) examined ways to reduce the costs and improve the effectiveness of its existing measurement and verification (M&V) protocols for Direct Load Control (DLC) programs. The current M&V protocol requires that a PURPA-compliant Load Research study be conducted every five years for each Load-Serving Entity (LSE). The current M&V protocol is expensive to implement and administer particularly for mature load control programs, some of which are marginally cost-effective. There was growing evidence that some LSEs were mothballing or dropping their DLC programs in lieu of incurring the expense associated with the M&V. This project had several objectives: (1) examine the potential for developing deemed savings estimates acceptable to PJM for legacy air conditioning and water heating DLC programs, and (2) explore the development of a collaborative, regional, consensus-based approach for conducting monitoring and verification of load reductions for emerging load management technologies for customers that do not have interval metering capability.

  5. Mapping anuran habitat suitability to estimate effects of grassland and wetland conservation programs

    Science.gov (United States)

    Mushet, David M.; Euliss, Ned H.; Stockwell, Craig A.

    2012-01-01

    The conversion of the Northern Great Plains of North America to a landscape favoring agricultural commodity production has negatively impacted wildlife habitats. To offset impacts, conservation programs have been implemented by the U.S. Department of Agriculture and other agencies to restore grassland and wetland habitat components. To evaluate effects of these efforts on anuran habitats, we used call survey data and environmental data in ecological niche factor analyses implemented through the program Biomapper to quantify habitat suitability for five anuran species within a 196 km2 study area. Our amphibian call surveys identified Northern Leopard Frogs (Lithobates pipiens), Wood Frogs (Lithobates sylvaticus), Boreal Chorus Frogs (Pseudacris maculata), Great Plains Toads (Anaxyrus cognatus), and Woodhouse’s Toads (Anaxyrus woodhousii) occurring within the study area. Habitat suitability maps developed for each species revealed differing patterns of suitable habitat among species. The most significant findings of our mapping effort were 1) the influence of deep-water overwintering wetlands on suitable habitat for all species encountered except the Boreal Chorus Frog; 2) the lack of overlap between areas of core habitat for both the Northern Leopard Frog and Wood Frog compared to the core habitat for both toad species; and 3) the importance of conservation programs in providing grassland components of Northern Leopard Frog and Wood Frog habitat. The differences in habitats suitable for the five species we studied in the Northern Great Plains, i.e., their ecological niches, highlight the importance of utilizing an ecosystem based approach that considers the varying needs of multiple species in the development of amphibian conservation and management plans.

  6. Public estimation of the program of the rehabilitation of the east Urals territory of radioactive contamination

    International Nuclear Information System (INIS)

    Ishutina, T.A.; Korobejnikova, T.A.; Pavlov, B.S.; Suslo, A.F.; Sharova, A.F.

    1996-01-01

    The state of public opinion at the East Urals territory of radioactive contamination of the moment of the adoption of a number of govement acts on rehabilitation may be considered as transitory from the state of actually complete neglect of the problem on the part of the government (1950-70) to that of publicity and taking first practical steps towards development and implementation of rehabilitation policies (1990 s). A primary goal for a program for such territories should be achieving their overall revival on the basis of modern requirements of the population

  7. JuPOETs: a constrained multiobjective optimization approach to estimate biochemical model ensembles in the Julia programming language.

    Science.gov (United States)

    Bassen, David M; Vilkhovoy, Michael; Minot, Mason; Butcher, Jonathan T; Varner, Jeffrey D

    2017-01-25

    Ensemble modeling is a promising approach for obtaining robust predictions and coarse grained population behavior in deterministic mathematical models. Ensemble approaches address model uncertainty by using parameter or model families instead of single best-fit parameters or fixed model structures. Parameter ensembles can be selected based upon simulation error, along with other criteria such as diversity or steady-state performance. Simulations using parameter ensembles can estimate confidence intervals on model variables, and robustly constrain model predictions, despite having many poorly constrained parameters. In this software note, we present a multiobjective based technique to estimate parameter or models ensembles, the Pareto Optimal Ensemble Technique in the Julia programming language (JuPOETs). JuPOETs integrates simulated annealing with Pareto optimality to estimate ensembles on or near the optimal tradeoff surface between competing training objectives. We demonstrate JuPOETs on a suite of multiobjective problems, including test functions with parameter bounds and system constraints as well as for the identification of a proof-of-concept biochemical model with four conflicting training objectives. JuPOETs identified optimal or near optimal solutions approximately six-fold faster than a corresponding implementation in Octave for the suite of test functions. For the proof-of-concept biochemical model, JuPOETs produced an ensemble of parameters that gave both the mean of the training data for conflicting data sets, while simultaneously estimating parameter sets that performed well on each of the individual objective functions. JuPOETs is a promising approach for the estimation of parameter and model ensembles using multiobjective optimization. JuPOETs can be adapted to solve many problem types, including mixed binary and continuous variable types, bilevel optimization problems and constrained problems without altering the base algorithm. JuPOETs is open

  8. Estimating the Effects of Astronaut Career Ionizing Radiation Dose Limits on Manned Interplanetary Flight Programs

    Science.gov (United States)

    Koontz, Steven L.; Rojdev, Kristina; Valle, Gerard D.; Zipay, John J.; Atwell, William S.

    2013-01-01

    The Hybrid Inflatable DSH combined with electric propulsion and high power solar-electric power systems offer a near TRL-now solution to the space radiation crew dose problem that is an inevitable aspect of long term manned interplanetary flight. Spreading program development and launch costs over several years can lead to a spending plan that fits with NASA's current and future budgetary limitations, enabling early manned interplanetary operations with space radiation dose control, in the near future while biomedical research, nuclear electric propulsion and active shielding research and development proceed in parallel. Furthermore, future work should encompass laboratory validation of HZETRN calculations, as previous laboratory investigations have not considered large shielding thicknesses and the calculations presented at these thicknesses are currently performed via extrapolation.

  9. A conceptual approach to the estimation of societal willingness-to-pay for nuclear safety programs

    International Nuclear Information System (INIS)

    Pandey, M.D.; Nathwani, J.S.

    2003-01-01

    The design, refurbishment and future decommissioning of nuclear reactors are crucially concerned with reducing the risk of radiation exposure that can result in adverse health effects and potential loss of life. To address this concern, large financial investments have been made to ensure safety of operating nuclear power plants worldwide. The efficacy of the expenditures incurred to provide safety must be judged against the safety benefit to be gained from such investments. We have developed an approach that provides a defendable basis for making that judgement. If the costs of risk reduction are disproportionate to the safety benefits derived, then the expenditures are not optimal; in essence the societal resources are being diverted away from other critical areas such as health care, education and social services that also enhance the quality of life. Thus, the allocation of society's resources devoted to nuclear safety must be continually appraised in light of competing needs, because there is a limit on the resources that any society can devote to extend life. The purpose of the paper is to present a simple and methodical approach to assessing the benefits of nuclear safety programs and regulations. The paper presents the Life-Quality Index (LQI) as a tool for the assessment of risk reduction initiatives that would support the public interest and enhance both safety and the quality of life. The LQI is formulated as a utility function consistent with the principles of rational decision analysis. The LQI is applied to quantify the societal willingness-to-pay (SWTP) for safety measures enacted to reduce of the risk of potential exposures to ionising radiation. The proposed approach provides essential support to help improve the cost-benefit analysis of engineering safety programs and safety regulations.

  10. Estimating the Population Impact of a New Pediatric Influenza Vaccination Program in England Using Social Media Content.

    Science.gov (United States)

    Wagner, Moritz; Lampos, Vasileios; Yom-Tov, Elad; Pebody, Richard; Cox, Ingemar J

    2017-12-21

    The rollout of a new childhood live attenuated influenza vaccine program was launched in England in 2013, which consisted of a national campaign for all 2 and 3 year olds and several pilot locations offering the vaccine to primary school-age children (4-11 years of age) during the influenza season. The 2014/2015 influenza season saw the national program extended to include additional pilot regions, some of which offered the vaccine to secondary school children (11-13 years of age) as well. We utilized social media content to obtain a complementary assessment of the population impact of the programs that were launched in England during the 2013/2014 and 2014/2015 flu seasons. The overall community-wide impact on transmission in pilot areas was estimated for the different age groups that were targeted for vaccination. A previously developed statistical framework was applied, which consisted of a nonlinear regression model that was trained to infer influenza-like illness (ILI) rates from Twitter posts originating in pilot (school-age vaccinated) and control (unvaccinated) areas. The control areas were then used to estimate ILI rates in pilot areas, had the intervention not taken place. These predictions were compared with their corresponding Twitter-based ILI estimates. Results suggest a reduction in ILI rates of 14% (1-25%) and 17% (2-30%) across all ages in only the primary school-age vaccine pilot areas during the 2013/2014 and 2014/2015 influenza seasons, respectively. No significant impact was observed in areas where two age cohorts of secondary school children were vaccinated. These findings corroborate independent assessments from traditional surveillance data, thereby supporting the ongoing rollout of the program to primary school-age children and providing evidence of the value of social media content as an additional syndromic surveillance tool. ©Moritz Wagner, Vasileios Lampos, Elad Yom-Tov, Richard Pebody, Ingemar J Cox. Originally published in the

  11. Use of risk projection models to estimate mortality and incidence from radiation-induced breast cancer in screening programs

    International Nuclear Information System (INIS)

    Ramos, M; Ferrer, S; Villaescusa, J I; Verdu, G; Salas, M D; Cuevas, M D

    2005-01-01

    The authors report on a method to calculate radiological risks, applicable to breast screening programs and other controlled medical exposures to ionizing radiation. In particular, it has been applied to make a risk assessment in the Valencian Breast Cancer Early Detection Program (VBCEDP) in Spain. This method is based on a parametric approach, through Markov processes, of hazard functions for radio-induced breast cancer incidence and mortality, with mean glandular breast dose, attained age and age-at-exposure as covariates. Excess relative risk functions of breast cancer mortality have been obtained from two different case-control studies exposed to ionizing radiation, with different follow-up time: the Canadian Fluoroscopy Cohort Study (1950-1987) and the Life Span Study (1950-1985 and 1950-1990), whereas relative risk functions for incidence have been obtained from the Life Span Study (1958-1993), the Massachusetts tuberculosis cohorts (1926-1985 and 1970-1985), the New York post-partum mastitis patients (1930-1981) and the Swedish benign breast disease cohort (1958-1987). Relative risks from these cohorts have been transported to the target population undergoing screening in the Valencian Community, a region in Spain with about four and a half million inhabitants. The SCREENRISK software has been developed to estimate radiological detriments in breast screening. Some hypotheses corresponding to different screening conditions have been considered in order to estimate the total risk associated with a woman who takes part in all screening rounds. In the case of the VBCEDP, the total radio-induced risk probability for fatal breast cancer is in a range between [5 x 10 -6 , 6 x 10 -4 ] versus the natural rate of dying from breast cancer in the Valencian Community which is 9.2 x 10 -3 . The results show that these indicators could be included in quality control tests and could be adequate for making comparisons between several screening programs

  12. A Tri-National program for estimating the link between snow resources and hydrological droughts

    Directory of Open Access Journals (Sweden)

    M. Zappa

    2015-06-01

    Full Text Available To evaluate how summer low flows and droughts are affected by the winter snowpack, a Tri-National effort will analyse data from three catchments: Alpbach (Prealps, central Switzerland, Gudjaretis-Tskali (Little Caucasus, central Georgia, and Kamenice (Jizera Mountains, northern Czech Republic. Two GIS-based rainfall-runoff models will simulate over 10 years of runoff in streams based on rain and snowfall measurements, and further meteorological variables. The models use information on the geographical settings of the catchments together with knowledge of the hydrological processes of runoff generation from rainfall, looking particularly at the relationship between spring snowmelt and summer droughts. These processes include snow accumulation and melt, evapotranspiration, groundwater recharge in spring that contributes to (the summer runoff, and will be studied by means of the environmental isotopes 18O and 2H. Knowledge about the isotopic composition of the different water sources will allow to identify the flow paths and estimate the residence time of snow meltwater in the subsurface and its contribution to the stream. The application of the models in different nested or neighbouring catchments will explore their potential for further development and allow a better early prediction of low-flow periods in various mountainous zones across Europe. The paper presents the planned activities including a first analysis of already available dataset of environmental isotopes, discharge, snow water equivalent and modelling experiments of the (already available datasets.

  13. Estimation of a Decommissioning Program Considering the Reuse of Demolition Materials

    International Nuclear Information System (INIS)

    Eiichi Sakata; Sachio Ozaki; Michihiko Hironaga; Daisuke Ogane; Tatsuo Usui; Yutaka Kono

    2002-01-01

    As for a decommissioning job in Japan decontamination and dismantling including safety store are executed during the prior period of the total project. The reprocessing and disposal of dismantling wastes as well recycling is to be practiced during the succeeding period. An expert system proposed in this paper has a faculty of furnishing motivations for decommissioning planners by correlative estimations of the project between its prior part and the succeeding recycle part of total process. This paper presents both a summarized configuration and an algorithm of the proposed model and indicates some contents of essential data bases (D/B) to be prepared as well some additional data to embody an assumed scenario for recycling. The proposed model provides the useful outputs concerning commercial cost, required procedures or licenses, regional encouragement, obstacles to be surmounted and so on. These outputs are available to explain the outline of the project both to inside and to outside of a plant corporate entity by combining each other. Simulated cases for concrete structures in non-controlled areas bring some information about both feasibility and comparisons of assumed recycling scenarios taking account of quality requirements in relevant technical standards, ordinances and the ability of reprocessing facilities. (authors)

  14. Kinetics of selenium release in mine waste from the Meade Peak Phosphatic Shale, Phosphoria Formation, Wooley Valley, Idaho, USA

    Science.gov (United States)

    Stillings, Lisa L.; Amacher, Michael C.

    2010-01-01

    Phosphorite from the Meade Peak Phosphatic Shale member of the Permian Phosphoria Formation has been mined in southeastern Idaho since 1906. Dumps of waste rock from mining operations contain high concentrations of Se which readily leach into nearby streams and wetlands. While the most common mineralogical residence of Se in the phosphatic shale is elemental Se, Se(0), Se is also an integral component of sulfide phases (pyrite, sphalerite and vaesite–pyritess) in the waste rock. It may also be present as adsorbed selenate and/or selenite, and FeSe2 and organo-selenides.Se release from the waste rock has been observed in field and laboratory experiments. Release rates calculated from waste rock dump and column leachate solutions describe the net, overall Se release from all of the possible sources of Se listed above. In field studies, Se concentration in seepage water (pH 7.4–7.8) from the Wooley Valley Unit 4 dump ranges from 3600 µg/L in May to 10 µg/L by Sept. Surface water flow, Q, from the seep also declines over the summer, from 2 L/s in May to 0.03 L/s in Sept. Se flux ([Se] ⁎ Q) reaches a steady-state of Laboratory experiments were performed with the waste shale in packed bed reactors; residence time varied from 0.09 to 400 h and outlet pH ∼ 7.5. Here, Se concentration increased with increasing residence time and release was modeled with a first order reaction with k = 2.19e−3 h− 1 (19.2 yr− 1).Rate constants reported here fall within an order of magnitude of reported rate constants for oxidation of Se(0) formed by bacterial precipitation. This similarity among rate constants from both field and laboratory studies combined with the direct observation of Se(0) in waste shales of the Phosphoria Formation suggests that oxidation of Se(0) may control steady-state Se concentration in water draining the Wooley Valley waste dump.

  15. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    International Nuclear Information System (INIS)

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations

  16. Publication Bias Currently Makes an Accurate Estimate of the Benefits of Enrichment Programs Difficult: A Postmortem of Two Meta-Analyses Using Statistical Power Analysis

    Science.gov (United States)

    Warne, Russell T.

    2016-01-01

    Recently Kim (2016) published a meta-analysis on the effects of enrichment programs for gifted students. She found that these programs produced substantial effects for academic achievement (g = 0.96) and socioemotional outcomes (g = 0.55). However, given current theory and empirical research these estimates of the benefits of enrichment programs…

  17. Estimating Costs and Benefits Associated with Evidence-Based Violence Prevention: Four Case Studies Based on the Fourth R Program

    Directory of Open Access Journals (Sweden)

    Claire V. Crooks

    2017-05-01

    Full Text Available Teen violence in dating and peer relationships has huge costs to society in numerous areas including health care, social services, the workforce and the justice system. Physical, psychological, and sexual abuse have long-lasting ramifications for the perpetrators as well as the victims, and for the families involved on both sides of that equation. An effective violence prevention program that is part of a school’s curriculum is beneficial not only for teaching teenagers what is appropriate behaviour in a relationship, but also for helping them break the cycle of violence which may have begun at home with their own maltreatment as children. The Fourth R program is an efficacious violence prevention program that was developed in Ontario and has been implemented in schools throughout Canada and the U.S. Covering relationship dynamics common to dating violence as well as substance abuse, peer violence and unsafe sex, the program can be adapted to different cultures and to same-sex relationships. The program, which gets its name from the traditional 3Rs — reading, ’riting and ’rithmetic — offers schools the opportunity to provide effective programming for teens to reduce the likelihood of them using relationship for violence as they move into adulthood. The federal government has estimated that the societal costs of relationship violence amount to more than $7 billion. These costs can continue to be incurred through the legal and health-care systems as the ripple effects of violence play out over the years, even after a relationship has ended. Other types of violence are also costly to society and not just in terms of dollars, but in young lives diverted into criminal activity. Up to 15 per cent of youth who become involved with the justice system grow into serious adult offenders who develop lengthy criminal careers. Yet, research shows that if prevention programs such as the Fourth R can deter just one 14-year-old high-risk juvenile from

  18. Gene expression programming approach for the estimation of moisture ratio in herbal plants drying with vacuum heat pump dryer

    Science.gov (United States)

    Dikmen, Erkan; Ayaz, Mahir; Gül, Doğan; Şahin, Arzu Şencan

    2017-07-01

    The determination of drying behavior of herbal plants is a complex process. In this study, gene expression programming (GEP) model was used to determine drying behavior of herbal plants as fresh sweet basil, parsley and dill leaves. Time and drying temperatures are input parameters for the estimation of moisture ratio of herbal plants. The results of the GEP model are compared with experimental drying data. The statistical values as mean absolute percentage error, root-mean-squared error and R-square are used to calculate the difference between values predicted by the GEP model and the values actually observed from the experimental study. It was found that the results of the GEP model and experimental study are in moderately well agreement. The results have shown that the GEP model can be considered as an efficient modelling technique for the prediction of moisture ratio of herbal plants.

  19. The Role of Inflation and Price Escalation Adjustments in Properly Estimating Program Costs: F-35 Case Study

    Science.gov (United States)

    2016-04-30

    qÜáêíÉÉåíÜ=^ååì~ä= ^Åèìáëáíáçå=oÉëÉ~êÅÜ= póãéçëáìã= qÜìêëÇ~ó=pÉëëáçåë= sçäìãÉ=ff= = The Role of Inflation and Price Escalation Adjustments in Properly...The Role of Inflation and Price Escalation Adjustments in Properly Estimating Program Costs: F-35 Case Study Stanley Horowitz, Assistant Division...Graduate School of Engineering and Management, Air Force Institute of Technology Cost and Price Collaboration Venkat Rao, Professor, Defense

  20. HYDRAULICS, MEADE COUNTY, USA

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — Recent developments in digital terrain and geospatial database management technology make it possible to protect this investment for existing and future projects to...

  1. FLOODPLAIN, MEADE COUNTY, SD

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Floodplain Mapping/Redelineation study deliverables depict and quantify the flood risks for the study area. The primary risk classifications used are the...

  2. MEAD retrospective analysis report

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Carstensen, J.; Frohn, L.M.

    2003-01-01

    the bottom waters. Yet the cumulative atmospheric deposition is always larger than the marine deep-water flux. The mixing of nutrient-rich water from belowthe pycnocline into the euphotic zone is also a process of highly episodic character and provides sufficient nitrogen to the euphotic zone to sustain...... larger algae blooms. The two nitrogen loading processes are correlated - mainly because both are to someextent related to the wind velocity - and the nitrogen input from both processes enables the build-up of algae blooms. Furthermore, the nitrogen supplied on a single day cannot sustain a bloom...

  3. A Computer Program Method for Estimation of Entrance Skin Dose for some Individuals Undergoing X-ray Imaging

    International Nuclear Information System (INIS)

    Taha, T.M.; Allehyani, S.

    2012-01-01

    A computer program depends on practical measurements of entrance skin dose patients undergoing radiological examinations. Physical parameters such as field size, half value layer, backscatter factor, dose output, focal film distance, focal skin distance, normal operating conditions were taken into consideration for calculation entrance skin dose. It was measured by many techniques such as Thermo-luminescence dosimeters, ionization chambers. TLD technique characterized by high precision and reproducibility of dose measurement is checked by addressing pre-readout annealing, group sorting, dose evaluation, Fifty TLD chips were annealed for 1 hour at 400 degree C followed by 2 h at 100 degree C. After exposure to constant dose from X-ray generator. 0.6 cc Ionization chamber was located at surface of water chest phantom that has dimensions of 40 cm x 40 cm x 20 cm and connected with farmer dose master. Entrance Skin Dose was calculated using the generated software by changing the physical parameters and using the measured output doses. The obtained results were compared with the reference levels of International Atomic Energy Authority. The constructed computer program provides an easy and more practical mean of estimating skin dose even before exposure. They also provide the easiest and cheapest technique can be employed in any entrance skin dose measurement

  4. Model to estimate radiation dose commitments to the world population from the atmospheric release of radionuclides (LWBR development program)

    International Nuclear Information System (INIS)

    Rider, J.L.; Beal, S.K.

    1978-02-01

    The equations developed for use in the LWBR environmental statement to estimate the dose commitment over a given time interval to a given organ of the population in the entire region affected by the atmospheric releases of a radionuclide are presented and may be used for any assessment of dose commitments in these regions. These equations define the dose commitments to the world resulting from a released radionuclide and each of its daughters and the sum of these dose commitments provides the total dose commitment accrued from the release of a given radionuclide. If more than one radionuclide is released from a facility, then the sum of the dose commitments from each released nuclide and from each daughter of each released nuclide is the total dose commitment to the world from that facility. Furthermore, if more than one facility is considered as part of an industry, then the sum of the dose commitments from the individual facilities represents the total world dose commitment associated with that industry. The actual solutions to these equations are carried out by the AIRWAY computer program. The writing of this computer program entailed defining in detail the specific representations of the various parameters such as scavenging coefficients, resuspension factors, deposition velocities, dose commitment conversion factors, and food uptake factors, in addition to providing specific numerical values for these types of parameters. The program permits the examination of more than one released nuclide at a time and performs the necessary summing to obtain the total dose commitment to the world accrued from the specified releases

  5. Competing risks and the development of adaptive management plans for water resources: Field reconnaissance investigation of risks to fishes and other aquatic biota exposed to endocrine disrupting chemicals (edcs) in lake mead, Nevada USA

    Science.gov (United States)

    Linder, G.; Little, E.E.

    2009-01-01

    The analysis and characterization of competing risks for water resources rely on a wide spectrum of tools to evaluate hazards and risks associated with their management. For example, waters of the lower Colorado River stored in reservoirs such as Lake Mead present a wide range of competing risks related to water quantity and water quality. These risks are often interdependent and complicated by competing uses of source waters for sustaining biological resources and for supporting a range of agricultural, municipal, recreational, and industrial uses. USGS is currently conducting a series of interdisciplinary case-studies on water quality of Lake Mead and its source waters. In this case-study we examine selected constituents potentially entering the Lake Mead system, particularly endocrine disrupting chemicals (EDCs). Worldwide, a number of environmental EDCs have been identified that affect reproduction, development, and adaptive behaviors in a wide range of organisms. Many EDCs are minimally affected by current treatment technologies and occur in treated sewage effluents. Several EDCs have been detected in Lake Mead, and several substances have been identified that are of concern because of potential impacts to the aquatic biota, including the sport fishery of Lake Mead and endangered razorback suckers (Xyrauchen texanus) that occur in the Colorado River system. For example, altered biomarkers relevant to reproduction and thyroid function in fishes have been observed and may be predictive of impaired metabolism and development. Few studies, however, have addressed whether such EDC-induced responses observed in the field have an ecologically significant effect on the reproductive success of fishes. To identify potential linkages between EDCs and species of management concern, the risk analysis and characterization in this reconnaissance study focused on effects (and attendant uncertainties) that might be expressed by exposed populations. In addition, risk reduction

  6. Chaos control in solar fed DC-DC boost converter by optimal parameters using nelder-mead algorithm powered enhanced BFOA

    Science.gov (United States)

    Sudhakar, N.; Rajasekar, N.; Akhil, Saya; Jyotheeswara Reddy, K.

    2017-11-01

    The boost converter is the most desirable DC-DC power converter for renewable energy applications for its favorable continuous input current characteristics. In other hand, these DC-DC converters known as practical nonlinear systems are prone to several types of nonlinear phenomena including bifurcation, quasiperiodicity, intermittency and chaos. These undesirable effects has to be controlled for maintaining normal periodic operation of the converter and to ensure the stability. This paper presents an effective solution to control the chaos in solar fed DC-DC boost converter since the converter experiences wide range of input power variation which leads to chaotic phenomena. Controlling of chaos is significantly achieved using optimal circuit parameters obtained through Nelder-Mead Enhanced Bacterial Foraging Optimization Algorithm. The optimization renders the suitable parameters in minimum computational time. The results are compared with the traditional methods. The obtained results of the proposed system ensures the operation of the converter within the controllable region.

  7. Hydrogeology and sources of water to select springs in Black Canyon, south of Hoover Dam, Lake Mead National Recreation Area, Nevada and Arizona

    Science.gov (United States)

    Moran, Michael J.; Wilson, Jon W.; Beard, L. Sue

    2015-11-03

    Springs in Black Canyon of the Colorado River, directly south of Hoover Dam in the Lake Mead National Recreation Area, Nevada and Arizona, are important hydrologic features that support a unique riparian ecosystem including habitat for endangered species. Rapid population growth in areas near and surrounding Black Canyon has caused concern among resource managers that such growth could affect the discharge from these springs. The U.S. Geological Survey studied the springs in Black Canyon between January 2008, and May 2014. The purposes of this study were to provide a baseline of discharge and hydrochemical data from selected springs in Black Canyon and to better understand the sources of water to the springs.

  8. Improving estimates of the burden of severe acute malnutrition and predictions of caseload for programs treating severe acute malnutrition

    DEFF Research Database (Denmark)

    Bulti, Assaye; Briend, André; Dale, Nancy M

    2017-01-01

    Background: The burden of severe acute malnutrition (SAM) is estimated using unadjusted prevalence estimates. SAM is an acute condition and many children with SAM will either recover or die within a few weeks. Estimating SAM burden using unadjusted prevalence estimates results in significant...

  9. Linear Programming in the economic estimate of livestock-crop integration: application to a Brazilian dairy farm

    Directory of Open Access Journals (Sweden)

    Augusto Hauber Gameiro

    2016-04-01

    Full Text Available ABSTRACT A linear programming mathematical model was applied to a representative dairy farm located in Brazil. The results showed that optimization models are relevant tools to assist in the planning and management of agricultural production, as well as to assist in estimating potential gains from the use of integrated systems. Diversification was a necessary condition for economic viability. A total cost reduction potential of about 30% was revealed when a scenario of lower levels of diversification was contrasted to one of higher levels. Technical complementarities proved to be important sources of economies. The possibility of reusing nitrogen, phosphorus, and potassium present in animal waste could be increased to 167%, while water reuse could be increased up to 150%. In addition to economic gains, integrated systems bring benefits to the environment, especially with reference to the reuse of resources. The cost dilution of fixed production factors can help economies of scope to be achieved. However, this does not seem to have been the main source of these benefits. Still, the percentage of land use could increase up to 30.7% when the lowest and the highest diversification scenarios were compared. The labor coefficient could have a 4.3 percent increase. Diversification also leads to drastic transaction cost reductions.

  10. Computer programs for capital cost estimation, lifetime economic performance simulation, and computation of cost indexes for laser fusion and other advanced technology facilities

    International Nuclear Information System (INIS)

    Pendergrass, J.H.

    1978-01-01

    Three FORTRAN programs, CAPITAL, VENTURE, and INDEXER, have been developed to automate computations used in assessing the economic viability of proposed or conceptual laser fusion and other advanced-technology facilities, as well as conventional projects. The types of calculations performed by these programs are, respectively, capital cost estimation, lifetime economic performance simulation, and computation of cost indexes. The codes permit these three topics to be addressed with considerable sophistication commensurate with user requirements and available data

  11. A linear programming approach for estimating the structure of a sparse linear genetic network from transcript profiling data

    Directory of Open Access Journals (Sweden)

    Chandra Nagasuma R

    2009-02-01

    Full Text Available Abstract Background A genetic network can be represented as a directed graph in which a node corresponds to a gene and a directed edge specifies the direction of influence of one gene on another. The reconstruction of such networks from transcript profiling data remains an important yet challenging endeavor. A transcript profile specifies the abundances of many genes in a biological sample of interest. Prevailing strategies for learning the structure of a genetic network from high-dimensional transcript profiling data assume sparsity and linearity. Many methods consider relatively small directed graphs, inferring graphs with up to a few hundred nodes. This work examines large undirected graphs representations of genetic networks, graphs with many thousands of nodes where an undirected edge between two nodes does not indicate the direction of influence, and the problem of estimating the structure of such a sparse linear genetic network (SLGN from transcript profiling data. Results The structure learning task is cast as a sparse linear regression problem which is then posed as a LASSO (l1-constrained fitting problem and solved finally by formulating a Linear Program (LP. A bound on the Generalization Error of this approach is given in terms of the Leave-One-Out Error. The accuracy and utility of LP-SLGNs is assessed quantitatively and qualitatively using simulated and real data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM initiative provides gold standard data sets and evaluation metrics that enable and facilitate the comparison of algorithms for deducing the structure of networks. The structures of LP-SLGNs estimated from the INSILICO1, INSILICO2 and INSILICO3 simulated DREAM2 data sets are comparable to those proposed by the first and/or second ranked teams in the DREAM2 competition. The structures of LP-SLGNs estimated from two published Saccharomyces cerevisae cell cycle transcript profiling data sets capture known

  12. U.S. Air Force Operational Medicine: Using the Enterprise Estimating Supplies Program to Develop Materiel Solutions for the Aeromedical Evacuation In-Flight Kit (FFQDM)

    Science.gov (United States)

    2011-05-04

    HEMORRHAGE 1 533.10 ACUTE PEPTIC ULCER WITH PERFORATION 1 346.90 MIGRAINE UNPSECIFIED 1 410.90 ACUTE MYOCARDIAL INFARCTION OF UNSPECIFIED SITE 1 451.90...Enterprise Estimating Supplies Program (EESP), for the development and management of Air Force medical Allowance Standards as a baseline for

  13. Estimation of Shielding Thickness for a Prototype Department of Energy National Spent Nuclear Fuel Program Transport Cask

    Energy Technology Data Exchange (ETDEWEB)

    SANCHEZ,LAWRENCE C.; MCCONNELL,PAUL E.

    2000-07-01

    Preliminary shielding calculations were performed for a prototype National Spent Nuclear Fuel Program (NSNFP) transport cask. This analysis is intended for use in the selection of cask shield material type and preliminary estimate of shielding thickness. The radiation source term was modeled as cobalt-60 with radiation exposure strength of 100,000 R/hr. Cobalt-60 was chosen as a surrogate source because it simultaneous emits two high-energy gammas, 1.17 MeV and 1.33 MeV. This gamma spectrum is considered to be large enough that it will upper bound the spectra of all the various spent nuclear fuels types currently expected to be shipped within the prototype cask. Point-kernel shielding calculations were performed for a wide range of shielding thickness of lead and depleted uranium material. The computational results were compared to three shielding limits: 200 mrem/hr dose rate limit at the cask surface, 50 mR/hr exposure rate limit at one meter from the cask surface, and 10 mrem/hr limit dose rate at two meters from the cask surface. The results obtained in this study indicated that a shielding thickness of 13 cm is required for depleted uranium and 21 cm for lead in order to satisfy all three shielding requirements without taking credit for stainless steel liners. The system analysis also indicated that required shielding thicknesses are strongly dependent upon the gamma energy spectrum from the radiation source term. This later finding means that shielding material thickness, and hence cask weight, can be significantly reduced if the radiation source term can be shown to have a softer, lower energy, gamma energy spectrum than that due to cobalt-60.

  14. Estimation of Pap-test coverage in an area with an organised screening program: challenges for survey methods

    Directory of Open Access Journals (Sweden)

    Raggi Patrizio

    2006-03-01

    Full Text Available Abstract Background The cytological screening programme of Viterbo has completed the second round of invitations to the entire target population (age 25–64. From a public health perspective, it is important to know the Pap-test coverage rate and the use of opportunistic screening. The most commonly used study design is the survey, but the validity of self-reports and the assumptions made about non respondents are often questioned. Methods From the target population, 940 women were sampled, and responded to a telephone interview about Pap-test utilisation. The answers were compared with the screening program registry; comparing the dates of Pap-tests reported by both sources. Sensitivity analyses were performed for coverage over a 36-month period, according to various assumptions regarding non respondents. Results The response rate was 68%. The coverage over 36 months was 86.4% if we assume that non respondents had the same coverage as respondents, 66% if we assume they were not covered at all, and 74.6% if we adjust for screening compliance in the non respondents. The sensitivity and specificity of the question, "have you ever had a Pap test with the screening programme" were 84.5% and 82.2% respectively. The test dates reported in the interview tended to be more recent than those reported in the registry, but 68% were within 12 months of each other. Conclusion Surveys are useful tools to understand the effectiveness of a screening programme and women's self-report was sufficiently reliable in our setting, but the coverage estimates were strongly influenced by the assumptions we made regarding non respondents.

  15. Free-Roaming Dog Population Estimation and Status of the Dog Population Management and Rabies Control Program in Dhaka City, Bangladesh

    Science.gov (United States)

    Tenzin, Tenzin; Ahmed, Rubaiya; Debnath, Nitish C.; Ahmed, Garba; Yamage, Mat

    2015-01-01

    Beginning January 2012, a humane method of dog population management using a Catch-Neuter-Vaccinate-Release (CNVR) program was implemented in Dhaka City, Bangladesh as part of the national rabies control program. To enable this program, the size and distribution of the free-roaming dog population needed to be estimated. We present the results of a dog population survey and a pilot assessment of the CNVR program coverage in Dhaka City. Free-roaming dog population surveys were undertaken in 18 wards of Dhaka City on consecutive days using mark-resight methods. Data was analyzed using Lincoln-Petersen index-Chapman correction methods. The CNVR program was assessed over the two years (2012–2013) whilst the coverage of the CNVR program was assessed by estimating the proportion of dogs that were ear-notched (processed dogs) via dog population surveys. The free-roaming dog population was estimated to be 1,242 (95 % CI: 1205–1278) in the 18 sampled wards and 18,585 dogs in Dhaka City (52 dogs/km2) with an estimated human-to-free-roaming dog ratio of 828:1. During the two year CNVR program, a total of 6,665 dogs (3,357 male and 3,308 female) were neutered and vaccinated against rabies in 29 of the 92 city wards. A pilot population survey indicated a mean CNVR coverage of 60.6% (range 19.2–79.3%) with only eight wards achieving > 70% coverage. Given that the coverage in many neighborhoods was below the WHO-recommended threshold level of 70% for rabies eradications and since the CNVR program takes considerable time to implement throughout the entire Dhaka City area, a mass dog vaccination program in the non-CNVR coverage area is recommended to create herd immunity. The findings from this study are expected to guide dog population management and the rabies control program in Dhaka City and elsewhere in Bangladesh. PMID:25978406

  16. IMFIT: A FAST, FLEXIBLE NEW PROGRAM FOR ASTRONOMICAL IMAGE FITTING

    Energy Technology Data Exchange (ETDEWEB)

    Erwin, Peter [Max-Planck-Insitut für Extraterrestrische Physik, Giessenbachstrasse, D-85748 Garching, GermanyAND (Germany); Universitäts-Sternwarte München, Scheinerstrasse 1, D-81679 München (Germany)

    2015-02-01

    I describe a new, open-source astronomical image-fitting program called IMFIT, specialized for galaxies but potentially useful for other sources, which is fast, flexible, and highly extensible. A key characteristic of the program is an object-oriented design that allows new types of image components (two-dimensional surface-brightness functions) to be easily written and added to the program. Image functions provided with IMFIT include the usual suspects for galaxy decompositions (Sérsic, exponential, Gaussian), along with Core-Sérsic and broken-exponential profiles, elliptical rings, and three components that perform line-of-sight integration through three-dimensional luminosity-density models of disks and rings seen at arbitrary inclinations. Available minimization algorithms include Levenberg-Marquardt, Nelder-Mead simplex, and Differential Evolution, allowing trade-offs between speed and decreased sensitivity to local minima in the fit landscape. Minimization can be done using the standard χ{sup 2} statistic (using either data or model values to estimate per-pixel Gaussian errors, or else user-supplied error images) or Poisson-based maximum-likelihood statistics; the latter approach is particularly appropriate for cases of Poisson data in the low-count regime. I show that fitting low-signal-to-noise ratio galaxy images using χ{sup 2} minimization and individual-pixel Gaussian uncertainties can lead to significant biases in fitted parameter values, which are avoided if a Poisson-based statistic is used; this is true even when Gaussian read noise is present.

  17. Rapid determination of thermodynamic parameters from one-dimensional programmed-temperature gas chromatography for use in retention time prediction in comprehensive multidimensional chromatography.

    Science.gov (United States)

    McGinitie, Teague M; Ebrahimi-Najafabadi, Heshmatollah; Harynuk, James J

    2014-01-17

    A new method for estimating the thermodynamic parameters of ΔH(T0), ΔS(T0), and ΔCP for use in thermodynamic modeling of GC×GC separations has been developed. The method is an alternative to the traditional isothermal separations required to fit a three-parameter thermodynamic model to retention data. Herein, a non-linear optimization technique is used to estimate the parameters from a series of temperature-programmed separations using the Nelder-Mead simplex algorithm. With this method, the time required to obtain estimates of thermodynamic parameters a series of analytes is significantly reduced. This new method allows for precise predictions of retention time with the average error being only 0.2s for 1D separations. Predictions for GC×GC separations were also in agreement with experimental measurements; having an average relative error of 0.37% for (1)tr and 2.1% for (2)tr. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories

    Science.gov (United States)

    Dunn, H. J.

    1981-01-01

    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  19. Mildly abnormal general movement quality in infants is associated with higher Mead acid and lower arachidonic acid and shows a U-shaped relation with the DHA/AA ratio.

    Science.gov (United States)

    van Goor, S A; Schaafsma, A; Erwich, J J H M; Dijck-Brouwer, D A J; Muskiet, F A J

    2010-01-01

    We showed that docosahexaenoic acid (DHA) supplementation during pregnancy and lactation was associated with more mildly abnormal (MA) general movements (GMs) in the infants. Since this finding was unexpected and inter-individual DHA intakes are highly variable, we explored the relationship between GM quality and erythrocyte DHA, arachidonic acid (AA), DHA/AA and Mead acid in 57 infants of this trial. MA GMs were inversely related to AA, associated with Mead acid, and associated with DHA/AA in a U-shaped manner. These relationships may indicate dependence of newborn AA status on synthesis from linoleic acid. This becomes restricted during the intrauterine period by abundant de novo synthesis of oleic and Mead acids from glucose, consistent with reduced insulin sensitivity during the third trimester. The descending part of the U-shaped relation between MA GMs and DHA/AA probably indicates DHA shortage next to AA shortage. The ascending part may reflect a different developmental trajectory that is not necessarily unfavorable. Copyright 2009 Elsevier Ltd. All rights reserved.

  20. Comparison of Gene Expression Programming with neuro-fuzzy and neural network computing techniques in estimating daily incoming solar radiation in the Basque Country (Northern Spain)

    International Nuclear Information System (INIS)

    Landeras, Gorka; López, José Javier; Kisi, Ozgur; Shiri, Jalal

    2012-01-01

    Highlights: ► Solar radiation estimation based on Gene Expression Programming is unexplored. ► This approach is evaluated for the first time in this study. ► Other artificial intelligence models (ANN and ANFIS) are also included in the study. ► New alternatives for solar radiation estimation based on temperatures are provided. - Abstract: Surface incoming solar radiation is a key variable for many agricultural, meteorological and solar energy conversion related applications. In absence of the required meteorological sensors for the detection of global solar radiation it is necessary to estimate this variable. Temperature based modeling procedures are reported in this study for estimating daily incoming solar radiation by using Gene Expression Programming (GEP) for the first time, and other artificial intelligence models such as Artificial Neural Networks (ANNs), and Adaptive Neuro-Fuzzy Inference System (ANFIS). A comparison was also made among these techniques and traditional temperature based global solar radiation estimation equations. Root mean square error (RMSE), mean absolute error (MAE) RMSE-based skill score (SS RMSE ), MAE-based skill score (SS MAE ) and r 2 criterion of Nash and Sutcliffe criteria were used to assess the models’ performances. An ANN (a four-input multilayer perceptron with 10 neurons in the hidden layer) presented the best performance among the studied models (2.93 MJ m −2 d −1 of RMSE). The ability of GEP approach to model global solar radiation based on daily atmospheric variables was found to be satisfactory.

  1. Sampling-based approaches to improve estimation of mortality among patient dropouts: experience from a large PEPFAR-funded program in Western Kenya.

    Directory of Open Access Journals (Sweden)

    Constantin T Yiannoutsos

    Full Text Available Monitoring and evaluation (M&E of HIV care and treatment programs is impacted by losses to follow-up (LTFU in the patient population. The severity of this effect is undeniable but its extent unknown. Tracing all lost patients addresses this but census methods are not feasible in programs involving rapid scale-up of HIV treatment in the developing world. Sampling-based approaches and statistical adjustment are the only scaleable methods permitting accurate estimation of M&E indices.In a large antiretroviral therapy (ART program in western Kenya, we assessed the impact of LTFU on estimating patient mortality among 8,977 adult clients of whom, 3,624 were LTFU. Overall, dropouts were more likely male (36.8% versus 33.7%; p = 0.003, and younger than non-dropouts (35.3 versus 35.7 years old; p = 0.020, with lower median CD4 count at enrollment (160 versus 189 cells/ml; p<0.001 and WHO stage 3-4 disease (47.5% versus 41.1%; p<0.001. Urban clinic clients were 75.0% of non-dropouts but 70.3% of dropouts (p<0.001. Of the 3,624 dropouts, 1,143 were sought and 621 had their vital status ascertained. Statistical techniques were used to adjust mortality estimates based on information obtained from located LTFU patients. Observed mortality estimates one year after enrollment were 1.7% (95% CI 1.3%-2.0%, revised to 2.8% (2.3%-3.1% when deaths discovered through outreach were added and adjusted to 9.2% (7.8%-10.6% and 9.9% (8.4%-11.5% through statistical modeling depending on the method used. The estimates 12 months after ART initiation were 1.7% (1.3%-2.2%, 3.4% (2.9%-4.0%, 10.5% (8.7%-12.3% and 10.7% (8.9%-12.6% respectively. CONCLUSIONS/SIGNIFICANCE ABSTRACT: Assessment of the impact of LTFU is critical in program M&E as estimated mortality based on passive monitoring may underestimate true mortality by up to 80%. This bias can be ameliorated by tracing a sample of dropouts and statistically adjust the mortality estimates to properly evaluate and guide large

  2. Estimation of Epidemiological Effectiveness of the Program of Pharmaceutical Prevention of Influenza and ARVI «Antigripp» in Organized Children's Groups

    Directory of Open Access Journals (Sweden)

    I. B. Yakovlev

    2012-01-01

    Full Text Available The authors estimated epidemiological effectiveness of the program for prophylaxis of influenza and ARVI «Antigripp» for children of 7—12 years old. Children received Arbidol (capsules of 100 mg 2 times a week for 3 weeks and Complivit activ. As a result there was a decline in absolute and relative indicators of ARVI morbidity risks. Epidemiological effectiveness of the program during the application of the drugs made up 56% and index of preventive efficacy was 2,3.

  3. Preliminary estimates of the total-system cost for the restructured program: An addendum to the May 1989 analysis of the total-system life cycle cost for the Civilian Radioactive Waste Management Program

    International Nuclear Information System (INIS)

    1990-12-01

    The total-system life-cycle cost (TSLCC) analysis for the Department of Energy's (DOE) Civilian Radioactive Waste Management Program is an ongoing activity that helps determine whether the revenue-producing mechanism established by the Nuclear Waste Policy Act of 1982 - a fee levied on electricity generated and sold by commercial nuclear power plants - is sufficient to cover the cost of the program. This report provides cost estimates for the sixth annual evaluation of the adequacy of the fee. The costs contained in this report represent a preliminary analysis of the cost impacts associated with the Secretary of Energy's Report to Congress on Reassessment of the Civilian Radioactive Waste Management Program issued in November 1989. The major elements of the restructured program announced in this report which pertain to the program's life-cycle costs are: a prioritization of the scientific investigations program at the Yucca Mountain candidate site to focus on identification of potentially adverse conditions, a delay in the start of repository operations until 2010, the start of limited waste acceptance at the monitored retrievable storage (MRS) facility in 1998, and the start of waste acceptance at the full-capability MRS facility in 2,000. Based on the restructured program, the total-system cost for the system with a repository at the candidate site at Yucca Mountain in Nevada, a facility for monitored retrievable storage (MRS), and a transportation system is estimated at $26 billion (expressed in constant 1988 dollars). In the event that a second repository is required and is authorized by the Congress, the total-system cost is estimated at $34 to $35 billion, depending on the quantity of spent fuel and high-level waste (HLW) requiring disposal. 17 figs., 17 tabs

  4. Program for searching for semiempirical parameters by the MNDO method

    International Nuclear Information System (INIS)

    Bliznyuk, A.A.; Voityuk, A.A.

    1987-01-01

    The authors describe an program for optimizing atomic models constructed using the MNDO method which varies not only the parameters but also the scope for simple changes in the calculation scheme. The target function determines properties such as formation enthalpies, dipole moments, ionization potentials, and geometrical parameters. Software used to minimize the target function is based on the simplex method on the Nelder-Mead algorithm and on the Fletcher variable-metric method. The program is written in FORTRAN IV and implemented on the ES computer

  5. INTEGRAL ESTIMATE OF THE EFFECTIVENESS OF PERFORMANCE OF INDICES OF STATE TARGET PROGRAMS FOR THE PROTECTION OF THE NATURAL ENVIRONMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Oksana Senyshyn

    2017-12-01

    Full Text Available This scientific article deals with the integral estimate of the effectiveness of performance of indices of the state target programs for protection of the natural environment in Ukraine, namely – the subject of the research is quantitative indices of the State target program “Forests of Ukraine” for 2010–2015 and their estimate. Methodology. The methodological basis of the study is the system of indices for the estimate of effectiveness and performance of the state target programs for the protection of the natural environment that include the following indices (indicators: an integrated index of financing the program actions and indicators of co-financing. The author applies integrated indicator of financing the program tasks and actions to assess the actual level of financing the program from various sources through the entire period of the program implementation and to carry out a comparative analysis of financial support for various programs implemented at the expense of the budgetary funds and other sources. The author uses indicator of co-financing for calculating the ratio of actual and planned indicators of the attraction of the funds from other sources (public borrowings, extrabudgetary funds per 1 UAH of the budget funds. Results. Proceeding from the analysis of quantitative indices of the State target program “Forests of Ukraine” for 2010–2015, it was established that for all 5 years of activity, the planned level of budget financing of the Program has not been achieved. In particular, in 2010–2011, operations and tasks of the Program had been financed from the budget funds by 77% and in 2014–2015 by 33% and 27% respectively. During the entire period of the Program implementation, the average annual rate of actual financing from all sources attained 147%, including 53% from the state budget and 206% from other sources of financing. The author has proved that the said indices of the performance of the Program

  6. A computer program integrating a multichannel analyzer with gamma analysis for the estimation of 226 Ra concentration in soil samples

    International Nuclear Information System (INIS)

    Wilson, J. E.

    1992-08-01

    A new hardware/software system has been implemented using the existing three-regions-of-interest method for determining the concentration of 226 Ra in soil samples for the Pollutant Assessment Group of the Oak Ridge National Laboratory. Consisting of a personal computer containing a multichannel analyzer, the system utilizes a new program combining the multichannel analyzer with a program analyzing gamma-radiation spectra for 226 Ra concentrations. This program uses a menu interface to minimize and simplify the tasks of system operation

  7. Are endocrine and reproductive biomarkers altered in contaminant-exposed wild male Largemouth Bass (Micropterus salmoides) of Lake Mead, Nevada/Arizona, USA?

    Science.gov (United States)

    Goodbred, Steven L.; Patino, Reynaldo; Torres, Leticia; Echols, Kathy R.; Jenkins, Jill A.; Rosen, Michael R.; Orsak, Erik

    2015-01-01

    Male Largemouth Bass were sampled from two locations in Lake Mead (USA), a site influenced by treated municipal wastewater effluent and urban runoff (Las Vegas Bay), and a reference site (Overton Arm). Samples were collected in summer (July '07) and spring (March '08) to assess general health, endocrine and reproductive biomarkers, and compare contaminant body burdens by analyzing 252 organic chemicals. Sperm count and motility were measured in spring. Contaminants were detected at much higher frequencies and concentrations in fish from Las Vegas Bay than Overton Arm. Those with the highest concentrations included PCBs, DDTs, PBDEs, galaxolide, and methyl triclosan. Fish from Las Vegas Bay also had higher Fulton condition factor, hepatosomatic index, and hematocrit, and lower plasma 11-ketotestosterone concentration (KT). Gonadosomatic index (GSI) and sperm motility did not differ between sites, but sperm count was lower by nearly 50% in fish from Las Vegas Bay. A positive association between KT and GSI was identified, but this association was nonlinear. On average, maximal GSI was reached at sub-maximal KT concentrations. In conclusion, the higher concentration of contaminant body burdens coupled with reduced levels of KT and sperm count in fish from Las Vegas Bay suggest that male reproductive condition was influenced by contaminant exposures. Also, the nonlinear KT-GSI association provided a framework to understand why GSI was similar between male bass from both sites despite their large difference in KT, and also suggested the existence of post-gonadal growth functions of KT at high concentrations.

  8. Construction of Indonesian cultural thoughts in tafsir al-Azhar as Hamka’s teaching practice; text analysis using George Herbert Mead communication theory

    Directory of Open Access Journals (Sweden)

    Hamdi Putra Ahmad

    2018-01-01

    Full Text Available Tafsir al-Azhar is one of the many exegesis books written by Nusantara scholars. Authored by Hamka, this book includes products come in the contemporary era and has attracted many researchers to analyze deeply all of the secrets contained in them. Tafsir al-Azhar contains a number of information related to the elements of Indonesia cultures. But not many of the readers are aware of this fact because of its relatively small capacity. It is the main attraction to be studied in depth in order to know how the construction built by a Hamka related to his Indonesian cultural thoughts contained in Tafsir al-Azhar. Using George Herbert Mead's social communication theory, this research will formulate the construction of the Indonesian cultural commentary constructed by Hamka in Tafsir al-Azhar and prove that the use of elements of Indonesian culture in interpreting the Qur'an is one of the effective strategies to teach about the Qur’anic interpretation to the Muslims in particular, and the Indonesian people in general.

  9. Kroppens och känslornas betydelse för självförverkligandet. En rekonstruktion av G H Meads tänkande

    Directory of Open Access Journals (Sweden)

    Emma Engdahl

    2001-01-01

    Full Text Available Many social researchers mistakenly think that Mead had no interest in the body and the emotions of the human being. They seem to think that his social philosophy of the act is all about her mind. This is unfortunate since his idea of the significance of the body and emotions for the emergence and development of the self is of great relevance to the contemporary interest in the subject matter. First, this article presents Mead’s contribution to the area. Especially it considers his idea of emotions as a form of intersubjective corporeality. Secondly, to better fulfil Mead’s own wish to transcend the Cartesian mind-body dualism it reconstructs his idea of emotions by relating them not only to the social body of the human being, but also the structures of norms and values embodied in social life. In that way it becomes evident that not only mind but also the social body and the emotions of the human being is of great significance for her selfrealisation.

  10. Are endocrine and reproductive biomarkers altered in contaminant-exposed wild male Largemouth Bass (Micropterus salmoides) of Lake Mead, Nevada/Arizona, USA?

    Science.gov (United States)

    Goodbred, Steven L; Patiño, Reynaldo; Torres, Leticia; Echols, Kathy R; Jenkins, Jill A; Rosen, Michael R; Orsak, Erik

    2015-08-01

    Male Largemouth Bass were sampled from two locations in Lake Mead (USA), a site influenced by treated municipal wastewater effluent and urban runoff (Las Vegas Bay), and a reference site (Overton Arm). Samples were collected in summer (July '07) and spring (March '08) to assess general health, endocrine and reproductive biomarkers, and compare contaminant body burdens by analyzing 252 organic chemicals. Sperm count and motility were measured in spring. Contaminants were detected at much higher frequencies and concentrations in fish from Las Vegas Bay than Overton Arm. Those with the highest concentrations included PCBs, DDTs, PBDEs, galaxolide, and methyl triclosan. Fish from Las Vegas Bay also had higher Fulton condition factor, hepatosomatic index, and hematocrit, and lower plasma 11-ketotestosterone concentration (KT). Gonadosomatic index (GSI) and sperm motility did not differ between sites, but sperm count was lower by nearly 50% in fish from Las Vegas Bay. A positive association between KT and GSI was identified, but this association was nonlinear. On average, maximal GSI was reached at sub-maximal KT concentrations. In conclusion, the higher concentration of contaminant body burdens coupled with reduced levels of KT and sperm count in fish from Las Vegas Bay suggest that male reproductive condition was influenced by contaminant exposures. Also, the nonlinear KT-GSI association provided a framework to understand why GSI was similar between male bass from both sites despite their large difference in KT, and also suggested the existence of post-gonadal growth functions of KT at high concentrations. Published by Elsevier Inc.

  11. Sperm quality biomarkers complement reproductive and endocrine parameters in investigating environmental contaminants in common carp (Cyprinus carpio) from the Lake Mead National Recreation Area

    Science.gov (United States)

    Jenkins, Jill A.; Rosen, Michael R.; Dale, Rassa O.; Echols, Kathy R.; Torres, Leticia; Wieser, Carla M.; Kersten, Constance A.; Goodbred, Steven L.

    2018-01-01

    Lake Mead National Recreational Area (LMNRA) serves as critical habitat for several federally listed species and supplies water for municipal, domestic, and agricultural use in the Southwestern U.S. Contaminant sources and concentrations vary among the sub-basins within LMNRA. To investigate whether exposure to environmental contaminants is associated with alterations in male common carp (Cyprinus carpio) gamete quality and endocrine- and reproductive parameters, data were collected among sub-basins over 7 years (1999–2006). Endpoints included sperm quality parameters of motility, viability, mitochondrial membrane potential, count, morphology, and DNA fragmentation; plasma components were vitellogenin (VTG), 17ß-estradiol, 11-keto-testosterone, triiodothyronine, and thyroxine. Fish condition factor, gonadosomatic index, and gonadal histology parameters were also measured. Diminished biomarker effects were noted in 2006, and sub-basin differences were indicated by the irregular occurrences of contaminants and by several associations between chemicals (e.g., polychlorinated biphenyls, hexachlorobenzene, galaxolide, and methyl triclosan) and biomarkers (e.g., plasma thyroxine, sperm motility and DNA fragmentation). By 2006, sex steroid hormone and VTG levels decreased with subsequent reduced endocrine disrupting effects. The sperm quality bioassays developed and applied with carp complemented endocrine and reproductive data, and can be adapted for use with other species.

  12. A mathematical model, algorithm, and package of programs for simulation and prompt estimation of the atmospheric dispersion of radioactive pollutants

    International Nuclear Information System (INIS)

    Nikolaev, V.I.; Yatsko, S.N.

    1995-01-01

    A mathematical model and a package of programs are presented for simulating the atmospheric turbulent diffusion of contaminating impurities from land based and other sources. Test calculations and investigations of the effect of various factors are carried out

  13. Estimated cost savings associated with the transfer of office-administered specialty pharmaceuticals to a specialty pharmacy provider in a Medical Injectable Drug program.

    Science.gov (United States)

    Baldini, Christopher G; Culley, Eric J

    2011-01-01

    A large managed care organization (MCO) in western Pennsylvania initiated a Medical Injectable Drug (MID) program in 2002 that transferred a specific subset of specialty drugs from physician reimbursement under the traditional "buy-and-bill" model in the medical benefit to MCO purchase from a specialty pharmacy provider (SPP) that supplied physician offices with the MIDs. The MID program was initiated with 4 drugs in 2002 (palivizumab and 3 hyaluronate products/derivatives) growing to more than 50 drugs by 2007-2008. To (a) describe the MID program as a method to manage the cost and delivery of this subset of specialty drugs, and (b) estimate the MID program cost savings in 2007 and 2008 in an MCO with approximately 4.6 million members. Cost savings generated by the MID program were calculated by comparing the total actual expenditure (plan cost plus member cost) on medications included in the MID program for calendar years 2007 and 2008 with the total estimated expenditure that would have been paid to physicians during the same time period for the same medication if reimbursement had been made using HCPCS (J code) billing under the physician "buy-and-bill" reimbursement rates. For the approximately 50 drugs in the MID program in 2007 and 2008, the drug cost savings in 2007 were estimated to be $15.5 million (18.2%) or $290 per claim ($0.28 per member per month [PMPM]) and about $13 million (12.7%) or $201 per claim ($0.23 PMPM) in 2008. Although 28% of MID claims continued to be billed by physicians using J codes in 2007 and 22% in 2008, all claims for MIDs were limited to the SPP reimbursement rates. This MID program was associated with health plan cost savings of approximately $28.5 million over 2 years, achieved by the transfer of about 50 physician-administered injectable pharmaceuticals from reimbursement to physicians to reimbursement to a single SPP and payment of physician claims for MIDs at the SPP reimbursement rates.

  14. Integrated Status and Effectiveness Monitoring Program Population Estimates for Juvenile Salmonids in Nason Creek, WA ; 2008 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Matthew; Murdoch, Keely [Yakama Nation Fisheries Resource Management

    2009-07-20

    This report summarizes juvenile coho, spring Chinook, and steelhead salmon migration data collected at a 1.5m diameter cone rotary fish trap on Nason Creek during 2008; providing abundance and freshwater productivity estimates. We used species enumeration at the trap and efficiency trials to describe emigration timing and to estimate the number of emigrants. Trapping began on March 2, 2008 and was suspended on December 11, 2008 when snow and ice accumulation prevented operation. During 2008, 0 brood year (BY) 2006 coho, 1 BY2007 coho, 906 BY2006 spring Chinook, 323 BY2007 fry Chinook, 2,077 BY2007 subyearling Chinook, 169 steelhead smolts, 414 steelhead fry and 2,390 steelhead parr were trapped. Mark-recapture trap efficiency trials were performed over a range of stream discharge stages. A total of 2,639 spring Chinook, 2,154 steelhead and 12 bull trout were implanted with Passive Integrated Transponder (PIT) tags. Most PIT tagged fish were used for trap efficiency trials. We were unable to identify a statistically significant relationship between stream discharge and trap efficiency, thus, pooled efficiency estimates specific to species and trap size/position were used to estimate the number of fish emigrating past the trap. We estimate that 5,259 ({+-} 359; 95% CI) BY2006 Chinook, 16,816 ({+-} 731; 95% CI) BY2007 Chinook, and 47,868 ({+-} 3,780; 95% CI) steelhead parr and smolts emigrated from Nason Creek in 2008.

  15. Estimating the Impacts of Direct Load Control Programs Using GridPIQ, a Web-Based Screening Tool

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Seemita; Thayer, Brandon L.; Barrett, Emily L.; Studarus, Karen E.

    2017-11-13

    In direct load control (DLC) programs, utilities can curtail the demand of participating loads to contractually agreed-upon levels during periods of critical peak load, thereby reducing stress on the system, generation cost, and required transmission and generation capacity. Participating customers receive financial incentives. The impacts of implementing DLC programs extend well beyond peak shaving. There may be a shift of load proportional to the interrupted load to the times before or after a DLC event, and different load shifts have different consequences. Tools that can quantify the impacts of such programs on load curves, peak demand, emissions, and fossil fuel costs are currently lacking. The Grid Project Impact Quantification (GridPIQ) screening tool includes a Direct Load Control module, which takes into account project-specific inputs as well as the larger system context in order to quantify the impacts of a given DLC program. This allows users (utilities, researchers, etc.) to test and compare different program specifications and their impacts.

  16. Flood Catastrophe Model for Designing Optimal Flood Insurance Program: Estimating Location-Specific Premiums in the Netherlands.

    Science.gov (United States)

    Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A

    2017-01-01

    As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.

  17. Tree Canopy Light Interception Estimates in Almond and a Walnut Orchards Using Ground, Low Flying Aircraft, and Satellite Based Methods to Improve Irrigation Scheduling Programs

    Science.gov (United States)

    Rosecrance, Richard C.; Johnson, Lee; Soderstrom, Dominic

    2016-01-01

    Canopy light interception is a main driver of water use and crop yield in almond and walnut production. Fractional green canopy cover (Fc) is a good indicator of light interception and can be estimated remotely from satellite using the normalized difference vegetation index (NDVI) data. Satellite-based Fc estimates could be used to inform crop evapotranspiration models, and hence support improvements in irrigation evaluation and management capabilities. Satellite estimates of Fc in almond and walnut orchards, however, need to be verified before incorporating them into irrigation scheduling or other crop water management programs. In this study, Landsat-based NDVI and Fc from NASA's Satellite Irrigation Management Support (SIMS) were compared with four estimates of canopy cover: 1. light bar measurement, 2. in-situ and image-based dimensional tree-crown analyses, 3. high-resolution NDVI data from low flying aircraft, and 4. orchard photos obtained via Google Earth and processed by an Image J thresholding routine. Correlations between the various estimates are discussed.

  18. First Order Estimates of Energy Requirements for Pollution Control. Interagency Energy-Environment Research and Development Program Report.

    Science.gov (United States)

    Barker, James L.; And Others

    This U.S. Environmental Protection Agency report presents estimates of the energy demand attributable to environmental control of pollution from stationary point sources. This class of pollution source includes powerplants, factories, refineries, municipal waste water treatment plants, etc., but excludes mobile sources such as trucks, and…

  19. Programming

    International Nuclear Information System (INIS)

    Jackson, M.A.

    1982-01-01

    The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, this model is elaborated to produce the required program outputs; third, the resulting program is transformed to run efficiently in the execution environment. The first two stages deal in network structures of sequential processes; only the third is concerned with procedure hierarchies. (orig.)

  20. Programming

    OpenAIRE

    Jackson, M A

    1982-01-01

    The programmer's task is often taken to be the construction of algorithms, expressed in hierarchical structures of procedures: this view underlies the majority of traditional programming languages, such as Fortran. A different view is appropriate to a wide class of problem, perhaps including some problems in High Energy Physics. The programmer's task is regarded as having three main stages: first, an explicit model is constructed of the reality with which the program is concerned; second, thi...

  1. Flood Catastrophe Model for Designing Optimal Flood Insurance Program : Estimating Location-Specific Premiums in the Netherlands

    NARCIS (Netherlands)

    Ermolieva, T.; Filatova, Tatiana; Ermoliev, Y.; Obersteiner, M.; de Bruijn, K.M.; Jeuken, A.

    2017-01-01

    As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article

  2. Estimating the population impact of a new pediatric influenza vaccination program in England using social media content

    DEFF Research Database (Denmark)

    Wagner, Moritz; Lampos, Vasileios; Yom-Tov, Elad

    2017-01-01

    the influenza season. The 2014/2015 influenza season saw the national program extended to include additional pilot regions, some of which offered the vaccine to secondary school children (11-13 years of age) as well. Objective: We utilized social media content to obtain a complementary assessment......-age children and providing evidence of the value of social media content as an additional syndromic surveillance tool.......Background: The rollout of a new childhood live attenuated influenza vaccine program was launched in England in 2013, which consisted of a national campaign for all 2 and 3 year olds and several pilot locations offering the vaccine to primary school-age children (4-11 years of age) during...

  3. Seismic architecture and lithofacies of turbidites in Lake Mead (Arizona and Nevada, U.S.A.), an analogue for topographically complex basins

    Science.gov (United States)

    Twichell, D.C.; Cross, V.A.; Hanson, A.D.; Buck, B.J.; Zybala, J.G.; Rudin, M.J.

    2005-01-01

    Turbidites, which have accumulated in Lake Mead since completion of the Hoover Dam in 1935, have been mapped using high-resolution seismic and coring techniques. This lake is an exceptional natural laboratory for studying fine-grained turbidite systems in complex topographic settings. The lake comprises four relatively broad basins separated by narrow canyons, and turbidity currents run the full length of the lake. The mean grain size of turbidites is mostly coarse silt, and the sand content decreases from 11-30% in beds in the easternmost basin nearest the source to 3-14% in the central basins to 1-2% in the most distal basin. Regionally, the seismic amplitude mimics the core results and decreases away from the source. The facies and morphology of the sediment surface varies between basins and suggests a regional progression from higher-energy and possibly channelized flows in the easternmost basin to unchannelized flows in the central two basins to unchannelized flows that are ponded by the Hoover Dam in the westernmost basin. At the local scale, turbidites are nearly flat-lying in the central two basins, but here the morphology of the basin walls strongly affects the distribution of facies. One of the two basins is relatively narrow, and in sinuous sections reflection amplitude increases toward the outsides of meanders. Where a narrow canyon debouches into a broad basin, reflection amplitude decreases radially away from the canyon mouth and forms a fan-like deposit. The fine-grained nature of the turbidites in the most distal basin and the fact that reflections drape the underlying pre-impoundment surface suggest ponding here. The progression from ponding in the most distal basin to possibly channelized flows in the most proximal basin shows in plan view a progression similar to the stratigraphic progression documented in several minibasins in the Gulf of Mexico. Copyright ?? 2005, SEPM (Society for Sedimentary Geology).

  4. Association between degradation of pharmaceuticals and endocrine-disrupting compounds and microbial communities along a treated wastewater effluent gradient in Lake Mead

    Science.gov (United States)

    Blunt, Susanna M.; Sackett, Joshua D.; Rosen, Michael R.; Benotti, Mark J.; Trenholm, Rebecca A.; Vanderford, Brett J.; Hedlund, Brian P.; Moser, Duane P.

    2018-01-01

    The role of microbial communities in the degradation of trace organic contaminants in the environment is little understood. In this study, the biotransformation potential of 27 pharmaceuticals and endocrine-disrupting compounds was examined in parallel with a characterization of the native microbial community in water samples from four sites variously impacted by urban run-off and wastewater discharge in Lake Mead, Nevada and Arizona, USA. Samples included relatively pristine Colorado River water at the upper end of the lake, nearly pure tertiary-treated municipal wastewater entering via the Las Vegas Wash, and waters of mixed influence (Las Vegas Bay and Boulder Basin), which represented a gradient of treated wastewater effluent impact. Microbial diversity analysis based on 16S rRNA gene censuses revealed the community at this site to be distinct from the less urban-impacted locations, although all sites were similar in overall diversity and richness. Similarly, Biolog EcoPlate assays demonstrated that the microbial community at Las Vegas Wash was the most metabolically versatile and active. Organic contaminants added as a mixture to laboratory microcosms were more rapidly and completely degraded in the most wastewater-impacted sites (Las Vegas Wash and Las Vegas Bay), with the majority exhibiting shorter half-lives than at the other sites or in a bacteriostatic control. Although the reasons for enhanced degradation capacity in the wastewater-impacted sites remain to be established, these data are consistent with the acclimatization of native microorganisms (either through changes in community structure or metabolic regulation) to effluent-derived trace contaminants. This study suggests that in urban, wastewater-impacted watersheds, prior exposure to organic contaminants fundamentally alters the structure and function of microbial communities, which in turn translates into greater potential for the natural attenuation of these compounds compared to more pristine

  5. Regional estimates of ecological services derived from U.S. Department of Agriculture conservation programs in the Mississippi Alluvial Valley

    Science.gov (United States)

    Faulkner, Stephen P.; Baldwin, Michael J.; Barrow, Wylie C.; Waddle, Hardin; Keeland, Bobby D.; Walls, Susan C.; James, Dale; Moorman, Tom

    2010-01-01

    The Mississippi Alluvial Valley (MAV) is the Nation?s largest floodplain and this once predominantly forested ecosystem provided significant habitat for a diverse flora and fauna, sequestered carbon in trees and soil, and stored floodwater, sediments, and nutrients within the floodplain. This landscape has been substantially altered by the conversion of nearly 75% of the riparian forests, predominantly to agricultural cropland, with significant loss and degradation of important ecosystem services. Large-scale efforts have been employed to restore the forest and wetland resources and the U.S. Department of Agriculture (USDA) Wetlands Reserve Program (WRP) and Conservation Reserve Program (CRP) represent some of the most extensive restoration programs in the MAV. The objective of the WRP is to restore and protect the functions and values of wetlands in agricultural landscapes with an emphasis on habitat for migratory birds and wetland-dependent wildlife, protection and improvement of water quality, flood attenuation, ground water recharge, protection of native flora and fauna, and educational and scientific scholarship.

  6. EQUILGAS: Program to estimate temperatures and in situ two-phase conditions in geothermal reservoirs using three combined FT-HSH gas equilibria models

    Science.gov (United States)

    Barragán, Rosa María; Núñez, José; Arellano, Víctor Manuel; Nieva, David

    2016-03-01

    Exploration and exploitation of geothermal resources require the estimation of important physical characteristics of reservoirs including temperatures, pressures and in situ two-phase conditions, in order to evaluate possible uses and/or investigate changes due to exploitation. As at relatively high temperatures (>150 °C) reservoir fluids usually attain chemical equilibrium in contact with hot rocks, different models based on the chemistry of fluids have been developed that allow deep conditions to be estimated. Currently either in water-dominated or steam-dominated reservoirs the chemistry of steam has been useful for working out reservoir conditions. In this context, three methods based on the Fischer-Tropsch (FT) and combined H2S-H2 (HSH) mineral-gas reactions have been developed for estimating temperatures and the quality of the in situ two-phase mixture prevailing in the reservoir. For these methods the mineral buffers considered to be controlling H2S-H2 composition of fluids are as follows. The pyrite-magnetite buffer (FT-HSH1); the pyrite-hematite buffer (FT-HSH2) and the pyrite-pyrrhotite buffer (FT-HSH3). Currently from such models the estimations of both, temperature and steam fraction in the two-phase fluid are obtained graphically by using a blank diagram with a background theoretical solution as reference. Thus large errors are involved since the isotherms are highly nonlinear functions while reservoir steam fractions are taken from a logarithmic scale. In order to facilitate the use of the three FT-HSH methods and minimize visual interpolation errors, the EQUILGAS program that numerically solves the equations of the FT-HSH methods was developed. In this work the FT-HSH methods and the EQUILGAS program are described. Illustrative examples for Mexican fields are also given in order to help the users in deciding which method could be more suitable for every specific data set.

  7. Program for shaping neutron microconstants for calculations by means of the Monte-Carlo method on the base of estimated data files (NEDAM)

    International Nuclear Information System (INIS)

    Zakharov, L.N.; Markovskij, D.V.; Frank-Kamenetskij, A.D.; Shatalov, G.E.

    1978-01-01

    The program for shaping neutron microconstants for calculations by means of the Monte-Carlo method, oriented on the detailed consideration of processes in the quick region. The initial information is files of the estimated datea within the UKNDL formate. The method combines the group approach to representation of the process probability and anisotropy of the elastic scattering with the individual description of the secondary neutron spectra of non-elastic processes. The NEDAM program is written in the FORTRAN language for BESM-6 computer and has the following characteristics: the initial file length of the evaluated data is 20000 words, the multigroup constant file length equals 8000 words, the MARK massive length equals 1000 words. The calculation time of a single variant equals 1-2 min

  8. cloudPEST - A python module for cloud-computing deployment of PEST, a program for parameter estimation

    Science.gov (United States)

    Fienen, Michael N.; Kunicki, Thomas C.; Kester, Daniel E.

    2011-01-01

    This report documents cloudPEST-a Python module with functions to facilitate deployment of the model-independent parameter estimation code PEST on a cloud-computing environment. cloudPEST makes use of low-level, freely available command-line tools that interface with the Amazon Elastic Compute Cloud (EC2(TradeMark)) that are unlikely to change dramatically. This report describes the preliminary setup for both Python and EC2 tools and subsequently describes the functions themselves. The code and guidelines have been tested primarily on the Windows(Registered) operating system but are extensible to Linux(Registered).

  9. Immunological indices of blood and interstitial fluid in estimation of a program of therapy of upper limb secondary edemas

    International Nuclear Information System (INIS)

    Kuz'mina, E.G.; Degtyareva, A.A.; Doroshenko, L.N.; Rogova, N.M.; Zorina, L.N.

    1990-01-01

    The efficacy of therapy of upper limb secondary edemas after 4 programs was compared among 83 patients. The methods were as follows: traditional method (TM) including routine conservative therapy, acupuncture (AP), He-Ne laser OKG-13 and semiconductor laser against a background of traditional therapy. A study was made of the time course of the extent of edema, total protein, IG, G, A and M and circulating immune complexes (CIC) during therapy of such patients. Blood serum and interstitial fluid indices were compared. It was shown that the application of both lasers led to increasing efficacy of TM and AP

  10. Model to estimate the local radiation doses to man from the atmospheric release of radionuclides (LWBR development program)

    International Nuclear Information System (INIS)

    Rider, J.L.; Beal, S.K.

    1977-04-01

    A model was developed to estimate the radiation dose commitments received by people in the vicinity of a facility that releases radionuclides into the atmosphere. This model considers dose commitments resulting from immersion in the plume, ingestion of contaminated food, inhalation of gaseous and suspended radioactivity, and exposure to ground deposits. The dose commitments from each of these pathways is explicitly considered for each radionuclide released into the atmosphere and for each daughter of each released nuclide. Using the release rate of only the parent radionuclide, the air and ground concentrations of each daughter are calculated for each position of interest. This is considered to be a significant improvement over other models in which the concentrations of daughter radionuclides must be approximated by separate releases

  11. The inverse Numerical Computer Program FLUX-BOT for estimating Vertical Water Fluxes from Temperature Time-Series.

    Science.gov (United States)

    Trauth, N.; Schmidt, C.; Munz, M.

    2016-12-01

    Heat as a natural tracer to quantify water fluxes between groundwater and surface water has evolved to a standard hydrological method. Typically, time series of temperatures in the surface water and in the sediment are observed and are subsequently evaluated by a vertical 1D representation of heat transport by advection and dispersion. Several analytical solutions as well as their implementation into user-friendly software exist in order to estimate water fluxes from the observed temperatures. Analytical solutions can be easily implemented but assumptions on the boundary conditions have to be made a priori, e.g. sinusoidal upper temperature boundary. Numerical models offer more flexibility and can handle temperature data which is characterized by irregular variations such as storm-event induced temperature changes and thus cannot readily be incorporated in analytical solutions. This also reduced the effort of data preprocessing such as the extraction of the diurnal temperature variation. We developed a software to estimate water FLUXes Based On Temperatures- FLUX-BOT. FLUX-BOT is a numerical code written in MATLAB which is intended to calculate vertical water fluxes in saturated sediments, based on the inversion of measured temperature time series observed at multiple depths. It applies a cell-centered Crank-Nicolson implicit finite difference scheme to solve the one-dimensional heat advection-conduction equation. Besides its core inverse numerical routines, FLUX-BOT includes functions visualizing the results and functions for performing uncertainty analysis. We provide applications of FLUX-BOT to generic as well as to measured temperature data to demonstrate its performance.

  12. Methods for estimating costs of transporting spent fuel and defense high-level radioactive waste for the civilian radioactive waste management program

    International Nuclear Information System (INIS)

    Darrough, M.E.; Lilly, M.J.

    1989-01-01

    The US Department of Energy (DOE), through the Office of Civilian Radioactive Waste Management, is planning and developing a transportation program for the shipment of spent fuel and defense high-level waste from current storage locations to the site of the mined geologic repository. In addition to its responsibility for providing a safe transportation system, the DOE will assure that the transportation program will function with the other system components to create an integrated waste management system. In meeting these objectives, the DOE will use private industry to the maximum extent practicable and in a manner that is cost effective. This paper discusses various methodologies used for estimating costs for the national radioactive waste transportation system. Estimating these transportation costs is a complex effort, as the high-level radioactive waste transportation system, itself, will be complex. Spent fuel and high-level waste will be transported from more than 100 nuclear power plants and defense sites across the continental US, using multiple transport modes (truck, rail, and barge/rail) and varying sizes and types of casks. Advance notification to corridor states will be given and scheduling will need to be coordinated with utilities, carriers, state and local officials, and the DOE waste acceptance facilities. Additionally, the waste forms will vary in terms of reactor type, size, weight, age, radioactivity, and temperature

  13. Design and implementation of estimation-based monitoring programs for flora and fauna: A case study on the Cherokee National Forest

    Science.gov (United States)

    Klimstra, J.D.; O'Connell, A.F.; Pistrang, M.J.; Lewis, L.M.; Herrig, J.A.; Sauer, J.R.

    2007-01-01

    Science-based monitoring of biological resources is important for a greater understanding of ecological systems and for assessment of the target population using theoretic-based management approaches. When selecting variables to monitor, managers first need to carefully consider their objectives, the geographic and temporal scale at which they will operate, and the effort needed to implement the program. Generally, monitoring can be divided into two categories: index and inferential. Although index monitoring is usually easier to implement, analysis of index data requires strong assumptions about consistency in detection rates over time and space, and parameters are often biasednot accounting for detectability and spatial variation. In most cases, individuals are not always available for detection during sampling periods, and the entire area of interest cannot be sampled. Conversely, inferential monitoring is more rigorous because it is based on nearly unbiased estimators of spatial distribution. Thus, we recommend that detectability and spatial variation be considered for all monitoring programs that intend to make inferences about the target population or the area of interest. Application of these techniques is especially important for the monitoring of Threatened and Endangered (T&E) species because it is critical to determine if population size is increasing or decreasing with some level of certainty. Use of estimation-based methods and probability sampling will reduce many of the biases inherently associated with index data and provide meaningful information with respect to changes that occur in target populations. We incorporated inferential monitoring into protocols for T&E species spanning a wide range of taxa on the Cherokee National Forest in the Southern Appalachian Mountains. We review the various approaches employed for different taxa and discuss design issues, sampling strategies, data analysis, and the details of estimating detectability using site

  14. A Case-Control Study to Estimate the Impact of the Icelandic Population-Based Mammography Screening Program on Breast Cancer Death

    Energy Technology Data Exchange (ETDEWEB)

    Gabe, R.; Tryggvadottir, L.; Sigfusson, B.F.; Olafsdottir, G.H.; Sigurarsson , K. [Icelandic Cancer Society (Krabbameinsfelag Islands), Reykjavik (Iceland); Duffy, S.W. [Cancer Research UK, Centre for Epidemiology, Mathematics and Stati stics, Wolfson Inst. of Preventive Medicine, London (United Kingdom)

    2007-11-15

    Background: The Icelandic breast cancer screening program, initiated November 1987 in Reykjavik and covering the whole country from December 1989, comprises biennial invitation to mammography for women aged 40-69 years old. Purpose: To estimate the impact of mammography service screening in Iceland on deaths from breast cancer. Material and Methods: Cases were deaths from breast cancer from 1990 onwards in women aged 40 and over at diagnosis, during the period November 1987 to December 31, 2002. Age- and screening-area-matched, population-based controls were women who had also been invited to screening but were alive at the time their case died. Results: Using conditional logistic regression on the data from 226 cases and 902 controls, the odds ratio for the risk of death from breast cancer in those attending at least one screen compared to those never screened was 0.59 (95% CI 0.41-0.84). After adjustment for healthy-volunteer bias and screening-opportunity bias, the odds ratio was 0.65 (95% CI 0.39-1.09). Conclusion: These results indicate a 35-40% reduction in breast cancer deaths by attending the Icelandic breast cancer screening program. These results are consistent with the overall evidence from other observational evaluations of mammography-based programs.

  15. A Case-Control Study to Estimate the Impact of the Icelandic Population-Based Mammography Screening Program on Breast Cancer Death

    International Nuclear Information System (INIS)

    Gabe, R.; Tryggvadottir, L.; Sigfusson, B.F.; Olafsdottir, G.H.; Sigurarsson, K.; Duffy, S.W.

    2007-01-01

    Background: The Icelandic breast cancer screening program, initiated November 1987 in Reykjavik and covering the whole country from December 1989, comprises biennial invitation to mammography for women aged 40-69 years old. Purpose: To estimate the impact of mammography service screening in Iceland on deaths from breast cancer. Material and Methods: Cases were deaths from breast cancer from 1990 onwards in women aged 40 and over at diagnosis, during the period November 1987 to December 31, 2002. Age- and screening-area-matched, population-based controls were women who had also been invited to screening but were alive at the time their case died. Results: Using conditional logistic regression on the data from 226 cases and 902 controls, the odds ratio for the risk of death from breast cancer in those attending at least one screen compared to those never screened was 0.59 (95% CI 0.41-0.84). After adjustment for healthy-volunteer bias and screening-opportunity bias, the odds ratio was 0.65 (95% CI 0.39-1.09). Conclusion: These results indicate a 35-40% reduction in breast cancer deaths by attending the Icelandic breast cancer screening program. These results are consistent with the overall evidence from other observational evaluations of mammography-based programs

  16. Evaluating the Impact of Zimbabwe's Prevention of Mother-to-Child HIV Transmission Program: Population-Level Estimates of HIV-Free Infant Survival Pre-Option A.

    Science.gov (United States)

    Buzdugan, Raluca; McCoy, Sandra I; Watadzaushe, Constancia; Kang Dufour, Mi-Suk; Petersen, Maya; Dirawo, Jeffrey; Mushavi, Angela; Mujuru, Hilda Angela; Mahomva, Agnes; Musarandega, Reuben; Hakobyan, Anna; Mugurungi, Owen; Cowan, Frances M; Padian, Nancy S

    2015-01-01

    We estimated HIV-free infant survival and mother-to-child HIV transmission (MTCT) rates in Zimbabwe, some of the first community-based estimates from a UNAIDS priority country. In 2012 we surveyed mother-infant pairs residing in the catchment areas of 157 health facilities randomly selected from 5 of 10 provinces in Zimbabwe. Enrolled infants were born 9-18 months before the survey. We collected questionnaires, blood samples for HIV testing, and verbal autopsies for deceased mothers/infants. Estimates were assessed among i) all HIV-exposed infants, as part of an impact evaluation of Option A of the 2010 WHO guidelines (rolled out in Zimbabwe in 2011), and ii) the subgroup of infants unexposed to Option A. We compared province-level MTCT rates measured among women in the community with MTCT rates measured using program monitoring data from facilities serving those communities. Among 8568 women with known HIV serostatus, 1107 (12.9%) were HIV-infected. Among all HIV-exposed infants, HIV-free infant survival was 90.9% (95% confidence interval (CI): 88.7-92.7) and MTCT was 8.8% (95% CI: 6.9-11.1). Sixty-six percent of HIV-exposed infants were still breastfeeding. Among the 762 infants born before Option A was implemented, 90.5% (95% CI: 88.1-92.5) were alive and HIV-uninfected at 9-18 months of age, and 9.1% (95%CI: 7.1-11.7) were HIV-infected. In four provinces, the community-based MTCT rate was higher than the facility-based MTCT rate. In Harare, the community and facility-based rates were 6.0% and 9.1%, respectively. By 2012 Zimbabwe had made substantial progress towards the elimination of MTCT. Our HIV-free infant survival and MTCT estimates capture HIV transmissions during pregnancy, delivery and breastfeeding regardless of whether or not mothers accessed health services. These estimates also provide a baseline against which to measure the impact of Option A guidelines (and subsequently Option B+).

  17. Why invest in a national public health program for stroke? An example using Australian data to estimate the potential benefits and cost implications.

    Science.gov (United States)

    Cadilhac, Dominique A; Carter, Robert C; Thrift, Amanda G; Dewey, Helen M

    2007-10-01

    Stroke is the world's second leading cause of death in people aged over 60 years. Approximately 50,000 strokes occur annually in Australia with numbers predicted to increase by about one third over 10-years. Our objectives were to assess the economic implications of a public health program for stroke by: (1) predicting what potential health-gains and cost-offsets could be achieved; and (2) determining the net level of annual investment that would offer value-for-money. Lifetime costs and outcomes were calculated for additional cases that would benefit if 'current practice' was feasibly improved, estimated for one indicative year using: (i) local epidemiological data, coverage rates and costs; and (ii) pooled effect sizes from systematic reviews. blood pressure lowering; warfarin for atrial fibrillation; increased access to stroke units; intravenous thrombolysis and aspirin for ischemic events; and carotid endarterectomy. Value-for-money threshold: AUD$30,000/DALY recovered. Improved, prevention and management could prevent about 27,000 (38%) strokes in 2015. In present terms (2004), about 85,000 DALYs and AUD$1.06 billion in lifetime cost-offsets could be recovered. The net level of annual warranted investment was AUD$3.63 billion. Primary prevention, in particular blood pressure lowering, was most effective. A public health program for stroke is warranted.

  18. Analytical results and effective dose estimation of the operational Environmental Monitoring Program for the radioactive waste repository in Abadia de Goias from 1998 to 2008

    Energy Technology Data Exchange (ETDEWEB)

    Ribeiro, Edison, E-mail: edison@cnen.gov.b [Centro Regional de Ciencias Nucleares do Centro-Oeste, Comissao Nacional de Energia Nuclear- Br 060 km 174, 5-Abadia de Goias- Goias, CEP 75345-000 (Brazil); Tauhata, Luiz, E-mail: tauhata@ird.gov.b [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Recreio dos Bandeirantes, Rio de Janeiro, RJ, CEP 22780-160 (Brazil); Eugenia dos Santos, Eliane, E-mail: esantos@cnen.gov.b [Centro Regional de Ciencias Nucleares do Centro-Oeste, Comissao Nacional de Energia Nuclear- Br 060 km 174, 5-Abadia de Goias- Goias, CEP 75345-000 (Brazil); Silveira Correa, Rosangela da, E-mail: rcorrea@cnen.gov.b [Centro Regional de Ciencias Nucleares do Centro-Oeste, Comissao Nacional de Energia Nuclear- Br 060 km 174, 5-Abadia de Goias- Goias, CEP 75345-000 (Brazil)

    2011-02-15

    This paper presents the results of the Environmental Monitoring Program for the Radioactive waste repository of Abadia de Goias, which was originated from the accident of Goiania, conducted by the Regional Center of Nuclear Sciences (CRCN-CO) of the National Commission on Nuclear Energy (CNEN), from 1998 to 2008. The results are related to the determination of {sup 137}Cs activity per unit of mass or volume of samples from surface water, ground water, depth sediments of the river, soil and vegetation, and also the air-kerma rate estimation for gamma exposure in the monitored site. In the phase of operational Environmental Monitoring Program, the values of the geometric mean and standard deviation obtained for {sup 137}Cs activity per unit of mass or volume in the analyzed samples were (0.08 {+-} 1.16) Bq.L{sup -1} for surface and underground water, (0.22 {+-} 2.79) Bq.kg{sup -1} for soil, and (0.19 {+-} 2.72) Bq.kg{sup -1} for sediment, and (0.19 {+-} 2.30) Bq.kg{sup -1} for vegetation. These results were similar to the values of the pre-operational Environmental Monitoring Program. With these data, estimations for effective dose were evaluated for public individuals in the neighborhood of the waste repository, considering the main possible way of exposure of this population group. The annual effective dose obtained from the analysis of these results were lower than 0.3 mSv.y{sup -1}, which is the limit established by CNEN for environmental impact in the public individuals indicating that the facility is operating safely, without any radiological impact to the surrounding environment. - Research highlights: {yields} A stolen capsule of Cesium 137 was opened in the city of Goiania, generating some 6000 tons of debris which were stored in the Repository area built for this purpose. {yields} The activity of cesium 137 of the surface water, underground water, depth sediments of river, soil, vegetation, and air, inside and surround the Repository area. {yields

  19. Estimating Supplies Program: Evaluation Report

    Science.gov (United States)

    2002-12-24

    All Cases 211 Scabies All Cases 212 Pilonidal Cyst /abscess Requiring Major Excision 213 Cyst /abscess Allcases Including Minor Incision...Osteitis) 802 Apical Abscess/ periapical Abscess-collection Of Purulent Exudate Around The Area Of The Tooth That Surrounds The Root Tip 803 Avulsed

  20. Potential for bias in using hybrids between common carp (Cyprinus carpio) and goldfish (Carassius auratus) in endocrine studies: a first report of hybrids in Lake Mead, Nevada, U.S.A

    Science.gov (United States)

    Goodbred, Steven L.; Patino, Reynaldo; Orsak, Erik; Sharma, Prakash; Ruessler, Shane

    2013-01-01

    During a 2008 study to assess endocrine and reproductive health of common carp (Cyprinus carpio) in Lake Mead, Nevada (U.S.A.) we identified two fish, one male and one female, as hybrids with goldfish (Carassius auratus) based on morphology, lateral line scale count, and lack of anterior barbels. Gross examination of the female hybrid ovaries indicated presence of vitellogenic ovarian follicles; whereas histological evaluation of the male hybrid testes showed lobule-like structures with open lumens but without germ cells, suggesting it was sterile. Because common carp/goldfish hybrids are more susceptible to gonadal tumors and may have different endocrine profiles than common carp, researchers using common carp as a model for endocrine/reproductive studies should be aware of the possible presence of hybrids.

  1. Evaluating lake stratification and temporal trends by using near-continuous water-quality data from automated profiling systems for water years 2005-09, Lake Mead, Arizona and Nevada

    Science.gov (United States)

    Veley, Ronald J.; Moran, Michael J.

    2012-01-01

    The U.S. Geological Survey, in cooperation with the National Park Service and Southern Nevada Water Authority, collected near-continuous depth-dependent water-quality data at Lake Mead, Arizona and Nevada, as part of a multi-agency monitoring network maintained to provide resource managers with basic data and to gain a better understanding of the hydrodynamics of the lake. Water-quality data-collection stations on Lake Mead were located in shallow water (less than 20 meters) at Las Vegas Bay (Site 3) and Overton Arm, and in deep water (greater than 20 meters) near Sentinel Island and at Virgin and Temple Basins. At each station, near-continual depth-dependent water-quality data were collected from October 2004 through September 2009. The data were collected by using automatic profiling systems equipped with multiparameter water-quality sondes. The sondes had sensors for temperature, specific conductance, dissolved oxygen, pH, turbidity, and depth. Data were collected every 6 hours at 2-meter depth intervals (for shallow-water stations) or 5-meter depth intervals (for deep-water stations) beginning at 1 meter below water surface. Data were analyzed to determine water-quality conditions related to stratification of the lake and temporal trends in water-quality parameters. Three water-quality parameters were the main focus of these analyses: temperature, specific conductance, and dissolved oxygen. Statistical temporal-trend analyses were performed for a single depth at shallow-water stations [Las Vegas Bay (Site 3) and Overton Arm] and for thermally-stratified lake layers at deep-water stations (Sentinel Island and Virgin Basin). The limited period of data collection at the Temple Basin station prevented the application of statistical trend analysis. During the summer months, thermal stratification was not observed at shallow-water stations, nor were major maxima or minima observed for specific-conductance or dissolved-oxygen profiles. A clearly-defined thermocline

  2. Department of the Army: FY 1999 Amended Budget Estimates, Army National Guard, Military Construction Program FY 1999, Justification Data Submission to Congress

    National Research Council Canada - National Science Library

    1998-01-01

    This document contains State List of Projects, Mission Listing, Language, Program and Finance Schedule, Object Classification Schedule, Special Program Considerations, Future Years Defense Plan Audit...

  3. Patterns of metal composition and biological condition and their association in male common carp across an environmental contaminant gradient in Lake Mead National Recreation Area, Nevada and Arizona, USA

    Science.gov (United States)

    Patino, R.; Rosen, Michael R.; Orsak, E.L.; Goodbred, S.L.; May, T.W.; Alvarez, David; Echols, K.R.; Wieser, C.M.; Ruessler, S.; Torres, L.

    2012-01-01

    There is a contaminant gradient in Lake Mead National Recreation Area (LMNRA) that is partly driven by municipal and industrial runoff and wastewater inputs via Las Vegas Wash (LVW). Adult male common carp (Cyprinus carpio; 10 fish/site) were collected from LVW, Las Vegas Bay (receiving LVW flow), Overton Arm (OA, upstream reference), and Willow Beach (WB, downstream) in March 2008. Discriminant function analysis was used to describe differences in metal concentrations and biological condition of fish collected from the four study sites, and canonical correlation analysis was used to evaluate the association between metal and biological traits. Metal concentrations were determined in whole-body extracts. Of 63 metals screened, those initially used in the statistical analysis were Ag, As, Ba, Cd, Co, Fe, Hg, Pb, Se, Zn. Biological variables analyzed included total length (TL), Fulton's condition factor, gonadosomatic index (GSI), hematocrit (Hct), and plasma estradiol-17?? and 11-ketotestosterone (11kt) concentrations. Analysis of metal composition and biological condition both yielded strong discrimination of fish by site (respective canonical model, p< 0.0001). Compared to OA, pairwise Mahalanobis distances between group means were WB < LVB < LVW for metal concentrations and LVB < WB < LVW for biological traits. Respective primary drivers for these separations were Ag, As, Ba, Hg, Pb, Se and Zn; and TL, GSI, 11kt, and Hct. Canonical correlation analysis using the latter variable sets showed they are significantly associated (p<0.0003); with As, Ba, Hg, and Zn, and TL, 11kt, and Hct being the primary contributors to the association. In conclusion, male carp collected along a contaminant gradient in LMNRA have distinct, collection site-dependent metal and morpho-physiological profiles that are significantly associated with each other. These associations suggest that fish health and reproductive condition (as measured by the biological variables evaluated in this

  4. Sea surface temperature estimates for the mid-Piacenzian Indian Ocean—Ocean Drilling Program sites 709, 716, 722, 754, 757, 758, and 763

    Science.gov (United States)

    Robinson, Marci M.; Dowsett, Harry J.; Stoll, Danielle K.

    2018-01-30

    Despite the wealth of global paleoclimate data available for the warm period in the middle of the Piacenzian Stage of the Pliocene Epoch (about 3.3 to 3.0 million years ago [Ma]; Dowsett and others, 2013, and references therein), the Indian Ocean has remained a region of sparse geographic coverage in terms of microfossil analysis. In an effort to characterize the surface Indian Ocean during this interval, we examined the planktic foraminifera from Ocean Drilling Program (ODP) sites 709, 716, 722, 754, 757, 758, and 763, encompassing a wide range of oceanographic conditions. We quantitatively analyzed the data for sea surface temperature (SST) estimation using both the modern analog technique (MAT) and a factor analytic transfer function. The data will contribute to the U.S. Geological Survey (USGS) Pliocene Research, Interpretation and Synoptic Mapping (PRISM) Project’s global SST reconstruction and climate model SST boundary condition for the mid-Piacenzian and will become part of the PRISM verification dataset designed to ground-truth Pliocene climate model simulations (Dowsett and others, 2013).

  5. TRAC-PF1/MOD1: an advanced best-estimate computer program for pressurized water reactor thermal-hydraulic analysis

    International Nuclear Information System (INIS)

    Liles, D.R.; Mahaffy, J.H.

    1986-07-01

    The Los Alamos National Laboratory is developing the Transient Reactor Analysis Code (TRAC) to provide advanced best-estimate predictions of postulated accidents in light-water reactors. The TRAC-PF1/MOD1 program provides this capability for pressurized water reactors and for many thermal-hydraulic test facilities. The code features either a one- or a three-dimensional treatment of the pressure vessel and its associated internals, a two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field and solute tracking, flow-regime-dependent constitutive equation treatment, optional reflood tracking capability for bottom-flood and falling-film quench fronts, and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The stability-enhancing two-step (SETS) numerical algorithm is used in the one-dimensional hydrodynamics and permits this portion of the fluid dynamics to violate the material Courant condition. This technique permits large time steps and, hence, reduced running time for slow transients

  6. TRAC-PF1/MOD1: an advanced best-estimate computer program for pressurized water reactor thermal-hydraulic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Liles, D.R.; Mahaffy, J.H.

    1986-07-01

    The Los Alamos National Laboratory is developing the Transient Reactor Analysis Code (TRAC) to provide advanced best-estimate predictions of postulated accidents in light-water reactors. The TRAC-PF1/MOD1 program provides this capability for pressurized water reactors and for many thermal-hydraulic test facilities. The code features either a one- or a three-dimensional treatment of the pressure vessel and its associated internals, a two-fluid nonequilibrium hydrodynamics model with a noncondensable gas field and solute tracking, flow-regime-dependent constitutive equation treatment, optional reflood tracking capability for bottom-flood and falling-film quench fronts, and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The stability-enhancing two-step (SETS) numerical algorithm is used in the one-dimensional hydrodynamics and permits this portion of the fluid dynamics to violate the material Courant condition. This technique permits large time steps and, hence, reduced running time for slow transients.

  7. Asteroid mass estimation using Markov-chain Monte Carlo

    Science.gov (United States)

    Siltala, Lauri; Granvik, Mikael

    2017-11-01

    Estimates for asteroid masses are based on their gravitational perturbations on the orbits of other objects such as Mars, spacecraft, or other asteroids and/or their satellites. In the case of asteroid-asteroid perturbations, this leads to an inverse problem in at least 13 dimensions where the aim is to derive the mass of the perturbing asteroid(s) and six orbital elements for both the perturbing asteroid(s) and the test asteroid(s) based on astrometric observations. We have developed and implemented three different mass estimation algorithms utilizing asteroid-asteroid perturbations: the very rough 'marching' approximation, in which the asteroids' orbital elements are not fitted, thereby reducing the problem to a one-dimensional estimation of the mass, an implementation of the Nelder-Mead simplex method, and most significantly, a Markov-chain Monte Carlo (MCMC) approach. We describe each of these algorithms with particular focus on the MCMC algorithm, and present example results using both synthetic and real data. Our results agree with the published mass estimates, but suggest that the published uncertainties may be misleading as a consequence of using linearized mass-estimation methods. Finally, we discuss remaining challenges with the algorithms as well as future plans.

  8. Estimating Delays In ASIC's

    Science.gov (United States)

    Burke, Gary; Nesheiwat, Jeffrey; Su, Ling

    1994-01-01

    Verification is important aspect of process of designing application-specific integrated circuit (ASIC). Design must not only be functionally accurate, but must also maintain correct timing. IFA, Intelligent Front Annotation program, assists in verifying timing of ASIC early in design process. This program speeds design-and-verification cycle by estimating delays before layouts completed. Written in C language.

  9. Estimating the cost-per-result of a national reflexed Cryptococcal antigenaemia screening program: Forecasting the impact of potential HIV guideline changes and treatment goals.

    Science.gov (United States)

    Cassim, Naseem; Coetzee, Lindi Marie; Schnippel, Kathryn; Glencross, Deborah Kim

    2017-01-01

    During 2016, the National Health Laboratory Service (NHLS) introduced laboratory-based reflexed Cryptococcal antigen (CrAg) screening to detect early Cryptococcal disease in immunosuppressed HIV+ patients with a confirmed CD4 count of 100 cells/μl or less. The aim of this study was to assess cost-per-result of a national screening program across different tiers of laboratory service, with variable daily CrAg test volumes. The impact of potential ART treatment guideline and treatment target changes on CrAg volumes, platform choice and laboratory workflow are considered. CD4 data (with counts per-result was calculated for four scenarios, including the existing service status quo (Scenario-I), and three other settings (as Scenarios II-IV) which were based on information from recent antiretroviral (ART) guidelines, District Health Information System (DHIS) data and UNAIDS 90/90/90 HIV/AIDS treatment targets. Scenario-II forecast CD4 testing offered only to new ART initiates recorded at DHIS. Scenario-III projected all patients notified as HIV+, but not yet on ART (recorded at DHIS) and Scenario-IV forecast CrAg screening in 90% of estimated HIV+ patients across South Africa (also DHIS). Stata was used to assess daily CrAg volumes at the 5th, 10th, 25th, 50th, 75th, 90th and 95th percentiles across 52 CD4-laboratories. Daily volumes were used to determine technical effort/ operator staff costs (% full time equivalent) and cost-per-result for all scenarios. Daily volumes ranged between 3 and 64 samples for Scenario-I at the 5th and 95th percentile. Similarly, daily volumes ranges of 1-12, 2-45 and 5-100 CrAg-directed samples were noted for Scenario's II, III and IV respectively. A cut-off of 30 CrAg tests per day defined use of either LFA or EIA platform. LFA cost-per-result ranged from $8.24 to $5.44 and EIA cost-per-result between $5.58 and $4.88 across the range of test volumes. The technical effort across scenarios ranged from 3.2-27.6% depending on test volumes and

  10. Estimating the net benefit of a specialized return-to-work program for workers on short-term disability related to a mental disorder: an example exploring investment in collaborative care.

    Science.gov (United States)

    Dewa, Carolyn S; Hoch, Jeffrey S

    2014-06-01

    This article estimates the net benefit for a company incorporating a collaborative care model into its return-to-work program for workers on short-term disability related to a mental disorder. Employing a simple decision model, the net benefit and uncertainty were explored. The breakeven point occurs when the average short-term disability episode is reduced by at least 7 days. In addition, 85% of the time, benefits could outweigh costs. Model results and sensitivity analyses indicate that organizational benefits can be greater than the costs of incorporating a collaborative care model into a return-to-work program for workers on short-term disability related to a mental disorder. The results also demonstrate how the probability of a program's effectiveness and the magnitude of its effectiveness are key factors that determine whether the benefits of a program outweigh its costs.

  11. On the q-Weibull distribution for reliability applications: An adaptive hybrid artificial bee colony algorithm for parameter estimation

    International Nuclear Information System (INIS)

    Xu, Meng; Droguett, Enrique López; Lins, Isis Didier; Chagas Moura, Márcio das

    2017-01-01

    The q-Weibull model is based on the Tsallis non-extensive entropy and is able to model various behaviors of the hazard rate function, including bathtub curves, by using a single set of parameters. Despite its flexibility, the q-Weibull has not been widely used in reliability applications partly because of the complicated parameters estimation. In this work, the parameters of the q-Weibull are estimated by the maximum likelihood (ML) method. Due to the intricate system of nonlinear equations, derivative-based optimization methods may fail to converge. Thus, the heuristic optimization method of artificial bee colony (ABC) is used instead. To deal with the slow convergence of ABC, it is proposed an adaptive hybrid ABC (AHABC) algorithm that dynamically combines Nelder-Mead simplex search method with ABC for the ML estimation of the q-Weibull parameters. Interval estimates for the q-Weibull parameters, including confidence intervals based on the ML asymptotic theory and on bootstrap methods, are also developed. The AHABC is validated via numerical experiments involving the q-Weibull ML for reliability applications and results show that it produces faster and more accurate convergence when compared to ABC and similar approaches. The estimation procedure is applied to real reliability failure data characterized by a bathtub-shaped hazard rate. - Highlights: • Development of an Adaptive Hybrid ABC (AHABC) algorithm for q-Weibull distribution. • AHABC combines local Nelder-Mead simplex method with ABC to enhance local search. • AHABC efficiently finds the optimal solution for the q-Weibull ML problem. • AHABC outperforms ABC and self-adaptive hybrid ABC in accuracy and convergence speed. • Useful model for reliability data with non-monotonic hazard rate.

  12. Estimating electricity storage power rating and discharge duration for utility transmission and distribution deferral :a study for the DOE energy storage program.

    Energy Technology Data Exchange (ETDEWEB)

    Eyer, James M. (Distributed Utility Associates, Livermore, CA); Butler, Paul Charles; Iannucci, Joseph J., Jr. (,.Distributed Utility Associates, Livermore, CA)

    2005-11-01

    This report describes a methodology for estimating the power and energy capacities for electricity energy storage systems that can be used to defer costly upgrades to fully overloaded, or nearly overloaded, transmission and distribution (T&D) nodes. This ''sizing'' methodology may be used to estimate the amount of storage needed so that T&D upgrades may be deferred for one year. The same methodology can also be used to estimate the characteristics of storage needed for subsequent years of deferral.

  13. Estimation of measurement variances

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    In the previous two sessions, it was assumed that the measurement error variances were known quantities when the variances of the safeguards indices were calculated. These known quantities are actually estimates based on historical data and on data generated by the measurement program. Session 34 discusses how measurement error parameters are estimated for different situations. The various error types are considered. The purpose of the session is to enable participants to: (1) estimate systematic error variances from standard data; (2) estimate random error variances from data as replicate measurement data; (3) perform a simple analysis of variances to characterize the measurement error structure when biases vary over time

  14. Mead acid (20:3n-9) and n-3 polyunsaturated fatty acids are not associated with risk of posterior longitudinal ligament ossification: results of a case-control study.

    Science.gov (United States)

    Hamazaki, Kei; Kawaguchi, Yoshiharu; Nakano, Masato; Yasuda, Taketoshi; Seki, Shoji; Hori, Takeshi; Hamazaki, Tomohito; Kimura, Tomoatsu

    2015-05-01

    Ossification of the posterior longitudinal ligament (OPLL) involves the replacement of ligamentous tissue with ectopic bone. Although genetics and heritability appear to be involved in the development of OPLL, its pathogenesis remains to be elucidated. Given previous findings that 5,8,11-eicosatrienoic acid [20:3n-9, Mead acid (MA)] has depressive effects on osteoblastic activity and anti-angiogenic effects, and that n-3 polyunsaturated fatty acids (PUFAs) have a preventive effect on heterotopic ossification, we hypothesized that both fatty acids would be involved in OPLL development. To examine the biological significance of these and other fatty acids in OPLL, we conducted this case-control study involving 106 patients with cervical OPLL and 109 age matched controls. Fatty acid composition was determined from plasma samples by gas chromatography. Associations between fatty acid levels and incident OPLL were evaluated by logistic regression. Contrary to our expectations, we found no significant differences between patients and controls in the levels of MA or n-3 PUFAs (e.g., eicosapentaenoic acid and docosahexaenoic acid). Logistic regression analysis did not reveal any associations with OPLL risk for MA or n-3 PUFAs. In conclusion, no potential role was found for MA or n-3 PUFAs in ectopic bone formation in the spinal canal. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. National Coral Reef Monitoring Program: Stratified Random Surveys (StRS) of Reef Fish, including Benthic Estimate Data of the Mariana Archipelago in 2014 (NCEI Accession 0157596)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Data provided in this data set were collected as part of the NOAA Pacific Islands Fisheries Science Center (PIFSC), Coral Reef Ecosystem Program (CREP) led NCRMP...

  16. Parameter Estimation

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Heitzig, Martina; Cameron, Ian

    2011-01-01

    of optimisation techniques coupled with dynamic solution of the underlying model. Linear and nonlinear approaches to parameter estimation are investigated. There is also the application of maximum likelihood principles in the estimation of parameters, as well as the use of orthogonal collocation to generate a set......In this chapter the importance of parameter estimation in model development is illustrated through various applications related to reaction systems. In particular, rate constants in a reaction system are obtained through parameter estimation methods. These approaches often require the application...... of algebraic equations as the basis for parameter estimation.These approaches are illustrated using estimations of kinetic constants from reaction system models....

  17. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    Science.gov (United States)

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and

  18. Estimating aboveground forest biomass carbon and fire consumption in the U.S. Utah High Plateaus using data from the Forest Inventory and Analysis program, Landsat, and LANDFIRE

    Science.gov (United States)

    Chen, Xuexia; Liu, Shuguang; Zhu, Zhiliang; Vogelmann, James E.; Li, Zhengpeng; Ohlen, Donald O.

    2011-01-01

    The concentrations of CO2 and other greenhouse gases in the atmosphere have been increasing and greatly affecting global climate and socio-economic systems. Actively growing forests are generally considered to be a major carbon sink, but forest wildfires lead to large releases of biomass carbon into the atmosphere. Aboveground forest biomass carbon (AFBC), an important ecological indicator, and fire-induced carbon emissions at regional scales are highly relevant to forest sustainable management and climate change. It is challenging to accurately estimate the spatial distribution of AFBC across large areas because of the spatial heterogeneity of forest cover types and canopy structure. In this study, Forest Inventory and Analysis (FIA) data, Landsat, and Landscape Fire and Resource Management Planning Tools Project (LANDFIRE) data were integrated in a regression tree model for estimating AFBC at a 30-m resolution in the Utah High Plateaus. AFBC were calculated from 225 FIA field plots and used as the dependent variable in the model. Of these plots, 10% were held out for model evaluation with stratified random sampling, and the other 90% were used as training data to develop the regression tree model. Independent variable layers included Landsat imagery and the derived spectral indicators, digital elevation model (DEM) data and derivatives, biophysical gradient data, existing vegetation cover type and vegetation structure. The cross-validation correlation coefficient (r value) was 0.81 for the training model. Independent validation using withheld plot data was similar with r value of 0.82. This validated regression tree model was applied to map AFBC in the Utah High Plateaus and then combined with burn severity information to estimate loss of AFBC in the Longston fire of Zion National Park in 2001. The final dataset represented 24 forest cover types for a 4 million ha forested area. We estimated a total of 353 Tg AFBC with an average of 87 MgC/ha in the Utah High

  19. Department of the Navy FY 1990/FY 1991 Biennial Budget Estimates. Military Construction and Family Housing Program FY 1990. Justification Data Submitted to Congress

    Science.gov (United States)

    1989-01-01

    properly configured for optimum space use, are inadequate, overcrowded, and cannot acca -modate all the children who need child care. Comeercial child... F6 -4 l’mestic Leasing Fiscal Year Summary: PY 1988 - The domestic leasing program consists of 1,324 units requiring funding of 09,551.0. runding in

  20. Comparison of the Gen Expression Programming, Nonlinear Time Series and Artificial Neural Network in Estimating the River Daily Flow (Case Study: The Karun River

    Directory of Open Access Journals (Sweden)

    R. Zamani

    2015-06-01

    Full Text Available Today, the daily flow forecasting of rivers is an important issue in hydrology and water resources and thus can be used the results of daily river flow modeling in water resources management, droughts and floods monitoring. In this study, due to the importance of this issue, using nonlinear time series models and artificial intelligence (Artificial Neural Network and Gen Expression Programming, the daily flow modeling has been at the time interval (1981-2012 in the Armand hydrometric station on the Karun River. Armand station upstream basin is one of the most basins in the North Karun basin and includes four sub basins (Vanak, Middle Karun, Beheshtabad and Kohrang.The results of this study shown that artificial intelligence models have superior than nonlinear time series in flow daily simulation in the Karun River. As well as, modeling and comparison of artificial intelligence models showed that the Gen Expression Programming have evaluation criteria better than artificial neural network.

  1. On Continuous Distributions and Parameter Estimation in Probabilistic Logic Programs (Over continue verdelingen en het schatten van parameters in probabilistische logische programma's)

    OpenAIRE

    Gutmann, Bernd

    2011-01-01

    In the last decade remarkable progress has been made on combining statistical machine learning techniques, reasoning under uncertainty, and relational representations. The branch of Artificial Intelligence working on the synthesis of these three areas is known as statistical relational learning or probabilistic logic learning.ProbLog, one of the probabilistic frameworks developed, is an extension of the logic programming language Prolog with independent random variables that are defined by an...

  2. Models of Community-Based Hepatitis B Surface Antigen Screening Programs in the U.S. and Their Estimated Outcomes and Costs

    Science.gov (United States)

    Rein, David B.; Lesesne, Sarah B.; Smith, Bryce D.; Weinbaum, Cindy M.

    2011-01-01

    Objectives Information on the process and method of service delivery is sparse for hepatitis B surface antigen (HBsAg) testing, and no systematic study has evaluated the relative effectiveness or cost-effectiveness of different HBsAg screening models. To address this need, we compared five specific community-based screening programs. Methods We funded five HBsAg screening programs to collect information on their design, costs, and outcomes of participants during a six-month observation period. We categorized programs into four types of models. For each model, we calculated the number screened, the number screened as per Centers for Disease Control and Prevention (CDC) recommendations, and the cost per screening. Results The models varied by cost per person screened and total number of people screened, but they did not differ meaningfully in the proportion of people screened following CDC recommendations, the proportion of those screened who tested positive, or the proportion of those who newly tested positive. Conclusions Integrating screening into outpatient service settings is the most cost-effective method but may not reach all people needing to be screened. Future research should examine cost-effective methods that expand the reach of screening into communities in outpatient settings. PMID:21800750

  3. Global parameter estimation for thermodynamic models of transcriptional regulation.

    Science.gov (United States)

    Suleimenov, Yerzhan; Ay, Ahmet; Samee, Md Abul Hassan; Dresch, Jacqueline M; Sinha, Saurabh; Arnosti, David N

    2013-07-15

    Deciphering the mechanisms involved in gene regulation holds the key to understanding the control of central biological processes, including human disease, population variation, and the evolution of morphological innovations. New experimental techniques including whole genome sequencing and transcriptome analysis have enabled comprehensive modeling approaches to study gene regulation. In many cases, it is useful to be able to assign biological significance to the inferred model parameters, but such interpretation should take into account features that affect these parameters, including model construction and sensitivity, the type of fitness calculation, and the effectiveness of parameter estimation. This last point is often neglected, as estimation methods are often selected for historical reasons or for computational ease. Here, we compare the performance of two parameter estimation techniques broadly representative of local and global approaches, namely, a quasi-Newton/Nelder-Mead simplex (QN/NMS) method and a covariance matrix adaptation-evolutionary strategy (CMA-ES) method. The estimation methods were applied to a set of thermodynamic models of gene transcription applied to regulatory elements active in the Drosophila embryo. Measuring overall fit, the global CMA-ES method performed significantly better than the local QN/NMS method on high quality data sets, but this difference was negligible on lower quality data sets with increased noise or on data sets simplified by stringent thresholding. Our results suggest that the choice of parameter estimation technique for evaluation of gene expression models depends both on quality of data, the nature of the models [again, remains to be established] and the aims of the modeling effort. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Slope-Area Computation Program Graphical User Interface 1.0—A Preprocessing and Postprocessing Tool for Estimating Peak Flood Discharge Using the Slope-Area Method

    Science.gov (United States)

    Bradley, D. Nathan

    2012-01-01

    The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data

  5. Estimating the intra-cluster correlation coefficient for evaluating an educational intervention program to improve rabies awareness and dog bite prevention among children in Sikkim, India: A pilot study.

    Science.gov (United States)

    Auplish, Aashima; Clarke, Alison S; Van Zanten, Trent; Abel, Kate; Tham, Charmaine; Bhutia, Thinlay N; Wilks, Colin R; Stevenson, Mark A; Firestone, Simon M

    2017-05-01

    Educational initiatives targeting at-risk populations have long been recognized as a mainstay of ongoing rabies control efforts. Cluster-based studies are often utilized to assess levels of knowledge, attitudes and practices of a population in response to education campaigns. The design of cluster-based studies requires estimates of intra-cluster correlation coefficients obtained from previous studies. This study estimates the school-level intra-cluster correlation coefficient (ICC) for rabies knowledge change following an educational intervention program. A cross-sectional survey was conducted with 226 students from 7 schools in Sikkim, India, using cluster sampling. In order to assess knowledge uptake, rabies education sessions with pre- and post-session questionnaires were administered. Paired differences of proportions were estimated for questions answered correctly. A mixed effects logistic regression model was developed to estimate school-level and student-level ICCs and to test for associations between gender, age, school location and educational level. The school- and student-level ICCs for rabies knowledge and awareness were 0.04 (95% CI: 0.01, 0.19) and 0.05 (95% CI: 0.2, 0.09), respectively. These ICCs suggest design effect multipliers of 5.45 schools and 1.05 students per school, will be required when estimating sample sizes and designing future cluster randomized trials. There was a good baseline level of rabies knowledge (mean pre-session score 71%), however, key knowledge gaps were identified in understanding appropriate behavior around scared dogs, potential sources of rabies and how to correctly order post rabies exposure precaution steps. After adjusting for the effect of gender, age, school location and education level, school and individual post-session test scores improved by 19%, with similar performance amongst boys and girls attending schools in urban and rural regions. The proportion of participants that were able to correctly order post

  6. Longitudinal change in estimated GFR among CKD patients: A 10-year follow-up study of an integrated kidney disease care program in Taiwan.

    Directory of Open Access Journals (Sweden)

    Ching-Wei Tsai

    Full Text Available This study examined the progression of chronic kidney disease (CKD by using average annual decline in estimated GFR (eGFR and its risk factors in a 10-year follow-up CKD cohort.A prospective, observational cohort study, 4600 individuals fulfilled the definition of CKD, with or without proteinuria, were followed for 10 years. The eGFR was estimated by the MDRD equation. Linear regression was used to estimate participants' annual decline rate in eGFR. We defined subjects with annual eGFR decline rate <1 ml/min/1.73 m2 as non-progression and the decline rate over 3 ml/min/1.73 m2 as rapid progression.During the follow-up period, 2870 (62.4% individuals had annual eGFR decline rate greater than 1 ml/min/1.73 m2. The eGFR decline rate was slower in individuals with CKD diagnosed over the age of 60 years than those with onset at a younger age. Comparing to subjects with decline rate <1 ml/min/1.73 m2/year, the odds ratio (OR of developing rapid CKD progression for diabetes, proteinuria and late onset of CKD was 1.72 (95% CI: 1.48-2.00, 1.89(1.63-2.20 and 0.68 (0.56-0.81, respectively. When the model was adjusted for the latest CKD stage, comparing to those with CKD stage 1, patients with stage 4 and stage 5 have significantly higher risks for rapid progression (OR, 5.17 (2.60-10.25, 19.83 (10.05-39.10, respectively. However, such risk was not observed among patients with the latest CKD stage 2 and 3. The risk for incident ESRD was 17% higher for each 1 ml/min/1.73 m2 increasing in annual decline rate.Not everyone with CKD develops ESRD after a 10-year follow-up. Absolute annual eGFR decline rate can help clinicians to better predict the progression of CKD. Individuals with renal function decline rate over 3 ml/min/1.73 m2/year require intensive CKD care.

  7. Signal detection theory and vestibular perception: III. Estimating unbiased fit parameters for psychometric functions.

    Science.gov (United States)

    Chaudhuri, Shomesh E; Merfeld, Daniel M

    2013-03-01

    Psychophysics generally relies on estimating a subject's ability to perform a specific task as a function of an observed stimulus. For threshold studies, the fitted functions are called psychometric functions. While fitting psychometric functions to data acquired using adaptive sampling procedures (e.g., "staircase" procedures), investigators have encountered a bias in the spread ("slope" or "threshold") parameter that has been attributed to the serial dependency of the adaptive data. Using simulations, we confirm this bias for cumulative Gaussian parametric maximum likelihood fits on data collected via adaptive sampling procedures, and then present a bias-reduced maximum likelihood fit that substantially reduces the bias without reducing the precision of the spread parameter estimate and without reducing the accuracy or precision of the other fit parameters. As a separate topic, we explain how to implement this bias reduction technique using generalized linear model fits as well as other numeric maximum likelihood techniques such as the Nelder-Mead simplex. We then provide a comparison of the iterative bootstrap and observed information matrix techniques for estimating parameter fit variance from adaptive sampling procedure data sets. The iterative bootstrap technique is shown to be slightly more accurate; however, the observed information technique executes in a small fraction (0.005 %) of the time required by the iterative bootstrap technique, which is an advantage when a real-time estimate of parameter fit variance is required.

  8. Installation restoration program: Hydrologic measurements with an estimated hydrologic budget for the Joliet Army Ammunition Plant, Joliet, Illinois. [Contains maps of monitoring well locations, topography and hydrologic basins

    Energy Technology Data Exchange (ETDEWEB)

    Diodato, D.M.; Cho, H.E.; Sundell, R.C.

    1991-07-01

    Hydrologic data were gathered from the 36.8-mi{sup 2} Joliet Army Ammunition Plant (JAAP) located in Joliet, Illinois. Surface water levels were measured continuously, and groundwater levels were measured monthly. The resulting information was entered into a database that could be used as part of numerical flow model validation for the site. Deep sandstone aquifers supply much of the water in the JAAP region. These aquifers are successively overlain by confining shales and a dolomite aquifer of Silurian age. This last unit is unconformably overlain by Pleistocene glacial tills and outwash sand and gravel. Groundwater levels in the shallow glacial system fluctuate widely, with one well completed in an upland fluctuating more than 17 ft during the study period. The response to groundwater recharge in the underlying Silurian dolomite is slower. In the upland recharge areas, increased groundwater levels were observed; in the lowland discharge areas, groundwater levels decreased during the study period. The decreases are postulated to be a lag effect related to a 1988 drought. These observations show that fluid at the JAAP is not steady-state, either on a monthly or an annual basis. Hydrologic budgets were estimated for the two principal surface water basins at the JAAP site. These basins account for 70% of the facility's total land area. Meteorological data collected at a nearby dam show that total measured precipitation was 31.45 in. and total calculated evapotranspiration was 23.09 in. for the study period. The change in surface water storage was assumed to be zero for the annual budget for each basin. The change in groundwater storage was calculated to be 0.12 in. for the Grant Creek basin and 0. 26 in. for the Prairie Creek basin. Runoff was 7.02 in. and 7.51 in. for the Grant Creek and Prairie Creek basins, respectively. The underflow to the deep hydrogeologic system in the Grant Creek basin was calculated to be negligible. 12 refs., 17 figs., 15 tabs.

  9. Estimating the Costs of Preventive Interventions

    Science.gov (United States)

    Foster, E. Michael; Porter, Michele M.; Ayers, Tim S.; Kaplan, Debra L.; Sandler, Irwin

    2007-01-01

    The goal of this article is to improve the practice and reporting of cost estimates of prevention programs. It reviews the steps in estimating the costs of an intervention and the principles that should guide estimation. The authors then review prior efforts to estimate intervention costs using a sample of well-known but diverse studies. Finally,…

  10. Evaluating the Impact of Zimbabwe’s Prevention of Mother-to-Child HIV Transmission Program: Population-Level Estimates of HIV-Free Infant Survival Pre-Option A

    Science.gov (United States)

    Buzdugan, Raluca; McCoy, Sandra I.; Watadzaushe, Constancia; Kang Dufour, Mi-Suk; Petersen, Maya; Dirawo, Jeffrey; Mushavi, Angela; Mujuru, Hilda Angela; Mahomva, Agnes; Musarandega, Reuben; Hakobyan, Anna; Mugurungi, Owen; Cowan, Frances M.; Padian, Nancy S.

    2015-01-01

    Objective We estimated HIV-free infant survival and mother-to-child HIV transmission (MTCT) rates in Zimbabwe, some of the first community-based estimates from a UNAIDS priority country. Methods In 2012 we surveyed mother-infant pairs residing in the catchment areas of 157 health facilities randomly selected from 5 of 10 provinces in Zimbabwe. Enrolled infants were born 9–18 months before the survey. We collected questionnaires, blood samples for HIV testing, and verbal autopsies for deceased mothers/infants. Estimates were assessed among i) all HIV-exposed infants, as part of an impact evaluation of Option A of the 2010 WHO guidelines (rolled out in Zimbabwe in 2011), and ii) the subgroup of infants unexposed to Option A. We compared province-level MTCT rates measured among women in the community with MTCT rates measured using program monitoring data from facilities serving those communities. Findings Among 8568 women with known HIV serostatus, 1107 (12.9%) were HIV-infected. Among all HIV-exposed infants, HIV-free infant survival was 90.9% (95% confidence interval (CI): 88.7–92.7) and MTCT was 8.8% (95% CI: 6.9–11.1). Sixty-six percent of HIV-exposed infants were still breastfeeding. Among the 762 infants born before Option A was implemented, 90.5% (95% CI: 88.1–92.5) were alive and HIV-uninfected at 9–18 months of age, and 9.1% (95%CI: 7.1–11.7) were HIV-infected. In four provinces, the community-based MTCT rate was higher than the facility-based MTCT rate. In Harare, the community and facility-based rates were 6.0% and 9.1%, respectively. Conclusion By 2012 Zimbabwe had made substantial progress towards the elimination of MTCT. Our HIV-free infant survival and MTCT estimates capture HIV transmissions during pregnancy, delivery and breastfeeding regardless of whether or not mothers accessed health services. These estimates also provide a baseline against which to measure the impact of Option A guidelines (and subsequently Option B+). PMID:26248197

  11. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Directory of Open Access Journals (Sweden)

    Afnizanfaizal Abdullah

    Full Text Available The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  12. Comparison of Four Different Energy Balance Models for Estimating Evapotranspiration in the Midwestern United States

    Directory of Open Access Journals (Sweden)

    Ramesh K. Singh

    2015-12-01

    Full Text Available The development of different energy balance models has allowed users to choose a model based on its suitability in a region. We compared four commonly used models—Mapping EvapoTranspiration at high Resolution with Internalized Calibration (METRIC model, Surface Energy Balance Algorithm for Land (SEBAL model, Surface Energy Balance System (SEBS model, and the Operational Simplified Surface Energy Balance (SSEBop model—using Landsat images to estimate evapotranspiration (ET in the Midwestern United States. Our models validation using three AmeriFlux cropland sites at Mead, Nebraska, showed that all four models captured the spatial and temporal variation of ET reasonably well with an R2 of more than 0.81. Both the METRIC and SSEBop models showed a low root mean square error (<0.93 mm·day−1 and a high Nash–Sutcliffe coefficient of efficiency (>0.80, whereas the SEBAL and SEBS models resulted in relatively higher bias for estimating daily ET. The empirical equation of daily average net radiation used in the SEBAL and SEBS models for upscaling instantaneous ET to daily ET resulted in underestimation of daily ET, particularly when the daily average net radiation was more than 100 W·m−2. Estimated daily ET for both cropland and grassland had some degree of linearity with METRIC, SEBAL, and SEBS, but linearity was stronger for evaporative fraction. Thus, these ET models have strengths and limitations for applications in water resource management.

  13. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Science.gov (United States)

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  14. Load estimation and assessment of land-based pollution for Quanzhou Bay and their relevance to the Total Quantity Control of Pollutants Discharged into the Sea (TQCPS) Program in China

    Science.gov (United States)

    Zhao, W. L.; Yang, S. Y.; Wang, J.; Xiao, J. M.; Lu, X. X.; Lin, J.; Huang, P.; Cai, M. G.

    2015-12-01

    The Total Quantity Control of Pollutants Discharged into the Sea (TQCPS) Program belonged to the Public Science and Technology Research Funds Projects of Ocean in China, launched in 2008. As one of the most important and typical demonstration cases of the TQCPS Program, a full investigation of the land-based pollutions discharges around Quanzhou Bay, China developed the total input for three main environmental factors (NH3-N, TP, COD) which were estimated and quantified in 2008 and 2012, respectively. Combined with the trend of seawater quality changes in Quanzhou Bay in the same periods, the effects of the program's implementation were then evaluated. On the whole, by using the basic survey data and export coefficient method, the total amounts of NH3-N, TP and COD discharged into the bay were estimated to be approximately 888.3, 130.6 and 14527.4 t/a in 2008, and 1518.6, 558.8 and 19986.7 t/a in 2012, respectively, where the percentage of the discharge from domestic sources (46.5% in 2008 and 45.2% in 2012) was generally higher than that from the other sources. Based on the characteristic of geography and administrative division, the land areas around the bay were divided into three parts: the south coast region (SCR), the west coast region (WCR), and the north coast region (NCR). The SCR and WCR accounted for 59.2 and 35.4% of the COD loads, and 49.2 and 48.0% of NH3-N loads in 2008. The NCR contributed less of the industrial pollution, but most to domestic pollution (54.1%), followed by 26.2% in the SCR in 2012. The contributions of the discharge from different land areas to the pollution of Quanzhou Bay were found to be differed in 2008 and 2012. Due to the difference in the levels of the economic development among these three areas, the discharge of pollutants from the north coast was much lower than that from the other two parts in 2008; however, following our suggestion of the moderation and optimization of the industrial distribution and the sewage

  15. Estimativa do consumo de energia e de macronutrientes no domicílio e na escola em pré-escolares Estimation of energy and macronutrient intake at home and in the kindergarten programs in preschool children

    Directory of Open Access Journals (Sweden)

    Juliana Rombaldi Bernardi

    2010-02-01

    Full Text Available OBJETIVO: Estimar o consumo de energia e de macronutrientes no domicílio e na escola em tempo integral em crianças de 2 a 6 anos e pesquisar diferenças no consumo entre as crianças de escolas públicas e particulares. MÉTODOS: Estudo transversal realizado com 362 pré-escolares em Caxias do Sul (RS. O estado nutricional foi avaliado pela razão peso para estatura. O consumo na escola foi avaliado por meio do método de pesagem direta individual dos alimentos consumidos pelas crianças e, no domicílio, por meio do método de registro alimentar realizado pelos pais ou responsáveis. Para as análises estatísticas utilizou-se o teste U de Mann-Whitney (p OBJECTIVE: To estimate the energy and macronutrient intake at home and at all-day in the kindergarten programs in children aged 2 to 6 and to investigate differences in consumption and intake between children at public and private kindergartens. METHODS: This was a cross-sectional study of 362 preschool children from Caxias do Sul, Brazil. Nutritional status was assessed in terms of weight to height ratios. Foods consumed in the kindergarten were evaluated by weighing the actual foods eaten by the children and home intakes were calculated from a food diary kept by parents or guardians. Statistical analyses were performed using the Mann-Whitney U test (p < 0.05. RESULTS: It was found that 28 children (7.7% were overweight, 92 (25.4% were at risk of becoming overweight and seven (1.9% were classified as having wasting. Analysis of 24-hour nutritional intake demonstrated that 51.3% of the energy, 60.3% of the lipids and 51.6% of the proteins consumed by children were eaten at home, despite the children spending the whole day in the kindergarten programs. Preschool children at kindergartens ate greater quantities of energy (p = 0.001, carbohydrates (p < 0.001, and lipids (p = 0.04 than did children at public kindergartens, but their total daily intakes were similar, irrespective of which type of

  16. programs to estimate emissions from gasoline vehicles

    Directory of Open Access Journals (Sweden)

    José Ignacio Huertas

    2010-01-01

    Full Text Available Actualmente, la mayoría de los inventarios de emisiones realizados para los grandes centros urbanos están basados en los factores de emisión recomendados por la US EPA. Buscando usar la herramienta MOBILE de esta agencia, las autoridades ambientales concentran su esfuerzo en modificar esos factores de emisión para que tomen en cuenta las características particulares de cada ciudad. Sin embargo, aun permanece la necesidad de una metodología que basada en datos experimentales mejore la precisión de los inventarios de emisiones realizados para los grandes centros urbanos. Para atender esta necesidad, el presente trabajo propone una metodología que se basa en la información recolectada en los programas de inspección y mantenimiento (I/M donde el 100% de la flota vehicular es evaluada usando los protocolo de pruebas ASM 5015, ASM 2525 o similares. Se realizó trabajo experimental en pista y sobre dinamómetro de chasis para explorar la posibilidad de implementar la metodología propuesta para el caso de un vehículo operando a condiciones de estado estable. Los resultados experimentales muestran que la metodología propuesta tiene el potencial de ser implementada para el caso de velocidad constante.

  17. Estimating Utility

    DEFF Research Database (Denmark)

    Arndt, Channing; Simler, Kenneth R.

    2010-01-01

    A fundamental premise of absolute poverty lines is that they represent the same level of utility through time and space. Disturbingly, a series of recent studies in middle- and low-income economies show that even carefully derived poverty lines rarely satisfy this premise. This article proposes a......, with the current approach tending to systematically overestimate (underestimate) poverty in urban (rural) zones.......A fundamental premise of absolute poverty lines is that they represent the same level of utility through time and space. Disturbingly, a series of recent studies in middle- and low-income economies show that even carefully derived poverty lines rarely satisfy this premise. This article proposes...... an information-theoretic approach to estimating cost-of-basic-needs (CBN) poverty lines that are utility consistent. Applications to date illustrate that utility-consistent poverty measurements derived from the proposed approach and those derived from current CBN best practices often differ substantially...

  18. A practical algorithm for distribution state estimation including renewable energy sources

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher [Electronic and Electrical Department, Shiraz University of Technology, Modares Blvd., P.O. 71555-313, Shiraz (Iran); Firouzi, Bahman Bahmani [Islamic Azad University Marvdasht Branch, Marvdasht (Iran)

    2009-11-15

    Renewable energy is energy that is in continuous supply over time. These kinds of energy sources are divided into five principal renewable sources of energy: the sun, the wind, flowing water, biomass and heat from within the earth. According to some studies carried out by the research institutes, about 25% of the new generation will be generated by Renewable Energy Sources (RESs) in the near future. Therefore, it is necessary to study the impact of RESs on the power systems, especially on the distribution networks. This paper presents a practical Distribution State Estimation (DSE) including RESs and some practical consideration. The proposed algorithm is based on the combination of Nelder-Mead simplex search and Particle Swarm Optimization (PSO) algorithms, called PSO-NM. The proposed algorithm can estimate load and RES output values by Weighted Least-Square (WLS) approach. Some practical considerations are var compensators, Voltage Regulators (VRs), Under Load Tap Changer (ULTC) transformer modeling, which usually have nonlinear and discrete characteristics, and unbalanced three-phase power flow equations. The comparison results with other evolutionary optimization algorithms such as original PSO, Honey Bee Mating Optimization (HBMO), Neural Networks (NNs), Ant Colony Optimization (ACO), and Genetic Algorithm (GA) for a test system demonstrate that PSO-NM is extremely effective and efficient for the DSE problems. (author)

  19. Small Area Income and Poverty Estimates (SAIPE): 2010 Highlights

    Science.gov (United States)

    US Census Bureau, 2011

    2011-01-01

    This document presents 2010 data from the Small Area Income and Poverty Estimates (SAIPE) program of the U.S. Census Bureau. The SAIPE program produces poverty estimates for the total population and median household income estimates annually for all counties and states. SAIPE data also produces single-year poverty estimates for the school-age…

  20. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  1. Estimating ISABELLE shielding requirements

    International Nuclear Information System (INIS)

    Stevens, A.J.; Thorndike, A.M.

    1976-01-01

    Estimates were made of the shielding thicknesses required at various points around the ISABELLE ring. Both hadron and muon requirements are considered. Radiation levels at the outside of the shield and at the BNL site boundary are kept at or below 1000 mrem per year and 5 mrem/year respectively. Muon requirements are based on the Wang formula for pion spectra, and the hadron requirements on the hadron cascade program CYLKAZ of Ranft. A muon shield thickness of 77 meters of sand is indicated outside the ring in one area, and hadron shields equivalent to from 2.7 to 5.6 meters in thickness of sand above the ring. The suggested safety allowance would increase these values to 86 meters and 4.0 to 7.2 meters respectively. There are many uncertainties in such estimates, but these last figures are considered to be rather conservative

  2. An improved estimation and focusing scheme for vector velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Munk, Peter

    1999-01-01

    to reduce spatial velocity dispersion. Examples of different velocity vector conditions are shown using the Field II simulation program. A relative accuracy of 10.1 % is obtained for the lateral velocity estimates for a parabolic velocity profile for a flow perpendicular to the ultrasound beam and a signal...

  3. SPOT Program

    Science.gov (United States)

    Smith, Jason T.; Welsh, Sam J.; Farinetti, Antonio L.; Wegner, Tim; Blakeslee, James; Deboeck, Toni F.; Dyer, Daniel; Corley, Bryan M.; Ollivierre, Jarmaine; Kramer, Leonard; hide

    2010-01-01

    A Spacecraft Position Optimal Tracking (SPOT) program was developed to process Global Positioning System (GPS) data, sent via telemetry from a spacecraft, to generate accurate navigation estimates of the vehicle position and velocity (state vector) using a Kalman filter. This program uses the GPS onboard receiver measurements to sequentially calculate the vehicle state vectors and provide this information to ground flight controllers. It is the first real-time ground-based shuttle navigation application using onboard sensors. The program is compact, portable, self-contained, and can run on a variety of UNIX or Linux computers. The program has a modular objec-toriented design that supports application-specific plugins such as data corruption remediation pre-processing and remote graphics display. The Kalman filter is extensible to additional sensor types or force models. The Kalman filter design is also strong against data dropouts because it uses physical models from state and covariance propagation in the absence of data. The design of this program separates the functionalities of SPOT into six different executable processes. This allows for the individual processes to be connected in an a la carte manner, making the feature set and executable complexity of SPOT adaptable to the needs of the user. Also, these processes need not be executed on the same workstation. This allows for communications between SPOT processes executing on the same Local Area Network (LAN). Thus, SPOT can be executed in a distributed sense with the capability for a team of flight controllers to efficiently share the same trajectory information currently being computed by the program. SPOT is used in the Mission Control Center (MCC) for Space Shuttle Program (SSP) and International Space Station Program (ISSP) operations, and can also be used as a post -flight analysis tool. It is primarily used for situational awareness, and for contingency situations.

  4. Variance estimation for generalized Cavalieri estimators

    OpenAIRE

    Johanna Ziegel; Eva B. Vedel Jensen; Karl-Anton Dorph-Petersen

    2011-01-01

    The precision of stereological estimators based on systematic sampling is of great practical importance. This paper presents methods of data-based variance estimation for generalized Cavalieri estimators where errors in sampling positions may occur. Variance estimators are derived under perturbed systematic sampling, systematic sampling with cumulative errors and systematic sampling with random dropouts. Copyright 2011, Oxford University Press.

  5. Comparison of Classical and Robust Estimates of Threshold Auto-regression Parameters

    Directory of Open Access Journals (Sweden)

    V. B. Goryainov

    2017-01-01

    Full Text Available The study object is the first-order threshold auto-regression model with a single zero-located threshold. The model describes a stochastic temporal series with discrete time by means of a piecewise linear equation consisting of two linear classical first-order autoregressive equations. One of these equations is used to calculate a running value of the temporal series. A control variable that determines the choice between these two equations is the sign of the previous value of the same series.The first-order threshold autoregressive model with a single threshold depends on two real parameters that coincide with the coefficients of the piecewise linear threshold equation. These parameters are assumed to be unknown. The paper studies an estimate of the least squares, an estimate the least modules, and the M-estimates of these parameters. The aim of the paper is a comparative study of the accuracy of these estimates for the main probabilistic distributions of the updating process of the threshold autoregressive equation. These probability distributions were normal, contaminated normal, logistic, double-exponential distributions, a Student's distribution with different number of degrees of freedom, and a Cauchy distribution.As a measure of the accuracy of each estimate, was chosen its variance to measure the scattering of the estimate around the estimated parameter. An estimate with smaller variance made from the two estimates was considered to be the best. The variance was estimated by computer simulation. To estimate the smallest modules an iterative weighted least-squares method was used and the M-estimates were done by the method of a deformable polyhedron (the Nelder-Mead method. To calculate the least squares estimate, an explicit analytic expression was used.It turned out that the estimation of least squares is best only with the normal distribution of the updating process. For the logistic distribution and the Student's distribution with the

  6. 7 CFR 1435.301 - Annual estimates and quarterly re-estimates.

    Science.gov (United States)

    2010-01-01

    ... CORPORATION, DEPARTMENT OF AGRICULTURE LOANS, PURCHASES, AND OTHER OPERATIONS SUGAR PROGRAM Flexible Marketing..., estimates, and re-estimates in this subpart will use available USDA statistics and estimates of production, consumption, and stocks, taking into account, where appropriate, data supplied in reports submitted pursuant...

  7. The Psychology of Cost Estimating

    Science.gov (United States)

    Price, Andy

    2016-01-01

    Cost estimation for large (and even not so large) government programs is a challenge. The number and magnitude of cost overruns associated with large Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) programs highlight the difficulties in developing and promulgating accurate cost estimates. These overruns can be the result of inadequate technology readiness or requirements definition, the whims of politicians or government bureaucrats, or even as failures of the cost estimating profession itself. However, there may be another reason for cost overruns that is right in front of us, but only recently have we begun to grasp it: the fact that cost estimators and their customers are human. The last 70+ years of research into human psychology and behavioral economics have yielded amazing findings into how we humans process and use information to make judgments and decisions. What these scientists have uncovered is surprising: humans are often irrational and illogical beings, making decisions based on factors such as emotion and perception, rather than facts and data. These built-in biases to our thinking directly affect how we develop our cost estimates and how those cost estimates are used. We cost estimators can use this knowledge of biases to improve our cost estimates and also to improve how we communicate and work with our customers. By understanding how our customers think, and more importantly, why they think the way they do, we can have more productive relationships and greater influence. By using psychology to our advantage, we can more effectively help the decision maker and our organizations make fact-based decisions.

  8. Financing Competency Based Programs.

    Science.gov (United States)

    Daniel, Annette

    Literature on the background, causes, and current prevalence of competency based programs is synthesized in this report. According to one analysis of the actual and probable costs of minimum competency testing, estimated costs for test development, test administration, bureaucratic structures, and remedial programs for students who cannot pass the…

  9. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  10. MINIMUM VARIANCE BETA ESTIMATION WITH DYNAMIC CONSTRAINTS,

    Science.gov (United States)

    developed (at AFETR ) and is being used to isolate the primary error sources in the beta estimation task. This computer program is additionally used to...determine what success in beta estimation can be achieved with foreseeable instrumentation accuracies. Results are included that illustrate the effects on

  11. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    Science.gov (United States)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  12. Estimating Equilibrium Effects of Job Search Assistance

    DEFF Research Database (Denmark)

    Gautier, Pieter; Muller, Paul; van der Klaauw, Bas

    that the nonparticipants in the experiment regions find jobs slower after the introduction of the activation program (relative to workers in other regions). We then estimate an equilibrium search model. This model shows that a large scale role out of the activation program decreases welfare, while a standard partial...... microeconometric cost-benefit analysis would conclude the opposite....

  13. Longitudinal Factor Score Estimation Using the Kalman Filter.

    Science.gov (United States)

    Oud, Johan H.; And Others

    1990-01-01

    How longitudinal factor score estimation--the estimation of the evolution of factor scores for individual examinees over time--can profit from the Kalman filter technique is described. The Kalman estimates change more cautiously over time, have lower estimation error variances, and reproduce the LISREL program latent state correlations more…

  14. National Coral Reef Monitoring Program: Stratified Random Surveys (StRS) of Reef Fish, including Benthic Estimate Data at Jarvis Island from 2016-05-16 to 2016-05-22 (NCEI Accession 0157594)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Surveys were conducted in the course of a reef fish survey cruise conducted by the NOAA Coral Reef Ecosystem Program (CREP) at the NOAA Pacific Islands Fisheries...

  15. Functional Programming

    OpenAIRE

    Chitil, Olaf

    2009-01-01

    Functional programming is a programming paradigm like object-oriented programming and logic programming. Functional programming comprises both a specific programming style and a class of programming languages that encourage and support this programming style. Functional programming enables the programmer to describe an algorithm on a high-level, in terms of the problem domain, without having to deal with machine-related details. A program is constructed from functions that only map inputs to ...

  16. Estimating Discount Rates

    Directory of Open Access Journals (Sweden)

    Laurence Booth

    2015-04-01

    Full Text Available Discount rates are essential to applied finance, especially in setting prices for regulated utilities and valuing the liabilities of insurance companies and defined benefit pension plans. This paper reviews the basic building blocks for estimating discount rates. It also examines market risk premiums, as well as what constitutes a benchmark fair or required rate of return, in the aftermath of the financial crisis and the U.S. Federal Reserve’s bond-buying program. Some of the results are disconcerting. In Canada, utilities and pension regulators responded to the crash in different ways. Utilities regulators haven’t passed on the full impact of low interest rates, so that consumers face higher prices than they should whereas pension regulators have done the opposite, and forced some contributors to pay more. In both cases this is opposite to the desired effect of monetary policy which is to stimulate aggregate demand. A comprehensive survey of global finance professionals carried out last year provides some clues as to where adjustments are needed. In the U.S., the average equity market required return was estimated at 8.0 per cent; Canada’s is 7.40 per cent, due to the lower market risk premium and the lower risk-free rate. This paper adds a wealth of historic and survey data to conclude that the ideal base long-term interest rate used in risk premium models should be 4.0 per cent, producing an overall expected market return of 9-10.0 per cent. The same data indicate that allowed returns to utilities are currently too high, while the use of current bond yields in solvency valuations of pension plans and life insurers is unhelpful unless there is a realistic expectation that the plans will soon be terminated.

  17. Outer planet probe cost estimates: First impressions

    Science.gov (United States)

    Niehoff, J.

    1974-01-01

    An examination was made of early estimates of outer planetary atmospheric probe cost by comparing the estimates with past planetary projects. Of particular interest is identification of project elements which are likely cost drivers for future probe missions. Data are divided into two parts: first, the description of a cost model developed by SAI for the Planetary Programs Office of NASA, and second, use of this model and its data base to evaluate estimates of probe costs. Several observations are offered in conclusion regarding the credibility of current estimates and specific areas of the outer planet probe concept most vulnerable to cost escalation.

  18. Load Estimation from Natural input Modal Analysis

    DEFF Research Database (Denmark)

    Aenlle, Manuel López; Brincker, Rune; Canteli, Alfonso Fernández

    2005-01-01

    One application of Natural Input Modal Analysis consists in estimating the unknown load acting on structures such as wind loads, wave loads, traffic loads, etc. In this paper, a procedure to determine loading from a truncated modal model, as well as the results of an experimental testing programme...... estimation. In the experimental program a small structure subjected to vibration was used to estimate the loading from the measurements and the experimental modal space. The modal parameters were estimated by Natural Input Modal Analysis and the scaling factors of the mode shapes obtained by the mass change...

  19. Department of the Navy Supporting Data for Fiscal Year 1984 Budget Estimates Descriptive Summaries Submitted to Congress January 1983. Research, Development, Test and Evaluation, Navy. Book 1. Technology Base, Advanced Technology Development, Strategic Programs.

    Science.gov (United States)

    1983-01-01

    battle groups--these materials will also he of value to the Subelement: 41 Title, Biloil anMedical SciencesProgram EleueiiET: WON3I Yile Iee s esan ...in PT 1982. G. (U) WORK PI1101530 BT: In-"Walt Raval Surface Weapons Centat, Dhigren, VA; Satorn Space and Missile Center, Cocoa Beach, FL; Army

  20. Variance function estimation for immunoassays

    International Nuclear Information System (INIS)

    Raab, G.M.; Thompson, R.; McKenzie, I.

    1980-01-01

    A computer program is described which implements a recently described, modified likelihood method of determining an appropriate weighting function to use when fitting immunoassay dose-response curves. The relationship between the variance of the response and its mean value is assumed to have an exponential form, and the best fit to this model is determined from the within-set variability of many small sets of repeated measurements. The program estimates the parameter of the exponential function with its estimated standard error, and tests the fit of the experimental data to the proposed model. Output options include a list of the actual and fitted standard deviation of the set of responses, a plot of actual and fitted standard deviation against the mean response, and an ordered list of the 10 sets of data with the largest ratios of actual to fitted standard deviation. The program has been designed for a laboratory user without computing or statistical expertise. The test-of-fit has proved valuable for identifying outlying responses, which may be excluded from further analysis by being set to negative values in the input file. (Auth.)

  1. Department of the Navy Supporting Data for Fiscal Year 1984 Budget Estimates Descriptive Summaries Submitted to Congress January 1983. Research, Development, Test and Evaluation, Navy. Book 2. Tactical Programs

    Science.gov (United States)

    1983-01-01

    Production Releass for the Army, OPv and At, Force is in ty 1991. Production of 602 uitcrcft for the Marine Corps and May) will be completed :n tY 1998 ...Fy 1965 3. Operational Test and Evaluation (USKO) Secoad Quartet FY 1998 4. First USEC delivery Third Quarter FY 1991 5. USA/USAF/USN Delivery First...vehicular mounted loler power unite. (U) Project C0075. Tactical Mot Tranprt Vehicles: This program is to provide the optimum mix of tactical motor

  2. Variable Kernel Density Estimation

    OpenAIRE

    Terrell, George R.; Scott, David W.

    1992-01-01

    We investigate some of the possibilities for improvement of univariate and multivariate kernel density estimates by varying the window over the domain of estimation, pointwise and globally. Two general approaches are to vary the window width by the point of estimation and by point of the sample observation. The first possibility is shown to be of little efficacy in one variable. In particular, nearest-neighbor estimators in all versions perform poorly in one and two dimensions, but begin to b...

  3. Programa computacional para a estimativa da temperatura do ar para a região Nordeste do Brasil A computer program to estimate air temperature for Northeast region of Brazil

    Directory of Open Access Journals (Sweden)

    Enilson P. Cavalcanti

    2006-03-01

    Full Text Available A meta principal através desta pesquisa foi estabelecer modelo de estimativa de temperatura do ar (Estima_T em função das coordenadas geográficas e das Anomalias de Temperaturas da Superfície do Mar (ATSM. Neste estudo foram utilizadas as séries temporais das médias mensais de temperatura do ar (média diária, mínima e máxima de 69 estações meteorológicas do Nordeste do Brasil (NEB e ATSM do oceano Atlântico Tropical. O modelo Estima_T mostrou-se capaz de reconstruir séries temporais de temperatura do ar com razoável precisão para todo o NEB. Os resultados mostraram correlações estatisticamente significantes ao nível de 1% de probabilidade entre as temperaturas do ar observada e estimada pelo modelo, em toda a região de estudo.The objective of this research was to establish a model to estimate air temperature (Estima_T as a function of geographical coordinates and Sea Surface Temperature Anomalies (SSTA. The mean monthly time series of air temperatures (daily mean, minimum and maximum of 69 weather stations and SSTA of the Tropical Atlantic were analyzed. The model Estima_T showed good agreement between real and estimated air temperature data of Northeast Brazil. The results showed statistically significant correlation at 1% level between observed air temperatures and those estimated by model in the whole area of study.

  4. Fuel Burn Estimation Model

    Science.gov (United States)

    Chatterji, Gano

    2011-01-01

    Conclusions: Validated the fuel estimation procedure using flight test data. A good fuel model can be created if weight and fuel data are available. Error in assumed takeoff weight results in similar amount of error in the fuel estimate. Fuel estimation error bounds can be determined.

  5. Optimal fault signal estimation

    NARCIS (Netherlands)

    Stoorvogel, Antonie Arij; Niemann, H.H.; Saberi, A.; Sannuti, P.

    2002-01-01

    We consider here both fault identification and fault signal estimation. Regarding fault identification, we seek either exact or almost fault identification. On the other hand, regarding fault signal estimation, we seek either $H_2$ optimal, $H_2$ suboptimal or Hinfinity suboptimal estimation. By

  6. An analysis of the uncertainty in temperature and density estimates from fitting model spectra to data. 1998 summer research program for high school juniors at the University of Rochester's Laboratory for Laser Energetics. Student research reports

    International Nuclear Information System (INIS)

    Schubmehl, M.

    1999-03-01

    Temperature and density histories of direct-drive laser fusion implosions are important to an understanding of the reaction's progress. Such measurements also document phenomena such as preheating of the core and improper compression that can interfere with the thermonuclear reaction. Model x-ray spectra from the non-LTE (local thermodynamic equilibrium) radiation transport post-processor for LILAC have recently been fitted to OMEGA data. The spectrum fitting code reads in a grid of model spectra and uses an iterative weighted least-squares algorithm to perform a fit to experimental data, based on user-input parameter estimates. The purpose of this research was to upgrade the fitting code to compute formal uncertainties on fitted quantities, and to provide temperature and density estimates with error bars. A standard error-analysis process was modified to compute these formal uncertainties from information about the random measurement error in the data. Preliminary tests of the code indicate that the variances it returns are both reasonable and useful

  7. Supplemental report on cost estimates'

    International Nuclear Information System (INIS)

    1992-01-01

    The Office of Management and Budget (OMB) and the U.S. Army Corps of Engineers have completed an analysis of the Department of Energy's (DOE) Fiscal Year (FY) 1993 budget request for its Environmental Restoration and Waste Management (ERWM) program. The results were presented to an interagency review group (IAG) of senior-Administration officials for their consideration in the budget process. This analysis included evaluations of the underlying legal requirements and cost estimates on which the ERWM budget request was based. The major conclusions are contained in a separate report entitled, ''Interagency Review of the Department of Energy Environmental Restoration and Waste Management Program.'' This Corps supplemental report provides greater detail on the cost analysis

  8. A neural flow estimator

    DEFF Research Database (Denmark)

    Jørgensen, Ivan Harald Holger; Bogason, Gudmundur; Bruun, Erik

    1995-01-01

    This paper proposes a new way to estimate the flow in a micromechanical flow channel. A neural network is used to estimate the delay of random temperature fluctuations induced in a fluid. The design and implementation of a hardware efficient neural flow estimator is described. The system...... is implemented using switched-current technique and is capable of estimating flow in the μl/s range. The neural estimator is built around a multiplierless neural network, containing 96 synaptic weights which are updated using the LMS1-algorithm. An experimental chip has been designed that operates at 5 V...

  9. Piecewise Loglinear Estimation of Efficient Production Surfaces

    OpenAIRE

    Rajiv D. Banker; Ajay Maindiratta

    1986-01-01

    Linear programming formulations for piecewise loglinear estimation of efficient production surfaces are derived from a set of basic properties postulated for the underlying production possibility sets. Unlike the piecewise linear model of Banker, Charnes, and Cooper (Banker R. D., A. Charnes, W. W. Cooper. 1984. Models for the estimation of technical and scale inefficiencies in data envelopment analysis. Management Sci. 30 (September) 1078--1092.), this approach permits the identification of ...

  10. TEST (Toxicity Estimation Software Tool) Ver 4.1

    Science.gov (United States)

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  11. Program management system manual

    International Nuclear Information System (INIS)

    1989-08-01

    OCRWM has developed a program management system (PMS) to assist in organizing, planning, directing and controlling the Civilian Radioactive Waste Management Program. A well defined management system is necessary because: (1) the Program is a complex technical undertaking with a large number of participants, (2) the disposal and storage facilities to be developed by the Program must be licensed by the Nuclear Regulatory Commission (NRC) and hence are subject to rigorous quality assurance (QA) requirements, (3) the legislation mandating the Program creates a dichotomy between demanding schedules of performance and a requirement for close and continuous consultation and cooperation with external entities, (4) the various elements of the Program must be managed as parts of an integrated waste management system, (5) the Program has an estimated total system life cycle cost of over $30 billion, and (6) the Program has a unique fiduciary responsibility to the owners and generators of the nuclear waste for controlling costs and minimizing the user fees paid into the Nuclear Waste Fund. This PMS Manual is designed and structured to facilitate strong, effective Program management by providing policies and requirements for organizing, planning, directing and controlling the major Program functions

  12. Preparing Science Teachers: Strong Emphasis on Science Content Course Work in a Master's Program in Education

    Science.gov (United States)

    Ajhar, Edward A.; Blackwell, E.; Quesada, D.

    2010-05-01

    In South Florida, science teacher preparation is often weak as a shortage of science teachers often prompts administrators to assign teachers to science classes just to cover the classroom needs. This results is poor preparation of students for college science course work, which, in turn, causes the next generation of science teachers to be even weaker than the first. This cycle must be broken in order to prepare better students in the sciences. At St. Thomas University in Miami Gardens, Florida, our School of Science has teamed with our Institute for Education to create a program to alleviate this problem: A Master of Science in Education with a Concentration in Earth/Space Science. The Master's program consists of 36 total credits. Half the curriculum consists of traditional educational foundation and instructional leadership courses while the other half is focused on Earth and Space Science content courses. The content area of 18 credits also provides a separate certificate program. Although traditional high school science education places a heavy emphasis on Earth Science, this program expands that emphasis to include the broader context of astronomy, astrophysics, astrobiology, planetary science, and the practice and philosophy of science. From this contextual basis the teacher is better prepared to educate and motivate middle and high school students in all areas of the physical sciences. Because hands-on experience is especially valuable to educators, our program uses materials and equipment including small optical telescopes (Galileoscopes), several 8-in and 14-in Celestron and Meade reflectors, and a Small Radio Telescope installed on site. (Partial funding provided by the US Department of Education through Minority Science and Engineering Improvement Program grant P120A050062.)

  13. Cost Estimating Handbook for Environmental Restoration

    International Nuclear Information System (INIS)

    1993-01-01

    Environmental restoration (ER) projects have presented the DOE and cost estimators with a number of properties that are not comparable to the normal estimating climate within DOE. These properties include: An entirely new set of specialized expressions and terminology. A higher than normal exposure to cost and schedule risk, as compared to most other DOE projects, due to changing regulations, public involvement, resource shortages, and scope of work. A higher than normal percentage of indirect costs to the total estimated cost due primarily to record keeping, special training, liability, and indemnification. More than one estimate for a project, particularly in the assessment phase, in order to provide input into the evaluation of alternatives for the cleanup action. While some aspects of existing guidance for cost estimators will be applicable to environmental restoration projects, some components of the present guidelines will have to be modified to reflect the unique elements of these projects. The purpose of this Handbook is to assist cost estimators in the preparation of environmental restoration estimates for Environmental Restoration and Waste Management (EM) projects undertaken by DOE. The DOE has, in recent years, seen a significant increase in the number, size, and frequency of environmental restoration projects that must be costed by the various DOE offices. The coming years will show the EM program to be the largest non-weapons program undertaken by DOE. These projects create new and unique estimating requirements since historical cost and estimating precedents are meager at best. It is anticipated that this Handbook will enhance the quality of cost data within DOE in several ways by providing: The basis for accurate, consistent, and traceable baselines. Sound methodologies, guidelines, and estimating formats. Sources of cost data/databases and estimating tools and techniques available at DOE cost professionals

  14. Estimating aquifer transmissivity from specific capacity using MATLAB.

    Science.gov (United States)

    McLin, Stephen G

    2005-01-01

    Historically, specific capacity information has been used to calculate aquifer transmissivity when pumping test data are unavailable. This paper presents a simple computer program written in the MATLAB programming language that estimates transmissivity from specific capacity data while correcting for aquifer partial penetration and well efficiency. The program graphically plots transmissivity as a function of these factors so that the user can visually estimate their relative importance in a particular application. The program is compatible with any computer operating system running MATLAB, including Windows, Macintosh OS, Linux, and Unix. Two simple examples illustrate program usage.

  15. Cost Estimation and Control for Flight Systems

    Science.gov (United States)

    Hammond, Walter E.; Vanhook, Michael E. (Technical Monitor)

    2002-01-01

    Good program management practices, cost analysis, cost estimation, and cost control for aerospace flight systems are interrelated and depend upon each other. The best cost control process cannot overcome poor design or poor systems trades that lead to the wrong approach. The project needs robust Technical, Schedule, Cost, Risk, and Cost Risk practices before it can incorporate adequate Cost Control. Cost analysis both precedes and follows cost estimation -- the two are closely coupled with each other and with Risk analysis. Parametric cost estimating relationships and computerized models are most often used. NASA has learned some valuable lessons in controlling cost problems, and recommends use of a summary Project Manager's checklist as shown here.

  16. Methods for robustness programming

    NARCIS (Netherlands)

    Olieman, N.J.

    2008-01-01

    Robustness of an object is defined as the probability that an object will have properties as required. Robustness Programming (RP) is a mathematical approach for Robustness estimation and Robustness optimisation. An example in the context of designing a food product, is finding the best composition

  17. Program specialization

    CERN Document Server

    Marlet, Renaud

    2013-01-01

    This book presents the principles and techniques of program specialization - a general method to make programs faster (and possibly smaller) when some inputs can be known in advance. As an illustration, it describes the architecture of Tempo, an offline program specializer for C that can also specialize code at runtime, and provides figures for concrete applications in various domains. Technical details address issues related to program analysis precision, value reification, incomplete program specialization, strategies to exploit specialized program, incremental specialization, and data speci

  18. Adjusting estimative prediction limits

    OpenAIRE

    Masao Ueki; Kaoru Fueda

    2007-01-01

    This note presents a direct adjustment of the estimative prediction limit to reduce the coverage error from a target value to third-order accuracy. The adjustment is asymptotically equivalent to those of Barndorff-Nielsen & Cox (1994, 1996) and Vidoni (1998). It has a simpler form with a plug-in estimator of the coverage probability of the estimative limit at the target value. Copyright 2007, Oxford University Press.

  19. Electrical estimating methods

    CERN Document Server

    Del Pico, Wayne J

    2014-01-01

    Simplify the estimating process with the latest data, materials, and practices Electrical Estimating Methods, Fourth Edition is a comprehensive guide to estimating electrical costs, with data provided by leading construction database RS Means. The book covers the materials and processes encountered by the modern contractor, and provides all the information professionals need to make the most precise estimate. The fourth edition has been updated to reflect the changing materials, techniques, and practices in the field, and provides the most recent Means cost data available. The complexity of el

  20. Health effects estimation for contaminated properties

    International Nuclear Information System (INIS)

    Marks, S.; Denham, D.H.; Cross, F.T.; Kennedy, W.E. Jr.

    1984-05-01

    As part of an overall remedial action program to evaluate the need for and institute actions designed to minimize health hazards from inactive tailings piles and from displaced tailings, methods for estimating health effects from tailings were developed and applied to the Salt Lake City area. 2 references, 2 tables

  1. Structural Estimation of Stock Market Participation Costs

    DEFF Research Database (Denmark)

    Khorunzhina, Natalia

    2013-01-01

    education programs can affect consumers' investment decisions. Using household data from the Panel Study of Income Dynamics, I estimate the magnitude of the participation cost, allowing for individual heterogeneity in it. The results show the average stock market participation cost is about 4–6% of labor...

  2. Maximum likely scale estimation

    DEFF Research Database (Denmark)

    Loog, Marco; Pedersen, Kim Steenstrup; Markussen, Bo

    2005-01-01

    A maximum likelihood local scale estimation principle is presented. An actual implementation of the estimation principle uses second order moments of multiple measurements at a fixed location in the image. These measurements consist of Gaussian derivatives possibly taken at several scales and/or ...

  3. Cost function estimation

    DEFF Research Database (Denmark)

    Andersen, C K; Andersen, K; Kragh-Sørensen, P

    2000-01-01

    on these criteria, a two-part model was chosen. In this model, the probability of incurring any costs was estimated using a logistic regression, while the level of the costs was estimated in the second part of the model. The choice of model had a substantial impact on the predicted health care costs, e...

  4. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.

    1992-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  5. Software cost estimation

    NARCIS (Netherlands)

    Heemstra, F.J.; Heemstra, F.J.

    1993-01-01

    The paper gives an overview of the state of the art of software cost estimation (SCE). The main questions to be answered in the paper are: (1) What are the reasons for overruns of budgets and planned durations? (2) What are the prerequisites for estimating? (3) How can software development effort be

  6. Coherence in quantum estimation

    Science.gov (United States)

    Giorda, Paolo; Allegra, Michele

    2018-01-01

    The geometry of quantum states provides a unifying framework for estimation processes based on quantum probes, and it establishes the ultimate bounds of the achievable precision. We show a relation between the statistical distance between infinitesimally close quantum states and the second order variation of the coherence of the optimal measurement basis with respect to the state of the probe. In quantum phase estimation protocols, this leads to propose coherence as the relevant resource that one has to engineer and control to optimize the estimation precision. Furthermore, the main object of the theory i.e. the symmetric logarithmic derivative, in many cases allows one to identify a proper factorization of the whole Hilbert space in two subsystems. The factorization allows one to discuss the role of coherence versus correlations in estimation protocols; to show how certain estimation processes can be completely or effectively described within a single-qubit subsystem; and to derive lower bounds for the scaling of the estimation precision with the number of probes used. We illustrate how the framework works for both noiseless and noisy estimation procedures, in particular those based on multi-qubit GHZ-states. Finally we succinctly analyze estimation protocols based on zero-temperature critical behavior. We identify the coherence that is at the heart of their efficiency, and we show how it exhibits the non-analyticities and scaling behavior proper of a large class of quantum phase transitions.

  7. Overconfidence in Interval Estimates

    Science.gov (United States)

    Soll, Jack B.; Klayman, Joshua

    2004-01-01

    Judges were asked to make numerical estimates (e.g., "In what year was the first flight of a hot air balloon?"). Judges provided high and low estimates such that they were X% sure that the correct answer lay between them. They exhibited substantial overconfidence: The correct answer fell inside their intervals much less than X% of the time. This…

  8. Adaptive Spectral Doppler Estimation

    DEFF Research Database (Denmark)

    Gran, Fredrik; Jakobsson, Andreas; Jensen, Jørgen Arendt

    2009-01-01

    . The methods can also provide better quality of the estimated power spectral density (PSD) of the blood signal. Adaptive spectral estimation techniques are known to pro- vide good spectral resolution and contrast even when the ob- servation window is very short. The 2 adaptive techniques are tested......In this paper, 2 adaptive spectral estimation techniques are analyzed for spectral Doppler ultrasound. The purpose is to minimize the observation window needed to estimate the spectrogram to provide a better temporal resolution and gain more flexibility when designing the data acquisition sequence...... and compared with the averaged periodogram (Welch’s method). The blood power spectral capon (BPC) method is based on a standard minimum variance technique adapted to account for both averaging over slow-time and depth. The blood amplitude and phase estimation technique (BAPES) is based on finding a set...

  9. Optomechanical parameter estimation

    International Nuclear Information System (INIS)

    Ang, Shan Zheng; Tsang, Mankei; Harris, Glen I; Bowen, Warwick P

    2013-01-01

    We propose a statistical framework for the problem of parameter estimation from a noisy optomechanical system. The Cramér–Rao lower bound on the estimation errors in the long-time limit is derived and compared with the errors of radiometer and expectation–maximization (EM) algorithms in the estimation of the force noise power. When applied to experimental data, the EM estimator is found to have the lowest error and follow the Cramér–Rao bound most closely. Our analytic results are envisioned to be valuable to optomechanical experiment design, while the EM algorithm, with its ability to estimate most of the system parameters, is envisioned to be useful for optomechanical sensing, atomic magnetometry and fundamental tests of quantum mechanics. (paper)

  10. CHANNEL ESTIMATION TECHNIQUE

    DEFF Research Database (Denmark)

    2015-01-01

    A method includes determining a sequence of first coefficient estimates of a communication channel based on a sequence of pilots arranged according to a known pilot pattern and based on a receive signal, wherein the receive signal is based on the sequence of pilots transmitted over the communicat......A method includes determining a sequence of first coefficient estimates of a communication channel based on a sequence of pilots arranged according to a known pilot pattern and based on a receive signal, wherein the receive signal is based on the sequence of pilots transmitted over...... the communication channel. The method further includes determining a sequence of second coefficient estimates of the communication channel based on a decomposition of the first coefficient estimates in a dictionary matrix and a sparse vector of the second coefficient estimates, the dictionary matrix including...... filter characteristics of at least one known transceiver filter arranged in the communication channel....

  11. Visual Literacy: Implications for the Production of Children's Television Programs.

    Science.gov (United States)

    Amey, L. J.

    Visual literacy, the integration of seeing with other cognitive processes, is an essential tool of learning. To explain the relationship between the perceiver and the perceived, three types of theories can be brought to bear: introverted; extroverted; and transactional. Franklin Fearing, George Herbert Mead, Martin Buber, and other theorists have…

  12. Sharing Perspectives and Learning from One Another: Southern Paiutes, Scientists, and Policymakers in the Glen Canyon Dam Adaptive Management Program

    Science.gov (United States)

    Austin, D. E.; Bulletts, K.; Bulletts, C.

    2017-12-01

    The traditional lands of the Southern Paiute people in the United States are bounded by more than 600 miles of the Colorado River from the Kaiparowits Plateau in the north to Blythe, California in the south. According to Southern Paiute traditional knowledge, Southern Paiutes were the first inhabitants of this region and are responsible for protecting and managing this land along with the water and all that is upon and within it. In 1963, the Bureau of Reclamation completed construction of Glen Canyon Dam on the Colorado River, and in 1972, the Glen Canyon National Recreation Area was established, encompassing Lake Mead above the Dam and a world class trout fishery on the Colorado River between the Dam and Lees Ferry. Below Lees Ferry on its way to Lake Mead and Hoover Dam, the Colorado River flows through Grand Canyon National Park and the Navajo and Hualapai reservations. U.S. federal law requires that Glen Canyon Dam be operated with minimal impact to the natural, recreational, and cultural resources of the region of the Colorado River that is potentially impacted by flows from the Dam. The Grand Canyon Protection Act and the Environmental Impact Statement (EIS) for the Operation of the Glen Canyon Dam established a program of long-term research and monitoring of the effects of the Dam on these resources. In 1991, three Southern Paiute tribes - the Kaibab Band of Paiute Indians, the Paiute Indian Tribe of Utah, and the San Juan Southern Paiute Tribe - agreed to participate in studies to identify cultural resources impacted by Glen Canyon Dam and to recommend strategies for their protection, In 1995, the EIS was completed and transition to the Adaptive Management Program (AMP) called for in the Grand Canyon Protection Act was begun. At that time, Southern Paiute activities expanded to include assessing potential environmental and cultural impacts of the dam, developing monitoring procedures, and interacting with scientists, other tribal representatives, and

  13. Radiation risk estimation

    International Nuclear Information System (INIS)

    Schull, W.J.; Texas Univ., Houston, TX

    1992-01-01

    Estimation of the risk of cancer following exposure to ionizing radiation remains largely empirical, and models used to adduce risk incorporate few, if any, of the advances in molecular biology of a past decade or so. These facts compromise the estimation risk where the epidemiological data are weakest, namely, at low doses and dose rates. Without a better understanding of the molecular and cellular events ionizing radiation initiates or promotes, it seems unlikely that this situation will improve. Nor will the situation improve without further attention to the identification and quantitative estimation of the effects of those host and environmental factors that enhance or attenuate risk. (author)

  14. Radon Research Program, FY-1990

    International Nuclear Information System (INIS)

    1991-03-01

    The Department of Energy (DOE) Office of Health and Environmental Research (OHER) has established a Radon Research Program with the primary objectives of acquiring knowledge necessary to improve estimates of health risks associated with radon exposure and also to improve radon control. Through the Radon Research Program, OHER supports and coordinates the research activities of investigators at facilities all across the nation. From this research, significant advances are being made in our understanding of the health effects of radon. OHER publishes this annual report to provide information to interested researchers and the public about its research activities. This edition of the report summarizes the activities of program researchers during FY90. Chapter 2 of this report describes how risks associated with radon exposure are estimated, what assumptions are made in estimating radon risks for the general public, and how the uncertainties in these assumptions affect the risk estimates. Chapter 3 examines how OHER, through the Radon Research Program, is working to gather information for reducing the uncertainties and improving the risk estimates. Chapter 4 highlights some of the major findings of investigators participating in the Radon Research Program in the past year. And, finally, Chapter 5 discusses the direction in which the program is headed in the future. 20 figs

  15. Estimation of Jump Tails

    DEFF Research Database (Denmark)

    Bollerslev, Tim; Todorov, Victor

    We propose a new and flexible non-parametric framework for estimating the jump tails of Itô semimartingale processes. The approach is based on a relatively simple-to-implement set of estimating equations associated with the compensator for the jump measure, or its "intensity", that only utilizes...... the weak assumption of regular variation in the jump tails, along with in-fill asymptotic arguments for uniquely identifying the "large" jumps from the data. The estimation allows for very general dynamic dependencies in the jump tails, and does not restrict the continuous part of the process...... and the temporal variation in the stochastic volatility. On implementing the new estimation procedure with actual high-frequency data for the S&P 500 aggregate market portfolio, we find strong evidence for richer and more complex dynamic dependencies in the jump tails than hitherto entertained in the literature....

  16. Bridged Race Population Estimates

    Data.gov (United States)

    U.S. Department of Health & Human Services — Population estimates from "bridging" the 31 race categories used in Census 2000, as specified in the 1997 Office of Management and Budget (OMB) race and ethnicity...

  17. Estimation of measurement variances

    International Nuclear Information System (INIS)

    Jaech, J.L.

    1984-01-01

    The estimation of measurement error parameters in safeguards systems is discussed. Both systematic and random errors are considered. A simple analysis of variances to characterize the measurement error structure with biases varying over time is presented

  18. APLIKASI SPLINE ESTIMATOR TERBOBOT

    Directory of Open Access Journals (Sweden)

    I Nyoman Budiantara

    2001-01-01

    Full Text Available We considered the nonparametric regression model : Zj = X(tj + ej, j = 1,2,…,n, where X(tj is the regression curve. The random error ej are independently distributed normal with a zero mean and a variance s2/bj, bj > 0. The estimation of X obtained by minimizing a Weighted Least Square. The solution of this optimation is a Weighted Spline Polynomial. Further, we give an application of weigted spline estimator in nonparametric regression. Abstract in Bahasa Indonesia : Diberikan model regresi nonparametrik : Zj = X(tj + ej, j = 1,2,…,n, dengan X (tj kurva regresi dan ej sesatan random yang diasumsikan berdistribusi normal dengan mean nol dan variansi s2/bj, bj > 0. Estimasi kurva regresi X yang meminimumkan suatu Penalized Least Square Terbobot, merupakan estimator Polinomial Spline Natural Terbobot. Selanjutnya diberikan suatu aplikasi estimator spline terbobot dalam regresi nonparametrik. Kata kunci: Spline terbobot, Regresi nonparametrik, Penalized Least Square.

  19. Fractional cointegration rank estimation

    DEFF Research Database (Denmark)

    Lasak, Katarzyna; Velasco, Carlos

    the parameters of the model under the null hypothesis of the cointegration rank r = 1, 2, ..., p-1. This step provides consistent estimates of the cointegration degree, the cointegration vectors, the speed of adjustment to the equilibrium parameters and the common trends. In the second step we carry out a sup......-likelihood ratio test of no-cointegration on the estimated p - r common trends that are not cointegrated under the null. The cointegration degree is re-estimated in the second step to allow for new cointegration relationships with different memory. We augment the error correction model in the second step...... to control for stochastic trend estimation effects from the first step. The critical values of the tests proposed depend only on the number of common trends under the null, p - r, and on the interval of the cointegration degrees b allowed, but not on the true cointegration degree b0. Hence, no additional...

  20. Estimation of spectral kurtosis

    Science.gov (United States)

    Sutawanir

    2017-03-01

    Rolling bearings are the most important elements in rotating machinery. Bearing frequently fall out of service for various reasons: heavy loads, unsuitable lubrications, ineffective sealing. Bearing faults may cause a decrease in performance. Analysis of bearing vibration signals has attracted attention in the field of monitoring and fault diagnosis. Bearing vibration signals give rich information for early detection of bearing failures. Spectral kurtosis, SK, is a parameter in frequency domain indicating how the impulsiveness of a signal varies with frequency. Faults in rolling bearings give rise to a series of short impulse responses as the rolling elements strike faults, SK potentially useful for determining frequency bands dominated by bearing fault signals. SK can provide a measure of the distance of the analyzed bearings from a healthy one. SK provides additional information given by the power spectral density (psd). This paper aims to explore the estimation of spectral kurtosis using short time Fourier transform known as spectrogram. The estimation of SK is similar to the estimation of psd. The estimation falls in model-free estimation and plug-in estimator. Some numerical studies using simulations are discussed to support the methodology. Spectral kurtosis of some stationary signals are analytically obtained and used in simulation study. Kurtosis of time domain has been a popular tool for detecting non-normality. Spectral kurtosis is an extension of kurtosis in frequency domain. The relationship between time domain and frequency domain analysis is establish through power spectrum-autocovariance Fourier transform. Fourier transform is the main tool for estimation in frequency domain. The power spectral density is estimated through periodogram. In this paper, the short time Fourier transform of the spectral kurtosis is reviewed, a bearing fault (inner ring and outer ring) is simulated. The bearing response, power spectrum, and spectral kurtosis are plotted to

  1. Approximate Bayesian recursive estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav

    2014-01-01

    Roč. 285, č. 1 (2014), s. 100-111 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf

  2. Ranking as parameter estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Guy, Tatiana Valentine

    2009-01-01

    Roč. 4, č. 2 (2009), s. 142-158 ISSN 1745-7645 R&D Projects: GA MŠk 2C06001; GA AV ČR 1ET100750401; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : ranking * Bayesian estimation * negotiation * modelling Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2009/AS/karny- ranking as parameter estimation.pdf

  3. Maximal combustion temperature estimation

    International Nuclear Information System (INIS)

    Golodova, E; Shchepakina, E

    2006-01-01

    This work is concerned with the phenomenon of delayed loss of stability and the estimation of the maximal temperature of safe combustion. Using the qualitative theory of singular perturbations and canard techniques we determine the maximal temperature on the trajectories located in the transition region between the slow combustion regime and the explosive one. This approach is used to estimate the maximal temperature of safe combustion in multi-phase combustion models

  4. The great environmental restoration cost estimating shootout: A blind test of three DOE cost estimating groups

    International Nuclear Information System (INIS)

    Klemen, Paul

    1992-01-01

    The cost of the Department of Energy's (DOE) Environmental Restoration (ER) Program has increased steadily over the last three years and, in the process, has drawn increasing scrutiny from Congress, the public, and government agencies such as the Office of Management and Budget and the General Accounting Office. Programmatic costs have been reviewed by many groups from within the DOE as well as from outside agencies. While cost may appear to be a universally applicable barometer of project conditions, it is actually a single dimensional manifestation of a complex set of conditions. As such, variations in cost estimates can be caused by a variety of underlying factors such as changes in scope, schedule, performing organization, economic conditions, or regulatory environment. This paper will examine the subject of cost estimates by evaluating three different cost estimates prepared for a single project including two estimates prepared by project proponents and another estimate prepared by a review team. The paper identifies the reasons for cost growth as measured by the different estimates and evaluates the ability of review estimates to measure the validity of costs. The comparative technique used to test the three cost estimates will identify the reasons for changes in the estimated cost, over time, and evaluate the ability of an independent review to correctly identify the reasons for cost growth and evaluate the reasonableness of the cost proposed by the project proponents. Recommendations are made for improved cost estimates and improved cost estimate reviews. Conclusions are reached regarding the differences in estimate results that can be attributed to differences in estimating techniques, the implications of these differences for decision makers, and circumstances that are unique to environmental cost estimating. (author)

  5. Space program management methods and tools

    CERN Document Server

    Spagnulo, Marcello; Balduccini, Mauro; Nasini, Federico

    2013-01-01

    Beginning with the basic elements that differentiate space programs from other management challenges, Space Program Management explains through theory and example of real programs from around the world, the philosophical and technical tools needed to successfully manage large, technically complex space programs both in the government and commercial environment. Chapters address both systems and configuration management, the management of risk, estimation, measurement and control of both funding and the program schedule, and the structure of the aerospace industry worldwide.

  6. Implications of DOD Funds Execution Policy for Acquisition Program Management

    Science.gov (United States)

    2014-08-01

    package, the Automated Cost Estimating Integrated Tools ( ACEIT ). Using development cost estimation modeling techniques, the team also estimates...using Automated Cost Estimating Integrated Tool ( ACEIT )  An SQL database, known as the Program Financial Management System, currently used by the AH... ACEIT Automated Cost Estimating Integrated Tools AFOTEC Air Force Operational Test and Evaluation Center AFSOC Air Force Special Operations Command

  7. Program History

    Science.gov (United States)

    Learn how the National Cancer Institute transitioned the former Cooperative Groups Program to the National Clinical Trials Network (NCTN) program. The NCTN gives funds and other support to cancer research organizations to conduct cancer clinical trials.

  8. Program auto

    International Nuclear Information System (INIS)

    Rawool-Sullivan, M.W.; Plagnol, E.

    1990-01-01

    The program AUTO was developed to be used in the analysis of dE vs E type spectra. This program is written in FORTRAN and calculates dE vs E lines in MeV. The provision is also made in the program to convert these lines from MeV to ADC channel numbers to facilitate the comparison with the raw data from the experiments. Currently the output of this program can be plotted with the display program, called VISU, but it can also be used independent of the program VISU, with little or no modification in the actual fortran code. The program AUTO has many useful applications. In this article the program AUTO is described along with its applications

  9. Pesticides Industry Sales and Usage 2006 and 2007 Market Estimates

    Science.gov (United States)

    These reports provide economic profile information on sectors producing and using pesticides covered by FIFRA mandated regulatory programs. The reports contain contemporary and historical data estimating values and amounts of active ingredients used here.

  10. Pesticides Industry Sales and Usage 2008 - 2012 Market Estimates

    Science.gov (United States)

    These reports provide economic profile information on sectors producing and using pesticides covered by FIFRA mandated regulatory programs. The reports contain contemporary and historical data estimating values and amounts of active ingredients used here.

  11. Federal Ballpark Estimator

    Data.gov (United States)

    Office of Personnel Management — The Federal Ballpark E$timate(R) was developed by the Employee Benefit Research Institute(R) and its American Savings Education Council(R) (ASEC(R)) program. It is...

  12. Quantifying Uncertainty in Soil Volume Estimates

    International Nuclear Information System (INIS)

    Roos, A.D.; Hays, D.C.; Johnson, R.L.; Durham, L.A.; Winters, M.

    2009-01-01

    Proper planning and design for remediating contaminated environmental media require an adequate understanding of the types of contaminants and the lateral and vertical extent of contamination. In the case of contaminated soils, this generally takes the form of volume estimates that are prepared as part of a Feasibility Study for Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites and/or as part of the remedial design. These estimates are typically single values representing what is believed to be the most likely volume of contaminated soil present at the site. These single-value estimates, however, do not convey the level of confidence associated with the estimates. Unfortunately, the experience has been that pre-remediation soil volume estimates often significantly underestimate the actual volume of contaminated soils that are encountered during the course of remediation. This underestimation has significant implications, both technically (e.g., inappropriate remedial designs) and programmatically (e.g., establishing technically defensible budget and schedule baselines). Argonne National Laboratory (Argonne) has developed a joint Bayesian/geostatistical methodology for estimating contaminated soil volumes based on sampling results, that also provides upper and lower probabilistic bounds on those volumes. This paper evaluates the performance of this method in a retrospective study that compares volume estimates derived using this technique with actual excavated soil volumes for select Formerly Utilized Sites Remedial Action Program (FUSRAP) Maywood properties that have completed remedial action by the U.S. Army Corps of Engineers (USACE) New York District. (authors)

  13. Practical global oceanic state estimation

    Science.gov (United States)

    Wunsch, Carl; Heimbach, Patrick

    2007-06-01

    The problem of oceanographic state estimation, by means of an ocean general circulation model (GCM) and a multitude of observations, is described and contrasted with the meteorological process of data assimilation. In practice, all such methods reduce, on the computer, to forms of least-squares. The global oceanographic problem is at the present time focussed primarily on smoothing, rather than forecasting, and the data types are unlike meteorological ones. As formulated in the consortium Estimating the Circulation and Climate of the Ocean (ECCO), an automatic differentiation tool is used to calculate the so-called adjoint code of the GCM, and the method of Lagrange multipliers used to render the problem one of unconstrained least-squares minimization. Major problems today lie less with the numerical algorithms (least-squares problems can be solved by many means) than with the issues of data and model error. Results of ongoing calculations covering the period of the World Ocean Circulation Experiment, and including among other data, satellite altimetry from TOPEX/POSEIDON, Jason-1, ERS- 1/2, ENVISAT, and GFO, a global array of profiling floats from the Argo program, and satellite gravity data from the GRACE mission, suggest that the solutions are now useful for scientific purposes. Both methodology and applications are developing in a number of different directions.

  14. Single snapshot DOA estimation

    Science.gov (United States)

    Häcker, P.; Yang, B.

    2010-10-01

    In array signal processing, direction of arrival (DOA) estimation has been studied for decades. Many algorithms have been proposed and their performance has been studied thoroughly. Yet, most of these works are focused on the asymptotic case of a large number of snapshots. In automotive radar applications like driver assistance systems, however, only a small number of snapshots of the radar sensor array or, in the worst case, a single snapshot is available for DOA estimation. In this paper, we investigate and compare different DOA estimators with respect to their single snapshot performance. The main focus is on the estimation accuracy and the angular resolution in multi-target scenarios including difficult situations like correlated targets and large target power differences. We will show that some algorithms lose their ability to resolve targets or do not work properly at all. Other sophisticated algorithms do not show a superior performance as expected. It turns out that the deterministic maximum likelihood estimator is a good choice under these hard conditions.

  15. FastChem: A computer program for efficient complex chemical equilibrium calculations in the neutral/ionized gas phase with applications to stellar and planetary atmospheres

    Science.gov (United States)

    Stock, Joachim W.; Kitzmann, Daniel; Patzer, A. Beate C.; Sedlmayr, Erwin

    2018-06-01

    For the calculation of complex neutral/ionized gas phase chemical equilibria, we present a semi-analytical versatile and efficient computer program, called FastChem. The applied method is based on the solution of a system of coupled nonlinear (and linear) algebraic equations, namely the law of mass action and the element conservation equations including charge balance, in many variables. Specifically, the system of equations is decomposed into a set of coupled nonlinear equations in one variable each, which are solved analytically whenever feasible to reduce computation time. Notably, the electron density is determined by using the method of Nelder and Mead at low temperatures. The program is written in object-oriented C++ which makes it easy to couple the code with other programs, although a stand-alone version is provided. FastChem can be used in parallel or sequentially and is available under the GNU General Public License version 3 at https://github.com/exoclime/FastChem together with several sample applications. The code has been successfully validated against previous studies and its convergence behavior has been tested even for extreme physical parameter ranges down to 100 K and up to 1000 bar. FastChem converges stable and robust in even most demanding chemical situations, which posed sometimes extreme challenges for previous algorithms.

  16. G. H. Mead in the History of Sociological Ideas

    OpenAIRE

    Silva, Filipe Carreira da

    2006-01-01

    My aim is to discuss the history of the reception of George Herbert Mead’s ideas in sociology. After discussing the methodological debate between presentism and historicism, I address the interpretations of those responsible for Mead’s inclusion in the sociological canon: Herbert Blumer, Jürgen Habermas, and Hans Joas. In the concluding section, I as- sess these reconstructions of Mead’s thought and suggest an alternative more consistent with my initial methodological remarks. In particular, ...

  17. GEORGE HERBERT MEAD'IN SOSYAL AHLAK ANLAYIŞI

    OpenAIRE

    Mustafa Kınağ

    2017-01-01

    Klasik Amerikan pragmatistlerinden George Herbert Mead’a göre benlik, ne salt zihinden ibarettir, ne de ontolojik çatallaşmayı içerir. O, bunun yerine sadece analitik bir ayrım olarak benliği (self) özne benlik (I) ve nesne benlik (me) olarak ifade eder. Ancak ne özne benliği nesne benlik olmaksızın, ne de nesne benliği özne benlik olmaksızın düşünebiliriz. Özne benlik, doğrudan deneyimlerimizde kendisini göstermez, eylemin gerçekleşmesinden sonra biliş alanına girer. O, haf...

  18. Mead, Habermas, and Levinas: Cultivating Subjectivity in Education for Democracy

    Science.gov (United States)

    Zhao, Guoping

    2014-01-01

    For several decades education has struggled to find a way out of the entanglement of modernity, the premises and assumptions under which modern education has operated. According to Robin Usher and Richard Edwards, modern education, as the "dutiful child of the Enlightenment," has been "allotted a key role in the forming and shaping…

  19. 36 CFR 7.48 - Lake Mead National Recreation Area.

    Science.gov (United States)

    2010-07-01

    ... manufacturing of two-stroke engines. A person operating a personal watercraft that meets the EPA 2006 emission standards through the use of direct-injection two-stroke or four-stroke engines, or the equivalent thereof...

  20. Thermodynamic estimation: Ionic materials

    International Nuclear Information System (INIS)

    Glasser, Leslie

    2013-01-01

    Thermodynamics establishes equilibrium relations among thermodynamic parameters (“properties”) and delineates the effects of variation of the thermodynamic functions (typically temperature and pressure) on those parameters. However, classical thermodynamics does not provide values for the necessary thermodynamic properties, which must be established by extra-thermodynamic means such as experiment, theoretical calculation, or empirical estimation. While many values may be found in the numerous collected tables in the literature, these are necessarily incomplete because either the experimental measurements have not been made or the materials may be hypothetical. The current paper presents a number of simple and relible estimation methods for thermodynamic properties, principally for ionic materials. The results may also be used as a check for obvious errors in published values. The estimation methods described are typically based on addition of properties of individual ions, or sums of properties of neutral ion groups (such as “double” salts, in the Simple Salt Approximation), or based upon correlations such as with formula unit volumes (Volume-Based Thermodynamics). - Graphical abstract: Thermodynamic properties of ionic materials may be readily estimated by summation of the properties of individual ions, by summation of the properties of ‘double salts’, and by correlation with formula volume. Such estimates may fill gaps in the literature, and may also be used as checks of published values. This simplicity arises from exploitation of the fact that repulsive energy terms are of short range and very similar across materials, while coulombic interactions provide a very large component of the attractive energy in ionic systems. Display Omitted - Highlights: • Estimation methods for thermodynamic properties of ionic materials are introduced. • Methods are based on summation of single ions, multiple salts, and correlations. • Heat capacity, entropy

  1. Distribution load estimation - DLE

    Energy Technology Data Exchange (ETDEWEB)

    Seppaelae, A. [VTT Energy, Espoo (Finland)

    1996-12-31

    The load research project has produced statistical information in the form of load models to convert the figures of annual energy consumption to hourly load values. The reliability of load models is limited to a certain network because many local circumstances are different from utility to utility and time to time. Therefore there is a need to make improvements in the load models. Distribution load estimation (DLE) is the method developed here to improve load estimates from the load models. The method is also quite cheap to apply as it utilises information that is already available in SCADA systems

  2. Generalized estimating equations

    CERN Document Server

    Hardin, James W

    2002-01-01

    Although powerful and flexible, the method of generalized linear models (GLM) is limited in its ability to accurately deal with longitudinal and clustered data. Developed specifically to accommodate these data types, the method of Generalized Estimating Equations (GEE) extends the GLM algorithm to accommodate the correlated data encountered in health research, social science, biology, and other related fields.Generalized Estimating Equations provides the first complete treatment of GEE methodology in all of its variations. After introducing the subject and reviewing GLM, the authors examine th

  3. Digital Quantum Estimation

    Science.gov (United States)

    Hassani, Majid; Macchiavello, Chiara; Maccone, Lorenzo

    2017-11-01

    Quantum metrology calculates the ultimate precision of all estimation strategies, measuring what is their root-mean-square error (RMSE) and their Fisher information. Here, instead, we ask how many bits of the parameter we can recover; namely, we derive an information-theoretic quantum metrology. In this setting, we redefine "Heisenberg bound" and "standard quantum limit" (the usual benchmarks in the quantum estimation theory) and show that the former can be attained only by sequential strategies or parallel strategies that employ entanglement among probes, whereas parallel-separable strategies are limited by the latter. We highlight the differences between this setting and the RMSE-based one.

  4. Distribution load estimation - DLE

    Energy Technology Data Exchange (ETDEWEB)

    Seppaelae, A [VTT Energy, Espoo (Finland)

    1997-12-31

    The load research project has produced statistical information in the form of load models to convert the figures of annual energy consumption to hourly load values. The reliability of load models is limited to a certain network because many local circumstances are different from utility to utility and time to time. Therefore there is a need to make improvements in the load models. Distribution load estimation (DLE) is the method developed here to improve load estimates from the load models. The method is also quite cheap to apply as it utilises information that is already available in SCADA systems

  5. Decision-Tree Program

    Science.gov (United States)

    Buntine, Wray

    1994-01-01

    IND computer program introduces Bayesian and Markov/maximum-likelihood (MML) methods and more-sophisticated methods of searching in growing trees. Produces more-accurate class-probability estimates important in applications like diagnosis. Provides range of features and styles with convenience for casual user, fine-tuning for advanced user or for those interested in research. Consists of four basic kinds of routines: data-manipulation, tree-generation, tree-testing, and tree-display. Written in C language.

  6. Detection probabilities for time-domain velocity estimation

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    1991-01-01

    programs, it is demonstrated that the probability of correct estimation depends on the signal-to-noise ratio, transducer bandwidth, number of A-lines and number of samples used in the correlation estimate. The influence of applying a stationary echo-canceler is explained. The echo canceling can be modeled...

  7. Organizational flexibility estimation

    OpenAIRE

    Komarynets, Sofia

    2013-01-01

    By the help of parametric estimation the evaluation scale of organizational flexibility and its parameters was formed. Definite degrees of organizational flexibility and its parameters for the Lviv region enterprises were determined. Grouping of the enterprises under the existing scale was carried out. Special recommendations to correct the enterprises behaviour were given.

  8. On Functional Calculus Estimates

    NARCIS (Netherlands)

    Schwenninger, F.L.

    2015-01-01

    This thesis presents various results within the field of operator theory that are formulated in estimates for functional calculi. Functional calculus is the general concept of defining operators of the form $f(A)$, where f is a function and $A$ is an operator, typically on a Banach space. Norm

  9. Estimation of vector velocity

    DEFF Research Database (Denmark)

    2000-01-01

    Using a pulsed ultrasound field, the two-dimensional velocity vector can be determined with the invention. The method uses a transversally modulated ultrasound field for probing the moving medium under investigation. A modified autocorrelation approach is used in the velocity estimation. The new...

  10. Quantifying IT estimation risks

    NARCIS (Netherlands)

    Kulk, G.P.; Peters, R.J.; Verhoef, C.

    2009-01-01

    A statistical method is proposed for quantifying the impact of factors that influence the quality of the estimation of costs for IT-enabled business projects. We call these factors risk drivers as they influence the risk of the misestimation of project costs. The method can effortlessly be

  11. Numerical Estimation in Preschoolers

    Science.gov (United States)

    Berteletti, Ilaria; Lucangeli, Daniela; Piazza, Manuela; Dehaene, Stanislas; Zorzi, Marco

    2010-01-01

    Children's sense of numbers before formal education is thought to rely on an approximate number system based on logarithmically compressed analog magnitudes that increases in resolution throughout childhood. School-age children performing a numerical estimation task have been shown to increasingly rely on a formally appropriate, linear…

  12. Estimating Gender Wage Gaps

    Science.gov (United States)

    McDonald, Judith A.; Thornton, Robert J.

    2011-01-01

    Course research projects that use easy-to-access real-world data and that generate findings with which undergraduate students can readily identify are hard to find. The authors describe a project that requires students to estimate the current female-male earnings gap for new college graduates. The project also enables students to see to what…

  13. Fast fundamental frequency estimation

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm; Jensen, Jesper Rindom

    2017-01-01

    Modelling signals as being periodic is common in many applications. Such periodic signals can be represented by a weighted sum of sinusoids with frequencies being an integer multiple of the fundamental frequency. Due to its widespread use, numerous methods have been proposed to estimate the funda...

  14. On Gnostical Estimates

    Czech Academy of Sciences Publication Activity Database

    Fabián, Zdeněk

    2017-01-01

    Roč. 56, č. 2 (2017), s. 125-132 ISSN 0973-1377 Institutional support: RVO:67985807 Keywords : gnostic theory * statistics * robust estimates Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability http://www.ceser.in/ceserp/index.php/ijamas/article/view/4707

  15. Estimation of morbidity effects

    International Nuclear Information System (INIS)

    Ostro, B.

    1994-01-01

    Many researchers have related exposure to ambient air pollution to respiratory morbidity. To be included in this review and analysis, however, several criteria had to be met. First, a careful study design and a methodology that generated quantitative dose-response estimates were required. Therefore, there was a focus on time-series regression analyses relating daily incidence of morbidity to air pollution in a single city or metropolitan area. Studies that used weekly or monthly average concentrations or that involved particulate measurements in poorly characterized metropolitan areas (e.g., one monitor representing a large region) were not included in this review. Second, studies that minimized confounding ad omitted variables were included. For example, research that compared two cities or regions and characterized them as 'high' and 'low' pollution area were not included because of potential confounding by other factors in the respective areas. Third, concern for the effects of seasonality and weather had to be demonstrated. This could be accomplished by either stratifying and analyzing the data by season, by examining the independent effects of temperature and humidity, and/or by correcting the model for possible autocorrelation. A fourth criterion for study inclusion was that the study had to include a reasonably complete analysis of the data. Such analysis would include an careful exploration of the primary hypothesis as well as possible examination of te robustness and sensitivity of the results to alternative functional forms, specifications, and influential data points. When studies reported the results of these alternative analyses, the quantitative estimates that were judged as most representative of the overall findings were those that were summarized in this paper. Finally, for inclusion in the review of particulate matter, the study had to provide a measure of particle concentration that could be converted into PM10, particulate matter below 10

  16. Estimating fuel consumption during prescribed fires in Arkansas

    Science.gov (United States)

    Virginia L. McDaniel; James M. Guldin; Roger W. Perry

    2012-01-01

    While prescribed fire is essential to maintaining numerous plant communities, fine particles produced in smoke can impair human health and reduce visibility in scenic areas. The Arkansas Smoke Management Program was established to mitigate the impacts of smoke from prescribed fires. This program uses fuel loading and consumption estimates from standard fire-behavior...

  17. Comparison of Four Estimators under sampling without Replacement

    African Journals Online (AJOL)

    The results were obtained using a program written in Microsoft Visual C++ programming language. It was observed that the two-stage sampling under unequal probabilities without replacement is always better than the other three estimators considered. Keywords: Unequal probability sampling, two-stage sampling, ...

  18. Implementation of a Personal Computer Based Parameter Estimation Program

    Science.gov (United States)

    1992-03-01

    if necessary and identify by biock nunrbet) FEILD GROUP SUBGROUP Il’arunietar uetinkatlUln 19 ABSTRACT (continue on reverse it necessary and identity...model constant ix L,M,N X,Y,Z moment components Lp: •sbc.’.• T’ = sb C . r, - 2 V C, , L, = _sb 2 C 2V C L8,=qsbC 1 , Lw Scale of the turbulence M Vector ...u,v,w X,Y,Z velocity components V Vector velocity V Magnitude of velocity vector w9 Z velocity due to gust X.. x-distance to normal acclerometer X.P x

  19. Estimation of relative effectiveness of phylogenetic programs by machine learning.

    Science.gov (United States)

    Krivozubov, Mikhail; Goebels, Florian; Spirin, Sergei

    2014-04-01

    Reconstruction of phylogeny of a protein family from a sequence alignment can produce results of different quality. Our goal is to predict the quality of phylogeny reconstruction basing on features that can be extracted from the input alignment. We used Fitch-Margoliash (FM) method of phylogeny reconstruction and random forest as a predictor. For training and testing the predictor, alignments of orthologous series (OS) were used, for which the result of phylogeny reconstruction can be evaluated by comparison with trees of corresponding organisms. Our results show that the quality of phylogeny reconstruction can be predicted with more than 80% precision. Also, we tried to predict which phylogeny reconstruction method, FM or UPGMA, is better for a particular alignment. With the used set of features, among alignments for which the obtained predictor predicts a better performance of UPGMA, 56% really give a better result with UPGMA. Taking into account that in our testing set only for 34% alignments UPGMA performs better, this result shows a principal possibility to predict the better phylogeny reconstruction method basing on features of a sequence alignment.

  20. Evaluating and Estimating the WCET Criticality Metric

    DEFF Research Database (Denmark)

    Jordan, Alexander

    2014-01-01

    a programmer (or compiler) from targeting optimizations the right way. A possible resort is to use a metric that targets WCET and which can be efficiently computed for all code parts of a program. Similar to dynamic profiling techniques, which execute code with input that is typically expected...... for the application, based on WCET analysis we can indicate how critical a code fragment is, in relation to the worst-case bound. Computing such a metric on top of static analysis, incurs a certain overhead though, which increases with the complexity of the underlying WCET analysis. We present our approach...... to estimate the Criticality metric, by relaxing the precision of WCET analysis. Through this, we can reduce analysis time by orders of magnitude, while only introducing minor error. To evaluate our estimation approach and share our garnered experience using the metric, we evaluate real-time programs, which...

  1. Budget estimates: Fiscal years, 1990--1991

    International Nuclear Information System (INIS)

    1989-01-01

    The budget estimates for the NRC for fiscal year 1990 provide for obligations of $475,000,000, to be funded in total by two new appropriations---one is NRC's Salaries and Expenses appropriation for $472,100,000 and the other is NRC's Office of the Inspector General appropriation of $2,900,000. Of the funds appropriated to the NRC's Salaries and Expenses, $23,195,000 shall be derived from the Nuclear Waste Fund. The sum appropriated to the NRC's Salaries and Expenses shall be reduced by the amount of revenues received during fiscal year 1990 from licensing fees, inspection services, other services and collections, and from the Nuclear Waste Fund, excluding those moneys received for the cooperative nuclear safety research program, services rendered to foreign governments and international organizations, and the material and information access authorization programs, so as to result in a final fiscal year 1990 appropriation estimated at not more than $292,155,000

  2. Material Programming

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Boer, Laurens; Tsaknaki, Vasiliki

    2017-01-01

    . Consequently we ask what the practice of programming and giving form to such materials would be like? How would we be able to familiarize ourselves with the dynamics of these materials and their different combinations of cause and effect? Which tools would we need and what would they look like? Will we program......, and color, but additionally being capable of sensing, actuating, and computing. Indeed, computers will not be things in and by themselves, but embedded into the materials that make up our surroundings. This also means that the way we interact with computers and the way we program them, will change...... these computational composites through external computers and then transfer the code them, or will the programming happen closer to the materials? In this feature we outline a new research program that floats between imagined futures and the development of a material programming practice....

  3. Measurement control program

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    A measurement control program for the model plant is described. The discussion includes the technical basis for such a program, the application of measurement control principles to each measurement, and the use of special experiments to estimate measurement error parameters for difficult-to-measure materials. The discussion also describes the statistical aspects of the program, and the documentation procedures used to record, maintain, and process the basic data. The purpose of the session is to enable participants to: (1) understand the criteria for this type of a measurement control program; (2) understand the kinds of physical standards required for the various measurement processes, e.g., weighing, analytical, NDA; (3) understand the need for and importance of a measurement control program; (4) understand the need for special experiments to provide an improved basis for the measurement of difficult-to-measure materials; (5) understand the general scope of the program's statistical aspects; and (6) understand the basis and scope of the documentation procedures

  4. Complementary programs for stochastic analysis of radionuclide transport

    International Nuclear Information System (INIS)

    Gomez Hernandez, J.J.

    1993-01-01

    The present programs will permit to analyze the risks using parametric and non parametric technic. The programs are presented in two groups: 1) variable estimation through indicator krigeaje and variable estimation by Cokrigeaje 2) variable simulation with multi gassiness stochastic model and non gassiness. This report includes new programs for the non parametric geostatistics

  5. Effective Programming

    DEFF Research Database (Denmark)

    Frost, Jacob

    To investigate the use of VTLoE as a basis for formal derivation of functional programs with effects. As a part of the process, a number of issues central to effective formal programming are considered. In particular it is considered how to develop a proof system suitable for pratical reasoning......, how to implement this system in the generic proof assistant Isabelle and finally how to apply the logic and the implementation to programming....

  6. Program Fullerene

    DEFF Research Database (Denmark)

    Wirz, Lukas; Peter, Schwerdtfeger,; Avery, James Emil

    2013-01-01

    Fullerene (Version 4.4), is a general purpose open-source program that can generate any fullerene isomer, perform topological and graph theoretical analysis, as well as calculate a number of physical and chemical properties. The program creates symmetric planar drawings of the fullerene graph, an......-Fowler, and Brinkmann-Fowler vertex insertions. The program is written in standard Fortran and C++, and can easily be installed on a Linux or UNIX environment....

  7. Programming F#

    CERN Document Server

    Smith, Chris

    2009-01-01

    Why learn F#? This multi-paradigm language not only offers you an enormous productivity boost through functional programming, it also lets you develop applications using your existing object-oriented and imperative programming skills. With Programming F#, you'll quickly discover the many advantages of Microsoft's new language, which includes access to all the great tools and libraries of the .NET platform. Learn how to reap the benefits of functional programming for your next project -- whether it's quantitative computing, large-scale data exploration, or even a pursuit of your own. With th

  8. PLC Programming

    International Nuclear Information System (INIS)

    Lee, Seong Jae; Wi, Seong Dong; Yoo, Jong Seon; Kim, Se Chan

    2001-02-01

    This book tells of PLC programming for KGL-WIN with summary of PLC, performance and function of PLC like characteristic of KGL-WIN, connection method with PLC, basic performance of K200S/K300S/K1000S, diagram of input and output H/W, writing project, staring the program, editing of program, on-line function, debugging and instructions like control, timer and counter, data transmission, comparison, rotation and moving, system, data operating data conversion and application program.

  9. Programming Interactivity

    CERN Document Server

    Noble, Joshua

    2009-01-01

    Make cool stuff. If you're a designer or artist without a lot of programming experience, this book will teach you to work with 2D and 3D graphics, sound, physical interaction, and electronic circuitry to create all sorts of interesting and compelling experiences -- online and off. Programming Interactivity explains programming and electrical engineering basics, and introduces three freely available tools created specifically for artists and designers: Processing, a Java-based programming language and environment for building projects on the desktop, Web, or mobile phonesArduino, a system t

  10. Histogram Estimators of Bivariate Densities

    National Research Council Canada - National Science Library

    Husemann, Joyce A

    1986-01-01

    One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals...

  11. Distribution load estimation (DLE)

    Energy Technology Data Exchange (ETDEWEB)

    Seppaelae, A; Lehtonen, M [VTT Energy, Espoo (Finland)

    1998-08-01

    The load research has produced customer class load models to convert the customers` annual energy consumption to hourly load values. The reliability of load models applied from a nation-wide sample is limited in any specific network because many local circumstances are different from utility to utility and time to time. Therefore there is a need to find improvements to the load models or, in general, improvements to the load estimates. In Distribution Load Estimation (DLE) the measurements from the network are utilized to improve the customer class load models. The results of DLE will be new load models that better correspond to the loading of the distribution network but are still close to the original load models obtained by load research. The principal data flow of DLE is presented

  12. Abundance estimation and conservation biology

    Science.gov (United States)

    Nichols, J.D.; MacKenzie, D.I.

    2004-01-01

    Abundance is the state variable of interest in most population–level ecological research and in most programs involving management and conservation of animal populations. Abundance is the single parameter of interest in capture–recapture models for closed populations (e.g., Darroch, 1958; Otis et al., 1978; Chao, 2001). The initial capture–recapture models developed for partially (Darroch, 1959) and completely (Jolly, 1965; Seber, 1965) open populations represented efforts to relax the restrictive assumption of population closure for the purpose of estimating abundance. Subsequent emphases in capture–recapture work were on survival rate estimation in the 1970’s and 1980’s (e.g., Burnham et al., 1987; Lebreton et al.,1992), and on movement estimation in the 1990’s (Brownie et al., 1993; Schwarz et al., 1993). However, from the mid–1990’s until the present time, capture–recapture investigators have expressed a renewed interest in abundance and related parameters (Pradel, 1996; Schwarz & Arnason, 1996; Schwarz, 2001). The focus of this session was abundance, and presentations covered topics ranging from estimation of abundance and rate of change in abundance, to inferences about the demographic processes underlying changes in abundance, to occupancy as a surrogate of abundance. The plenary paper by Link & Barker (2004) is provocative and very interesting, and it contains a number of important messages and suggestions. Link & Barker (2004) emphasize that the increasing complexity of capture–recapture models has resulted in large numbers of parameters and that a challenge to ecologists is to extract ecological signals from this complexity. They offer hierarchical models as a natural approach to inference in which traditional parameters are viewed as realizations of stochastic processes. These processes are governed by hyperparameters, and the inferential approach focuses on these hyperparameters. Link & Barker (2004) also suggest that our attention

  13. Abundance estimation and Conservation Biology

    Directory of Open Access Journals (Sweden)

    Nichols, J. D.

    2004-06-01

    Full Text Available Abundance is the state variable of interest in most population–level ecological research and in most programs involving management and conservation of animal populations. Abundance is the single parameter of interest in capture–recapture models for closed populations (e.g., Darroch, 1958; Otis et al., 1978; Chao, 2001. The initial capture–recapture models developed for partially (Darroch, 1959 and completely (Jolly, 1965; Seber, 1965 open populations represented efforts to relax the restrictive assumption of population closure for the purpose of estimating abundance. Subsequent emphases in capture–recapture work were on survival rate estimation in the 1970’s and 1980’s (e.g., Burnham et al., 1987; Lebreton et al.,1992, and on movement estimation in the 1990’s (Brownie et al., 1993; Schwarz et al., 1993. However, from the mid–1990’s until the present time, capture–recapture investigators have expressed a renewed interest in abundance and related parameters (Pradel, 1996; Schwarz & Arnason, 1996; Schwarz, 2001. The focus of this session was abundance, and presentations covered topics ranging from estimation of abundance and rate of change in abundance, to inferences about the demographic processes underlying changes in abundance, to occupancy as a surrogate of abundance. The plenary paper by Link & Barker (2004 is provocative and very interesting, and it contains a number of important messages and suggestions. Link & Barker (2004 emphasize that the increasing complexity of capture–recapture models has resulted in large numbers of parameters and that a challenge to ecologists is to extract ecological signals from this complexity. They offer hierarchical models as a natural approach to inference in which traditional parameters are viewed as realizations of stochastic processes. These processes are governed by hyperparameters, and the inferential approach focuses on these hyperparameters. Link & Barker (2004 also suggest that

  14. Variance Function Estimation. Revision.

    Science.gov (United States)

    1987-03-01

    UNLSIFIED RFOSR-TR-87-±112 F49620-85-C-O144 F/C 12/3 NL EEEEEEh LOUA28~ ~ L53 11uLoo MICROOP REOUINTS-’HR ------ N L E U INARF-% - IS %~1 %i % 0111...and 9 jointly. If 7,, 0. and are any preliminary estimators for 71, 6. and 3. define 71 and 6 to be the solutions of (4.1) N1 IN2 (7., ’ Td " ~ - / =0P

  15. Estimating Risk Parameters

    OpenAIRE

    Aswath Damodaran

    1999-01-01

    Over the last three decades, the capital asset pricing model has occupied a central and often controversial place in most corporate finance analysts’ tool chests. The model requires three inputs to compute expected returns – a riskfree rate, a beta for an asset and an expected risk premium for the market portfolio (over and above the riskfree rate). Betas are estimated, by most practitioners, by regressing returns on an asset against a stock index, with the slope of the regression being the b...

  16. Estimating Venezuelas Latent Inflation

    OpenAIRE

    Juan Carlos Bencomo; Hugo J. Montesinos; Hugo M. Montesinos; Jose Roberto Rondo

    2011-01-01

    Percent variation of the consumer price index (CPI) is the inflation indicator most widely used. This indicator, however, has some drawbacks. In addition to measurement errors of the CPI, there is a problem of incongruence between the definition of inflation as a sustained and generalized increase of prices and the traditional measure associated with the CPI. We use data from 1991 to 2005 to estimate a complementary indicator for Venezuela, the highest inflation country in Latin America. Late...

  17. Chernobyl source term estimation

    International Nuclear Information System (INIS)

    Gudiksen, P.H.; Harvey, T.F.; Lange, R.

    1990-09-01

    The Chernobyl source term available for long-range transport was estimated by integration of radiological measurements with atmospheric dispersion modeling and by reactor core radionuclide inventory estimation in conjunction with WASH-1400 release fractions associated with specific chemical groups. The model simulations revealed that the radioactive cloud became segmented during the first day, with the lower section heading toward Scandinavia and the upper part heading in a southeasterly direction with subsequent transport across Asia to Japan, the North Pacific, and the west coast of North America. By optimizing the agreement between the observed cloud arrival times and duration of peak concentrations measured over Europe, Japan, Kuwait, and the US with the model predicted concentrations, it was possible to derive source term estimates for those radionuclides measured in airborne radioactivity. This was extended to radionuclides that were largely unmeasured in the environment by performing a reactor core radionuclide inventory analysis to obtain release fractions for the various chemical transport groups. These analyses indicated that essentially all of the noble gases, 60% of the radioiodines, 40% of the radiocesium, 10% of the tellurium and about 1% or less of the more refractory elements were released. These estimates are in excellent agreement with those obtained on the basis of worldwide deposition measurements. The Chernobyl source term was several orders of magnitude greater than those associated with the Windscale and TMI reactor accidents. However, the 137 Cs from the Chernobyl event is about 6% of that released by the US and USSR atmospheric nuclear weapon tests, while the 131 I and 90 Sr released by the Chernobyl accident was only about 0.1% of that released by the weapon tests. 13 refs., 2 figs., 7 tabs

  18. Estimating Corporate Yield Curves

    OpenAIRE

    Antionio Diaz; Frank Skinner

    2001-01-01

    This paper represents the first study of retail deposit spreads of UK financial institutions using stochastic interest rate modelling and the market comparable approach. By replicating quoted fixed deposit rates using the Black Derman and Toy (1990) stochastic interest rate model, we find that the spread between fixed and variable rates of interest can be modeled (and priced) using an interest rate swap analogy. We also find that we can estimate an individual bank deposit yield curve as a spr...

  19. Estimation of inspection effort

    International Nuclear Information System (INIS)

    Mullen, M.F.; Wincek, M.A.

    1979-06-01

    An overview of IAEA inspection activities is presented, and the problem of evaluating the effectiveness of an inspection is discussed. Two models are described - an effort model and an effectiveness model. The effort model breaks the IAEA's inspection effort into components; the amount of effort required for each component is estimated; and the total effort is determined by summing the effort for each component. The effectiveness model quantifies the effectiveness of inspections in terms of probabilities of detection and quantities of material to be detected, if diverted over a specific period. The method is applied to a 200 metric ton per year low-enriched uranium fuel fabrication facility. A description of the model plant is presented, a safeguards approach is outlined, and sampling plans are calculated. The required inspection effort is estimated and the results are compared to IAEA estimates. Some other applications of the method are discussed briefly. Examples are presented which demonstrate how the method might be useful in formulating guidelines for inspection planning and in establishing technical criteria for safeguards implementation

  20. Qualitative Robustness in Estimation

    Directory of Open Access Journals (Sweden)

    Mohammed Nasser

    2012-07-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Times New Roman","serif";} Qualitative robustness, influence function, and breakdown point are three main concepts to judge an estimator from the viewpoint of robust estimation. It is important as well as interesting to study relation among them. This article attempts to present the concept of qualitative robustness as forwarded by first proponents and its later development. It illustrates intricacies of qualitative robustness and its relation with consistency, and also tries to remove commonly believed misunderstandings about relation between influence function and qualitative robustness citing some examples from literature and providing a new counter-example. At the end it places a useful finite and a simulated version of   qualitative robustness index (QRI. In order to assess the performance of the proposed measures, we have compared fifteen estimators of correlation coefficient using simulated as well as real data sets.

  1. Estimating directional epistasis

    Science.gov (United States)

    Le Rouzic, Arnaud

    2014-01-01

    Epistasis, i.e., the fact that gene effects depend on the genetic background, is a direct consequence of the complexity of genetic architectures. Despite this, most of the models used in evolutionary and quantitative genetics pay scant attention to genetic interactions. For instance, the traditional decomposition of genetic effects models epistasis as noise around the evolutionarily-relevant additive effects. Such an approach is only valid if it is assumed that there is no general pattern among interactions—a highly speculative scenario. Systematic interactions generate directional epistasis, which has major evolutionary consequences. In spite of its importance, directional epistasis is rarely measured or reported by quantitative geneticists, not only because its relevance is generally ignored, but also due to the lack of simple, operational, and accessible methods for its estimation. This paper describes conceptual and statistical tools that can be used to estimate directional epistasis from various kinds of data, including QTL mapping results, phenotype measurements in mutants, and artificial selection responses. As an illustration, I measured directional epistasis from a real-life example. I then discuss the interpretation of the estimates, showing how they can be used to draw meaningful biological inferences. PMID:25071828

  2. Adaptive Nonparametric Variance Estimation for a Ratio Estimator ...

    African Journals Online (AJOL)

    Kernel estimators for smooth curves require modifications when estimating near end points of the support, both for practical and asymptotic reasons. The construction of such boundary kernels as solutions of variational problem is a difficult exercise. For estimating the error variance of a ratio estimator, we suggest an ...

  3. 75 FR 44 - Temporary Suspension of the Population Estimates and Income Estimates Challenge Programs

    Science.gov (United States)

    2010-01-04

    ... final rule may be submitted to Dr. Enrique Lamas, Chief of the Population Division, through any of the...-mailed to: Enrique.Lamas@census.gov . Mail: Correspondence may be mailed to: Dr. Enrique Lamas, Chief...

  4. Computer Programs.

    Science.gov (United States)

    Anderson, Tiffoni

    This module provides information on development and use of a Material Safety Data Sheet (MSDS) software program that seeks to link literacy skills education, safety training, and human-centered design. Section 1 discusses the development of the software program that helps workers understand the MSDSs that accompany the chemicals with which they…

  5. BASIC Programming.

    Science.gov (United States)

    Jennings, Carol Ann

    Designed for use by both secondary- and postsecondary-level business teachers, this curriculum guide consists of 10 units of instructional materials dealing with Beginners All-Purpose Symbol Instruction Code (BASIC) programing. Topics of the individual lessons are numbering BASIC programs and using the PRINT, END, and REM statements; system…

  6. Choreographic Programming

    DEFF Research Database (Denmark)

    Montesi, Fabrizio

    , as they offer a concise view of the message flows enacted by a system. For this reason, in the last decade choreographies have been used in the development of programming languages, giving rise to a programming paradigm that in this dissertation we refer to as Choreographic Programming. Recent studies show...... endpoint described in a choreography can then be automatically generated, ensuring that such implementations are safe by construction. However, current formal models for choreographies do not deal with critical aspects of distributed programming, such as asynchrony, mobility, modularity, and multiparty...... sessions; it remains thus unclear whether choreographies can still guarantee safety when dealing with such nontrivial features. This PhD dissertation argues for the suitability of choreographic programming as a paradigm for the development of safe distributed systems. We proceed by investigating its...

  7. Modeling and Parameter Estimation of a Small Wind Generation System

    Directory of Open Access Journals (Sweden)

    Carlos A. Ramírez Gómez

    2013-11-01

    Full Text Available The modeling and parameter estimation of a small wind generation system is presented in this paper. The system consists of a wind turbine, a permanent magnet synchronous generator, a three phase rectifier, and a direct current load. In order to estimate the parameters wind speed data are registered in a weather station located in the Fraternidad Campus at ITM. Wind speed data were applied to a reference model programed with PSIM software. From that simulation, variables were registered to estimate the parameters. The wind generation system model together with the estimated parameters is an excellent representation of the detailed model, but the estimated model offers a higher flexibility than the programed model in PSIM software.

  8. PHAZE, Parametric Hazard Function Estimation

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: Phaze performs statistical inference calculations on a hazard function (also called a failure rate or intensity function) based on reported failure times of components that are repaired and restored to service. Three parametric models are allowed: the exponential, linear, and Weibull hazard models. The inference includes estimation (maximum likelihood estimators and confidence regions) of the parameters and of the hazard function itself, testing of hypotheses such as increasing failure rate, and checking of the model assumptions. 2 - Methods: PHAZE assumes that the failures of a component follow a time-dependent (or non-homogenous) Poisson process and that the failure counts in non-overlapping time intervals are independent. Implicit in the independence property is the assumption that the component is restored to service immediately after any failure, with negligible repair time. The failures of one component are assumed to be independent of those of another component; a proportional hazards model is used. Data for a component are called time censored if the component is observed for a fixed time-period, or plant records covering a fixed time-period are examined, and the failure times are recorded. The number of these failures is random. Data are called failure censored if the component is kept in service until a predetermined number of failures has occurred, at which time the component is removed from service. In this case, the number of failures is fixed, but the end of the observation period equals the final failure time and is random. A typical PHAZE session consists of reading failure data from a file prepared previously, selecting one of the three models, and performing data analysis (i.e., performing the usual statistical inference about the parameters of the model, with special emphasis on the parameter(s) that determine whether the hazard function is increasing). The final goals of the inference are a point estimate

  9. CYPROS - Cybernetic Program Packages

    Directory of Open Access Journals (Sweden)

    Arne Tyssø

    1980-10-01

    Full Text Available CYPROS is an interactive program system consisting of a number of special purpose packages for simulation, identification, parameter estimation and control system design. The programming language is standard FORTRAN IV and the system is implemented on a medium size computer system (Nord-10. The system is interactive and program control is obtained by the use of numeric terminals. Output is rapidly examined by extensive use of video colour graphics. The subroutines included in the packages are designed and documented according to standardization rules given by the SCL (Scandinavian Control Library organization. This simplifies the exchange of subroutines throughout the SCL system. Also, this makes the packages attractive for implementation by industrial users. In the simulation package, different integration methods are available and it can be easily used for off-line, as well as real time, simulation problems. The identification package consists of programs for single-input/single-output and multivariablc problems. Both transfer function models and state space models can be handled. Optimal test signals can be designed. The control package consists of programs based on multivariable time domain and frequency domain methods for analysis and design. In addition, there is a package for matrix and time series manipulation. CYPROS has been applied successfully to industrial problems of various kinds, and parts of the system have already been implemented on different computers in industry. This paper will, in some detail, describe the use and the contents of the packages and some examples of application will be discussed.

  10. Piping research program plan

    International Nuclear Information System (INIS)

    1988-09-01

    This document presents the piping research program plan for the Structural and Seismic Engineering Branch and the Materials Engineering Branch of the Division of Engineering, Office of Nuclear Regulatory Research. The plan describes the research to be performed in the areas of piping design criteria, environmentally assisted cracking, pipe fracture, and leak detection and leak rate estimation. The piping research program addresses the regulatory issues regarding piping design and piping integrity facing the NRC today and in the foreseeable future. The plan discusses the regulatory issues and needs for the research, the objectives, key aspects, and schedule for each research project, or group of projects focussing of a specific topic, and, finally, the integration of the research areas into the regulatory process is described. The plan presents a snap-shot of the piping research program as it exists today. However, the program plan will change as the regulatory issues and needs change. Consequently, this document will be revised on a bi-annual basis to reflect the changes in the piping research program. (author)

  11. Estimation of Lung Ventilation

    Science.gov (United States)

    Ding, Kai; Cao, Kunlin; Du, Kaifang; Amelon, Ryan; Christensen, Gary E.; Raghavan, Madhavan; Reinhardt, Joseph M.

    Since the primary function of the lung is gas exchange, ventilation can be interpreted as an index of lung function in addition to perfusion. Injury and disease processes can alter lung function on a global and/or a local level. MDCT can be used to acquire multiple static breath-hold CT images of the lung taken at different lung volumes, or with proper respiratory control, 4DCT images of the lung reconstructed at different respiratory phases. Image registration can be applied to this data to estimate a deformation field that transforms the lung from one volume configuration to the other. This deformation field can be analyzed to estimate local lung tissue expansion, calculate voxel-by-voxel intensity change, and make biomechanical measurements. The physiologic significance of the registration-based measures of respiratory function can be established by comparing to more conventional measurements, such as nuclear medicine or contrast wash-in/wash-out studies with CT or MR. An important emerging application of these methods is the detection of pulmonary function change in subjects undergoing radiation therapy (RT) for lung cancer. During RT, treatment is commonly limited to sub-therapeutic doses due to unintended toxicity to normal lung tissue. Measurement of pulmonary function may be useful as a planning tool during RT planning, may be useful for tracking the progression of toxicity to nearby normal tissue during RT, and can be used to evaluate the effectiveness of a treatment post-therapy. This chapter reviews the basic measures to estimate regional ventilation from image registration of CT images, the comparison of them to the existing golden standard and the application in radiation therapy.

  12. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  13. Estimating NHL Scoring Rates

    OpenAIRE

    Buttrey, Samuel E.; Washburn, Alan R.; Price, Wilson L.; Operations Research

    2011-01-01

    The article of record as published may be located at http://dx.doi.org/10.2202/1559-0410.1334 We propose a model to estimate the rates at which NHL teams score and yield goals. In the model, goals occur as if from a Poisson process whose rate depends on the two teams playing, the home-ice advantage, and the manpower (power-play, short-handed) situation. Data on all the games from the 2008-2009 season was downloaded and processed into a form suitable for the analysis. The model...

  14. Risk estimation and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Ferguson, R A.D.

    1982-10-01

    Risk assessment involves subjectivity, which makes objective decision making difficult in the nuclear power debate. The author reviews the process and uncertainties of estimating risks as well as the potential for misinterpretation and misuse. Risk data from a variety of aspects cannot be summed because the significance of different risks is not comparable. A method for including political, social, moral, psychological, and economic factors, environmental impacts, catastrophes, and benefits in the evaluation process could involve a broad base of lay and technical consultants, who would explain and argue their evaluation positions. 15 references. (DCK)

  15. Estimating Gear Teeth Stiffness

    DEFF Research Database (Denmark)

    Pedersen, Niels Leergaard

    2013-01-01

    The estimation of gear stiffness is important for determining the load distribution between the gear teeth when two sets of teeth are in contact. Two factors have a major influence on the stiffness; firstly the boundary condition through the gear rim size included in the stiffness calculation...... and secondly the size of the contact. In the FE calculation the true gear tooth root profile is applied. The meshing stiffness’s of gears are highly non-linear, it is however found that the stiffness of an individual tooth can be expressed in a linear form assuming that the contact length is constant....

  16. Mixtures Estimation and Applications

    CERN Document Server

    Mengersen, Kerrie; Titterington, Mike

    2011-01-01

    This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subject

  17. Icobj Programming

    OpenAIRE

    Boussinot , Frédéric

    1996-01-01

    A simple and fully graphical programming method is presented, using a powerful means to combine behaviors. This programming is based on the notion of an «icobj» which has a behavioral aspect («object» part), a graphical aspect («icon» part), with an «animation» aspect. Icobj programming provides parallelism, broadcast event communication and migration through the network. An experimental system based on this approach is described in details. Its implementation with reactive scripts is also pr...

  18. Programming Python

    CERN Document Server

    Lutz, Mark

    2011-01-01

    If you've mastered Python's fundamentals, you're ready to start using it to get real work done. Programming Python will show you how, with in-depth tutorials on the language's primary application domains: system administration, GUIs, and the Web. You'll also explore how Python is used in databases, networking, front-end scripting layers, text processing, and more. This book focuses on commonly used tools and libraries to give you a comprehensive understanding of Python's many roles in practical, real-world programming. You'll learn language syntax and programming techniques in a clear and co

  19. Cost estimating relationships for nuclear power plant operationa and maintenance

    International Nuclear Information System (INIS)

    Bowers, H.I.; Fuller, L.C.; Myers, M.L.

    1987-11-01

    Revised cost estimating relationships for 1987 are presented for estimating annual nonfuel operation and maintenance (O and M) costs for light-water reactor (LWR) nuclear power plants, which update guidelines published previously in 1982. The purpose of these cost estimating relationships is for use in long range planning and evaluations of the economics of nuclear energy for electric power generation. A listing of a computer program, LWROM, implementing the cost estimating relationships and written in advanced BASIC for IBM personal computers, is included

  20. Best-estimate analysis development for BWR systems

    International Nuclear Information System (INIS)

    Sutherland, W.A.; Alamgir, M.; Kalra, S.P.; Beckner, W.D.

    1986-01-01

    The Full Integral Simulation Test (FIST) Program is a three pronged approach to the development of best-estimate analysis capability for BWR systems. An experimental program in the FIST BWR system simulator facility extends the LOCA data base and adds operational transients data. An analytical method development program with the BWR-TRAC computer program extends the modeling of BWR specific components and major interfacing systems, and improves numerical techniques to reduce computer running time. A method qualification program tests TRAC-B against experiments run in the FIST facility and extends the results to reactor system applications. With the completion and integration of these three activities, the objective of a best-estimate analysis capability has been achieved. (author)