WorldWideScience

Sample records for analysis approach leading

  1. A systems approach to risk management through leading safety indicators

    International Nuclear Information System (INIS)

    Leveson, Nancy

    2015-01-01

    The goal of leading indicators for safety is to identify the potential for an accident before it occurs. Past efforts have focused on identifying general leading indicators, such as maintenance backlog, that apply widely in an industry or even across industries. Other recommendations produce more system-specific leading indicators, but start from system hazard analysis and thus are limited by the causes considered by the traditional hazard analysis techniques. Most rely on quantitative metrics, often based on probabilistic risk assessments. This paper describes a new and different approach to identifying system-specific leading indicators and provides guidance in designing a risk management structure to generate, monitor and use the results. The approach is based on the STAMP (System-Theoretic Accident Model and Processes) model of accident causation and tools that have been designed to build on that model. STAMP extends current accident causality to include more complex causes than simply component failures and chains of failure events or deviations from operational expectations. It incorporates basic principles of systems thinking and is based on systems theory rather than traditional reliability theory. - Highlights: • Much effort has gone into developing leading indicators with only limited success. • A systems-theoretic, assumption-based approach may be more successful. • Leading indicators are warning signals of an assumption’s changing vulnerability. • Heuristic biases can be controlled by using plausibility rather than likelihood

  2. How lead consultants approach educational change in postgraduate medical education.

    Science.gov (United States)

    Fokkema, Joanne P I; Westerman, Michiel; Teunissen, Pim W; van der Lee, Nadine; Scherpbier, Albert J J A; van der Vleuten, Cees P M; Dörr, P Joep; Scheele, Fedde

    2012-04-01

      Consultants in charge of postgraduate medical education (PGME) in hospital departments ('lead consultants') are responsible for the implementation of educational change. Although difficulties in innovating in medical education are described in the literature, little is known about how lead consultants approach educational change.   This study was conducted to explore lead consultants' approaches to educational change in specialty training and factors influencing these approaches.   From an interpretative constructivist perspective, we conducted a qualitative exploratory study using semi-structured interviews with a purposive sample of 16 lead consultants in the Netherlands between August 2010 and February 2011. The study design was based on the research questions and notions from corporate business and social psychology about the roles of change managers. Interview transcripts were analysed thematically using template analysis.   The lead consultants described change processes with different stages, including cause, development of content, and the execution and evaluation of change, and used individual change strategies consisting of elements such as ideas, intentions and behaviour. Communication is necessary to the forming of a strategy and the implementation of change, but the nature of communication is influenced by the strategy in use. Lead consultants differed in their degree of awareness of the strategies they used. Factors influencing approaches to change were: knowledge, ideas and beliefs about change; level of reflection; task interpretation; personal style, and department culture.   Most lead consultants showed limited awareness of their own approaches to change. This can lead them to adopt a rigid approach, whereas the ability to adapt strategies to circumstances is considered important to effective change management. Interventions and research should be aimed at enhancing the awareness of lead consultants of approaches to change in PGME.

  3. Real analysis a constructive approach

    CERN Document Server

    Bridger, Mark

    2012-01-01

    A unique approach to analysis that lets you apply mathematics across a range of subjects This innovative text sets forth a thoroughly rigorous modern account of the theoretical underpinnings of calculus: continuity, differentiability, and convergence. Using a constructive approach, every proof of every result is direct and ultimately computationally verifiable. In particular, existence is never established by showing that the assumption of non-existence leads to a contradiction. The ultimate consequence of this method is that it makes sense-not just to math majors but also to students from a

  4. GALA: Group Analysis Leads to Accuracy, a novel approach for solving the inverse problem in exploratory analysis of group MEG recordings

    Directory of Open Access Journals (Sweden)

    Vladimir eKozunov

    2015-04-01

    Full Text Available Although MEG/EEG signals are highly variable between subjects, they allow characterizing systematic changes of cortical activity in both space and time. Traditionally a two-step procedure is used. The first step is a transition from sensor to source space by the means of solving an ill-posed inverse problem for each subject individually. The second is mapping of cortical regions consistently active across subjects. In practice the first step often leads to a set of active cortical regions whose location and timecourses display a great amount of interindividual variability hindering the subsequent group analysis.We propose Group Analysis Leads to Accuracy (GALA - a solution that combines the two steps into one. GALA takes advantage of individual variations of cortical geometry and sensor locations. It exploits the ensuing variability in electromagnetic forward model as a source of additional information. We assume that for different subjects functionally identical cortical regions are located in close proximity and partially overlap and their timecourses are correlated. This relaxed similarity constraint on the inverse solution can be expressed within a probabilistic framework, allowing for an iterative algorithm solving the inverse problem jointly for all subjects.A systematic simulation study showed that GALA, as compared with the standard min-norm approach, improves accuracy of true activity recovery, when accuracy is assessed both in terms of spatial proximity of the estimated and true activations and correct specification of spatial extent of the activated regions. This improvement obtained without using any noise normalization techniques for both solutions, preserved for a wide range of between-subject variations in both spatial and temporal features of regional activation. The corresponding activation timecourses exhibit significantly higher similarity across subjects. Similar results were obtained for a real MEG dataset of face

  5. Approach for seismic risk analysis for CANDU plants in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B-S; Kim, T; Kang, S-K [Korea Power Engineering Co., Seoul (Korea, Republic of); Hong, S-Y; Roh, S-R [Korea Electric Power Corp., Taejon (Korea, Republic of). Research Centre

    1996-12-31

    A seismic risk analysis for CANDU type plants has never been performed. The study presented here suggested that the approach generally applied to LWR type plants could lead to unacceptable result, if directly applied to CANDU plants. This paper presents a modified approach for the seismic risk analysis of CANDU plants. (author). 5 refs., 2 tabs., 2 figs.

  6. Leading change: a concept analysis.

    Science.gov (United States)

    Nelson-Brantley, Heather V; Ford, Debra J

    2017-04-01

    To report an analysis of the concept of leading change. Nurses have been called to lead change to advance the health of individuals, populations, and systems. Conceptual clarity about leading change in the context of nursing and healthcare systems provides an empirical direction for future research and theory development that can advance the science of leadership studies in nursing. Concept analysis. CINAHL, PubMed, PsycINFO, Psychology and Behavioral Sciences Collection, Health Business Elite and Business Source Premier databases were searched using the terms: leading change, transformation, reform, leadership and change. Literature published in English from 2001 - 2015 in the fields of nursing, medicine, organizational studies, business, education, psychology or sociology were included. Walker and Avant's method was used to identify descriptions, antecedents, consequences and empirical referents of the concept. Model, related and contrary cases were developed. Five defining attributes of leading change were identified: (a) individual and collective leadership; (b) operational support; (c) fostering relationships; (d) organizational learning; and (e) balance. Antecedents were external or internal driving forces and organizational readiness. The consequences of leading change included improved organizational performance and outcomes and new organizational culture and values. A theoretical definition and conceptual model of leading change were developed. Future studies that use and test the model may contribute to the refinement of a middle-range theory to advance nursing leadership research and education. From this, empirically derived interventions that prepare and enable nurses to lead change to advance health may be realized. © 2016 John Wiley & Sons Ltd.

  7. Current lead thermal analysis code 'CURRENT'

    International Nuclear Information System (INIS)

    Yamaguchi, Masahito; Tada, Eisuke; Shimamoto, Susumu; Hata, Kenichiro.

    1985-08-01

    Large gas-cooled current lead with the capacity more than 30 kA and 22 kV is required for superconducting toroidal and poloidal coils for fusion application. The current lead is used to carry electrical current from the power supply system at room temperature to the superconducting coil at 4 K. Accordingly, the thermal performance of the current lead is significantly important to determine the heat load requirements of the coil system at 4 K. Japan Atomic Energy Research Institute (JAERI) has being developed the large gas-cooled current leads with the optimum condition in which the heat load is around 1 W per 1 kA at 4 K. In order to design the current lead with the optimum thermal performances, JAERI developed thermal analysis code named as ''CURRENT'' which can theoretically calculate the optimum geometric shape and cooling conditions of the current lead. The basic equations and the instruction manual of the analysis code are described in this report. (author)

  8. Lead optimization attrition analysis (LOAA): a novel and general methodology for medicinal chemistry.

    Science.gov (United States)

    Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos

    2015-08-01

    Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Lead and Conduct Problems: A Meta-Analysis

    Science.gov (United States)

    Marcus, David K.; Fulton, Jessica J.; Clarke, Erin J.

    2010-01-01

    This meta-analysis examined the association between conduct problems and lead exposure. Nineteen studies on 8,561 children and adolescents were included. The average "r" across all 19 studies was 0.19 (p less than 0.001), which is considered a medium effect size. Studies that assessed lead exposure using hair element analysis yielded…

  10. Measuring leading placental edge to internal cervical os: Transabdominal versus transvaginal approach

    DEFF Research Database (Denmark)

    Westerway, Susan Campbell; Hyett, Jon; Henning Pedersen, Lars

    2017-01-01

    We aimed to compare the value of transabdominal (TA) and transvaginal (TV) approaches for assessing the risk of a low-lying placenta. This involved a comparison of TA and TV measurements between the leading placental edge and the internal cervical os. We also assessed the intra-/interobserver var......We aimed to compare the value of transabdominal (TA) and transvaginal (TV) approaches for assessing the risk of a low-lying placenta. This involved a comparison of TA and TV measurements between the leading placental edge and the internal cervical os. We also assessed the intra......-/interobserver variation for these measurements and the efficacy of TA measures in screening for a low placenta. Methodology Transabdominal and TV measurements of the leading placental edge to the internal cervical os were performed on 369 consecutive pregnancies of 16–41 weeks' gestation. The difference (TA-TV) from...... the area under the receiver operator characteristics (ROC) curve. Intra-/interobserver variations were also calculated. Results Of the pregnancies, 278 had a leading placental edge that was visible with the TV approach. Differences (TA-TV) ranged from −50 mm to +57 mm. Bland-Altman plot shows that TA...

  11. Blood Lead Toxicity Analysis of Multipurpose Canines and Military Working Dogs.

    Science.gov (United States)

    Reid, Paul; George, Clinton; Byrd, Christopher M; Miller, Laura; Lee, Stephen J; Motsinger-Reif, Alison; Breen, Matthew; Hayduk, Daniel W

    Special Operations Forces and their accompanying tactical multipurpose canines (MPCs) who are involved in repeated live-fire exercises and military operations have the potential for increased blood lead levels and toxicity due to aerosolized and environmental lead debris. Clinical lead-toxicity symptoms can mimic other medical disorders, rendering accurate diagnosis more challenging. The objective of this study was to examine baseline lead levels of MPCs exposed to indoor firing ranges compared with those of nontactical military working dogs (MWDs) with limited or no exposure to the same environment. In the second part of the study, results of a commercially available, human-blood lead testing system were compared with those of a benchtop inductively coupled plasma-mass spectrometry (ICP-MS) analysis technique. Blood samples from 18 MPCs were tested during routine clinical blood draws, and six samples from a canine group with limited exposure to environmental lead (nontactical MWDs) were tested for comparison. There was a high correlation between results of the commercial blood-testing system compared with ICP-MS when blood lead levels were higher than 4.0µg/dL. Both testing methods recorded higher blood lead levels in the MPC blood samples than in those of the nontactical MWDs, although none of the MPC samples tested contained lead levels approaching those at which symptoms of lead toxicity have previously been reported in animals (i.e., 35µg/dL). 2018.

  12. A functional genomics approach using metabolomics and in silico pathway analysis

    DEFF Research Database (Denmark)

    Förster, Jochen; Gombert, Andreas Karoly; Nielsen, Jens

    2002-01-01

    analysis techniques and changes in the genotype will in many cases lead to different metabolite profiles. Here, a theoretical framework that may be applied to identify the function of orphan genes is presented. The approach is based on a combination of metabolome analysis combined with in silico pathway...

  13. Reliability analysis - systematic approach based on limited data

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-11-01

    The initial approaches required for reliability analysis are outlined. These approaches highlight the system boundaries, examine the conditions under which the system is required to operate, and define the overall performance requirements. The discussion is illustrated by a simple example of an automatic protective system for a nuclear reactor. It is then shown how the initial approach leads to a method of defining the system, establishing performance parameters of interest and determining the general form of reliability models to be used. The overall system model and the availability of reliability data at the system level are next examined. An iterative process is then described whereby the reliability model and data requirements are systematically refined at progressively lower hierarchic levels of the system. At each stage, the approach is illustrated with examples from the protective system previously described. The main advantages of the approach put forward are the systematic process of analysis, the concentration of assessment effort in the critical areas and the maximum use of limited reliability data. (author)

  14. Information-theoretic approach to lead-lag effect on financial markets

    Science.gov (United States)

    Fiedor, Paweł

    2014-08-01

    Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both types of analysis are concentrated mostly on Pearson's correlation coefficient and consequently intraday lead-lag relationships (where one of the variables in a pair is time-lagged) are also associated with them. Under the Efficient-Market Hypothesis such relationships are not possible as all information is embedded in the prices, but in real markets we find such dependencies. In this paper we analyse lead-lag relationships of financial instruments and extend known methodology by using mutual information instead of Pearson's correlation coefficient. Mutual information is not only a more general measure, sensitive to non-linear dependencies, but also can lead to a simpler procedure of statistical validation of links between financial instruments. We analyse lagged relationships using New York Stock Exchange 100 data not only on an intraday level, but also for daily stock returns, which have usually been ignored.

  15. Children's Lead Exposure: A Multimedia Modeling Analysis to Guide Public Health Decision-Making.

    Science.gov (United States)

    Zartarian, Valerie; Xue, Jianping; Tornero-Velez, Rogelio; Brown, James

    2017-09-12

    Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)'s National Drinking Water Advisory Council (NDWAC) recommended establishment of a "health-based, household action level" for lead in drinking water based on children's exposure. The primary objective was to develop a coupled exposure-dose modeling approach that can be used to determine what drinking water lead concentrations keep children's blood lead levels (BLLs) below specified values, considering exposures from water, soil, dust, food, and air. Related objectives were to evaluate the coupled model estimates using real-world blood lead data, to quantify relative contributions by the various media, and to identify key model inputs. A modeling approach using the EPA's Stochastic Human Exposure and Dose Simulation (SHEDS)-Multimedia and Integrated Exposure Uptake and Biokinetic (IEUBK) models was developed using available data. This analysis for the U.S. population of young children probabilistically simulated multimedia exposures and estimated relative contributions of media to BLLs across all population percentiles for several age groups. Modeled BLLs compared well with nationally representative BLLs (0-23% relative error). Analyses revealed relative importance of soil and dust ingestion exposure pathways and associated Pb intake rates; water ingestion was also a main pathway, especially for infants. This methodology advances scientific understanding of the relationship between lead concentrations in drinking water and BLLs in children. It can guide national health-based benchmarks for lead and related community public health decisions. https://doi.org/10.1289/EHP1605.

  16. A Ligand-observed Mass Spectrometry Approach Integrated into the Fragment Based Lead Discovery Pipeline

    Science.gov (United States)

    Chen, Xin; Qin, Shanshan; Chen, Shuai; Li, Jinlong; Li, Lixin; Wang, Zhongling; Wang, Quan; Lin, Jianping; Yang, Cheng; Shui, Wenqing

    2015-01-01

    In fragment-based lead discovery (FBLD), a cascade combining multiple orthogonal technologies is required for reliable detection and characterization of fragment binding to the target. Given the limitations of the mainstream screening techniques, we presented a ligand-observed mass spectrometry approach to expand the toolkits and increase the flexibility of building a FBLD pipeline especially for tough targets. In this study, this approach was integrated into a FBLD program targeting the HCV RNA polymerase NS5B. Our ligand-observed mass spectrometry analysis resulted in the discovery of 10 hits from a 384-member fragment library through two independent screens of complex cocktails and a follow-up validation assay. Moreover, this MS-based approach enabled quantitative measurement of weak binding affinities of fragments which was in general consistent with SPR analysis. Five out of the ten hits were then successfully translated to X-ray structures of fragment-bound complexes to lay a foundation for structure-based inhibitor design. With distinctive strengths in terms of high capacity and speed, minimal method development, easy sample preparation, low material consumption and quantitative capability, this MS-based assay is anticipated to be a valuable addition to the repertoire of current fragment screening techniques. PMID:25666181

  17. Noninvasive Biomonitoring Approaches to Determine Dosimetry and Risk Following Acute Chemical Exposure: Analysis of Lead or Organophosphate Insecticide in Saliva

    International Nuclear Information System (INIS)

    Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.; Campbell, James A.; Lin, Yuehe

    2004-01-01

    There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. The system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk

  18. Mechanochemical synthesis of nanocrystalline lead selenide. Industrial approach

    Energy Technology Data Exchange (ETDEWEB)

    Achimovicova, Marcela; Balaz, Peter [Slovak Academy of Sciences, Kosice (Slovakia). Inst. of Geotechnics; Durisin, Juraj [Slovak Academy of Sciences, Kosice (Slovakia). Inst. of Materials Research; Daneu, Nina [Josef Stefan Institute, Ljubljana (Slovenia). Dept. for Nanostructured Materials; Kovac, Juraj; Satka, Alexander [Slovak Univ. of Technology and International Laser Centre, Bratislava (Slovakia). Dept. of Microelectronics; Feldhoff, Armin [Leibniz Univ. Hannover (Germany). Inst. fuer Physikalische Chemie und Elektrochemie; Gock, Eberhard [Technical Univ. Clausthal, Clausthal-Zellerfeld (Germany). Inst. of Mineral and Waste Processing and Dumping Technology

    2011-04-15

    Mechanochemical synthesis of lead selenide PbSe nanoparticles was performed by high-energy milling of lead and selenium powder in a laboratory planetary ball mill and in an industrial eccentric vibratory mill. Structural properties of the synthesized lead selenide were characterized using X-ray diffraction that confirmed crystalline nature of PbSe nanoparticles. The average size of PbSe crystallites of 37 nm was calculated from X-ray diffraction data using the Williamson-Hall method. The methods of particle size distribution analysis, specific surface area measurement, scanning electron microscopy and transmission electron microscopy were used for characterization of surface, mean particle size, and morphology of PbSe. An application of industrial mill verified a possibility of the synthesis of a narrow bandgap semiconductor PbSe at ambient temperature and in a relatively short reaction time. (orig.)

  19. Transvenous Lead Extraction via the Inferior Approach Using a Gooseneck Snare versus Simple Manual Traction.

    Science.gov (United States)

    Jo, Uk; Kim, Jun; Hwang, You-Mi; Lee, Ji-Hyun; Kim, Min-Su; Choi, Hyung-Oh; Lee, Woo-Seok; Kwon, Chang-Hee; Ko, Gi-Young; Yoon, Hyun-Ki; Nam, Gi-Byoung; Choi, Kee-Joon; Kim, You-Ho

    2016-03-01

    The number of patients with cardiac implantable electronic devices needing lead extraction is increasing for various reasons, including infections, vascular obstruction, and lead failure. We report our experience with transvenous extraction of pacemaker and defibrillator leads via the inferior approach of using a gooseneck snare as a first-line therapy and compare extraction using a gooseneck snare with extraction using simple manual traction. The study included 23 consecutive patients (43 leads) who underwent transvenous lead extraction using a gooseneck snare (group A) and 10 consecutive patients (17 leads) who underwent lead extraction using simple manual traction (group B). Patient characteristics, indications, and outcomes were analyzed and compared between the groups. The dwelling time of the leads was longer in group A (median, 121) than in group B (median, 56; p=0.000). No differences were noted in the overall procedural success rate (69.6% vs. 70%), clinical procedural success rate (82.6% vs. 90%), and lead clinical success rate (86% vs. 94.1%) between the groups. The procedural success rates according to lead type were 89.2% and 100% for pacing leads and 66.7% and 83.3% for defibrillator leads in groups A and B, respectively. Major complications were noted in 3 (mortality in 1) patients in group A and 2 patients in group B. Transvenous extraction of pacemaker leads via an inferior approach using a gooseneck snare was both safe and effective. However, stand-alone transvenous extraction of defibrillator leads using the inferior approach was suboptimal.

  20. Analysis of lead/acid battery life cycle factors: their impact on society and the lead industry

    Science.gov (United States)

    Robertson, J. G. S.; Wood, J. R.; Ralph, B.; Fenn, R.

    The underlying theme of this paper is that society, globally, is undergoing a fundamental conceptual shift in the way it views the environment and the role of industry within it. There are views in certain quarters that this could result in the virtual elimination of the lead industry's entire product range. Despite these threats, it is argued that the prospects for the lead industry appear to be relatively favourable in a number of respects. The industry's future depends to a significant degree, however, upon its ability to argue its case in a number of key areas. It is contended, therefore, that if appropriate strategies and means are promulgated, the prospects of the industry would appear to be relatively healthy. But, for this to happen with optimal effectiveness, a conceptual change will be necessary within the industry. New strategies and tools will have to be developed. These will require a significantly more integrated, holistically based and 'reflexive' approach than previously. The main elements of such an approach are outlined. With reference to the authors' ongoing research into automotive lead/acid starting lighting ignition (SLI) batteries, the paper shows how the technique of in-depth life cycle assessment (LCA), appropriately adapted to the needs of the industry, will provide a crucial role in this new approach. It also shows how it may be used as an internal design and assessment tool to identify those stages in the battery life cycle that give rise to the greatest environmental burdens, and to assess the effects of changes in the cycle to those burdens. It is argued that the development of this approach requires the serious and urgent attention of the whole of the lead industry. Also to make the LCA tool fully effective, it must be based on a 'live' database that is produced, maintained and continually updated by the industry.

  1. INAA, AAS, and lead isotope analysis of ancient lead anchors from the black SEA

    International Nuclear Information System (INIS)

    Kuleff, I.; Djingova, R.; Alexandrova, A.

    1995-01-01

    Lead stock of wooden-lead anchors found along the Bulgarian Black Sea coast and typo logically dated VI c. B.C. - III c. A.D. have been analyzed for chemical composition and lead isotope ratios by INAA, AAS and mass spectrometry. Using multivariate methods for analysis as well as simple bi variate plots the lead for production of the stocks was localized as originating from Laurion, Thassos, Troas, Chalkidike and the Rhodopes. In general, the chemical composition is not recommended to be used for provenance study of lead artefacts. Combining the results from this study with the existing typo logical classification certain conclusion about the production and distribution of lead anchors in the Aegean region are made. (author). 22 refs., 3 figs., 4 tabs

  2. ANALYSIS OF AGRICULTURAL LEADING SUBSECTOR DISTRICT/CITIES IN BENGKULU PROVINCE

    Directory of Open Access Journals (Sweden)

    Agung Ridho Pratama

    2017-12-01

    Full Text Available The purpose of this study was to determine the agricultural leading subsector in the districts/ cities in Bengkulu Province and the conditions of agricultural leading  subsectors  district/ cites status as a Main Area in Bengkulu Province before and after the expansion area. The analysis method used Location Quotient (LQ, Dynamic Location Quotient (DLQ, Shift-Share analysis (SS and Overlay analysis. The study used secondary data, such as Gross Domestic Regional Bruto (PDRB of districts/cities in Bengkulu Province and Bengkulu Province from 2004 until 2014 based on constant basic price. The result of this study showed that agricultural leading subsector based on the overlay analysis (combined analysing from three analysis method are livestock subsector and plantation crops subsector, sepecially in Bengkulu City and Kaur. In South Bengkulu, before the expansion area, the fisheries subsector to be the only agricultural leading subsector. After the expansion, the agricultural leading subsectors increased to fisheries subsector and livestock subsector. In Rejang Lebong, food crops subsector remains a agricultural leading subsectors both before and after doing the expansion area. Meanwhile, in North Bengkulu, before the expansion area livestock subsector as the agricultural leading subsector. When do expansion area, position of the livestock subsector replaced by fisheries subsector.

  3. An exploratory study of lead recovery in lead-acid battery lifecycle in US market: An evidence-based approach

    International Nuclear Information System (INIS)

    Genaidy, A.M.; Sequeira, R.; Tolaymat, T.; Kohler, J.; Rinder, M.

    2008-01-01

    Background: This research examines lead recovery and recycling in lead-acid batteries (LAB) which account for 88% of US lead consumption. We explore strategies to maximize lead recovery and recycling in the LAB lifecycle. Currently, there is limited information on recycling rates for LAB in the published literature and is derived from a single source. Therefore, its recycling efforts in the US has been unclear so as to determine the maximum opportunities for metal recovery and recycling in the face of significant demands for LAB particularly in the auto industry. Objectives: The research utilizes an evidence-based approach to: (1) determine recycling rates for lead recovery in the LAB product lifecycle for the US market; and (2) quantify and identify opportunities where lead recovery and recycling can be improved. Methods: A comprehensive electronic search of the published literature was conducted to gather information on different LAB recycling models and actual data used to calculate recycling rates based on product lifecycle for the US market to identify strategies for increasing lead recovery and recycling. Results: The electronic search yielded five models for calculating LAB recycling rates. The description of evidence was documented for each model. Furthermore, an integrated model was developed to identify and quantify the maximum opportunities for lead recovery and recycling. Results showed that recycling rates declined during the period spanning from 1999 to 2006. Opportunities were identified for recovery and recycling of lead in the LAB product lifecycle. Concluding remarks: One can deduce the following from the analyses undertaken in this report: (1) lead recovery and recycling has been stable between 1999 and 2006; (2) lead consumption has increased at an annual rate of 2.25%, thus, the values derived in this study for opportunities dealing with lead recovery and recycling underestimate the amount of lead in scrap and waste generated; and (3) the

  4. Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge

    Science.gov (United States)

    Yap, Keng C.

    2010-01-01

    This viewgraph presentation reviews Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge. The Wing Leading Edge Impact Detection System (WLE IDS) and the Impact Analysis Process are also described to monitor WLE debris threats. The contents include: 1) Risk Management via SHM; 2) Hardware Overview; 3) Instrumentation; 4) Sensor Configuration; 5) Debris Hazard Monitoring; 6) Ascent Response Summary; 7) Response Signal; 8) Distribution of Flight Indications; 9) Probabilistic Risk Analysis (PRA); 10) Model Correlation; 11) Impact Tests; 12) Wing Leading Edge Modeling; 13) Ascent Debris PRA Results; and 14) MM/OD PRA Results.

  5. Analysis of leading sector of Jambi City

    Directory of Open Access Journals (Sweden)

    Hardiani Hardiani

    2017-09-01

    Full Text Available This study aims to analyze the leading sectors in the city of Jambi. The main data used is GDP data of Jambi City series 2010 for the period of 2012-2014. Analysis tool that is used are Location Quotient, Shift Share, Klassen Typology and Overlay Analysis. The results of the analysis found that of the 14 basic sectors in Jambi City (based on LQ analysis, there are four priority sectors namely electricity and gas procurement, building, large and retail trade, car and motorcycle repairs, health service, and social activities. Keywords: Location Quotient, Shift Share, Klassen Tipology, Overlay Analysis

  6. The lead cooled fast reactor benchmark Brest-300: analysis with sensitivity method

    International Nuclear Information System (INIS)

    Smirnov, V.; Orlov, V.; Mourogov, A.; Lecarpentier, D.; Ivanova, T.

    2005-01-01

    Lead cooled fast neutrons reactor is one of the most interesting candidates for the development of atomic energy. BREST-300 is a 300 MWe lead cooled fast reactor developed by the NIKIET (Russia) with a deterministic safety approach which aims to exclude reactivity margins greater than the delayed neutron fraction. The development of innovative reactors (lead coolant, nitride fuel...) and fuel cycles with new constraints such as cycle closure or actinide burning, requires new technologies and new nuclear data. In this connection, the tool and neutron data used for the calculational analysis of reactor characteristics requires thorough validation. NIKIET developed a reactor benchmark fitting of design type calculational tools (including neutron data). In the frame of technical exchanges between NIKIET and EDF (France), results of this benchmark calculation concerning the principal parameters of fuel evolution and safety parameters has been inter-compared, in order to estimate the uncertainties and validate the codes for calculations of this new kind of reactors. Different codes and cross-sections data have been used, and sensitivity studies have been performed to understand and quantify the uncertainties sources.The comparison of results shows that the difference on k eff value between ERANOS code with ERALIB1 library and the reference is of the same order of magnitude than the delayed neutron fraction. On the other hand, the discrepancy is more than twice bigger if JEF2.2 library is used with ERANOS. Analysis of discrepancies in calculation results reveals that the main effect is provided by the difference of nuclear data, namely U 238 , Pu 239 fission and capture cross sections and lead inelastic cross sections

  7. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    Science.gov (United States)

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Comprehensive analysis of 5-aminolevulinic acid dehydrogenase (ALAD variants and renal cell carcinoma risk among individuals exposed to lead.

    Directory of Open Access Journals (Sweden)

    Dana M van Bemmel

    Full Text Available BACKGROUND: Epidemiologic studies are reporting associations between lead exposure and human cancers. A polymorphism in the 5-aminolevulinic acid dehydratase (ALAD gene affects lead toxicokinetics and may modify the adverse effects of lead. METHODS: The objective of this study was to evaluate single-nucleotide polymorphisms (SNPs tagging the ALAD region among renal cancer cases and controls to determine whether genetic variation alters the relationship between lead and renal cancer. Occupational exposure to lead and risk of cancer was examined in a case-control study of renal cell carcinoma (RCC. Comprehensive analysis of variation across the ALAD gene was assessed using a tagging SNP approach among 987 cases and 1298 controls. Occupational lead exposure was estimated using questionnaire-based exposure assessment and expert review. Odds ratios (OR and 95% confidence intervals (CI were calculated using logistic regression. RESULTS: The adjusted risk associated with the ALAD variant rs8177796(CT/TT was increased (OR = 1.35, 95%CI = 1.05-1.73, p-value = 0.02 when compared to the major allele, regardless of lead exposure. Joint effects of lead and ALAD rs2761016 suggest an increased RCC risk for the homozygous wild-type and heterozygous alleles ((GGOR = 2.68, 95%CI = 1.17-6.12, p = 0.01; (GAOR = 1.79, 95%CI = 1.06-3.04 with an interaction approaching significance (p(int = 0.06. No significant modification in RCC risk was observed for the functional variant rs1800435(K68N. Haplotype analysis identified a region associated with risk supporting tagging SNP results. CONCLUSION: A common genetic variation in ALAD may alter the risk of RCC overall, and among individuals occupationally exposed to lead. Further work in larger exposed populations is warranted to determine if ALAD modifies RCC risk associated with lead exposure.

  9. Re-analysis of fatigue data for welded joints using the notch stress approach

    DEFF Research Database (Denmark)

    Pedersen, Mikkel Melters; Mouritsen, Ole Ø.; Hansen, Michael Rygaard

    2010-01-01

    Experimental fatigue data for welded joints have been collected and subjected to re-analysis using the notch stress approach according to IIW recommendations. This leads to an overview regarding the reliability of the approach, based on a large number of results (767 specimens). Evidently......-welded joints agree quite well with the FAT 225 curve; however a reduction to FAT 200 is suggested in order to achieve approximately the same safety as observed in the nominal stress approach....

  10. Corrections to the leading eikonal amplitude for high-energy scattering and quasipotential approach

    International Nuclear Information System (INIS)

    Nguyen Suan Hani; Nguyen Duy Hung

    2003-12-01

    Asymptotic behaviour of the scattering amplitude for two scalar particle at high energy and fixed momentum transfers is reconsidered in quantum field theory. In the framework of the quasipotential approach and the modified perturbation theory a systematic scheme of finding the leading eikonal scattering amplitudes and its corrections is developed and constructed. The connection between the solutions obtained by quasipotential and functional approaches is also discussed. (author)

  11. Tracing fetal and childhood exposure to lead using isotope analysis of deciduous teeth

    International Nuclear Information System (INIS)

    Shepherd, Thomas J.; Dirks, Wendy; Roberts, Nick M.W.; Patel, Jaiminkumar G.; Hodgson, Susan; Pless-Mulloli, Tanja; Walton, Pamela; Parrish, Randall R.

    2016-01-01

    study confirms that laser ablation Pb isotope analysis of deciduous teeth, when carried out in conjunction with histological analysis, permits a reconstruction of the timing, duration and source of exposure to Pb during early childhood. With further development, this approach has the potential to study larger cohorts and appraise environments where the levels of exposure to Pb are much higher. - Highlights: • Reconstructing a high resolution chronology of early childhood exposure to lead. • Combined laser ablation lead isotope – histological analysis of children's teeth. • Using dentine to recover information on the intensity, duration and source of lead. • Importance of industrial airborne lead pollution in a post-leaded petrol era.

  12. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  13. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Directory of Open Access Journals (Sweden)

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  14. Lead isotope approach to the understanding of early Japanese bronze culture

    International Nuclear Information System (INIS)

    Mabuchi, H.; Hirao, Y.

    1985-01-01

    For several years, the authors have used lead isotope analysis to investigate extensively the provenance of ancient bronze or copper artifacts which had been excavated mainly from Japanese archaeological sites. The results have been published item by item in several relevant Japanese journals. This review is intended to give an account which will review the whole work relating early Japanese bronze culture to Chinese and Korean cultures through lead isotope study. (author)

  15. Relational Leading

    DEFF Research Database (Denmark)

    Larsen, Mette Vinther; Rasmussen, Jørgen Gulddahl

    2015-01-01

    This first chapter presents the exploratory and curious approach to leading as relational processes – an approach that pervades the entire book. We explore leading from a perspective that emphasises the unpredictable challenges and triviality of everyday life, which we consider an interesting......, relevant and realistic way to examine leading. The chapter brings up a number of concepts and contexts as formulated by researchers within the field, and in this way seeks to construct a first understanding of relational leading....

  16. Tracing fetal and childhood exposure to lead using isotope analysis of deciduous teeth

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, Thomas J. [Centre for Oral Health Research, School of Dental Sciences, Newcastle University, Newcastle upon Tyne (United Kingdom); British Geological Survey, Keyworth, Nottingham (United Kingdom); Dirks, Wendy [Department of Anthropology, Durham University, Durham (United Kingdom); Roberts, Nick M.W. [NERC Isotope Geosciences Laboratory, British Geological Survey, Nottingham (United Kingdom); Patel, Jaiminkumar G. [Leeds Dental Institute, University Leeds, Leeds (United Kingdom); Hodgson, Susan [MRC-PHE Centre for Environment and Health, Department of Epidemiology and Biostatistics, Imperial College London (United Kingdom); Pless-Mulloli, Tanja [Institute of Health and Society, Newcastle University, Newcastle upon Tyne (United Kingdom); Walton, Pamela [Centre for Oral Health Research, School of Dental Sciences, Newcastle University, Newcastle upon Tyne (United Kingdom); Parrish, Randall R. [British Geological Survey, Keyworth, Nottingham (United Kingdom)

    2016-04-15

    changes in the isotope composition of blood Pb. Our pilot study confirms that laser ablation Pb isotope analysis of deciduous teeth, when carried out in conjunction with histological analysis, permits a reconstruction of the timing, duration and source of exposure to Pb during early childhood. With further development, this approach has the potential to study larger cohorts and appraise environments where the levels of exposure to Pb are much higher. - Highlights: • Reconstructing a high resolution chronology of early childhood exposure to lead. • Combined laser ablation lead isotope – histological analysis of children's teeth. • Using dentine to recover information on the intensity, duration and source of lead. • Importance of industrial airborne lead pollution in a post-leaded petrol era.

  17. Childhood lead poisoning investigations: evaluating a portable instrument for testing soil lead.

    Science.gov (United States)

    Reames, Ginger; Lance, Larrie L

    2002-04-01

    The Childhood Lead Poisoning Prevention Branch of the California Department of Health Services evaluated a portable X-ray fluorescence (XRF) instrument for use as a soil lead-testing tool during environmental investigations of lead-poisoned children's homes. A Niton XRF was used to test soil at 119 sampling locations in the yards of 11 San Francisco Bay Area houses. Niton XRF readings were highly correlated with laboratory results and met the study criteria for an acceptable screening method. The data suggest that the most health-protective and time-efficient approach to testing for soil lead above regulatory levels is to take either surface readings or readings of a test cup of soil prepared by grinding with a mortar and pestle. The advantage of the test cup method is that the test cup with soil may be submitted to a laboratory for confirmatory analysis.

  18. Mixcore safety analysis approach used for introduction of Westinghouse fuel assemblies in Ukraine

    International Nuclear Information System (INIS)

    Abdullayev, A.; Baidullin, V.; Maryochin, A.; Sleptsov, S.; Kulish, G.

    2008-01-01

    Six Westinghouse Lead Test Assemblies (LTA) were installed in 2005 and are currently operated in Unit 3 of the South Ukraine NPP (SUNPP) under the Ukraine Nuclear Fuel Qualification Project. At the early stages of the LTAs implementation in Ukraine, there was no experience of licensing of new fuel types, which explains the need to develop approaches for safety substantiation of LTAs. This presentation considers some approaches for performing of safety analysis of the design basis Initiating Events (IE) for the LTA fuel cycles. These approaches are non-standard in terms of the established practices for obtaining the regulatory authorities' permission for the core operation. The analysis was based on the results of the FA and reactor core thermal hydraulic and nuclear design

  19. Identification of sources of lead exposure in French children by lead isotope analysis: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Lucas Jean-Paul

    2011-08-01

    Full Text Available Abstract Background The amount of lead in the environment has decreased significantly in recent years, and so did exposure. However, there is no known safe exposure level and, therefore, the exposure of children to lead, although low, remains a major public health issue. With the lower levels of exposure, it is becoming more difficult to identify lead sources and new approaches may be required for preventive action. This study assessed the usefulness of lead isotope ratios for identifying sources of lead using data from a nationwide sample of French children aged from six months to six years with blood lead levels ≥25 μg/L. Methods Blood samples were taken from 125 children, representing about 600,000 French children; environmental samples were taken from their homes and personal information was collected. Lead isotope ratios were determined using quadrupole ICP-MS (inductively coupled plasma - mass spectrometry and the isotopic signatures of potential sources of exposure were matched with those of blood in order to identify the most likely sources. Results In addition to the interpretation of lead concentrations, lead isotope ratios were potentially of use for 57% of children aged from six months to six years with blood lead level ≥ 25 μg/L (7% of overall children in France, about 332,000 children, with at least one potential source of lead and sufficiently well discriminated lead isotope ratios. Lead isotope ratios revealed a single suspected source of exposure for 32% of the subjects and were able to eliminate at least one unlikely source of exposure for 30% of the children. Conclusions In France, lead isotope ratios could provide valuable additional information in about a third of routine environmental investigations.

  20. Systematic approach to scenario development using FEP analysis

    International Nuclear Information System (INIS)

    Bailey, L.E.E.; Lever, D.A.

    2001-01-01

    UK regulatory requirements require that the 'assessed radiological risk ... to a representative member of the potentially exposed group at greatest risk should be consistent with a risk target of 10 -6 per year' and that risks should be 'summed over all situations that could give rise to exposure to the group'. It is a further requirement that a repository performance assessment provides a 'comprehensive record of the judgements and assumptions on which the risk assessments are based'. In order to meet these requirements, Nirex, working with AEA Technology, has developed an approach to performance assessment based on the identification and analysis of features, events and processes (FEPs). The objectives of the approach are to provide a comprehensive, traceable and clear presentation of a performance assessment for a deep geological radioactive waste repository. The approach to scenario development is fundamental to the overall Nirex strategy for performance assessment, eventually leading to a repository safety case for regulatory submission. This paper outlines the main concepts of the approach, illustrated with examples of work undertaken by Nirex to demonstrate its practicality. Due to the current status of the Nirex repository programme, the approach has not yet been used to conduct a full performance assessment of a repository located at a specific site. (authors)

  1. Analysis of Lead and Zinc by Mercury-Free Potentiometric Stripping Analysis

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    A method is presented for trace-element analysis of lead and zinc by potentiometric stripping analysis (PSA) where both the glassy-carbon working electrode and the electrolyte are free of mercury. Analysis of zinc requires an activation procedure of the glassy-carbon electrode. The activation...... is performed by pre-concentrating zinc on glassy carbon at -1400 mV(SCE) in a mercury-free electrolyte containing 0.1 M HCl and 2 ppm Zn2+, followed by stripping at approx. -1050 mV. A linear relationship between stripping peak areas, recorded in the derivative mode, and concentration was found...

  2. Novel approach of fragment-based lead discovery applied to renin inhibitors.

    Science.gov (United States)

    Tawada, Michiko; Suzuki, Shinkichi; Imaeda, Yasuhiro; Oki, Hideyuki; Snell, Gyorgy; Behnke, Craig A; Kondo, Mitsuyo; Tarui, Naoki; Tanaka, Toshimasa; Kuroita, Takanobu; Tomimoto, Masaki

    2016-11-15

    A novel approach was conducted for fragment-based lead discovery and applied to renin inhibitors. The biochemical screening of a fragment library against renin provided the hit fragment which showed a characteristic interaction pattern with the target protein. The hit fragment bound only to the S1, S3, and S3 SP (S3 subpocket) sites without any interactions with the catalytic aspartate residues (Asp32 and Asp215 (pepsin numbering)). Prior to making chemical modifications to the hit fragment, we first identified its essential binding sites by utilizing the hit fragment's substructures. Second, we created a new and smaller scaffold, which better occupied the identified essential S3 and S3 SP sites, by utilizing library synthesis with high-throughput chemistry. We then revisited the S1 site and efficiently explored a good building block attaching to the scaffold with library synthesis. In the library syntheses, the binding modes of each pivotal compound were determined and confirmed by X-ray crystallography and the library was strategically designed by structure-based computational approach not only to obtain a more active compound but also to obtain informative Structure Activity Relationship (SAR). As a result, we obtained a lead compound offering synthetic accessibility as well as the improved in vitro ADMET profiles. The fragments and compounds possessing a characteristic interaction pattern provided new structural insights into renin's active site and the potential to create a new generation of renin inhibitors. In addition, we demonstrated our FBDD strategy integrating highly sensitive biochemical assay, X-ray crystallography, and high-throughput synthesis and in silico library design aimed at fragment morphing at the initial stage was effective to elucidate a pocket profile and a promising lead compound. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A Systematic Approach for Engagement Analysis Under Multitasking Environments

    Science.gov (United States)

    Zhang, Guangfan; Leddo, John; Xu, Roger; Richey, Carl; Schnell, Tom; McKenzie, Frederick; Li, Jiang

    2011-01-01

    An overload condition can lead to high stress for an operator and further cause substantial drops in performance. On the other extreme, in automated systems, an operator may become underloaded; in which case, it is difficult for the operator to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, a disengaged operation may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss a systematic approach monitor for extremes of cognitive workload and engagement in multitasking environments. Inferences of cognitive workload ar engagement are based on subjective evaluations, objective performance measures, physiological signals, and task analysis results. The systematic approach developed In this paper aggregates these types of information collected under the multitasking environment and can provide a real-time assessment or engagement.

  4. Comparison of two methods for blood lead analysis in cattle: graphite-furnace atomic absorption spectrometry and LeadCare(R) II system.

    Science.gov (United States)

    Bischoff, Karyn; Gaskill, Cynthia; Erb, Hollis N; Ebel, Joseph G; Hillebrandt, Joseph

    2010-09-01

    The current study compared the LeadCare(R) II test kit system with graphite-furnace atomic absorption spectrometry for blood lead (Pb) analysis in 56 cattle accidentally exposed to Pb in the field. Blood Pb concentrations were determined by LeadCare II within 4 hr of collection and after 72 hr of refrigeration. Blood Pb concentrations were determined by atomic absorption spectrometry, and samples that were coagulated (n = 12) were homogenized before analysis. There was strong rank correlation (R(2) = 0.96) between atomic absorption and LeadCare II (within 4 hr of collection), and a conversion formula was determined for values within the observed range (3-91 mcg/dl, although few had values >40 mcg/dl). Median and mean blood pb concentrations for atomic absorption were 7.7 and 15.9 mcg/dl, respectively; for LeadCare II, medians were 5.2 mcg/dl at 4 hr and 4.9 mcg/dl at 72 hr, and means were 12.4 and 11.7, respectively. LeadCare II results at 4 hr strongly correlated with 72 hr results (R(2) = 0.96), but results at 72 hr were lower (P atomic absorption. Although there have been several articles that compared LeadCare with other analytical techniques, all were for the original system, not LeadCare II. The present study indicated that LeadCare II results correlated well with atomic absorption over a wide range of blood Pb concentrations and that refrigerating samples for up to 72 hr before LeadCare II analysis was acceptable for clinical purposes.

  5. Long-term dietary exposure to lead in young European children: Comparing a pan-European approach with a national exposure assessment

    DEFF Research Database (Denmark)

    Boon, P.E.; Te Biesebeek, J.D.; van Klaveren, J.D.

    2012-01-01

    Long-term dietary exposures to lead in young children were calculated by combining food consumption data of 11 European countries categorised using harmonised broad food categories with occurrence data on lead from different Member States (pan-European approach). The results of the assessment...... in children living in the Netherlands were compared with a long-term lead intake assessment in the same group using Dutch lead concentration data and linking the consumption and concentration data at the highest possible level of detail. Exposures obtained with the pan-European approach were higher than...... the national exposure calculations. For both assessments cereals contributed most to the exposure. The lower dietary exposure in the national study was due to the use of lower lead concentrations and a more optimal linkage of food consumption and concentration data. When a pan-European approach, using...

  6. Lead isotope ratio analysis of bullet samples by using quadrupole ICP-MS

    International Nuclear Information System (INIS)

    Tamura, Shu-ichi; Hokura, Akiko; Nakai, Izumi; Oishi, Masahiro

    2006-01-01

    The measurement conditions for the precise analysis of the lead stable isotope ratio by using an ICP-MS equipped with a quadrupole mass spectrometer were studied in order to apply the technique to the forensic identification of bullet samples. The values of the relative standard deviation obtained for the ratio of 208 Pb/ 206 Pb, 207 Pb/ 206 Pb and 204 Pb/ 206 Pb were lower than 0.2% after optimization of the analytical conditions, including the optimum lead concentration of the sample solution to be about 70 ppb and an integration time for 1 m/s of 15 s. This method was applied to an analysis of lead in bullets for rifles and handguns; a stable isotope ratio of lead was found to be suitable for the identification of bullets. This study has demonstrated that the lead isotope ratio measured by using a quadrupole ICP-MS was useful for a practical analysis of bullet samples in forensic science. (author)

  7. Childhood lead exposure in France: benefit estimation and partial cost-benefit analysis of lead hazard control

    Directory of Open Access Journals (Sweden)

    Zmirou-Navier Denis

    2011-05-01

    Full Text Available Abstract Background Lead exposure remains a public health concern due to its serious adverse effects, such as cognitive and behavioral impairment: children younger than six years of age being the most vulnerable population. In Europe, the lead-related economic impacts have not been examined in detail. We estimate the annual costs in France due to childhood exposure and, through a cost benefit analysis (CBA, aim to assess the expected social and economic benefits of exposure abatement. Methods Monetary benefits were assessed in terms of avoided national costs. We used results from a 2008 survey on blood-lead (B-Pb concentrations in French children aged one to six years old. Given the absence of a threshold concentration being established, we performed a sensitivity analysis assuming different hypothetical threshold values for toxicity above 15 μg/L, 24 μg/L and 100 μg/L. Adverse health outcomes of lead exposure were translated into social burden and economic costs based on literature data from literature. Direct health benefits, social benefits and intangible avoided costs were included. Costs of pollutant exposure control were partially estimated in regard to homes lead-based paint decontamination, investments aiming at reducing industrial lead emissions and removal of all lead drinking water pipes. Results The following overall annual benefits for the three hypothetical thresholds values in 2008 are: €22.72 billion, €10.72 billion and €0.44 billion, respectively. Costs from abatement ranged from €0.9 billion to 2.95 billion/year. Finally, from a partial CBA of lead control in soils and dust the estimates of total net benefits were € 3.78 billion, € 1.88 billion and €0.25 billion respectively for the three hypothesized B-Pb effect values. Conclusions Prevention of childhood lead exposure has a high social benefit, due to reduction of B-Pb concentrations to levels below 15 μg/L or 24 μg/L, respectively. Reducing only exposures

  8. 'Top-down' BACT analysis - Recommended approach and recent determinations

    International Nuclear Information System (INIS)

    Cochran, J.R.; Fagan, M.E.

    1991-01-01

    New EPA requirements for 'top-down' best available control technology (BACT) analyses have resulted in determinations that require more stringent control technologies. Accordingly, these permit decisions include nitrogen oxide (NO x ), sulfur dioxide, and particulate emission limits significantly lower than applicable New Source Performance Standards. However, with careful consideration of acceptable site-specific impacts, obtaining a reasonable BACT determination is still possible. This paper presents a step-by-step approach for conducting a top-down BACT analysis, and summarizes important considerations that will lead to a more effective BACT analysis. In addition, recent permit decisions regarding NO x emission rate and control technology requirements for combined cycle combustion turbine and coal fueled power plants are summarized and examined to ascertain the basis for decisions. Guidance from this paper will help applicants in preparing an accurate and comprehensive BACT analysis for their proposed projects

  9. Effect of Wetting Agents and Approaching Anodes on Lead Migration in Electrokinetic Soil Remediation

    OpenAIRE

    Ng, Yee-Sern; Gupta, Bhaskar Sen; Hashim, Mohd Ali

    2015-01-01

    This is the presentation slides for my conference paper "Effect of Wetting Agents and Approaching Anodes on Lead Migration in Electrokinetic Soil Remediation", which was presented in 5th International Conference on Chemical Engineering and Applications, Taipei on 27 August 2014.

  10. Blood, urine, and hair kinetic analysis following an acute lead intoxication.

    Science.gov (United States)

    Ho, G; Keutgens, A; Schoofs, R; Kotolenko, S; Denooz, R; Charlier, C

    2011-01-01

    A case of lead exposure resulting from the accidental ingestion of a lead-containing solution is reported. Because of clinical management rapidly performed through chelation therapy by 2,3-dimercaptopropane sulfonate sodium and meso-2,3-dimercaptosuccinic acid, blood lead levels of this 51-year-old patient were moderate (412.9 μg/L) and no clinical symptoms were observed. Numerous blood and urine samples were collected for kinetic analysis of lead elimination. However, we report the first case in which hair samples were analyzed to determine the excretion level of lead after acute intoxication.

  11. A Public Health Approach to Addressing Lead

    Science.gov (United States)

    Describes EPA’s achievements in reducing childhood lead exposures and emphasizes the need to continue actions to further reduce lead exposures, especially in those communities where exposures remain high.

  12. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  13. A hybrid approach for global sensitivity analysis

    International Nuclear Information System (INIS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2017-01-01

    Distribution based sensitivity analysis (DSA) computes sensitivity of the input random variables with respect to the change in distribution of output response. Although DSA is widely appreciated as the best tool for sensitivity analysis, the computational issue associated with this method prohibits its use for complex structures involving costly finite element analysis. For addressing this issue, this paper presents a method that couples polynomial correlated function expansion (PCFE) with DSA. PCFE is a fully equivalent operational model which integrates the concepts of analysis of variance decomposition, extended bases and homotopy algorithm. By integrating PCFE into DSA, it is possible to considerably alleviate the computational burden. Three examples are presented to demonstrate the performance of the proposed approach for sensitivity analysis. For all the problems, proposed approach yields excellent results with significantly reduced computational effort. The results obtained, to some extent, indicate that proposed approach can be utilized for sensitivity analysis of large scale structures. - Highlights: • A hybrid approach for global sensitivity analysis is proposed. • Proposed approach integrates PCFE within distribution based sensitivity analysis. • Proposed approach is highly efficient.

  14. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  15. The use of lead isotope analysis to identify potential sources of lead toxicosis in a juvenile bald eagle (Haliaeetus leucocephalus) with ventricular foreign bodies

    Science.gov (United States)

    Franzen-Klein, Dana; McRuer, David; Slabe, Vincent; Katzner, Todd

    2018-01-01

    A male juvenile bald eagle (Haliaeetus leucocephalus) was admitted to the Wildlife Center of Virginia with a left humeral fracture a large quantity of anthropogenic debris in the ventriculus, a blood lead level of 0.616 ppm, and clinical signs consistent with chronic lead toxicosis. Because of the poor prognosis for recovery and release, the eagle was euthanatized. Lead isotope analysis was performed to identify potential anthropogenic sources of lead in this bird. The lead isotope ratios in the eagle's femur (0.8773), liver (0.8761), and kidneys (0.8686) were most closely related to lead paint (0.8925), leaded gasoline (0.8450), and zinc smelting (0.8240). The lead isotope ratios were dissimilar to lead ammunition (0.8179) and the anthropogenic debris in the ventriculus. This case report documents foreign body ingestion in a free-ranging bald eagle and demonstrates the clinical utility of lead isotope analysis to potentially identify or exclude anthropogenic sources of lead poisoning in wildlife patients.

  16. Lead isotopic compositions of environmental certified reference materials for an inter-laboratory comparison of lead isotope analysis

    International Nuclear Information System (INIS)

    Aung, Nyein Nyein; Uryu, Tsutomu; Yoshinaga, Jun

    2004-01-01

    Lead isotope ratios, viz. 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, of the commercially available certified reference materials (CRMs) issued in Japan are presented with an objective to provide a data set, which will be useful for the quality assurance of analytical procedures, instrumental performance and method validation of the laboratories involved in environmental lead isotope ratio analysis. The analytical method used in the present study was inductively coupled plasma quadrupole mass spectrometry (ICPQMS) presented by acid digestion and with/without chemical separation of lead from the matrix. The precision of the measurements in terms of the relative standard deviation (RSD) of triplicated analyses was 0.19% and 0.14%, for 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, respectively. The trueness of lead isotope ratio measurements of the present study was tested with a few CRMs, which have been analyzed by other analytical methods and reported in various literature. The lead isotopic ratios of 18 environmental matrix CRMs (including 6 CRMs analyzed for our method validation) are presented and the distribution of their ratios is briefly discussed. (author)

  17. A modular approach to lead-cooled reactors modelling

    Energy Technology Data Exchange (ETDEWEB)

    Casamassima, V. [CESI RICERCA, via Rubattino 54, I-20134 Milano (Italy)], E-mail: casamassima@cesiricerca.it; Guagliardi, A. [CESI RICERCA, via Rubattino 54, I-20134 Milano (Italy)], E-mail: guagliardi@cesiricerca.it

    2008-06-15

    After an overview of the lego plant simulation tools (LegoPST), the paper gives some details about the ongoing LegoPST extension for modelling lead fast reactor plants. It refers to a simple mathematical model of the liquid lead channel dynamic process and shows the preliminary results of its application in dynamic simulation of the BREST 300 liquid lead steam generator. Steady state results agree with reference data [IAEA-TECDOC 1531, Fast Reactor Database, 2006 Update] both for water and lead.

  18. A modular approach to lead-cooled reactors modelling

    International Nuclear Information System (INIS)

    Casamassima, V.; Guagliardi, A.

    2008-01-01

    After an overview of the lego plant simulation tools (LegoPST), the paper gives some details about the ongoing LegoPST extension for modelling lead fast reactor plants. It refers to a simple mathematical model of the liquid lead channel dynamic process and shows the preliminary results of its application in dynamic simulation of the BREST 300 liquid lead steam generator. Steady state results agree with reference data [IAEA-TECDOC 1531, Fast Reactor Database, 2006 Update] both for water and lead

  19. Data fusion for QRS complex detection in multi-lead electrocardiogram recordings

    Science.gov (United States)

    Ledezma, Carlos A.; Perpiñan, Gilberto; Severeyn, Erika; Altuve, Miguel

    2015-12-01

    Heart diseases are the main cause of death worldwide. The first step in the diagnose of these diseases is the analysis of the electrocardiographic (ECG) signal. In turn, the ECG analysis begins with the detection of the QRS complex, which is the one with the most energy in the cardiac cycle. Numerous methods have been proposed in the bibliography for QRS complex detection, but few authors have analyzed the possibility of taking advantage of the information redundancy present in multiple ECG leads (simultaneously acquired) to produce accurate QRS detection. In our previous work we presented such an approach, proposing various data fusion techniques to combine the detections made by an algorithm on multiple ECG leads. In this paper we present further studies that show the advantages of this multi-lead detection approach, analyzing how many leads are necessary in order to observe an improvement in the detection performance. A well known QRS detection algorithm was used to test the fusion techniques on the St. Petersburg Institute of Cardiological Technics database. Results show improvement in the detection performance with as little as three leads, but the reliability of these results becomes interesting only after using seven or more leads. Results were evaluated using the detection error rate (DER). The multi-lead detection approach allows an improvement from DER = 3:04% to DER = 1:88%. Further works are to be made in order to improve the detection performance by implementing further fusion steps.

  20. ELFR: The European Lead Fast Reactor. Design, Safety Approach and Safety Characteristics

    International Nuclear Information System (INIS)

    Alemberti, Alessandro

    2012-01-01

    • In the framework of the LEADER project, the safety approach for a Lead cooled fast reactor has been defined and, in particular, all the possible challenges to the main safety functions and their mechanisms have been specified, in order to better define the needed provisions. • On the basis of the above and taking into account the results of the safety analyses performed during previous project (ELSY), a reference configuration of the ELFR plant has been consolidated, by improving and updating the plant design features. In particular, the emerged safety concerns have been analyzed in the LEADER project and a new set of design options and safety provisions have been proposed. • The combination of favourable Lead coolant inherent characteristics and plant design features, specifically developed to face identified challenges, resulted in a very robust and forgiving design, even in very extreme conditions, as a Fukushima-like scenario

  1. Electrode alignment of transverse tripoles using a percutaneous triple-lead approach in spinal cord stimulation

    NARCIS (Netherlands)

    Sankarasubramanian, V.; Buitenweg, Jan R.; Holsheimer, J.; Veltink, Petrus H.

    The aim of this modeling study is to determine the influence of electrode alignment of transverse tripoles on the paresthesia coverage of the pain area in spinal cord stimulation, using a percutaneous triple-lead approach. Transverse tripoles, comprising a central cathode and two lateral anodes,

  2. Lead remediation and changes in human lead exposure: some physiological and biokinetic dimensions.

    Science.gov (United States)

    Mushak, Paul

    2003-02-15

    This paper presents a qualitative and quantitative analysis of the various aspects of lead remediation effectiveness with particular reference to human health risk assessment. One of the key elements of lead remediation efforts at such sites as those under the Superfund program deals with populations at elevated exposure and toxicity risk in the proximity of, or at, the site of remediation, especially remediation workers, workers at other tasks on sites that were remediated down to some action level of lead concentration in soils, and groups at risk in nearby communities. A second element has to do with how one measures or models lead exposure changes with special reference to baseline and post-remediation conditions. Various biomarkers of lead exposure can be employed, but their use requires detailed knowledge of what results using each means. The most commonly used approach is measurement of blood lead (Pb-B). Recognized limitations in the use of Pb-B has led to the use of predictive Pb exposure models, which are less vulnerable to the many behavioral, physiological, and environmental parameters that can distort isolated or 'single shot' Pb-B testings. A third aspect covered in this paper presents various physiological factors that affect the methods by which one evaluates Pb remediation effectiveness. Finally, this article offers an integrated look at how lead remediation actions directed at one lead source or pathway affect the total lead exposure picture for human populations at elevated lead exposure and toxicity risk.

  3. Blood lead and preeclampsia: A meta-analysis and review of implications.

    Science.gov (United States)

    Poropat, Arthur E; Laidlaw, Mark A S; Lanphear, Bruce; Ball, Andrew; Mielke, Howard W

    2018-01-01

    Multiple cross-sectional studies suggest that there is an association between blood lead and preeclampsia. We performed a systematic review and meta-analysis to summarize information on the association between preeclampsia and lead poisoning. Searches of Medline, Web of Science, Scopus, Pubmed, Science Direct and ProQuest (dissertations and theses) identified 2089 reports, 46 of which were downloaded after reviewing the abstracts, and 11 studies were evaluated as meeting the selection criteria. Evaluation using the ROBINS-I template (Sterne, et al., 2016), indicated moderate risk of bias in all studies. We found that blood lead concentrations were significantly and substantially associated with preeclampsia (k = 12; N = 6069; Cohen's d = 1.26; odds ratio = 9.81; odds ratio LCL = 8.01; odds ratio UCL = 12.02; p = 0.005). Eliminating one study produced a homogeneous meta-analysis and stronger estimates, despite the remaining studies coming from eight separate countries and having countervailing risks of bias. Blood lead concentrations in pregnant women are a major risk factor for preeclampsia, with an increase of 1μg/dL associated with a 1.6% increase in likelihood of preeclampsia, which appears to be the strongest risk factor for preeclampsia yet reported. Pregnant women with historical lead exposure should routinely have blood lead concentrations tested, especially after mid-term. Women with concentrations higher than 5μg/dL should be actively monitored for preeclampsia and be advised to take prophylactic calcium supplementation. All pregnant women should be advised to actively avoid lead exposure. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Liquidity indicator for the Croatian economy – Factor analysis approach

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2014-12-01

    Full Text Available Croatian business surveys (BS are conducted in the manufacturing industry, retail trade and construction sector. In all of these sectors, manager´s assessments of liquidity are measured. The aim of the paper was to form a new composite liquidity indicator by including business survey liquidity measures from all three covered economic sectors in the Croatian economy mentioned above. In calculating the leading indicator, a factor analysis approach was used. However, this kind of indicator does not exist in a Croatia or in any other European economy. Furthermore, the issue of Croatian companies´ illiquidity is highly neglected in the literature. The empirical analysis consists of two parts. In the first part the new liquidity indicator was formed using factor analysis. One factor (representing the new liquidity indicator; LI was extracted out of the three liquidity variables in three economic sectors. This factor represents the new liquidity indicator. In the second part, econometric models were applied in order to investigate the forecasting properties of the new business survey liquidity indicator, when predicting the direction of changes in Croatian industrial production. The quarterly data used in the research covered the period from January 2000 to April 2013. Based on econometric analysis, it can be concluded that the LI is a leading indicator of Croatia’s industrial production with better forecasting properties then the standard liquidity indicators (formed in a manufacturing industry.

  5. Illustration and analysis of a coordinated approach to an effective forensic trace evidence capability.

    Science.gov (United States)

    Stoney, David A; Stoney, Paul L

    2015-08-01

    An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Isotopic analysis of bullet lead samples

    International Nuclear Information System (INIS)

    Sankar Das, M.; Venkatasubramanian, V.S.; Sreenivas, K.

    1976-01-01

    The possibility of using the isotopic composition of lead for the identification of bullet lead is investigated. Lead from several spent bullets were converted to lead sulphide and analysed for the isotopic abundances using an MS-7 mass spectrometer. The abundances are measured relative to that for Pb 204 was too small to permit differentiation, while the range of variation of Pb 206 and Pb 207 and the better precision in their analyses permitted differentiating samples from one another. The correlation among the samples examined has been pointed out. The method is complementary to characterisation of bullet leads by the trace element composition. The possibility of using isotopically enriched lead for tagging bullet lead is pointed out. (author)

  7. Dual-acting of Hybrid Compounds - A New Dawn in the Discovery of Multi-target Drugs: Lead Generation Approaches.

    Science.gov (United States)

    Abdolmaleki, Azizeh; Ghasemi, Jahan B

    2017-01-01

    Finding high quality beginning compounds is a critical job at the start of the lead generation stage for multi-target drug discovery (MTDD). Designing hybrid compounds as selective multitarget chemical entity is a challenge, opportunity, and new idea to better act against specific multiple targets. One hybrid molecule is formed by two (or more) pharmacophore group's participation. So, these new compounds often exhibit two or more activities going about as multi-target drugs (mtdrugs) and may have superior safety or efficacy. Application of integrating a range of information and sophisticated new in silico, bioinformatics, structural biology, pharmacogenomics methods may be useful to discover/design, and synthesis of the new hybrid molecules. In this regard, many rational and screening approaches have followed by medicinal chemists for the lead generation in MTDD. Here, we review some popular lead generation approaches that have been used for designing multiple ligands (DMLs). This paper focuses on dual- acting chemical entities that incorporate a part of two drugs or bioactive compounds to compose hybrid molecules. Also, it presents some of key concepts and limitations/strengths of lead generation methods by comparing combination framework method with screening approaches. Besides, a number of examples to represent applications of hybrid molecules in the drug discovery are included. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  8. Fragment-based virtual screening approach and molecular dynamics simulation studies for identification of BACE1 inhibitor leads.

    Science.gov (United States)

    Manoharan, Prabu; Ghoshal, Nanda

    2018-05-01

    Traditional structure-based virtual screening method to identify drug-like small molecules for BACE1 is so far unsuccessful. Location of BACE1, poor Blood Brain Barrier permeability and P-glycoprotein (Pgp) susceptibility of the inhibitors make it even more difficult. Fragment-based drug design method is suitable for efficient optimization of initial hit molecules for target like BACE1. We have developed a fragment-based virtual screening approach to identify/optimize the fragment molecules as a starting point. This method combines the shape, electrostatic, and pharmacophoric features of known fragment molecules, bound to protein conjugate crystal structure, and aims to identify both chemically and energetically feasible small fragment ligands that bind to BACE1 active site. The two top-ranked fragment hits were subjected for a 53 ns MD simulation. Principle component analysis and free energy landscape analysis reveal that the new ligands show the characteristic features of established BACE1 inhibitors. The potent method employed in this study may serve for the development of potential lead molecules for BACE1-directed Alzheimer's disease therapeutics.

  9. Corrosion by liquid lead and lead-bismuth: experimental results review and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jinsuo [Los Alamos National Laboratory

    2008-01-01

    Liquid metal technologies for liquid lead and lead-bismuth alloy are under wide investigation and development for advanced nuclear energy systems and waste transmutation systems. Material corrosion is one of the main issues studied a lot recently in the development of the liquid metal technology. This study reviews corrosion by liquid lead and lead bismuth, including the corrosion mechanisms, corrosion inhibitor and the formation of the protective oxide layer. The available experimental data are analyzed by using a corrosion model in which the oxidation and scale removal are coupled. Based on the model, long-term behaviors of steels in liquid lead and lead-bismuth are predictable. This report provides information for the selection of structural materials for typical nuclear reactor coolant systems when selecting liquid lead or lead bismuth as heat transfer media.

  10. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    Lawyers generally explain legal development by looking at explicit amendments to statutory law and modifications in judicial practice. As far as the latter are concerned, leading cases occupy a special place. This article empirically studies the process in which certain cases become leading cases....... Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual...

  11. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    Science.gov (United States)

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  12. A global analysis approach for investigating structural resilience in urban drainage systems.

    Science.gov (United States)

    Mugume, Seith N; Gomez, Diego E; Fu, Guangtao; Farmani, Raziyeh; Butler, David

    2015-09-15

    Building resilience in urban drainage systems requires consideration of a wide range of threats that contribute to urban flooding. Existing hydraulic reliability based approaches have focused on quantifying functional failure caused by extreme rainfall or increase in dry weather flows that lead to hydraulic overloading of the system. Such approaches however, do not fully explore the full system failure scenario space due to exclusion of crucial threats such as equipment malfunction, pipe collapse and blockage that can also lead to urban flooding. In this research, a new analytical approach based on global resilience analysis is investigated and applied to systematically evaluate the performance of an urban drainage system when subjected to a wide range of structural failure scenarios resulting from random cumulative link failure. Link failure envelopes, which represent the resulting loss of system functionality (impacts) are determined by computing the upper and lower limits of the simulation results for total flood volume (failure magnitude) and average flood duration (failure duration) at each link failure level. A new resilience index that combines the failure magnitude and duration into a single metric is applied to quantify system residual functionality at each considered link failure level. With this approach, resilience has been tested and characterised for an existing urban drainage system in Kampala city, Uganda. In addition, the effectiveness of potential adaptation strategies in enhancing its resilience to cumulative link failure has been tested. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  14. Androgen receptor mutations associated with androgen insensitivity syndrome: a high content analysis approach leading to personalized medicine.

    Directory of Open Access Journals (Sweden)

    Adam T Szafran

    2009-12-01

    Full Text Available Androgen insensitivity syndrome (AIS is a rare disease associated with inactivating mutations of AR that disrupt male sexual differentiation, and cause a spectrum of phenotypic abnormalities having as a common denominator loss of reproductive viability. No established treatment exists for these conditions, however there are sporadic reports of patients (or recapitulated mutations in cell lines that respond to administration of supraphysiologic doses (or pulses of testosterone or synthetic ligands. Here, we utilize a novel high content analysis (HCA approach to study AR function at the single cell level in genital skin fibroblasts (GSF. We discuss in detail findings in GSF from three historical patients with AIS, which include identification of novel mechanisms of AR malfunction, and the potential ability to utilize HCA for personalized treatment of patients affected by this condition.

  15. Thermodynamic analysis of separating lead and antimony in chloride system

    Institute of Scientific and Technical Information of China (English)

    CHEN Jin-zhong; CAO Hua-zhen; LI Bo; YUAN Hai-jun; ZHENG Guo-qu; YANG Tian-zu

    2009-01-01

    In chloride system, thermodynamic analysis is a useful guide to separate lead and antimony as well as to understand the separation mechanism. An efficient and feasible way for separating lead and antimony was discussed. The relationships of [Pb2+][Cl-]2-lg[Cl]T and E-lg[Cl]T in Pb-Sb-Cl-H2O system were studied, and the solubilities of lead chloride at different antimony concentrations were calculated based on principle of simultaneous equilibrium. The results show that insoluble salt PbCl2 will only exist stably in a certain concentration range of chlorine ion. This concentration range of chlorine ion expands a little with increasing the concentration of antimony in the system while narrows as the system acidity increases. The solubility of Pb2+ in solution decreases with increasing the concentration of antimony in the system, whereas increases with increasing the concentration of total chlorine. The concentration range of total chlorine causing lead solubility less than 0.005 mol/L increases monotonically.

  16. Goal-oriented failure analysis - a systems analysis approach to hazard identification

    International Nuclear Information System (INIS)

    Reeves, A.B.; Davies, J.; Foster, J.; Wells, G.L.

    1990-01-01

    Goal-Oriented Failure Analysis, GOFA, is a methodology which is being developed to identify and analyse the potential failure modes of a hazardous plant or process. The technique will adopt a structured top-down approach, with a particular failure goal being systematically analysed. A systems analysis approach is used, with the analysis being organised around a systems diagram of the plant or process under study. GOFA will also use checklists to supplement the analysis -these checklists will be prepared in advance of a group session and will help to guide the analysis and avoid unnecessary time being spent on identifying obvious failure modes or failing to identify certain hazards or failures. GOFA is being developed with the aim of providing a hazard identification methodology which is more efficient and stimulating than the conventional approach to HAZOP. The top-down approach should ensure that the analysis is more focused and the use of a systems diagram will help to pull the analysis together at an early stage whilst also helping to structure the sessions in a more stimulating way than the conventional techniques. GOFA will be, essentially, an extension of the HAZOP methodology. GOFA is currently being computerised using a knowledge-based systems approach for implementation. The Goldworks II expert systems development tool is being used. (author)

  17. Structural analysis of steam generator internals following feed water main steam line break: DLF approach

    International Nuclear Information System (INIS)

    Bhasin, Vivek; Kushwaha, H.S.; Mahajan, S.C.; Kakodkar, A.

    1993-01-01

    In order to evaluate the possible release of radioactivity in extreme events, some postulated accidents are analysed and studied during the design stage of Steam Generator (SG). Among the various accidents postulated, the most important are Feed Water Line Break (FWLB) and Main Steam Line Break (MSLB). This report concerns with dynamic structural analysis of SG internals following FWLB/MSLB. The pressure/drag-force time histories considered were corresponding to the conditions leading to the accident of maximum potential. The SG internals were analysed using two approaches of structural dynamics. In first approach simplified DLF method was adopted. This method yields an upper bound values of stresses and deflection. In the second approach time history analysis by Mode Superposition Technique was adopted. This approach gives more realistic results. The structure was qualified as per ASME B and PV Code SecIII NB. It was concluded that in all the components except perforated flow distribution plate, the stress values based on elastic analysis are within the limits specified by ASME Code. In case of perforated flow distribution plate during the MSLB transient the stress values based on elastic analysis are higher than the ASME Code limits. Therefore, its limit load analysis had to be done. Finally, the collapse pressure evaluated using limit load analysis was shown to be within the limits of ASME B and PV Code SecIII Nb. (author). 31 refs., 94 figs., 16 tabs

  18. Small-molecule inhibitor leads of ribosome-inactivating proteins developed using the doorstop approach.

    Directory of Open Access Journals (Sweden)

    Yuan-Ping Pang

    2011-03-01

    Full Text Available Ribosome-inactivating proteins (RIPs are toxic because they bind to 28S rRNA and depurinate a specific adenine residue from the α-sarcin/ricin loop (SRL, thereby inhibiting protein synthesis. Shiga-like toxins (Stx1 and Stx2, produced by Escherichia coli, are RIPs that cause outbreaks of foodborne diseases with significant morbidity and mortality. Ricin, produced by the castor bean plant, is another RIP lethal to mammals. Currently, no US Food and Drug Administration-approved vaccines nor therapeutics exist to protect against ricin, Shiga-like toxins, or other RIPs. Development of effective small-molecule RIP inhibitors as therapeutics is challenging because strong electrostatic interactions at the RIP•SRL interface make drug-like molecules ineffective in competing with the rRNA for binding to RIPs. Herein, we report small molecules that show up to 20% cell protection against ricin or Stx2 at a drug concentration of 300 nM. These molecules were discovered using the doorstop approach, a new approach to protein•polynucleotide inhibitors that identifies small molecules as doorstops to prevent an active-site residue of an RIP (e.g., Tyr80 of ricin or Tyr77 of Stx2 from adopting an active conformation thereby blocking the function of the protein rather than contenders in the competition for binding to the RIP. This work offers promising leads for developing RIP therapeutics. The results suggest that the doorstop approach might also be applicable in the development of other protein•polynucleotide inhibitors as antiviral agents such as inhibitors of the Z-DNA binding proteins in poxviruses. This work also calls for careful chemical and biological characterization of drug leads obtained from chemical screens to avoid the identification of irrelevant chemical structures and to avoid the interference caused by direct interactions between the chemicals being screened and the luciferase reporter used in screening assays.

  19. The importance of hydration thermodynamics in fragment-to-lead optimization.

    Science.gov (United States)

    Ichihara, Osamu; Shimada, Yuzo; Yoshidome, Daisuke

    2014-12-01

    Using a computational approach to assess changes in solvation thermodynamics upon ligand binding, we investigated the effects of water molecules on the binding energetics of over 20 fragment hits and their corresponding optimized lead compounds. Binding activity and X-ray crystallographic data of published fragment-to-lead optimization studies from various therapeutically relevant targets were studied. The analysis reveals a distinct difference between the thermodynamic profile of water molecules displaced by fragment hits and those displaced by the corresponding optimized lead compounds. Specifically, fragment hits tend to displace water molecules with notably unfavorable excess entropies-configurationally constrained water molecules-relative to those displaced by the newly added moieties of the lead compound during the course of fragment-to-lead optimization. Herein we describe the details of this analysis with the goal of providing practical guidelines for exploiting thermodynamic signatures of binding site water molecules in the context of fragment-to-lead optimization. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Analysis of natural radionuclides and lead in foods and diets

    International Nuclear Information System (INIS)

    Bueno, Luciana

    1999-01-01

    The main purpose of the present study was to determine the lead-210, polonium-210 and lead concentrations in foods and diets. Consumption of food is generally the main route by which radionuclides can enter the human organism. Precision and accuracy of the methods developed were verifies by the analysis of reference materials from the International Atomic Energy Agency (IAEA). The method for polonium-210 analysis consisted of sample dissolution by using a microwave digester (open system) employing concentrated nitric acid and hydrogen peroxide, evaporation almost dryness, addition of hydrochloric acid, polonium deposition onto silver disc for six hours and counting by alpha spectrometry. Lead was analysed by atomic absorption technique. After sample dissolution in a microwave digester (using concentrated nitric acid and hydrogen peroxide) and dilution to 50 ml, 20μl of the sample was injected in a pyrolytic graphite furnace - atomic absorption spectrophotometer equipped with Zeeman background correction. The assessment of the contaminants in foods and diets allowed to estimate the intake of these elements and for the radionuclides were also evaluated the radiation doses that the individuals selected were exposed by the food consumption. The effective dose for lead-210 by diets intake ranged from 1.3 to 4.3 μSv/year, corresponding to 25% of the resulting from polonium-210 intake. The dose due to the both natural radionuclides varied from 6.8 to 23.0 μSv/year. These values are in good agreement with the literature data. The value estimated by the United Nations Scientific Committee on Effects of Atomic Radiation (UNSCEAR, 1993) that is 60 μSv and lower than the dose of 0.02 Sv, limit established by ICRP, 1980. The lead levels found in the majority of the Brazilian foods are in good agreement with the values published by CONAT and FAO/WHO. However, some foods such as bean, potato, papaya, apple and rice present levels above of the recommended values by the Public

  1. EPA Leads the Way on Lead Exposure Science and Risk Management

    Science.gov (United States)

    EPA researchers have developed a modeling approach that improves our understanding of the relationship between lead concentrations of various sources (drinking water, soil and dust, food, and air) and children’s blood-lead levels.

  2. High frequency analysis of lead-lag relationships between financial markets

    NARCIS (Netherlands)

    de Jong, F.C.J.M.; Nijman, T.E.

    1995-01-01

    High frequency data are often observed at irregular intervals, which complicates the analysis of lead-lag relationships between financial markets. Frequently, estimators have been used that are based on observations at regular intervals, which are adapted to the irregular observations case by

  3. Approaches to data analysis of multiple-choice questions

    OpenAIRE

    Lin Ding; Robert Beichner

    2009-01-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  4. Quantum functional analysis non-coordinate approach

    CERN Document Server

    Helemskii, A Ya

    2010-01-01

    This book contains a systematic presentation of quantum functional analysis, a mathematical subject also known as operator space theory. Created in the 1980s, it nowadays is one of the most prominent areas of functional analysis, both as a field of active research and as a source of numerous important applications. The approach taken in this book differs significantly from the standard approach used in studying operator space theory. Instead of viewing "quantized coefficients" as matrices in a fixed basis, in this book they are interpreted as finite rank operators in a fixed Hilbert space. This allows the author to replace matrix computations with algebraic techniques of module theory and tensor products, thus achieving a more invariant approach to the subject. The book can be used by graduate students and research mathematicians interested in functional analysis and related areas of mathematics and mathematical physics. Prerequisites include standard courses in abstract algebra and functional analysis.

  5. Stability Analysis of a Model of Atherogenesis: An Energy Estimate Approach

    Directory of Open Access Journals (Sweden)

    A. I. Ibragimov

    2008-01-01

    Full Text Available Atherosclerosis is a disease of the vasculature that is characterized by chronic inflammation and the accumulation of lipids and apoptotic cells in the walls of large arteries. This disease results in plaque growth in an infected artery typically leading to occlusion of the artery. Atherosclerosis is the leading cause of human mortality in the US, much of Europe, and parts of Asia. In a previous work, we introduced a mathematical model of the biochemical aspects of the disease, in particular the inflammatory response of macrophages in the presence of chemoattractants and modified low density lipoproteins. Herein, we consider the onset of a lesion as resulting from an instability in an equilibrium configuration of cells and chemical species. We derive an appropriate norm by taking an energy estimate approach and present stability criteria. A bio-physical analysis of the mathematical results is presented.

  6. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    Sanchez Paz, L.A.

    1991-01-01

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  7. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  8. The quantitative analysis of data for magnetization of ferromagnet. Extended thermodynamic approach

    International Nuclear Information System (INIS)

    Bodryakov, V.Yu.; Bashkatov, A.N.

    2005-01-01

    A quantitative analysis of M(H,T) data on magnetization of a gadolinium single crystal in the vicinity of Curie point is accomplished within the frameworks of extended thermodynamic approach. It is established that actually observed behavior of temperature dependences of thermodynamic coefficients for gadolinium even near Curie point is sharply different from that in Landau theory. A discrepancy revealed leads to conclusion that traditional concepts should be revised. The solution of extended equation of a ferromagnet magnetic state is found and criteria of its stability are shown [ru

  9. Approaches to data analysis of multiple-choice questions

    Directory of Open Access Journals (Sweden)

    Lin Ding

    2009-09-01

    Full Text Available This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  10. Potentiometric stripping analysis of Cadmium and Lead in superficial waters

    International Nuclear Information System (INIS)

    Arias, Juan Miguel; Marciales Castiblanco, Clara

    2003-01-01

    This paper contains the implementation and validation of an analytical method for determining cadmium and lead in surface waters. This is a valuable tool for the description of actual conditions and qualitative and quantitative control of dangerous heavy metals discharge in water bodies. Test were run for selecting stripping potentiometry conditions that as indicated by results were: sample oxidant concentration 36.4 μg/L Hg 2+ stirring frequency 2400 rpm, electrolysis time 80 s., electrolysis potential -950 mV and pH of 2.0. Interference of Cu 2+ and Fe 2+ showed that copper concentrations larger than 150 μg/L and 500 μg/L negatively influence the analytical response for Cadmium and lead respectively; [Fe 3+ ] larger than 60 μg/L and 400 μg/L cause variations in cadmium and lead read content respectively. Linear concentration range for cadmium lies between 5 and 250 μg/L; for lead range goes from 10 to 250 μg/L. Precision expressed as repeatability for both system and method, exhibit good reproducibility with variation coefficients below 6%. Accuracy, assessed from recuperation, is strongly influenced by concentration level therefore standard addition is recommended for lead and cadmium quantification. Analysis performed on surface waters from Colombian Magdalena and Cauca rivers pointed lead and cadmium contents below detection limits

  11. A multitechnique approach for bullet characterization

    International Nuclear Information System (INIS)

    Sreenivas, K.; Venkatasubramanian, V.S.; Sankar Das, M.

    1978-01-01

    The possibility of using lead isotopic composition of bullet lead in conjunction with chemical composition (with respect to minor and trace elements) for the characterisation of bullet lead samples is demonstrated. Lead isotope analysis was done by using a mass spectrometer M.S. 702, while trace analyses were carried out by a combination of neutron activation analysis and atomic absorption spectrometry. It is pointed out that this multi-technique approach to the problem of bullet characterization can be gainfully employed in forensic investigations. (author)

  12. A life cycle analysis approach to D and D decision-making

    International Nuclear Information System (INIS)

    Yuracko, K.L.; Gresalfi, M.; Yerace, P.; Krstich, M.; Gerrick, D.

    1998-05-01

    This paper describes a life cycle analysis (LCA) approach that makes decontamination and decommissioning (D and D) of US Department of Energy facilities more efficient and more responsive to the concerns of the society. With the considerable complexity of D and D projects and their attendant environmental and health consequences, projects can no longer be designed based on engineering and economic criteria alone. Using the LCA D and D approach, the evaluation of material disposition alternatives explicitly includes environmental impacts, health and safety impacts, socioeconomic impacts, and stakeholder attitudes -- in addition to engineering and economic criteria. Multi-attribute decision analysis is used to take into consideration the uncertainties and value judgments that are an important part of all material disposition decisions. Use of the LCA D and D approach should lead to more appropriate selections of material disposition pathways and a decision-making process that is both understandable and defensible. The methodology and procedures of the LCA D and D approach are outlined and illustrated by an application of the approach at the Department of Energy's West Valley Demonstration Project. Specifically, LCA was used to aid decisions on disposition of soil and concrete from the Tank Pad D and D Project. A decision tree and the Pollution Prevention/Waste Minimization Users Guide for Environmental Restoration Projects were used to identify possible alternatives for disposition of the soil and concrete. Eight alternatives encompassing source reduction, segregation, treatment, and disposal were defined for disposition of the soil; two alternatives were identified for disposition of the concrete. Preliminary results suggest that segregation and treatment are advantageous in the disposition of both the soil and the concrete. This and other recent applications illustrate the strength and ease of application of the LCA D and D approach

  13. An Ethnografic Approach to Video Analysis

    DEFF Research Database (Denmark)

    Holck, Ulla

    2007-01-01

    The overall purpose in the ethnographic approach to video analysis is to become aware of implicit knowledge in those being observed. That is, knowledge that cannot be acquired through interviews. In music therapy this approach can be used to analyse patterns of interaction between client and ther......: Methods, Techniques and Applications in Music Therapy for Music Therapy Clinicians, Educators, Researchers and Students. London: Jessica Kingsley.......The overall purpose in the ethnographic approach to video analysis is to become aware of implicit knowledge in those being observed. That is, knowledge that cannot be acquired through interviews. In music therapy this approach can be used to analyse patterns of interaction between client...... a short introduction to the ethnographic approach, the workshop participants will have a chance to try out the method. First through a common exercise and then applied to video recordings of music therapy with children with severe communicative limitations. Focus will be on patterns of interaction...

  14. Lead distribution in soils impacted by a secondary lead smelter: Experimental and modelling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Arnaud R., E-mail: arnaud.schneider@univ-reims.fr [GEGENAA, EA 3795, Université de Reims Champagne-Ardenne, SFR Condorcet FR CNRS 3417, 2 esplanade Roland Garros, 51100 Reims (France); Cancès, Benjamin; Ponthieu, Marie [GEGENAA, EA 3795, Université de Reims Champagne-Ardenne, SFR Condorcet FR CNRS 3417, 2 esplanade Roland Garros, 51100 Reims (France); Sobanska, Sophie [Laboratoire de Spectrochimie IR et Raman, UMR-CNRS 8516, Bât. C5 Université de Lille I, 59655 Villeneuve d' Ascq Cedex (France); Benedetti, Marc F. [Institut de Physique du Globe de Paris, Sorbonne Paris Cité, Université Paris Diderot, UMR 7154, CNRS, F-75005 Paris (France); Pourret, Olivier [HydrISE, Institut Polytechnique LaSalle Beauvais, FR-60000 Beauvais (France); Conreux, Alexandra; Calandra, Ivan; Martinet, Blandine; Morvan, Xavier; Gommeaux, Maxime; Marin, Béatrice [GEGENAA, EA 3795, Université de Reims Champagne-Ardenne, SFR Condorcet FR CNRS 3417, 2 esplanade Roland Garros, 51100 Reims (France)

    2016-10-15

    Smelting activities are one of the most common sources of trace elements in the environment. The aim of this study was to determine the lead distribution in upper horizons (0–5 and 5–10 cm) of acidic soils in the vicinity of a lead-acid battery recycling plant in northern France. The combination of chemical methods (sequential extractions), physical methods (Raman microspectroscopy and scanning electron microscopy with an energy dispersive spectrometer) and multi-surface complexation modelling enabled an assessment of the behaviour of Pb. Regardless of the studied soil, none of the Pb-bearing phases commonly identified in similarly polluted environments (e.g., anglesite) were observed. Lead was mainly associated with organic matter and manganese oxides. The association of Pb with these soil constituents can be interpreted as evidence of Pb redistribution in the studied soils following smelter particle deposition. - Highlights: • Lead behavior was studied in smelter impacted soils. • A combination of experimental methods and modelling was employed. • Pb was mainly associated with organic matter and to a lesser degree with Mn oxides. • Pb was redistributed in soils after smelter particle deposition.

  15. Lead distribution in soils impacted by a secondary lead smelter: Experimental and modelling approaches

    International Nuclear Information System (INIS)

    Schneider, Arnaud R.; Cancès, Benjamin; Ponthieu, Marie; Sobanska, Sophie; Benedetti, Marc F.; Pourret, Olivier; Conreux, Alexandra; Calandra, Ivan; Martinet, Blandine; Morvan, Xavier; Gommeaux, Maxime; Marin, Béatrice

    2016-01-01

    Smelting activities are one of the most common sources of trace elements in the environment. The aim of this study was to determine the lead distribution in upper horizons (0–5 and 5–10 cm) of acidic soils in the vicinity of a lead-acid battery recycling plant in northern France. The combination of chemical methods (sequential extractions), physical methods (Raman microspectroscopy and scanning electron microscopy with an energy dispersive spectrometer) and multi-surface complexation modelling enabled an assessment of the behaviour of Pb. Regardless of the studied soil, none of the Pb-bearing phases commonly identified in similarly polluted environments (e.g., anglesite) were observed. Lead was mainly associated with organic matter and manganese oxides. The association of Pb with these soil constituents can be interpreted as evidence of Pb redistribution in the studied soils following smelter particle deposition. - Highlights: • Lead behavior was studied in smelter impacted soils. • A combination of experimental methods and modelling was employed. • Pb was mainly associated with organic matter and to a lesser degree with Mn oxides. • Pb was redistributed in soils after smelter particle deposition.

  16. Oxygen concentration diffusion analysis of lead-bismuth-cooled, natural-circulation reactor

    International Nuclear Information System (INIS)

    Ito, Kei; Sakai, Takaaki

    2001-11-01

    The feasibility study on fast breeder reactors in Japan has been conducted at JNC and related organizations. The Phase-I study has finished in March, 2001. During the Phase-I activity, lead-bismuth eutectic coolant has been selected as one of the possible coolant options and a medium-scale plant, cooled by a lead-bismuth natural circulation flow was studied. On the other side, it is known that lead-bismuth eutectic has a problem of structural material corrosiveness. It was found that oxygen concentration control in the eutectic plays an important role on the corrosion protection. In this report, we have developed a concentration diffusion analysis code (COCOA: COncentration COntrol Analysis code) in order to carry out the oxygen concentration control analysis. This code solves a two-dimensional concentration diffusion equation by the finite differential method. It is possible to simulate reaction of oxygen and hydrogen by the code. We verified the basic performance of the code and carried out oxygen concentration diffusion analysis for the case of an oxygen increase by a refueling process in the natural circulation reactor. In addition, characteristics of the oxygen control system was discussed for a different type of the control system as well. It is concluded that the COCOA code can simulate diffusion of oxygen concentration in the reactor. By the analysis of a natural circulation medium-scale reactor, we make clear that the ON-OFF control and PID control can well control oxygen concentration by choosing an appropriate concentration measurement point. In addition, even when a trouble occurs in the oxygen emission or hydrogen emission system, it observes that control characteristic drops away. It is still possible, however, to control oxygen concentration in such case. (author)

  17. An obstructive sleep apnea detection approach using kernel density classification based on single-lead electrocardiogram.

    Science.gov (United States)

    Chen, Lili; Zhang, Xi; Wang, Hui

    2015-05-01

    Obstructive sleep apnea (OSA) is a common sleep disorder that often remains undiagnosed, leading to an increased risk of developing cardiovascular diseases. Polysomnogram (PSG) is currently used as a golden standard for screening OSA. However, because it is time consuming, expensive and causes discomfort, alternative techniques based on a reduced set of physiological signals are proposed to solve this problem. This study proposes a convenient non-parametric kernel density-based approach for detection of OSA using single-lead electrocardiogram (ECG) recordings. Selected physiologically interpretable features are extracted from segmented RR intervals, which are obtained from ECG signals. These features are fed into the kernel density classifier to detect apnea event and bandwidths for density of each class (normal or apnea) are automatically chosen through an iterative bandwidth selection algorithm. To validate the proposed approach, RR intervals are extracted from ECG signals of 35 subjects obtained from a sleep apnea database ( http://physionet.org/cgi-bin/atm/ATM ). The results indicate that the kernel density classifier, with two features for apnea event detection, achieves a mean accuracy of 82.07 %, with mean sensitivity of 83.23 % and mean specificity of 80.24 %. Compared with other existing methods, the proposed kernel density approach achieves a comparably good performance but by using fewer features without significantly losing discriminant power, which indicates that it could be widely used for home-based screening or diagnosis of OSA.

  18. Long range correlations, leading particle spectrum and correlations with leading particles

    International Nuclear Information System (INIS)

    Ilgenfritz, E.M.

    1976-05-01

    The unitary cluster emission model by de Groot and Ruijgrok is discussed as an approach to understand the leading particle behaviour. Consequences of the model concerning co--rrelations between leading particles and produced particles in the central region are considered. No satisfactory agreement was found. Production of leading clusters is argued for being an essential feature of these correlations. (author)

  19. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  20. An equivalent dipole analysis of PZT ceramics and lead-free piezoelectric single crystals

    Science.gov (United States)

    Bell, Andrew J.

    2016-04-01

    The recently proposed Equivalent Dipole Model for describing the electromechanical properties of ionic solids in terms of 3 ions and 2 bonds has been applied to PZT ceramics and lead-free single crystal piezoelectric materials, providing analysis in terms of an effective ionic charge and the asymmetry of the interatomic force constants. For PZT it is shown that, as a function of composition across the morphotropic phase boundary, the dominant bond compliance peaks at 52% ZrO2. The stiffer of the two bonds shows little composition dependence with no anomaly at the phase boundary. The effective charge has a maximum value at 50% ZrO2, decreasing across the phase boundary region, but becoming constant in the rhombohedral phase. The single crystals confirm that both the asymmetry in the force constants and the magnitude of effective charge are equally important in determining the values of the piezoelectric charge coefficient and the electromechanical coupling coefficient. Both are apparently temperature dependent, increasing markedly on approaching the Curie temperature.

  1. New approach to the suction force at the leading edge of a profile with zero thickness

    NARCIS (Netherlands)

    Sparenberg, JA; de Jager, EM

    2004-01-01

    This paper considers the suction force at the leading edge of a profile with zero thickness in an incompressible and inviscid fluid flow. The theory is linear, and the approach to the suction force is from the innerside of the profile. It is shown that the suction force can be considered as an

  2. Postmortem analysis of encapsulation around long-term ventricular endocardial pacing leads.

    Science.gov (United States)

    Candinas, R; Duru, F; Schneider, J; Lüscher, T F; Stokes, K

    1999-02-01

    To analyze the site and thickness of encapsulation around ventricular endocardial pacing leads and the extent of tricuspid valve adhesion, from today's perspective, with implications for lead removal and sensor location. Gross cardiac postmortem analysis was performed in 11 cases (8 female and 3 male patients; mean age, 78+/-7 years). None of the patients had died because of pacemaker malfunction. The mean implant time was 61+/-60 months (range, 4 to 184). The observations ranged from encapsulation only at the tip of the pacing lead to complete encapsulation along the entire length of the pacing lead within the right ventricle. Substantial areas of adhesion at the tricuspid valve apparatus were noted in 7 of the 11 cases (64%). The firmly attached leads could be removed only by dissection, and in some cases, removal was possible only by damaging the associated structures. No specific optimal site for sensor placement could be identified along the ventricular portion of the pacing leads; however, the fibrotic response was relatively less prominent in the atrial chamber. Extensive encapsulation is present in most long-term pacemaker leads, which may complicate lead removal. The site and thickness of encapsulation seem to be highly variable. Tricuspid valve adhesion, which is usually underestimated, may be severe. In contrast to earlier reports, our study demonstrates that the extent of fibrotic encapsulation may not be related to the duration since lead implantation. Moreover, we noted no ideal encapsulation-free site for sensors on the ventricular portion of long-term pacing leads.

  3. Artistic image analysis using graph-based learning approaches.

    Science.gov (United States)

    Carneiro, Gustavo

    2013-08-01

    We introduce a new methodology for the problem of artistic image analysis, which among other tasks, involves the automatic identification of visual classes present in an art work. In this paper, we advocate the idea that artistic image analysis must explore a graph that captures the network of artistic influences by computing the similarities in terms of appearance and manual annotation. One of the novelties of our methodology is the proposed formulation that is a principled way of combining these two similarities in a single graph. Using this graph, we show that an efficient random walk algorithm based on an inverted label propagation formulation produces more accurate annotation and retrieval results compared with the following baseline algorithms: bag of visual words, label propagation, matrix completion, and structural learning. We also show that the proposed approach leads to a more efficient inference and training procedures. This experiment is run on a database containing 988 artistic images (with 49 visual classification problems divided into a multiclass problem with 27 classes and 48 binary problems), where we show the inference and training running times, and quantitative comparisons with respect to several retrieval and annotation performance measures.

  4. [Causal analysis approaches in epidemiology].

    Science.gov (United States)

    Dumas, O; Siroux, V; Le Moual, N; Varraso, R

    2014-02-01

    Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the

  5. Approaches to Data Analysis of Multiple-Choice Questions

    Science.gov (United States)

    Ding, Lin; Beichner, Robert

    2009-01-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics…

  6. Geostatistical approach for assessing soil volumes requiring remediation: validation using lead-polluted soils underlying a former smelting works.

    Science.gov (United States)

    Demougeot-Renard, Helene; De Fouquet, Chantal

    2004-10-01

    Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.

  7. Leading edge analysis of transcriptomic changes during pseudorabies virus infection.

    Science.gov (United States)

    Fleming, Damarius S; Miller, Laura C

    2016-12-01

    Eight RNA samples taken from the tracheobronchial lymph nodes (TBLN) of pigs that were either infected or non-infected with a feral isolate of porcine pseudorabies virus (PRV) were used to investigate changes in gene expression related to the pathogen. The RNA was processed into fastq files for each library prior to being analyzed using Illumina Digital Gene Expression Tag Profiling sequences (DGETP) which were used as the downstream measure of differential expression. Analyzed tags consisted of 21 base pair sequences taken from time points 1, 3, 6, and 14 days' post infection (dpi) that generated 1,927,547 unique tag sequences. Tag sequences were analyzed for differential transcript expression and gene set enrichment analysis (GSEA) to uncover transcriptomic changes related to PRV pathology progression. In conjunction with the DGETP and GSEA, the study also incorporated use of leading edge analysis to help link the TBLN transcriptome data to clinical progression of PRV at each of the sampled time points. The purpose of this manuscript is to provide useful background on applying the leading edge analysis to GSEA and expression data to help identify genes considered to be of high biological interest. The data in the form of fastq files has been uploaded to the NCBI Gene Expression Omnibus (GEO) (GSE74473) database.

  8. Leading coordinate analysis of reaction pathways in proton chain transfer: Application to a two-proton transfer model for the green fluorescent protein

    International Nuclear Information System (INIS)

    Wang Sufan; Smith, Sean C.

    2006-01-01

    The 'leading coordinate' approach to computing an approximate reaction pathway, with subsequent determination of the true minimum energy profile, is applied to a two-proton chain transfer model based on the chromophore and its surrounding moieties within the green fluorescent protein (GFP). Using an ab initio quantum chemical method, a number of different relaxed energy profiles are found for several plausible guesses at leading coordinates. The results obtained for different trial leading coordinates are rationalized through the calculation of a two-dimensional relaxed potential energy surface (PES) for the system. Analysis of the 2-D relaxed PES reveals that two of the trial pathways are entirely spurious, while two others contain useful information and can be used to furnish starting points for successful saddle-point searches. Implications for selection of trial leading coordinates in this class of proton chain transfer reactions are discussed, and a simple diagnostic function is proposed for revealing whether or not a relaxed pathway based on a trial leading coordinate is likely to furnish useful information

  9. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  10. Interstage Flammability Analysis Approach

    Science.gov (United States)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  11. Systemic Analysis Approaches for Air Transportation

    Science.gov (United States)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  12. Leading edge analysis of transcriptomic changes during pseudorabies virus infection

    Directory of Open Access Journals (Sweden)

    Damarius S. Fleming

    2016-12-01

    Full Text Available Eight RNA samples taken from the tracheobronchial lymph nodes (TBLN of pigs that were either infected or non-infected with a feral isolate of porcine pseudorabies virus (PRV were used to investigate changes in gene expression related to the pathogen. The RNA was processed into fastq files for each library prior to being analyzed using Illumina Digital Gene Expression Tag Profiling sequences (DGETP which were used as the downstream measure of differential expression. Analyzed tags consisted of 21 base pair sequences taken from time points 1, 3, 6, and 14 days' post infection (dpi that generated 1,927,547 unique tag sequences. Tag sequences were analyzed for differential transcript expression and gene set enrichment analysis (GSEA to uncover transcriptomic changes related to PRV pathology progression. In conjunction with the DGETP and GSEA, the study also incorporated use of leading edge analysis to help link the TBLN transcriptome data to clinical progression of PRV at each of the sampled time points. The purpose of this manuscript is to provide useful background on applying the leading edge analysis to GSEA and expression data to help identify genes considered to be of high biological interest. The data in the form of fastq files has been uploaded to the NCBI Gene Expression Omnibus (GEO (GSE74473 database.

  13. Multi-level approach for parametric roll analysis

    Science.gov (United States)

    Kim, Taeyoung; Kim, Yonghwan

    2011-03-01

    The present study considers multi-level approach for the analysis of parametric roll phenomena. Three kinds of computation method, GM variation, impulse response function (IRF), and Rankine panel method, are applied for the multi-level approach. IRF and Rankine panel method are based on the weakly nonlinear formulation which includes nonlinear Froude- Krylov and restoring forces. In the computation result of parametric roll occurrence test in regular waves, IRF and Rankine panel method show similar tendency. Although the GM variation approach predicts the occurrence of parametric roll at twice roll natural frequency, its frequency criteria shows a little difference. Nonlinear roll motion in bichromatic wave is also considered in this study. To prove the unstable roll motion in bichromatic waves, theoretical and numerical approaches are applied. The occurrence of parametric roll is theoretically examined by introducing the quasi-periodic Mathieu equation. Instability criteria are well predicted from stability analysis in theoretical approach. From the Fourier analysis, it has been verified that difference-frequency effects create the unstable roll motion. The occurrence of unstable roll motion in bichromatic wave is also observed in the experiment.

  14. A relational approach to support software architecture analysis

    NARCIS (Netherlands)

    Feijs, L.M.G.; Krikhaar, R.L.; van Ommering, R.C.

    1998-01-01

    This paper reports on our experience with a relational approach to support the analysis of existing software architectures. The analysis options provide for visualization and view calculation. The approach has been applied for reverse engineering. It is also possible to check concrete designs

  15. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  16. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  17. Benefit-Risk Analysis for Decision-Making: An Approach.

    Science.gov (United States)

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB). © 2016 American Society for Clinical Pharmacology and Therapeutics.

  18. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    Science.gov (United States)

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of

  19. Linking cases of illegal shootings of the endangered California condor using stable lead isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, Myra E., E-mail: myraf@ucsc.edu [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Kuspa, Zeka E. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Welch, Alacia [National Park Service, Pinnacles National Park, 5000 Highway 146, Paicines, CA 95043 (United States); Eng, Curtis; Clark, Michael [Los Angeles Zoo and Botanical Gardens, 5333 Zoo Drive, Los Angeles, CA 90027 (United States); Burnett, Joseph [Ventana Wildlife Society, 19045 Portola Dr. Ste. F-1, Salinas, CA 93908 (United States); Smith, Donald R. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States)

    2014-10-15

    Lead poisoning is preventing the recovery of the critically endangered California condor (Gymnogyps californianus) and lead isotope analyses have demonstrated that ingestion of spent lead ammunition is the principal source of lead poisoning in condors. Over an 8 month period in 2009, three lead-poisoned condors were independently presented with birdshot embedded in their tissues, evidencing they had been shot. No information connecting these illegal shooting events existed and the timing of the shooting(s) was unknown. Using lead concentration and stable lead isotope analyses of feathers, blood, and recovered birdshot, we observed that: i) lead isotope ratios of embedded shot from all three birds were measurably indistinguishable from each other, suggesting a common source; ii) lead exposure histories re-constructed from feather analysis suggested that the shooting(s) occurred within the same timeframe; and iii) two of the three condors were lead poisoned from a lead source isotopically indistinguishable from the embedded birdshot, implicating ingestion of this type of birdshot as the source of poisoning. One of the condors was subsequently lead poisoned the following year from ingestion of a lead buckshot (blood lead 556 µg/dL), illustrating that ingested shot possess a substantially greater lead poisoning risk compared to embedded shot retained in tissue (blood lead ∼20 µg/dL). To our knowledge, this is the first study to use lead isotopes as a tool to retrospectively link wildlife shooting events. - Highlights: • We conducted a case-based analysis of illegal shootings of California condors. • Blood and feather Pb isotopes were used to reconstruct the illegal shooting events. • Embedded birdshot from the three condors had the same Pb isotope ratios. • Feather and blood Pb isotopes indicated that the condors were shot in a common event. • Ingested shot causes substantially greater lead exposure compared to embedded shot.

  20. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  1. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Directory of Open Access Journals (Sweden)

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory.

  2. Electrode alignment of transverse tripoles using a percutaneous triple-lead approach in spinal cord stimulation

    Science.gov (United States)

    Sankarasubramanian, V.; Buitenweg, J. R.; Holsheimer, J.; Veltink, P.

    2011-02-01

    The aim of this modeling study is to determine the influence of electrode alignment of transverse tripoles on the paresthesia coverage of the pain area in spinal cord stimulation, using a percutaneous triple-lead approach. Transverse tripoles, comprising a central cathode and two lateral anodes, were modeled on the low-thoracic vertebral region (T10-T12) using percutaneous triple-lead configurations, with the center lead on the spinal cord midline. The triple leads were oriented both aligned and staggered. In the staggered configuration, the anodes were offset either caudally (caudally staggered) or rostrally (rostrally staggered) with respect to the midline cathode. The transverse tripolar field steering with the aligned and staggered configurations enabled the estimation of dorsal column fiber thresholds (IDC) and dorsal root fiber thresholds (IDR) at various anodal current ratios. IDC and IDR were considerably higher for the aligned transverse tripoles as compared to the staggered transverse tripoles. The aligned transverse tripoles facilitated deeper penetration into the medial dorsal columns (DCs). The staggered transverse tripoles always enabled broad and bilateral DC activation, at the expense of mediolateral steerability. The largest DC recruited area was obtained with the rostrally staggered transverse tripole. Transverse tripolar geometries, using percutaneous leads, allow for selective targeting of either medial or lateral DC fibers, if and only if the transverse tripole is aligned. Steering of anodal currents between the lateral leads of the staggered transverse tripoles cannot target medially confined populations of DC fibers in the spinal cord. An aligned transverse tripolar configuration is strongly recommended, because of its ability to provide more post-operative flexibility than other configurations.

  3. X-ray radiometric analysis of lead and zinc concentrates using germanium radiation detector

    International Nuclear Information System (INIS)

    Vajgachev, A.A.; Mamysh, V.A.; Mil'chakov, V.I.; Shchekin, K.I.; Berezkin, V.V.

    1975-01-01

    The results of determination of lead, zinc and iron in lead and zinc concentrates by the X-ray-radiometric method with the use of germanium semiconductor detector are presented. In the experiments the 57 Co source and tritium-zirconium target were used. The activity of 57 Co was 2 mc. The area of the germanium detector employed was 5g mm 2 , its thickness - 2.3 mm. In lead concentrates zinc and iron were determined from the direct intensity of K-series radiation. In the analysis of zinc concentrates the same conditions of recording and excitation were used as in the case of lead concentrates, but the measurements were conducted in saturated layers. It is demonstrated that the use of germanium semiconductor detectors in combination with the suggested methods of measurements makes it possible to perform determination of iron, zinc and lead in zinc and lead concentrates with permissible error

  4. Comparative analysis of employment dynamics in leading and lagging rural regions of the EU, 1980-1997.

    NARCIS (Netherlands)

    Terluin, I.J.; Post, J.H.; Sjöström, Å.

    1999-01-01

    In this study a comparative analysis of factors hampering and encouraging the development of employment in 9 leading and 9 lagging regions in the EU during the 1980s and the first half of the 1990s is made. Derived from this comparative analysis, some lessons, which leading and lagging rural regions

  5. Computational Methods Used in Hit-to-Lead and Lead Optimization Stages of Structure-Based Drug Discovery.

    Science.gov (United States)

    Heifetz, Alexander; Southey, Michelle; Morao, Inaki; Townsend-Nicholson, Andrea; Bodkin, Mike J

    2018-01-01

    GPCR modeling approaches are widely used in the hit-to-lead (H2L) and lead optimization (LO) stages of drug discovery. The aims of these modeling approaches are to predict the 3D structures of the receptor-ligand complexes, to explore the key interactions between the receptor and the ligand and to utilize these insights in the design of new molecules with improved binding, selectivity or other pharmacological properties. In this book chapter, we present a brief survey of key computational approaches integrated with hierarchical GPCR modeling protocol (HGMP) used in hit-to-lead (H2L) and in lead optimization (LO) stages of structure-based drug discovery (SBDD). We outline the differences in modeling strategies used in H2L and LO of SBDD and illustrate how these tools have been applied in three drug discovery projects.

  6. An equivalent dipole analysis of PZT ceramics and lead-free piezoelectric single crystals

    Directory of Open Access Journals (Sweden)

    Andrew J. Bell

    2016-06-01

    Full Text Available The recently proposed Equivalent Dipole Model for describing the electromechanical properties of ionic solids in terms of 3 ions and 2 bonds has been applied to PZT ceramics and lead-free single crystal piezoelectric materials, providing analysis in terms of an effective ionic charge and the asymmetry of the interatomic force constants. For PZT it is shown that, as a function of composition across the morphotropic phase boundary, the dominant bond compliance peaks at 52% ZrO2. The stiffer of the two bonds shows little composition dependence with no anomaly at the phase boundary. The effective charge has a maximum value at 50% ZrO2, decreasing across the phase boundary region, but becoming constant in the rhombohedral phase. The single crystals confirm that both the asymmetry in the force constants and the magnitude of effective charge are equally important in determining the values of the piezoelectric charge coefficient and the electromechanical coupling coefficient. Both are apparently temperature dependent, increasing markedly on approaching the Curie temperature.

  7. Eco-Balance analysis of the disused lead-acid-batteries recycling technology

    Science.gov (United States)

    Kamińska, Ewa; Kamiński, Tomasz

    2017-10-01

    The article presents the results of the eco-balance analysis of the disused lead-acid batteries recycling process. Test-dedicated technology offers the possibility to recover other elements, for example, polypropylene of the battery case or to obtain crystalline sodium sulphate. The life cycle assessment was made using ReCiPe and IMPACT2002 + methods. The results are shown as environmental points [Pt]. The results are shown in the environmental categories, specific for each of the methods grouped in the impact categories. 1 Mg of the processed srap was a dopted as the functional unit. The results of the analyses indicate that recycling processes may provide the environmental impact of recycling technology less harmful. Repeated use of lead causes that its original sources are not explored. Similarly, the use of granule production-dedicated polypropylene extracted from battery casings that are used in the plastics industry, has environmental benefits. Due to the widespread use of lead-acid batteries, the attention should be paid to their proper utilization, especially in terms of heavy metals, especially lead. According to the calculations, the highest level of environmental benefits from the use of lead from secondary sources in the production of new products, was observed in the refining process.

  8. Eco-Balance analysis of the disused lead-acid-batteries recycling technology

    Directory of Open Access Journals (Sweden)

    Kamińska Ewa

    2017-01-01

    Full Text Available The article presents the results of the eco-balance analysis of the disused lead-acid batteries recycling process. Test-dedicated technology offers the possibility to recover other elements, for example, polypropylene of the battery case or to obtain crystalline sodium sulphate. The life cycle assessment was made using ReCiPe and IMPACT2002 + methods. The results are shown as environmental points [Pt]. The results are shown in the environmental categories, specific for each of the methods grouped in the impact categories. 1 Mg of the processed srap was a dopted as the functional unit. The results of the analyses indicate that recycling processes may provide the environmental impact of recycling technology less harmful. Repeated use of lead causes that its original sources are not explored. Similarly, the use of granule production-dedicated polypropylene extracted from battery casings that are used in the plastics industry, has environmental benefits. Due to the widespread use of lead-acid batteries, the attention should be paid to their proper utilization, especially in terms of heavy metals, especially lead. According to the calculations, the highest level of environmental benefits from the use of lead from secondary sources in the production of new products, was observed in the refining process.

  9. Real Analysis A Historical Approach

    CERN Document Server

    Stahl, Saul

    2011-01-01

    A provocative look at the tools and history of real analysis This new edition of Real Analysis: A Historical Approach continues to serve as an interesting read for students of analysis. Combining historical coverage with a superb introductory treatment, this book helps readers easily make the transition from concrete to abstract ideas. The book begins with an exciting sampling of classic and famous problems first posed by some of the greatest mathematicians of all time. Archimedes, Fermat, Newton, and Euler are each summoned in turn, illuminating the utility of infinite, power, and trigonome

  10. Development and characterisation of disposable gold electrodes, and their use for lead(II) analysis

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Mohd F. M. [Cranfield University, Cranfield Health, Silsoe (United Kingdom); Institute for Medical Research, Toxicology and Pharmacology Unit, Herbal Medicine Research Centre, Kuala Lumpur (Malaysia); Tothill, Ibtisam E. [Cranfield University, Cranfield Health, Silsoe (United Kingdom)

    2006-12-15

    There is an increasing need to assess the harmful effects of heavy-metal-ion pollution on the environment. The ability to detect and measure toxic contaminants on site using simple, cost effective, and field-portable sensors is an important aspect of environmental protection and facilitating rapid decision making. A screen-printed gold sensor in a three-electrode configuration has been developed for analysis of lead(II) by square-wave stripping voltammetry (SWSV). The working electrode was fabricated with gold ink deposited by use of thick-film technology. Conditions affecting the lead stripping response were characterised and optimized. Experimental data indicated that chloride ions are important in lead deposition and subsequent analysis with this type of sensor. A linear concentration range of 10-50 {mu}g L{sup -1} and 25-300 {mu}g L{sup -1} with detection limits of 2 {mu}g L{sup -1} and 5.8 {mu}g L{sup -1} were obtained for lead(II) for measurement times of four and two minutes, respectively. The electrodes can be reused up to 20 times after cleaning with 0.5 mol L{sup -1} sulfuric acid. Interference of other metals with the response to lead were also examined to optimize the sensor response for analysis of environmental samples. The analytical utility of the sensor was demonstrated by applying the system to a variety of wastewater and soil sample extracts from polluted sites. The results are sufficient evidence of the feasibility of using these screen-printed gold electrodes for the determination of lead(II) in wastewater and soil extracts. For comparison purposes a mercury-film electrode and ICP-MS were used for validation. (orig.)

  11. Modern terrorism: concept and approach analysis

    OpenAIRE

    CHAIKA ALEXANDER VIKTOROVICH

    2015-01-01

    The problem of modern terrorism as an image of counterculture environment is considered. The analysis of concepts and approaches of foreign and native authors, specialists of terrorism problem research was conducted. Separate features of the modern terrorism are considered and emphasized. The author drew conceptual conclusions on the basis of dialectical approach to modern terrorism counterculture phenomenon research.

  12. Penetration of a magnetic field into superconducting lead and lead-indium alloys

    International Nuclear Information System (INIS)

    Egloff, C.; Raychaudhuri, A.K.; Rinderer, L.

    1983-01-01

    The temperature dependence of the magnetic field penetration depth of superconducting lead and lead-indium alloys has been studied over the temperature range between about 2 K and T/sub c/. Data are analyzed in terms of the microscopic theory. The difficulties of a unique analysis of the penetration data are pointed out and a strategy for the analysis is discussed. The penetration depth at T = 0K for pure lead is determined as 522 A. This value, though higher than the previously accepted value for lead, is nevertheless consistent with the strong coupling character of lead

  13. An Inverse Modeling Approach to Investigate Past Lead Atmospheric Deposition in Southern Greenland

    Science.gov (United States)

    Massa, C.; Monna, F.; Bichet, V.; Gauthier, E.; Richard, H.

    2013-12-01

    The aim of this study is to model atmospheric pollution lead fluxes using two different paleoenvironmental records, covering the last 2000 years, located in southern Greenland. Fifty five sediment samples from the Lake Igaliku sequence (61°00.403'N, 45°26.494'W) were analyzed for their Pb and Al contents, and for lead isotopic compositions. The second archive consists in a previously published dataset (Shotyk et al., 2003), including Zr and Pb concentrations, and lead isotopic compositions, obtained from a minerogenic peat deposit located 16 km northwest of Lake Igaliku (61°08.314'N, 45°33.703'W). As natural background concentrations are high and obliterate most of the airborne anthropogenic lead, it is not possible to isolate this anthropogenic contribution through time with classical methods (i.e. Pb is normalized to a lithogenic and conservative element). Moreover, the background 206Pb/207Pb ratio is rather noisy because of the wide geological heterogeneity of sediment sources, which further complicated unambiguous detection of the lead pollution. To overcome these difficulties, an inverse modeling approach based on assumptions about past lead inputs was applied. This method consists of simulating a range of anthropogenic fluxes to determine the best match between measured and simulated data, both for Pb concentrations and isotopic compositions. The model is validated by the coherence of the results obtained from the two independent datasets that must reflect a similar pollution history. Although notable 206Pb/207Pb ratio shifts suggest that the first signs of anthropogenic inputs may have occurred in the 15th century, the signal-to-noise ratio was too low to significantly influence the sediment composition. Nevertheless we were able to estimate that anthropogenic lead fluxes did not exceed 2700 μg m-2 yr-1, a maximum value recorded during the 1960s. The comparison with other records from the North Atlantic Islands reveals a spatial gradient most likely due

  14. The Environmental Burdens of Lead-Acid Batteries in China: Insights from an Integrated Material Flow Analysis and Life Cycle Assessment of Lead

    Directory of Open Access Journals (Sweden)

    Sha Chen

    2017-11-01

    Full Text Available Lead-acid batteries (LABs, a widely used energy storage equipment in cars and electric vehicles, are becoming serious problems due to their high environmental impact. In this study, an integrated method, combining material flow analysis with life cycle assessment, was developed to analyze the environmental emissions and burdens of lead in LABs. The environmental burdens from other materials in LABs were not included. The results indicated that the amount of primary lead used in LABs accounted for 77% of the total lead production in 2014 in China. The amount of discharged lead into the environment was 8.54 × 105 tonnes, which was mainly from raw material extraction (57.2%. The largest environmental burden was from the raw materials extraction and processing, which accounted for 81.7% of the total environmental burdens. The environmental burdens of the environmental toxicity potential, human toxicity potential-cancer, human toxicity potential-non-cancer, water footprint and land use accounted for more than 90% at this stage. Moreover, the environmental burdens from primary lead was much more serious than regenerated lead. On the basis of the results, main practical measures and policies were proposed to reduce the lead emissions and environmental burdens of LABs in China, namely establishing an effective LABs recycling system, enlarging the market share of the legal regenerated lead, regulating the production of regenerated lead, and avoiding the long-distance transportation of the waste LABs.

  15. The conjoint influence of home enriched environment and lead exposure on children's cognition and behaviour in a Mexican lead smelter community.

    Science.gov (United States)

    Moodie, Sue; Ialongo, Nick; López, Patricia; Rosado, Jorge; García-Vargas, Gonzalo; Ronquillo, Dolores; Kordas, Katarzyna

    2013-01-01

    A range of studies has been conducted on the detrimental effects of lead in mining and smelting communities. The neurocognitive and behavioural health effects of lead on children are well known. This research characterized the conjoint influence of lead exposure and home enriched environment on neurocognitive function and behaviour for first-grade children living in a Mexican lead smelter community. Structural equation models were used for this analysis with latent outcome variables, Cognition and Behaviour, constructed based on a battery of assessments administered to the first-grade children, their parents, and teachers. Structural equation modelling was used to describe complex relationships of exposure and health outcomes in a manner that permitted partition of both direct and indirect effects of the factors being measured. Home Environment (a latent variable constructed from information on mother's education and support of school work and extracurricular activities), and child blood lead concentration each had a main significant effect on cognition and behaviour. However, there were no statistically significant moderation relationships between lead and Home Environment on these latent outcomes. Home Environment had a significant indirect mediation effect between lead and both Cognition and Behaviour (p-valueEnvironment has a moderate mediation effect with respect to lead effects on Behaviour (β=0.305) and a lower mediation effect on Cognition (β=0.184). The extent of home enrichment in this study was most highly related to the mother's support of schoolwork and slightly less by the mother's support of extracurricular activities or mother's education. Further research may be able to develop approaches to support families to make changes within their home and child rearing practices, or advocate for different approaches to support their child's behaviour to reduce the impact of lead exposure on children's cognitive and behavioural outcomes. Copyright © 2012

  16. Leading healthcare in complexity.

    Science.gov (United States)

    Cohn, Jeffrey

    2014-12-01

    Healthcare institutions and providers are in complexity. Networks of interconnections from relationships and technology create conditions in which interdependencies and non-linear dynamics lead to surprising, unpredictable outcomes. Previous effective approaches to leadership, focusing on top-down bureaucratic methods, are no longer effective. Leading in complexity requires leaders to accept the complexity, create an adaptive space in which innovation and creativity can flourish and then integrate the successful practices that emerge into the formal organizational structure. Several methods for doing adaptive space work will be discussed. Readers will be able to contrast traditional leadership approaches with leading in complexity. They will learn new behaviours that are required of complexity leaders, along with challenges they will face, often from other leaders within the organization.

  17. An SQL-based approach to physics analysis

    International Nuclear Information System (INIS)

    Limper, Dr Maaike

    2014-01-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced 'ROOT-ntuple' files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  18. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study

    Directory of Open Access Journals (Sweden)

    Tania Dehesh

    2015-01-01

    Full Text Available Background. Univariate meta-analysis (UM procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS method as a multivariate meta-analysis approach. Methods. We evaluated the efficiency of four new approaches including zero correlation (ZC, common correlation (CC, estimated correlation (EC, and multivariate multilevel correlation (MMC on the estimation bias, mean square error (MSE, and 95% probability coverage of the confidence interval (CI in the synthesis of Cox proportional hazard models coefficients in a simulation study. Result. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. Conclusion. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.

  19. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study.

    Science.gov (United States)

    Dehesh, Tania; Zare, Najaf; Ayatollahi, Seyyed Mohammad Taghi

    2015-01-01

    Univariate meta-analysis (UM) procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS) method as a multivariate meta-analysis approach. We evaluated the efficiency of four new approaches including zero correlation (ZC), common correlation (CC), estimated correlation (EC), and multivariate multilevel correlation (MMC) on the estimation bias, mean square error (MSE), and 95% probability coverage of the confidence interval (CI) in the synthesis of Cox proportional hazard models coefficients in a simulation study. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.

  20. Lead shielded cells for the spectrographic analysis of radioisotope solutions

    International Nuclear Information System (INIS)

    Roca, M.; Capdevila, C.; Cruz, F. de la

    1967-01-01

    Two lead shielded cells for the spectrochemical analysis of radioisotope samples are described. One of them is devoted to the evaporation of samples before excitation and the other one contains a suitable spectrographic excitation stand for the copper spark technique. A special device makes it possible the easy displacement of the excitation cell on wheels and rails for its accurate and reproducible position as well as its replacement by a glove box for plutonium analysis. In order to guarantee safety the room in which the spectrograph and the source are set up in separated from the active laboratory by a wall with a suitable window. (Author) 1 refs

  1. Dynamic risk analysis using bow-tie approach

    International Nuclear Information System (INIS)

    Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2012-01-01

    Accident probability estimation is a common and central step to all quantitative risk assessment methods. Among many techniques available, bow-tie model (BT) is very popular because it represent the accident scenario altogether including causes and consequences. However, it suffers a static structure limiting its application in real-time monitoring and probability updating which are key factors in dynamic risk analysis. The present work is focused on using BT approach in a dynamic environment in which the occurrence probability of accident consequences changes. In this method, on one hand, failure probability of primary events of BT, leading to the top event, are developed using physical reliability models, and constantly revised as physical parameters (e.g., pressure, velocity, dimension, etc) change. And, on the other hand, the failure probability of safety barriers of the BT are periodically updated using Bayes’ theorem as new information becomes available over time. Finally, the resulting, updated BT is used to estimate the posterior probability of the consequences which in turn results in an updated risk profile. - Highlights: ► A methodology is proposed to make bow-tie method adapted for dynamic risk analysis. ► Physical reliability models are used to revise the top event. ► Bayes’ theorem is used to update the probability of safety barriers. ► The number of accidents in sequential time intervals is used to form likelihood function. ► The risk profile is updated for varying physical parameters and for different times.

  2. SENSITIVITY ANALYSIS as a methodical approach to the development of design strategies for environmentally sustainable buildings

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring

    . The research methodology applied in the project combines a literature study of descriptions of methodical approaches and built examples with a sensitivity analysis and a qualitative interview with two designers from a best practice example of a practice that has achieved environmentally sustainable...... architecture, such as: ecological, green, bio-climatic, sustainable, passive, low-energy and environmental architecture. This PhD project sets out to gain a better understanding of environmentally sustainable architecture and the methodical approaches applied in the development of this type of architecture...... an increase in scientific and political awareness, which has lead to an escalation in the number of research publications in the field, as well as, legislative demands for the energy consumption of buildings. The publications in the field refer to many different approaches to environmentally sustainable...

  3. Trace analysis of lead and cadmium in seafoods by differential pulse anodic stripping voltametry

    International Nuclear Information System (INIS)

    Sumera, F.C.; Verceluz, F.P.; Kapauan, P.A.

    1979-01-01

    A method for the simultaneous determination of cadmium and lead in seafoods is described. The sample is dry ashed in a muffle furnace elevating the temperature gradually up to 500 0 C. The ashed sample is treated with concentrated nitric acid, dried on a heating plate and returned to the muffle furnace for further heating. The treated ash is then dissolved in 1 N HCL acetate buffer and citric acid are added and the pH adjusted to 3.6-4. The resulting solution is analyzed for lead and cadmium by differential pulse anodic stripping voltametry (DPASV) using a wax-impregnated graphite thin film electrode. The average recoveries of 0.4 of cadmium and lead added to 5 fish samples were 97% and 99% respectively. The standard deviations, on a homogenized shark sample for lead and cadmium analysis were 6.7 ppb and 12.3 ppb, respectively, and the relative standard deviations were 21.0% and 15.5% respectively. Studies on instrumental parameters involved in the DPASV step of analysis and methods of measuring peak current signals were also made. (author)

  4. Lead user projects in practice

    DEFF Research Database (Denmark)

    Brem, Alexander; Gutstein, Adele

    2018-01-01

    Earlier research on the lead user method is focused on individual case studies and how the method was applied in a specific context. In this paper, we take a broader approach, analyzing a sample of 24 lead user projects, which included working with 188 lead users. These projects were analyzed...

  5. Comparing Machine Learning and Decision Making Approaches to Forecast Long Lead Monthly Rainfall: The City of Vancouver, Canada

    Directory of Open Access Journals (Sweden)

    Zahra Zahmatkesh

    2018-01-01

    Full Text Available Estimating maximum possible rainfall is of great value for flood prediction and protection, particularly for regions, such as Canada, where urban and fluvial floods from extreme rainfalls have been known to be a major concern. In this study, a methodology is proposed to forecast real-time rainfall (with one month lead time using different number of spatial inputs with different orders of lags. For this purpose, two types of models are used. The first one is a machine learning data driven-based model, which uses a set of hydrologic variables as inputs, and the second one is an empirical-statistical model that employs the multi-criteria decision analysis method for rainfall forecasting. The data driven model is built based on Artificial Neural Networks (ANNs, while the developed multi-criteria decision analysis model uses Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS approach. A comprehensive set of spatially varying climate variables, including geopotential height, sea surface temperature, sea level pressure, humidity, temperature and pressure with different orders of lags is collected to form input vectors for the forecast models. Then, a feature selection method is employed to identify the most appropriate predictors. Two sets of results from the developed models, i.e., maximum daily rainfall in each month (RMAX and cumulative value of rainfall for each month (RCU, are considered as the target variables for forecast purpose. The results from both modeling approaches are compared using a number of evaluation criteria such as Nash-Sutcliffe Efficiency (NSE. The proposed models are applied for rainfall forecasting for a coastal area in Western Canada: Vancouver, British Columbia. Results indicate although data driven models such as ANNs work well for the simulation purpose, developed TOPSIS model considerably outperforms ANNs for the rainfall forecasting. ANNs show acceptable simulation performance during the

  6. The use of lead isotopic abundances in trace uranium samples for nuclear forensics analysis

    International Nuclear Information System (INIS)

    Fahey, A.J.; Ritchie, N.W.M.; Newbury, D.E.; Small, J.A.

    2010-01-01

    Secondary ion mass spectrometry (SIMS), secondary electron microscopy (SEM) and X-ray analysis have been applied to the measurement of U-bearing particles with the intent of gleaning information concerning their history and/or origin. The lead isotopic abundances are definitive indicators that U-bearing particles have come from an ore-body, even if they have undergone chemical processing. SEM images and X-ray analysis can add further information to the study that may allude to the extent of chemical processing. The presence of 'common' lead that does not exhibit a radiogenic signature is clear evidence of anthropogenic origin. (author)

  7. Potentiometric stripping analysis of lead and cadmium leaching from dental prosthetic materials and teeth

    Directory of Open Access Journals (Sweden)

    GORAN M. NIKOLIC

    2004-07-01

    Full Text Available Potentiometric stipping analysis (PSA was applied for the determination of lead and cadmium leaching from dental prosthetic materials and teeth. The soluble lead content in finished dental implants was found to be much lower than that of the individual components used for their preparation. Cadmium was not detected in dental implants and materials under the defined conditions. The soluble lead and cadmium content of teeth was slightly lower than the lead and cadmium content in whole teeth (w/w reported by other researchers, except in the case of a tooth with removed amalgam filling. The results of this work suggest that PSA may be a good method for lead and cadmium leaching studies for investigation of the biocompatibility of dental prosthetic materials.

  8. Lead in teeth from lead-dosed goats: Microdistribution and relationship to the cumulative lead dose

    International Nuclear Information System (INIS)

    Bellis, David J.; Hetter, Katherine M.; Jones, Joseph; Amarasiriwardena, Dula; Parsons, Patrick J.

    2008-01-01

    Teeth are commonly used as a biomarker of long-term lead exposure. There appear to be few data, however, on the content or distribution of lead in teeth where data on specific lead intake (dose) are also available. This study describes the analysis of a convenience sample of teeth from animals that were dosed with lead for other purposes, i.e., a proficiency testing program for blood lead. Lead concentration of whole teeth obtained from 23 animals, as determined by atomic absorption spectrometry, varied from 0.6 to 80 μg g -1 . Linear regression of whole tooth lead (μg g -1 ) on the cumulative lead dose received by the animal (g) yielded a slope of 1.2, with r 2 =0.647 (p -1 , were found in circumpulpal dentine. Linear regression of circumpulpal lead (μg g -1 ) on cumulative lead dose (g) yielded a slope of 23 with r 2 =0.961 (p=0.0001). The data indicated that whole tooth lead, and especially circumpulpal lead, of dosed goats increased linearly with cumulative lead exposure. These data suggest that circumpulpal dentine is a better biomarker of cumulative lead exposure than is whole tooth lead, at least for lead-dosed goats

  9. Application of a series of artificial neural networks to on-site quantitative analysis of lead into real soil samples by laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Bruyère, D.; Ismaël, A.; Gallou, G.; Laperche, V.; Michel, K.; Canioni, L.; Bousquet, B.

    2014-01-01

    Artificial neural networks were applied to process data from on-site LIBS analysis of soil samples. A first artificial neural network allowed retrieving the relative amounts of silicate, calcareous and ores matrices into soils. As a consequence, each soil sample was correctly located inside the ternary diagram characterized by these three matrices, as verified by ICP-AES. Then a series of artificial neural networks were applied to quantify lead into soil samples. More precisely, two models were designed for classification purpose according to both the type of matrix and the range of lead concentrations. Then, three quantitative models were locally applied to three data subsets. This complete approach allowed reaching a relative error of prediction close to 20%, considered as satisfying in the case of on-site analysis. - Highlights: • Application of a series of artificial neural networks (ANN) to quantitative LIBS • Matrix-based classification of the soil samples by ANN • Concentration-based classification of the soil samples by ANN • Series of quantitative ANN models dedicated to the analysis of data subsets • Relative error of prediction lower than 20% for LIBS analysis of soil samples

  10. Arctic lead detection using a waveform mixture algorithm from CryoSat-2 data

    Science.gov (United States)

    Lee, Sanggyun; Kim, Hyun-cheol; Im, Jungho

    2018-05-01

    We propose a waveform mixture algorithm to detect leads from CryoSat-2 data, which is novel and different from the existing threshold-based lead detection methods. The waveform mixture algorithm adopts the concept of spectral mixture analysis, which is widely used in the field of hyperspectral image analysis. This lead detection method was evaluated with high-resolution (250 m) MODIS images and showed comparable and promising performance in detecting leads when compared to the previous methods. The robustness of the proposed approach also lies in the fact that it does not require the rescaling of parameters (i.e., stack standard deviation, stack skewness, stack kurtosis, pulse peakiness, and backscatter σ0), as it directly uses L1B waveform data, unlike the existing threshold-based methods. Monthly lead fraction maps were produced by the waveform mixture algorithm, which shows interannual variability of recent sea ice cover during 2011-2016, excluding the summer season (i.e., June to September). We also compared the lead fraction maps to other lead fraction maps generated from previously published data sets, resulting in similar spatiotemporal patterns.

  11. Lead coolant test facility systems design, thermal hydraulic analysis and cost estimate

    Energy Technology Data Exchange (ETDEWEB)

    Khericha, Soli, E-mail: slk2@inel.gov [Battelle Energy Alliance, LLC, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Harvego, Edwin; Svoboda, John; Evans, Robert [Battelle Energy Alliance, LLC, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Dalling, Ryan [ExxonMobil Gas and Power Marketing, Houston, TX 77069 (United States)

    2012-01-15

    The Idaho National Laboratory prepared a preliminary technical and functional requirements (T and FR), thermal hydraulic design and cost estimate for a lead coolant test facility. The purpose of this small scale facility is to simulate lead coolant fast reactor (LFR) coolant flow in an open lattice geometry core using seven electrical rods and liquid lead or lead-bismuth eutectic coolant. Based on review of current world lead or lead-bismuth test facilities and research needs listed in the Generation IV Roadmap, five broad areas of requirements were identified as listed below: Bullet Develop and demonstrate feasibility of submerged heat exchanger. Bullet Develop and demonstrate open-lattice flow in electrically heated core. Bullet Develop and demonstrate chemistry control. Bullet Demonstrate safe operation. Bullet Provision for future testing. This paper discusses the preliminary design of systems, thermal hydraulic analysis, and simplified cost estimated. The facility thermal hydraulic design is based on the maximum simulated core power using seven electrical heater rods of 420 kW; average linear heat generation rate of 300 W/cm. The core inlet temperature for liquid lead or Pb/Bi eutectic is 4200 Degree-Sign C. The design includes approximately seventy-five data measurements such as pressure, temperature, and flow rates. The preliminary estimated cost of construction of the facility is $3.7M (in 2006 $). It is also estimated that the facility will require two years to be constructed and ready for operation.

  12. a Novel Approach for 3d Neighbourhood Analysis

    Science.gov (United States)

    Emamgholian, S.; Taleai, M.; Shojaei, D.

    2017-09-01

    Population growth and lack of land in urban areas have caused massive developments such as high rises and underground infrastructures. Land authorities in the international context recognizes 3D cadastres as a solution to efficiently manage these developments in complex cities. Although a 2D cadastre does not efficiently register these developments, it is currently being used in many jurisdictions for registering land and property information. Limitations in analysis and presentation are considered as examples of such limitations. 3D neighbourhood analysis by automatically finding 3D spaces has become an issue of major interest in recent years. Whereas the neighbourhood analysis has been in the focus of research, the idea of 3D neighbourhood analysis has rarely been addressed in 3 dimensional information systems (3D GIS) analysis. In this paper, a novel approach for 3D neighbourhood analysis has been proposed by recording spatial and descriptive information of the apartment units and easements. This approach uses the coordinates of the subject apartment unit to find the neighbour spaces. By considering a buffer around the edges of the unit, neighbour spaces are accurately detected. This method was implemented in ESRI ArcScene and three case studies were defined to test the efficiency of this approach. The results show that spaces are accurately detected in various complex scenarios. This approach can also be applied for other applications such as property management and disaster management in order to find the affected apartments around a defined space.

  13. A Clifford analysis approach to superspace

    International Nuclear Information System (INIS)

    Bie, H. de; Sommen, F.

    2007-01-01

    A new framework for studying superspace is given, based on methods from Clifford analysis. This leads to the introduction of both orthogonal and symplectic Clifford algebra generators, allowing for an easy and canonical introduction of a super-Dirac operator, a super-Laplace operator and the like. This framework is then used to define a super-Hodge coderivative, which, together with the exterior derivative, factorizes the Laplace operator. Finally both the cohomology of the exterior derivative and the homology of the Hodge operator on the level of polynomial-valued super-differential forms are studied. This leads to some interesting graphical representations and provides a better insight in the definition of the Berezin-integral

  14. Lead Farmers Approach in Disseminating Improved Tef Production ...

    African Journals Online (AJOL)

    Abate

    interview schedule and testing the tef variety on field plots of ten lead farmers ..... Product transporting from the farm ... partly explained by the fact that smallholder farmers cannot afford to purchase .... Agricultural Extension, Good Intentions.

  15. Analysis of microRNA profile of Anopheles sinensis by deep sequencing and bioinformatic approaches.

    Science.gov (United States)

    Feng, Xinyu; Zhou, Xiaojian; Zhou, Shuisen; Wang, Jingwen; Hu, Wei

    2018-03-12

    microRNAs (miRNAs) are small non-coding RNAs widely identified in many mosquitoes. They are reported to play important roles in development, differentiation and innate immunity. However, miRNAs in Anopheles sinensis, one of the Chinese malaria mosquitoes, remain largely unknown. We investigated the global miRNA expression profile of An. sinensis using Illumina Hiseq 2000 sequencing. Meanwhile, we applied a bioinformatic approach to identify potential miRNAs in An. sinensis. The identified miRNA profiles were compared and analyzed by two approaches. The selected miRNAs from the sequencing result and the bioinformatic approach were confirmed with qRT-PCR. Moreover, target prediction, GO annotation and pathway analysis were carried out to understand the role of miRNAs in An. sinensis. We identified 49 conserved miRNAs and 12 novel miRNAs by next-generation high-throughput sequencing technology. In contrast, 43 miRNAs were predicted by the bioinformatic approach, of which two were assigned as novel. Comparative analysis of miRNA profiles by two approaches showed that 21 miRNAs were shared between them. Twelve novel miRNAs did not match any known miRNAs of any organism, indicating that they are possibly species-specific. Forty miRNAs were found in many mosquito species, indicating that these miRNAs are evolutionally conserved and may have critical roles in the process of life. Both the selected known and novel miRNAs (asi-miR-281, asi-miR-184, asi-miR-14, asi-miR-nov5, asi-miR-nov4, asi-miR-9383, and asi-miR-2a) could be detected by quantitative real-time PCR (qRT-PCR) in the sequenced sample, and the expression patterns of these miRNAs measured by qRT-PCR were in concordance with the original miRNA sequencing data. The predicted targets for the known and the novel miRNAs covered many important biological roles and pathways indicating the diversity of miRNA functions. We also found 21 conserved miRNAs and eight counterparts of target immune pathway genes in An. sinensis

  16. Research approaches to the analysis of «man-production» relations

    Directory of Open Access Journals (Sweden)

    Liliya A. Otstavnova

    2014-01-01

    Full Text Available Objective to identify and describe research approaches used in the analysis of the relationship between humans and production. Methods in this work we have applied the methods of grouping describing and historical and logical method. Results basing on the characteristics of the main approaches used in the analysis of laquomanproductionraquo relations and taking into account the focus of the research it was established that the application of institutional systematic quantitative regulatory legislative structural functional and integrated approaches allows to pay equal attention to both production and man. Organic humanistic reproductive and situational approaches focus primarily on the man while economic procedural structural and marketing approaches focus on production. The distribution of each approach to a particular group is justified. Scientific novelty the author presents a classification of research approaches to the analysis of the laquomanproductionraquo relations system consisting of two subsystems. Each approach is given a detailed characteristic of both man and production that allows to evaluate the possibility of using these approaches and increasing the efficiency of this system research. Research approaches to the analysis of laquomanproductionraquo relations Practical value is manifested in the ability to optimize the use of research approaches to the analysis of the laquomanproductionraquo relations system to identify problems and ways to address them.

  17. Lead, arsenic, and copper content of crops grown on lead arsenate-treated and untreated soils

    Energy Technology Data Exchange (ETDEWEB)

    Chisholm, D

    1972-01-01

    Increased lead and arsenic concentrations in the surface soil (0-15 cm), resulting from applications of lead arsenate (PbHAs0/sub 1/), increased both lead and arsenic levels in crops grown on treated plots. The lead levels in some crops approached or exceeded the Canadian residue tolerance of 2.0 ppM. Lead arsenate soil treatments did not affect copper absorption by crops. On areas such as old orchard land contaminated with lead arsenate residues it may be advisable to ascertain crops, and also to determine the lead affinity and arsenic sensitivity of the plants to be grown.

  18. Regge analysis of diffractive and leading baryon structure functions from deep inelastic scattering

    International Nuclear Information System (INIS)

    Batista, M.; Covolan, R.J.M.; Montanha, J.

    2002-01-01

    In this paper we present a combined analysis of the H1 data on leading baryon and diffractive structure functions from DIS, which are handled as two components of the same semi-inclusive process. The available structure function data are analyzed in a series of fits in which three main exchanges are taken into account: the Pomeron, Reggeon, and pion. For each of these contributions, Regge factorization of the correspondent structure function is assumed. By this procedure, we extract information about the interface between the diffractive, Pomeron-dominated, region and the leading proton spectrum, which is mostly ruled by secondary exchanges. One of the main results is that the relative Reggeon contribution to the semi-inclusive structure function is much smaller than the one obtained from an analysis of the diffractive structure function alone

  19. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  20. Progress in engineering high strain lead-free piezoelectric ceramics

    International Nuclear Information System (INIS)

    Leontsev, Serhiy O; Eitel, Richard E

    2010-01-01

    Environmental concerns are strongly driving the need to replace the lead-based piezoelectric materials currently employed as multilayer actuators. The current review describes both compositional and structural engineering approaches to achieve enhanced piezoelectric properties in lead-free materials. The review of the compositional engineering approach focuses on compositional tuning of the properties and phase behavior in three promising families of lead-free perovskite ferroelectrics: the titanate, alkaline niobate and bismuth perovskites and their solid solutions. The 'structural engineering' approaches focus instead on optimization of microstructural features including grain size, grain orientation or texture, ferroelectric domain size and electrical bias field as potential paths to induce large piezoelectric properties in lead-free piezoceramics. It is suggested that a combination of both compositional and novel structural engineering approaches will be required in order to realize viable lead-free alternatives to current lead-based materials for piezoelectric actuator applications. (topical review)

  1. Progress in engineering high strain lead-free piezoelectric ceramics

    Science.gov (United States)

    Leontsev, Serhiy O; Eitel, Richard E

    2010-01-01

    Environmental concerns are strongly driving the need to replace the lead-based piezoelectric materials currently employed as multilayer actuators. The current review describes both compositional and structural engineering approaches to achieve enhanced piezoelectric properties in lead-free materials. The review of the compositional engineering approach focuses on compositional tuning of the properties and phase behavior in three promising families of lead-free perovskite ferroelectrics: the titanate, alkaline niobate and bismuth perovskites and their solid solutions. The ‘structural engineering’ approaches focus instead on optimization of microstructural features including grain size, grain orientation or texture, ferroelectric domain size and electrical bias field as potential paths to induce large piezoelectric properties in lead-free piezoceramics. It is suggested that a combination of both compositional and novel structural engineering approaches will be required in order to realize viable lead-free alternatives to current lead-based materials for piezoelectric actuator applications. PMID:27877343

  2. A NOVEL APPROACH FOR 3D NEIGHBOURHOOD ANALYSIS

    Directory of Open Access Journals (Sweden)

    S. Emamgholian

    2017-09-01

    Full Text Available Population growth and lack of land in urban areas have caused massive developments such as high rises and underground infrastructures. Land authorities in the international context recognizes 3D cadastres as a solution to efficiently manage these developments in complex cities. Although a 2D cadastre does not efficiently register these developments, it is currently being used in many jurisdictions for registering land and property information. Limitations in analysis and presentation are considered as examples of such limitations. 3D neighbourhood analysis by automatically finding 3D spaces has become an issue of major interest in recent years. Whereas the neighbourhood analysis has been in the focus of research, the idea of 3D neighbourhood analysis has rarely been addressed in 3 dimensional information systems (3D GIS analysis. In this paper, a novel approach for 3D neighbourhood analysis has been proposed by recording spatial and descriptive information of the apartment units and easements. This approach uses the coordinates of the subject apartment unit to find the neighbour spaces. By considering a buffer around the edges of the unit, neighbour spaces are accurately detected. This method was implemented in ESRI ArcScene and three case studies were defined to test the efficiency of this approach. The results show that spaces are accurately detected in various complex scenarios. This approach can also be applied for other applications such as property management and disaster management in order to find the affected apartments around a defined space.

  3. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-01-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  4. Spectrum analysis in lead spectrometer for isotopic fissile assay in used fuel

    International Nuclear Information System (INIS)

    Lee, Y.D.; Park, C.J.; Kim, H.D.; Song, K.C.

    2014-01-01

    The LSDS system is under development for analyzing isotopic fissile content applicable in a hot cell for the pyro process. The fuel assay area and nuclear material composition were selected for simulation. The source mechanism for efficient neutron generation was also determined. A neutron is produced at the Ta target by hitting it from accelerated electron. The parameters for an electron accelerator are being researched for cost effectiveness, easy maintenance, and compact size. The basic principle of LSDS is that isotopic fissile has its own fission structure below the unresolved resonance region. The source neutron interacts with a lead medium and produces continuous neutron energy, which generates dominant fission at each fissile. Therefore, a spectrum analysis is very important at a lead medium and fuel area for system working. The energy spectrum with respect to slowing down energy and the energy resolution were investigated in lead. A spectrum analysis was done by the existence of surrounding detectors. In particular, high resonance energy was considered. The spectrum was well organized at each slowing down energy and the energy resolution was acceptable to distinguish isotopic fissile fissions. Additionally, LSDS is applicable for the optimum design of spent fuel storage and management.The isotopic fissile content assay will increase the transparency and credibility for spent fuel storage and its re-utilization, as demanded internationally. (author)

  5. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  6. ANALYSIS OF THE APPROACH TO THE ATLANTIC FOREST IN HIGH SCHOOL BIOLOGY TEXTBOOKS

    Directory of Open Access Journals (Sweden)

    Nicácio Oliveira Freitas

    2017-03-01

    Full Text Available The textbooks are the main teaching tool for students and teachers. The analysis of these books enables point out several shortcomings in relation to the contents approach. Thus, the objective of this work was to analyze the approach to the Atlantic Forest, considered one of the most degraded environments of the world. A total of seven high school biology textbooks were analyzed, following an evaluation script with general information, biotic and abiotic factors, environmental conservation and anthropic action, which were considered as satisfactory or unsatisfactory in the textbooks evaluation. In general, the Atlantic Forest theme has been addressed by all assessed books, including specific topics, however, some aspects such as the use of images was made improperly, leading to misunderstandings about Atlantic forest. In addition, ecosystem dynamics, its components and the environmental impacts have not been addressed satisfactorily in the majority of the works assessed. In general, the theme Atlantic Forest has been addressed by all assessed books, including specific topics, however, many aspects of this theme presented problems in their approach: update, concepts, definitions and importance, and also presented problems in their illustration of the current situation of Atlantic Forest. Periodic revisions of these textbooks are of great importance to assure student formation that allows them to analyze and know the effects of their actions on the environment and to reflect on ways to alleviates them. Keywords: biology textbooks; ecosystem; contents analysis.

  7. Structural development and kinetic analysis of PbTiO3 powders processed at low-temperature via new sol-gel approach

    Science.gov (United States)

    Bel-Hadj-Tahar, Radhouane; Abboud, Mohamed

    2018-04-01

    The synthesis of crystalline lead titanate powder by a generic low-temperature sol-gel approach is developed. Acetoin was added as ligand, instead of the commonly used alkanolamines, to ensure total dissolution of the precursor compounds. The feasibility of the acetoin-Ti isopropoxide complex as a new precursor of PbTiO3 perovskite particles via sol-gel method has been demonstrated. No excess lead has been introduced. Nanometric PbTiO3 crystallites have been formed at 400 °C under atmospheric pressure from titanium isopropoxide and lead acetate in alcoholic solution by remarkably low activation energy of crystallization process of 90 kJ mol-1. The powders show tetragonal lattice and dendritic morphology. In addition to the effect of heat-treatment temperature, time, and atmosphere, the sol chemistry particularly influenced the phase composition, particle size, and particle morphology. The use of different ligands significantly modified powder morphology. The extent of the crystallization was quantitatively evaluated by differential thermal analysis and analyzed by Johnson-Mehl-Avrami approach. The crystallization followed two rate regimes depending on the interval of the crystallized fraction.

  8. Innovative approaches for improving maternal and newborn health--A landscape analysis.

    Science.gov (United States)

    Lunze, Karsten; Higgins-Steele, Ariel; Simen-Kapeu, Aline; Vesel, Linda; Kim, Julia; Dickson, Kim

    2015-12-17

    Essential interventions can improve maternal and newborn health (MNH) outcomes in low- and middle-income countries, but their implementation has been challenging. Innovative MNH approaches have the potential to accelerate progress and to lead to better health outcomes for women and newborns, but their added value to health systems remains incompletely understood. This study's aim was to analyze the landscape of innovative MNH approaches and related published evidence. Systematic literature review and descriptive analysis based on the MNH continuum of care framework and the World Health Organization health system building blocks, analyzing the range and nature of currently published MNH approaches that are considered innovative. We used 11 databases (MedLine, Web of Science, CINAHL, Cochrane, Popline, BLDS, ELDIS, 3ie, CAB direct, WHO Global Health Library and WHOLIS) as data source and extracted data according to our study protocol. Most innovative approaches in MNH are iterations of existing interventions, modified for contexts in which they had not been applied previously. Many aim at the direct organization and delivery of maternal and newborn health services or are primarily health workforce interventions. Innovative approaches also include health technologies, interventions based on community ownership and participation, and novel models of financing and policy making. Rigorous randomized trials to assess innovative MNH approaches are rare; most evaluations are smaller pilot studies. Few studies assessed intervention effects on health outcomes or focused on equity in health care delivery. Future implementation and evaluation efforts need to assess innovations' effects on health outcomes and provide evidence on potential for scale-up, considering cost, feasibility, appropriateness, and acceptability. Measuring equity is an important aspect to identify and target population groups at risk of service inequity. Innovative MNH interventions will need innovative

  9. Uranium-lead shielding for nuclear material transportation systems

    International Nuclear Information System (INIS)

    Lusk, E.C.; Miller, N.E.; Basham, S.J. Jr.

    1978-01-01

    The basis for the selection of shielding materials for spent fuel shipping containers is described with comments concerning the favorable and unfavorable aspects of steel, lead, and depleted uranium. A concept for a new type of material made of depleted uranium and lead is described which capitalizes on the best cask shielding characteristics of both materials. This cask shielding is made by filling the shielding cavity with pieces of depleted uranium and then backfilling the interstitial voids with lead. The lead would be bonded to the uranium and also to the cask shells if desired. Shielding density approaching 80 percent of that of solid uranium could be achieved, while a density of 65 percent is readily obtainable. This material should overcome the problems of the effect of lead melting in the fire accident, high thermal gradients at uranium-stainless steel interfaces and at a major reduction in cost over that of a solid uranium shielded cask. A development program is described to obtain information on the properties of the composite material to aid in design analysis and licensing and to define the fabrication techniques

  10. VOLUMETRIC LEAD ASSAY

    International Nuclear Information System (INIS)

    Ebadian, M.A.; Dua, S.K.; Roelant, David; Kumar, Sachin

    2001-01-01

    This report describes a system for handling and radioassay of lead, consisting of a robot, a conveyor, and a gamma spectrometer. The report also presents a cost-benefit analysis of options: radioassay and recycling lead vs. disposal as waste

  11. Common approach of risks analysis

    International Nuclear Information System (INIS)

    Noviello, L.; Naviglio, A.

    1996-01-01

    Although, following the resolutions of the High German Court, the protection level of the human beings is an objective which can change in time, it is obvious that it is an important point when there is a risk for the population. This is true more particularly for the industrial plants whose possible accidents could affect the population. The accidents risk analysis indicates that there is no conceptual difference between the risks of a nuclear power plant and those of the other industrial plants as chemical plants, the gas distribution system and the hydraulic dams. A legislation analysis induced by the Seveso Directive for the industrial risks give some important indications which should always be followed. This work analyses more particularly the legislative situation in different European countries and identifies some of the most important characteristics. Indeed, for most of the countries, the situation is different and it is a later difficulties source for nuclear power plants. In order to strengthen this reasoning, this paper presents some preliminary results of an analysis of a nuclear power plant following the approach of other industrial plants. In conclusion, it will be necessary to analyse again the risks assessment approach for nuclear power plants because the real protection level of human beings in a country is determined by the less regulated of the dangerous industrial plants existing at the surroundings. (O.M.)

  12. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  13. Extraction of SelectSecure leads compared to conventional pacing leads in patients with congenital heart disease and congenital atrioventricular block.

    Science.gov (United States)

    Shepherd, Emma; Stuart, Graham; Martin, Rob; Walsh, Mark A

    2015-06-01

    SelectSecure™ pacing leads (Medtronic Inc) are increasingly being used in pediatric patients and adults with structural congenital heart disease. The 4Fr lead is ideal for patients who may require lifelong pacing and can be advantageous for patients with complex anatomy. The purpose of this study was to compare the extraction of SelectSecure leads with conventional (stylette-driven) pacing leads in patients with structural congenital heart disease and congenital atrioventricular block. The data on lead extractions from pediatric and adult congenital heart disease (ACHD) patients from August 2004 to July 2014 at Bristol Royal Hospital for Children and the Bristol Heart Institute were reviewed. Multivariable regression analysis was used to determine whether conventional pacing leads were associated with a more difficult extraction process. A total of 57 patients underwent pacemaker lead extractions (22 SelectSecure, 35 conventional). No deaths occurred. Mean age at the time of extraction was 17.6 ± 10.5 years, mean weight was 47 ± 18 kg, and mean lead age was 5.6 ± 2.6 years (range 1-11 years). Complex extraction (partial extraction/femoral extraction) was more common in patients with conventional pacing leads at univariate (P < .01) and multivariate (P = .04) levels. Lead age was also a significant predictor of complex extraction (P < .01). SelectSecure leads can be successfully extracted using techniques that are used for conventional pacing leads. They are less likely to be partially extracted and are less likely to require extraction using a femoral approach compared with conventional pacing leads. Copyright © 2015 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  14. Arctic lead detection using a waveform mixture algorithm from CryoSat-2 data

    Directory of Open Access Journals (Sweden)

    S. Lee

    2018-05-01

    Full Text Available We propose a waveform mixture algorithm to detect leads from CryoSat-2 data, which is novel and different from the existing threshold-based lead detection methods. The waveform mixture algorithm adopts the concept of spectral mixture analysis, which is widely used in the field of hyperspectral image analysis. This lead detection method was evaluated with high-resolution (250 m MODIS images and showed comparable and promising performance in detecting leads when compared to the previous methods. The robustness of the proposed approach also lies in the fact that it does not require the rescaling of parameters (i.e., stack standard deviation, stack skewness, stack kurtosis, pulse peakiness, and backscatter σ0, as it directly uses L1B waveform data, unlike the existing threshold-based methods. Monthly lead fraction maps were produced by the waveform mixture algorithm, which shows interannual variability of recent sea ice cover during 2011–2016, excluding the summer season (i.e., June to September. We also compared the lead fraction maps to other lead fraction maps generated from previously published data sets, resulting in similar spatiotemporal patterns.

  15. An analysis of lead (Pb) from human hair samples (20-40 years of age) by atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    Gelsano, Flordeliza K.; Timing, Laurie D.

    2003-01-01

    This analysis of lead from human hair samples in five different groups namely scavengers from Payatas Quezon City, tricycle drivers, car shop workers, paint factory workers, and students from Polytechnic University of the Philippines. The people from Nagcarlan, Laguna represented as a ''base-line value'' or as a control group. The method applied was acid digestion using HNO 3 and HClO 4 then the samples were subjected to atomic absorption spectrophotometer. In terms of lead found from hair, the scavengers from Payatas Q.C. obtained high exposure of lead among the samples that were tested. The result of the analysis of concentration of lead was expressed in mg/L. (Authors)

  16. Discussion on safety analysis approach for sodium fast reactors

    International Nuclear Information System (INIS)

    Hong, Soon Joon; Choo, Yeon Joon; Suh, Nam Duk; Shin, Ahn Dong; Bae, Moo Hoon

    2012-01-01

    Utilization of nuclear energy is increasingly necessary not only because of the increasing energy consumption but also because of the controls on greenhouse emissions against global warming. To keep step with such demands, advanced reactors are now world widely under development with the aims of highly economical advances, and enhanced safety. Recently, further elaborating is encouraged on the research and development program for Generation IV (GEN IV) reactors, and in collaboration with other interested countries through the Generation IV International Forum (GIF). Sodium cooled Fast Reactor (SFR) is a strong contender amongst the GEN IV reactor concepts. Korea also takes part in that program and plans to construct demonstration reactor of SFR. SFR is under the development for a candidate of small modular reactors, for example, PRISM (Power Reactor Innovative Small Module). Understanding of safety analysis approach has also advanced by the demand of increasing comprehensive safety requirement. Reviewing the past development of the licensing and safety basis in the advanced reactors, such approaches seemed primarily not so satisfactory because the reference framework of licensing and safety analysis approach in the advanced reactors was always the one in water reactors. And, the framework is very plant specific one and thereby the advanced reactors and their frameworks don't look like a well assorted couple. Recently as a result of considerable advances in probabilistic safety assessment (PSA), risk informed approaches are increasingly applied together with some of the deterministic approaches like as the ones in water reactors. Technology neutral framework (TNF) can be said to be the utmost works of such risk informed approaches, even though an intensive assessment of the applicability has not been sufficiently accomplished. This study discusses the viable safety analysis approaches for the urgent application to the construction of pool type SFR. As discussed in

  17. Lead identification for the K-Ras protein: virtual screening and combinatorial fragment-based approaches

    Directory of Open Access Journals (Sweden)

    Pathan AAK

    2016-05-01

    Full Text Available Akbar Ali Khan Pathan,1,2,* Bhavana Panthi,3,* Zahid Khan,1 Purushotham Reddy Koppula,4–6 Mohammed Saud Alanazi,1 Sachchidanand,3 Narasimha Reddy Parine,1 Mukesh Chourasia3,* 1Genome Research Chair (GRC, Department of Biochemistry, College of Science, King Saud University, 2Integrated Gulf Biosystems, Riyadh, Kingdom of Saudi Arabia; 3Department of Pharmacoinformatics, National Institute of Pharmaceutical Education and Research, Hajipur, India; 4Department of Internal Medicine, School of Medicine, 5Harry S. Truman Memorial Veterans Affairs Hospital, 6Department of Radiology, School of Medicine, Columbia, MO, USA *These authors contributed equally to this work Objective: Kirsten rat sarcoma (K-Ras protein is a member of Ras family belonging to the small guanosine triphosphatases superfamily. The members of this family share a conserved structure and biochemical properties, acting as binary molecular switches. The guanosine triphosphate-bound active K-Ras interacts with a range of effectors, resulting in the stimulation of downstream signaling pathways regulating cell proliferation, differentiation, and apoptosis. Efforts to target K-Ras have been unsuccessful until now, placing it among high-value molecules against which developing a therapy would have an enormous impact. K-Ras transduces signals when it binds to guanosine triphosphate by directly binding to downstream effector proteins, but in case of guanosine diphosphate-bound conformation, these interactions get disrupted. Methods: In the present study, we targeted the nucleotide-binding site in the “on” and “off” state conformations of the K-Ras protein to find out suitable lead compounds. A structure-based virtual screening approach has been used to screen compounds from different databases, followed by a combinatorial fragment-based approach to design the apposite lead for the K-Ras protein. Results: Interestingly, the designed compounds exhibit a binding preference for the

  18. Analysis of stability and quench in HTS devices-New approaches

    International Nuclear Information System (INIS)

    Vysotsky, V.S.; Sytnikov, V.E.; Rakhmanov, A.L.; Ilyin, Y.

    2006-01-01

    R and D of HTS devices are in their full steam-more magnets and devices are developed with larger sizes. But analysis of their stability and quench was still old fashioned, based on normal zone determination, analysis of its appearance and propagation. Some peculiarities of HTS make this traditional, quite impractical and inconvenient approach to consideration of HTS devices stability and quench development using normal zone origination and propagation analysis. The novel approaches were developed that consider the HTS device as a cooled medium with non-linear parameters with no mentioning of 'superconductivity' in the analysis. The approach showed its effectiveness and convenience to analyze the stability and quench development in HTS devices. In this paper the analysis of difference between HTS and LTS quench, dependent on index n and specific heat comparison, is followed by the short approach descriptions and by the consequences from it for the HTS devices design. The further development of the method is presented for the analysis of long HTS objects where 'blow-up' regimes may happen. This is important for design and analysis of HTS power cables operations under overloading conditions

  19. REDUCING LEAD TIME USING FUZZY LOGIC AT JOB SHOP

    Directory of Open Access Journals (Sweden)

    EMİN GÜNDOĞAR

    2000-06-01

    Full Text Available One problem encountering at the job shop scheduling is minimum production size of machine is different from each another. This case increases lead time. A new approach was improved to reduce lead time. In this new approach, the parts, which materials are in stock and orders coming very frequently are assigned to machine to reduce lead time. Due the fact that there are a lot of machine and orders, it is possible to become so1ne probletns. In this paper, fuzzy logic is used to cope with this problem. New approach was simulated at the job sop that has owner 15 machinery and 50 orders. Simulation results showed that new approach reduced lead time between 27.89% and 32.36o/o

  20. Clarifying beliefs underlying hunter intentions to support a ban on lead shot

    Science.gov (United States)

    Schroeder, Susan A.; Fulton, David C.; Doncarlos, Kathy

    2016-01-01

    Shot from hunting adds toxic lead to environments worldwide. Existing lead shot regulations have been instituted with little understanding of hunter beliefs and attitudes. This study applied the Theory of Reasoned Action, using a multilevel, multivariate approach, to clarify how positive and negative beliefs relate to attitudes about a ban on lead shot. Structure coefficients and commonality analysis were employed to further examine relationships between beliefs and attitudes. Results suggest that while both positive and negative outcomes influence attitudes, positive outcomes were more influential for supporters and negative beliefs for opposers. Management may need to focus on the results from hunters who indicated that they would be unlikely to support a ban, as these hunters include those who may actively oppose additional efforts to regulate lead.

  1. Multiphased use of an X-MET 880 XRF to survey lead in soil at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Gianotto, D.F.; Anderson, I.R.

    1995-01-01

    An X-ray fluorescence spectrometer was used in a multiphased approach to analyze soil samples for lead contamination. The objectives of this investigation were to characterize the spatial distribution of lead contamination, identify two areas of surficial soil with elevated lead concentrations (hot-spots), and quantify subsurface soil contamination at the hot-spots to evaluate the vertical migration of lead. Phase I consisted of using non-site-specific standards to calibrate the XRF instrument to qualitatively and semi-quantitatively assess lead contamination (Type I XRF analysis). Phase III involved selecting soil samples for off-site SW-846 analysis and using the results to develop a calibration model based on site-specific calibration standards (SSCS). The XRF was used in Phase III to obtained quantitative results (Type II XRF analysis)

  2. Finite element analysis of vibration energy harvesting using lead-free piezoelectric materials: A comparative study

    Directory of Open Access Journals (Sweden)

    Anuruddh Kumar

    2014-06-01

    Full Text Available In this article, the performance of various piezoelectric materials is simulated for the unimorph cantilever-type piezoelectric energy harvester. The finite element method (FEM is used to model the piezolaminated unimorph cantilever structure. The first-order shear deformation theory (FSDT and linear piezoelectric theory are implemented in finite element simulations. The genetic algorithm (GA optimization approach is carried out to optimize the structural parameters of mechanical energy-based energy harvester for maximum power density and power output. The numerical simulation demonstrates the performance of lead-free piezoelectric materials in unimorph cantilever-based energy harvester. The lead-free piezoelectric material K0.5Na0.5NbO3-LiSbO3-CaTiO3 (2 wt.% has demonstrated maximum mean power and maximum mean power density for piezoelectric energy harvester in the ambient frequency range of 90–110 Hz. Overall, the lead-free piezoelectric materials of K0.5Na0.5NbO3-LiSbO3 (KNN-LS family have shown better performance than the conventional lead-based piezoelectric material lead zirconate titanate (PZT in the context of piezoelectric energy harvesting devices.

  3. Introduction to Real Analysis An Educational Approach

    CERN Document Server

    Bauldry, William C

    2011-01-01

    An accessible introduction to real analysis and its connection to elementary calculus Bridging the gap between the development and history of real analysis, Introduction to Real Analysis: An Educational Approach presents a comprehensive introduction to real analysis while also offering a survey of the field. With its balance of historical background, key calculus methods, and hands-on applications, this book provides readers with a solid foundation and fundamental understanding of real analysis. The book begins with an outline of basic calculus, including a close examination of problems illust

  4. Meta-Analysis for Sociology – A Measure-Driven Approach

    Science.gov (United States)

    Roelfs, David J.; Shor, Eran; Falzon, Louise; Davidson, Karina W.; Schwartz, Joseph E.

    2013-01-01

    Meta-analytic methods are becoming increasingly important in sociological research. In this article we present an approach for meta-analysis which is especially helpful for sociologists. Conventional approaches to meta-analysis often prioritize “concept-driven” literature searches. However, in disciplines with high theoretical diversity, such as sociology, this search approach might constrain the researcher’s ability to fully exploit the entire body of relevant work. We explicate a “measure-driven” approach, in which iterative searches and new computerized search techniques are used to increase the range of publications found (and thus the range of possible analyses) and to traverse time and disciplinary boundaries. We demonstrate this measure-driven search approach with two meta-analytic projects, examining the effects of various social variables on all-cause mortality. PMID:24163498

  5. Biosorption of lead phosphates by lead-tolerant bacteria as a mechanism for lead immobilization.

    Science.gov (United States)

    Rodríguez-Sánchez, Viridiana; Guzmán-Moreno, Jesús; Rodríguez-González, Vicente; Flores-de la Torre, Juan Armando; Ramírez-Santoyo, Rosa María; Vidales-Rodríguez, Luz Elena

    2017-08-01

    The study of metal-tolerant bacteria is important for bioremediation of contaminated environments and development of green technologies for material synthesis due to their potential to transform toxic metal ions into less toxic compounds by mechanisms such as reduction, oxidation and/or sequestration. In this study, we report the isolation of seven lead-tolerant bacteria from a metal-contaminated site at Zacatecas, México. The bacteria were identified as members of the Staphylococcus and Bacillus genera by microscopic, biochemical and 16S rDNA analyses. Minimal inhibitory concentration of these isolates was established between 4.5 and 7.0 mM of Pb(NO 3 ) 2 in solid and 1.0-4.0 mM of Pb(NO 3 ) 2 in liquid media. A quantitative analysis of the lead associated to bacterial biomass in growing cultures, revealed that the percentage of lead associated to biomass was between 1 and 37% in the PbT isolates. A mechanism of complexation/biosorption of lead ions as inorganic phosphates (lead hydroxyapatite and pyromorphite) in bacterial biomass, was determined by Fourier transform infrared spectroscopy and X-ray diffraction analyses. Thus, the ability of the lead-tolerant isolates to transform lead ions into stable and highly insoluble lead minerals make them potentially useful for immobilization of lead in mining waste.

  6. A statistical approach to plasma profile analysis

    International Nuclear Information System (INIS)

    Kardaun, O.J.W.F.; McCarthy, P.J.; Lackner, K.; Riedel, K.S.

    1990-05-01

    A general statistical approach to the parameterisation and analysis of tokamak profiles is presented. The modelling of the profile dependence on both the radius and the plasma parameters is discussed, and pertinent, classical as well as robust, methods of estimation are reviewed. Special attention is given to statistical tests for discriminating between the various models, and to the construction of confidence intervals for the parameterised profiles and the associated global quantities. The statistical approach is shown to provide a rigorous approach to the empirical testing of plasma profile invariance. (orig.)

  7. An international pooled analysis for obtaining a benchmark dose for environmental lead exposure in children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack...... of a clear threshold presents a challenge to the identification of an acceptable level of exposure. The benchmark dose (BMD) is defined as the dose that leads to a specific known loss. As an alternative to elusive thresholds, the BMD is being used increasingly by regulatory authorities. Using the pooled data...... yielding lower confidence limits (BMDLs) of about 0.1-1.0 μ g/dL for the dose leading to a loss of one IQ point. We conclude that current allowable blood lead concentrations need to be lowered and further prevention efforts are needed to protect children from lead toxicity....

  8. Chemical behavior of residential lead in urban yards in the United States

    International Nuclear Information System (INIS)

    Elless, M.P.; Bray, C.A.; Blaylock, M.J.

    2007-01-01

    Long after federal regulations banned the use of lead-based paints and leaded gasoline, residential lead remains a persistent challenge. Soil lead is a significant contributor to this hazard and an improved understanding of physicochemical properties is likely to be useful for in situ abatement techniques such as phytoremediation and chemical stabilization. A laboratory characterization of high-lead soils collected from across the United States shows that the lead contaminants were concentrating in the silt and clay fractions, in the form of discrete particles of lead, as observed by scanning electron microscopy coupled with energy dispersive X-ray analysis. Soil lead varied widely in its solubility behavior as assessed by sequential and chelate extractions. Because site-specific factors (e.g., soil pH, texture, etc.) are believed to govern the solubility of the lead, understanding the variability in these characteristics at each site is necessary to optimize in situ remediation or abatement of these soils. - Site-specific solubility behavior of lead in soils has important implications for the selection of remediation approaches

  9. Lead user projects in practice

    DEFF Research Database (Denmark)

    Brem, Alexander; Gutstein, Adele

    2018-01-01

    Earlier research on the lead user method is focused on individual case studies and how the method was applied in a specific context. In this paper, we take a broader approach, analyzing a sample of 24 lead user projects, which included working with 188 lead users. These projects were analyzed....... Moreover, crowdsourcing contests and netnography proved to be of significant value for the need, trend, and lead user identification phases. This paper concludes by discussing theoretical and practical implications, the limitations of this study, and recommendations for future studies....

  10. An analysis of lead (Pb) from human hair samples (20-40 years of age) by atomic absorption spectrophotometry

    Energy Technology Data Exchange (ETDEWEB)

    Gelsano, Flordeliza K; Timing, Laurie D

    2003-02-17

    This analysis of lead from human hair samples in five different groups namely scavengers from Payatas Quezon City, tricycle drivers, car shop workers, paint factory workers, and students from Polytechnic University of the Philippines. The people from Nagcarlan, Laguna represented as a ''base-line value'' or as a control group. The method applied was acid digestion using HNO{sub 3} and HClO{sub 4} then the samples were subjected to atomic absorption spectrophotometer. In terms of lead found from hair, the scavengers from Payatas Q.C. obtained high exposure of lead among the samples that were tested. The result of the analysis of concentration of lead was expressed in mg/L. (Authors)

  11. Features of an emergency heat-conducting path in reactors about lead-bismuth and lead heat-carriers

    International Nuclear Information System (INIS)

    Beznosov, A.V.; Bokova, T.A.; Molodtsov, A.A.

    2006-01-01

    The reactor emergency heat removal systems should transfer heat from the surface of reactor core fuel element claddings to the primary circuit followed by heat transfer to the environment. One suggests three design approaches for emergency heat removal systems in lead-bismuth and lead cooled reactor circuits that take account of the peculiar nature of their features. Application of the discussed systems for emergency heat removal improves safety of lead-bismuth and lead cooled reactor plants [ru

  12. Bulk diffusion and solubility of silver and nickel in lead, lead-silver and lead-nickel solid solutions

    International Nuclear Information System (INIS)

    Amenzou-Badrour, H.; Moya, G.; Bernardini, J.

    1988-01-01

    The results of a study of solubility and bulk diffusion of /sup 110/Ag and /sup 63/Ni in lead, lead-silver and lead-nickel solid solutions in the temperature range 220 to 88 0 C are reported. Owing to the low solubility of silver and nickel in lead, Fick's solution corresponding to the boundary condition of a constant concentration of solute at the surface has been used. Depth profile concentration analysis suggests a fundamental difference between the diffusion mechanisms of silver and nickel. Since silver penetration profiles in pure lead give diffusion coefficients independent of the penetration depth and silver concentration, it is suggested that slight decreases of silver diffusivity in lead-silver solid solutions have no significance. This implies that the interstitial silver atoms do not associate significantly with each other to form Ag-Ag dimers. In contrast, different behaviors of /sup 63/Ni depth profile concentration in pure lead and saturated PbNi solid solutions agree with a Ni-Ni interaction leading to the formation of less mobile dimers near the surface in pure lead

  13. Leading neutron production at HERA in the color dipole approach

    Directory of Open Access Journals (Sweden)

    Carvalho F.

    2016-01-01

    Full Text Available In this work we study leading neutron production in e + p → e + n + X collisions at high energies and calculate the Feynman xL distribution of these neutrons. The differential cross section is written in terms of the pion flux and of the photon-pion total cross section. We describe this process using the color dipole formalism and, assuming the validity of the additive quark model, we relate the dipole-pion with the well determined dipoleproton cross section. In this formalism we can estimate the impact of the QCD dynamics at high energies as well as the contribution of gluon saturation effects to leading neutron production. With the parameters constrained by other phenomenological information, we are able to reproduce the basic features of the recently released H1 leading neutron spectra.

  14. Approaches to Enhance Sensemaking for Intelligence Analysis

    National Research Council Canada - National Science Library

    McBeth, Michael

    2002-01-01

    ..., and to apply persuasion skills to interact more productively with others. Each approach is explained from a sensemaking perspective and linked to Richard Heuer's Psychology of Intelligence Analysis...

  15. Spectral analysis of epicardial 60-lead electrograms in dogs with 4-week-old myocardial infarction.

    Science.gov (United States)

    Hosoya, Y; Ikeda, K; Komatsu, T; Yamaki, M; Kubota, I

    2001-01-01

    There were few studies on the spectral analysis of multiple-lead epicardial electrograms in chronic myocardial infarction. Spectral analysis of multi-lead epicardial electrograms was performed in 6 sham-operated dogs (N group) and 8 dogs with 4-week-old myocardial infarction (MI group). Four weeks after the ligation of left anterior descending coronary artery, fast Fourier transform was performed on 60-lead epicardial electrograms, and then inverse transform was performed on 5 frequency ranges from 0 to 250 Hz. From the QRS onset to QRS offset, the time integration of unsigned value of reconstructed waveform was calculated and displayed as AQRS maps. On 0-25 Hz AQRS map, there was no significant difference between the 2 groups. In the frequency ranges of 25-250 Hz, MI group had significantly smaller AQRS values than N group solely in the infarct zone. It was shown that high frequency potentials (25-250 Hz) within QRS complex were reduced in the infarct zone.

  16. In search of new lead compounds for trypanosomiasis drug design: A protein structure-based linked-fragment approach

    Science.gov (United States)

    Verlinde, Christophe L. M. J.; Rudenko, Gabrielle; Hol, Wim G. J.

    1992-04-01

    A modular method for pursuing structure-based inhibitor design in the framework of a design cycle is presented. The approach entails four stages: (1) a design pathway is defined in the three-dimensional structure of a target protein; (2) this pathway is divided into subregions; (3) complementary building blocks, also called fragments, are designed in each subregion; complementarity is defined in terms of shape, hydrophobicity, hydrogen bond properties and electrostatics; and (4) fragments from different subregions are linked into potential lead compounds. Stages (3) and (4) are qualitatively guided by force-field calculations. In addition, the designed fragments serve as entries for retrieving existing compounds from chemical databases. This linked-fragment approach has been applied in the design of potentially selective inhibitors of triosephosphate isomerase from Trypanosoma brucei, the causative agent of sleeping sickness.

  17. Specificities of reactor coolant pumps units with lead and lead-bismuth coolant

    International Nuclear Information System (INIS)

    Beznosov, A.V.; Anotonenkov, M.A.; Bokov, P.A.; Baranova, V.S.; Kustov, M.S.

    2009-01-01

    The analysis results of impact of lead and lead-bismuth coolants specific properties on the coolants flow features in flow channels of the main and auxiliary circulating pumps are presented. Impossibility of cavitation initiation in flow channels of vane pumps pumping lead and lead-bismuth coolants was demonstrated. The experimental research results of discontinuity of heavy liquid metal coolant column were presented and conditions of gas cavitation initiation in coolant flow were discussed. Invalidity of traditional calculation methods of water and sodium coolants circulation pumps calculations for lead and lead-bismuth coolants circulation pumps was substantiated [ru

  18. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  19. Gamma radiation shielding analysis of lead-flyash concretes

    International Nuclear Information System (INIS)

    Singh, Kanwaldeep; Singh, Sukhpal; Dhaliwal, A.S.; Singh, Gurmel

    2015-01-01

    Six samples of lead-flyash concrete were prepared with lead as an admixture and by varying flyash content – 0%, 20%, 30%, 40%, 50% and 60% (by weight) by replacing cement and keeping constant w/c ratio. Different gamma radiation interaction parameters used for radiation shielding design were computed theoretically and measured experimentally at 662 keV, 1173 keV and 1332 keV gamma radiation energy using narrow transmission geometry. The obtained results were compared with ordinary-flyash concretes. The radiation exposure rate of gamma radiation sources used was determined with and without lead-flyash concretes. - Highlights: • Concrete samples with lead as admixture were casted with flyash replacing 0%, 20%, 30%, 40%, 50% and 60% of cement content (by weight). • Gamma radiation shielding parameters of concretes for different gamma ray sources were measured. • The attenuation results of lead-flyash concretes were compared with the results of ordinary flyash concretes

  20. The core design of ALFRED, a demonstrator for the European lead-cooled reactors

    International Nuclear Information System (INIS)

    Grasso, G.; Petrovich, C.; Mattioli, D.; Artioli, C.; Sciora, P.; Gugiu, D.; Bandini, G.; Bubelis, E.; Mikityuk, K.

    2014-01-01

    Highlights: • The design for the lead fast reactor is conceived in a comprehensive approach. • Neutronic, thermal-hydraulic, and transient analyses show promising results. • The system is designed to withstand even design extension conditions accidents. • Activation products in lead, including polonium, are evaluated. - Abstract: The European Union has recently co-funded the LEADER (Lead-cooled European Advanced DEmonstration Reactor) project, in the frame of which the preliminary designs of an industrial size lead-cooled reactor (1500 MW th ) and of its demonstrator reactor (300 MW th ) were developed. The latter is called ALFRED (Advanced Lead-cooled Fast Reactor European Demonstrator) and its core, as designed and characterized in the project, is presented here. The core parameters have been fixed in a comprehensive approach taking into account the main technological constraints and goals of the system from the very beginning: the limiting temperature of the clad and of the fuel, the Pu enrichment, the achievement of a burn-up of 100 GWd/t, the respect of the integrity of the system even in design extension conditions (DEC). After the general core design has been fixed, it has been characterized from the neutronic point of view by two independent codes (MCNPX and ERANOS), whose results are compared. The power deposition and the reactivity coefficient calculations have been used respectively as input for the thermal-hydraulic analysis (TRACE, CFD and ANTEO codes) and for some preliminary transient calculations (RELAP, CATHARE and SIM-LFR codes). The results of the lead activation analysis are also presented (FISPACT code). Some issues of the core design are to be reviewed and improved, uncertainties are still to be evaluated, but the verifications performed so far confirm the promising safety features of the lead-cooled fast reactors

  1. The core design of ALFRED, a demonstrator for the European lead-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Grasso, G., E-mail: giacomo.grasso@enea.it [ENEA (Italian National Agency for New Technologies, Energy and Sustainable Economic Development), via Martiri di Monte Sole, 4, 40129 Bologna (Italy); Petrovich, C., E-mail: carlo.petrovich@enea.it [ENEA (Italian National Agency for New Technologies, Energy and Sustainable Economic Development), via Martiri di Monte Sole, 4, 40129 Bologna (Italy); Mattioli, D., E-mail: davide.mattioli@enea.it [ENEA (Italian National Agency for New Technologies, Energy and Sustainable Economic Development), via Martiri di Monte Sole, 4, 40129 Bologna (Italy); Artioli, C., E-mail: carlo.artioli@enea.it [ENEA (Italian National Agency for New Technologies, Energy and Sustainable Economic Development), via Martiri di Monte Sole, 4, 40129 Bologna (Italy); Sciora, P., E-mail: pierre.sciora@cea.fr [CEA (Alternative Energies and Atomic Energy Commission), DEN, DER, 13108 St Paul lez Durance (France); Gugiu, D., E-mail: daniela.gugiu@nuclear.ro [RATEN-ICN (Institute for Nuclear Research), Cod 115400 Mioveni, Str. Campului, 1, Jud. Arges (Romania); Bandini, G., E-mail: giacomino.bandini@enea.it [ENEA (Italian National Agency for New Technologies, Energy and Sustainable Economic Development), via Martiri di Monte Sole, 4, 40129 Bologna (Italy); Bubelis, E., E-mail: evaldas.bubelis@kit.edu [KIT (Karlsruhe Institute of Technology), Institute for Neutron Physics and Reactor Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Mikityuk, K., E-mail: konstantin.mikityuk@psi.ch [PSI (Paul Scherrer Institute), OHSA/D11, 5232 Villigen PSI (Switzerland)

    2014-10-15

    Highlights: • The design for the lead fast reactor is conceived in a comprehensive approach. • Neutronic, thermal-hydraulic, and transient analyses show promising results. • The system is designed to withstand even design extension conditions accidents. • Activation products in lead, including polonium, are evaluated. - Abstract: The European Union has recently co-funded the LEADER (Lead-cooled European Advanced DEmonstration Reactor) project, in the frame of which the preliminary designs of an industrial size lead-cooled reactor (1500 MW{sub th}) and of its demonstrator reactor (300 MW{sub th}) were developed. The latter is called ALFRED (Advanced Lead-cooled Fast Reactor European Demonstrator) and its core, as designed and characterized in the project, is presented here. The core parameters have been fixed in a comprehensive approach taking into account the main technological constraints and goals of the system from the very beginning: the limiting temperature of the clad and of the fuel, the Pu enrichment, the achievement of a burn-up of 100 GWd/t, the respect of the integrity of the system even in design extension conditions (DEC). After the general core design has been fixed, it has been characterized from the neutronic point of view by two independent codes (MCNPX and ERANOS), whose results are compared. The power deposition and the reactivity coefficient calculations have been used respectively as input for the thermal-hydraulic analysis (TRACE, CFD and ANTEO codes) and for some preliminary transient calculations (RELAP, CATHARE and SIM-LFR codes). The results of the lead activation analysis are also presented (FISPACT code). Some issues of the core design are to be reviewed and improved, uncertainties are still to be evaluated, but the verifications performed so far confirm the promising safety features of the lead-cooled fast reactors.

  2. The effect of non-aqueous solvents on spectrophotometric analysis of lead (II)

    International Nuclear Information System (INIS)

    Ramadan, A.A.; Bahbouh, M.; Kamuah, M.

    1992-01-01

    The effect of the following non-aqueous solvents: Methanol, Ethanol, Propanol, iso-propanol, dimethylsulfoxide, dimethylformamide and acetonitrile on spectrophotometric analysis of lead (II) was studied. One absorption peak at range 220-340 nm was observed. The values of maximum wave length (λ max ) and maximum molar absorptivity coefficient (ε max ) vary in accordance with the above solvents and the concentration of HC1. the analytical curves, A=f(C Pb 2+ ), for the determination of lead (II) in presence 5 M HC1 (in methanol) and 7 M HC1 (in other solvents) showed linear proportionality over the concentration range 2.5x10 -5 - 2.0x10 -4 M Pb 2+ . (author). 16 Refs., 4 figs., 2 Tabs

  3. Multivariate calibration in Laser-Induced Breakdown Spectroscopy quantitative analysis: The dangers of a 'black box' approach and how to avoid them

    Science.gov (United States)

    Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.

    2018-06-01

    The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.

  4. Kinetics of oil saponification by lead salts in ancient preparations of pharmaceutical lead plasters and painting lead mediums.

    Science.gov (United States)

    Cotte, M; Checroun, E; Susini, J; Dumas, P; Tchoreloff, P; Besnard, M; Walter, Ph

    2006-12-15

    Lead soaps can be found in archaeological cosmetics as well as in oil paintings, as product of interactions of lead salts with oil. In this context, a better understanding of the formation of lead soaps allows a follow-up of the historical evolution of preparation recipes and provides new insights into conservation conditions. First, ancient recipes of both pharmaceutical lead plasters and painting lead mediums, mixtures of oil and lead salts, were reconstructed. The ester saponification by lead salts is determined by the preparation parameters which were quantified by FT-IR spectrometry. In particular, ATR/FT-IR spectrometer was calibrated by the standard addition method to quantitatively follow the kinetics of this reaction. The influence of different parameters such as temperature, presence of water and choice of lead salts was assessed: the saponification is clearly accelerated by water and heating. This analysis provides chemical explanations to the historical evolution of cosmetic and painting preparation recipes.

  5. Electrical properties of a novel lead alkoxide precursor: Lead glycolate

    International Nuclear Information System (INIS)

    Tangboriboon, Nuchnapa; Pakdeewanishsukho, Kittikhun; Jamieson, Alexander; Sirivat, Anuvat; Wongkasemjit, Sujitra

    2006-01-01

    The reaction of lead acetate trihydrate Pb(CH 3 COO) 2 .3H 2 O and ethylene glycol, using triethylenetetramine (TETA) as a catalyst, provides in one step access to a polymer-like precursor of lead glycolate [-PbOCH 2 CH 2 O-]. On the basis of high-resolution mass spectroscopy, chemical analysis composition, FTIR, 13 C-solid state NMR and TGA, the lead glycolate precursor can be identified as a trimer structure. The FTIR spectrum demonstrates the characteristics of lead glycolate; the peaks at 1086 and 1042 cm -1 can be assigned to the C-O-Pb stretchings. The 13 C-solid state NMR spectrum gives notably only one peak at 68.639 ppm belonging to the ethylene glycol ligand. The phase transformations of lead glycolate and lead acetate trihydrate to lead oxide, their microstructures, and electrical properties were found to vary with increasing temperature. The lead glycolate precursor has superior electrical properties relative to those of lead acetate trihydrate, suggesting that the lead glycolate precursor can possibly be used as a starting material for producing electrical and semiconducting ceramics, viz. ferroelectric, anti-ferroelectric, and piezoelectric materials

  6. Preparing a Safety Analysis Report using the building block approach

    International Nuclear Information System (INIS)

    Herrington, C.C.

    1990-01-01

    The credibility of the applicant in a licensing proceeding is severely impacted by the quality of the license application, particularly the Safety Analysis Report. To ensure the highest possible credibility, the building block approach was devised to support the development of a quality Safety Analysis Report. The approach incorporates a comprehensive planning scheme that logically ties together all levels of the investigation and provides the direction necessary to prepare a superior Safety Analysis Report

  7. Frequency domain analysis and design of nonlinear systems based on Volterra series expansion a parametric characteristic approach

    CERN Document Server

    Jing, Xingjian

    2015-01-01

    This book is a systematic summary of some new advances in the area of nonlinear analysis and design in the frequency domain, focusing on the application oriented theory and methods based on the GFRF concept, which is mainly done by the author in the past 8 years. The main results are formulated uniformly with a parametric characteristic approach, which provides a convenient and novel insight into nonlinear influence on system output response in terms of characteristic parameters and thus facilitate nonlinear analysis and design in the frequency domain.  The book starts with a brief introduction to the background of nonlinear analysis in the frequency domain, followed by recursive algorithms for computation of GFRFs for different parametric models, and nonlinear output frequency properties. Thereafter the parametric characteristic analysis method is introduced, which leads to the new understanding and formulation of the GFRFs, and nonlinear characteristic output spectrum (nCOS) and the nCOS based analysis a...

  8. Three-stage method for interpretation of uranium-lead isotopic data

    International Nuclear Information System (INIS)

    Nejmark, L.A.; Ovchinnikova, G.V.; Levchenkov, O.A.

    1982-01-01

    Three-dimensional approach for the iterpretation of uranium-lead isoto e ratios in pnatural systems, development of which corresponds to three stages, has been considered. In the framework of the three-stage model two cases, differing in the character of uranium-lead systems violation at the beginning of the third stage, are discussed. The first case corresponds to uranium addition or lead substraction, and the second one - to addition of lead of unknown isotopic content. Three-stage approach permits without amending the isotopic content of lead captured during crystallization to calculated the beginning of the second and third stages of uranium-lead systems development and to evaluate parameters of lead added to the system. Concrete examples of interpretation of uranium-lead isotopic ratios in minerals and rock samples as a whole both of the terrestrial and cosmic origin are considered. Possibilities and limitations of the three-stage approach are analyzed and directions of further development are outlined

  9. Optimal left ventricular lead position assessed with phase analysis on gated myocardial perfusion SPECT

    International Nuclear Information System (INIS)

    Boogers, Mark J.; Chen, Ji; Garcia, Ernest V.; Bommel, Rutger J. van; Borleffs, C.J.W.; Schalij, Martin J.; Wall, Ernst E. van der; Bax, Jeroen J.; Dibbets-Schneider, Petra; Hiel, Bernies van der; Younis, Imad Al

    2011-01-01

    The aim of the current study was to evaluate the relationship between the site of latest mechanical activation as assessed with gated myocardial perfusion SPECT (GMPS), left ventricular (LV) lead position and response to cardiac resynchronization therapy (CRT). The patient population consisted of consecutive patients with advanced heart failure in whom CRT was currently indicated. Before implantation, 2-D echocardiography and GMPS were performed. The echocardiography was performed to assess LV end-systolic volume (LVESV), LV end-diastolic volume (LVEDV) and LV ejection fraction (LVEF). The site of latest mechanical activation was assessed by phase analysis of GMPS studies and related to LV lead position on fluoroscopy. Echocardiography was repeated after 6 months of CRT. CRT response was defined as a decrease of ≥15% in LVESV. Enrolled in the study were 90 patients (72% men, 67±10 years) with advanced heart failure. In 52 patients (58%), the LV lead was positioned at the site of latest mechanical activation (concordant), and in 38 patients (42%) the LV lead was positioned outside the site of latest mechanical activation (discordant). CRT response was significantly more often documented in patients with a concordant LV lead position than in patients with a discordant LV lead position (79% vs. 26%, p<0.01). After 6 months, patients with a concordant LV lead position showed significant improvement in LVEF, LVESV and LVEDV (p<0.05), whereas patients with a discordant LV lead position showed no significant improvement in these variables. Patients with a concordant LV lead position showed significant improvement in LV volumes and LV systolic function, whereas patients with a discordant LV lead position showed no significant improvements. (orig.)

  10. Analysis of (n, 2n) multiplication in lead

    International Nuclear Information System (INIS)

    Segev, M.

    1984-01-01

    Lead is being considered as a possible amplifier of neutrons for fusion blankets. A simple one-group model of neutron multiplications in Pb is presented. Given the 14 MeV neutron cross section on Pb, the model predicts the multiplication. Given measured multiplications, the model enables the determination of the (n, 2n) and transport cross sections. Required for the model are: P-the collision probability for source neutrons in the Pb body-and W- an average collision probability for non-virgin, non-degraded neutrons. In simple geometries, such as a source in the center of a spherical shell, P and an approximate W can be expressed analytically in terms of shell dimensions and the Pb transport cross section. The model was applied to Takahashi's measured multiplications in Pb shells in order to understand the apparent very high multiplicative power of Pb. The results of the analysis are not consistent with basic energy-balance and cross section magnitude constraints in neutron interaction theory. (author)

  11. A model-based prognostic approach to predict interconnect failure using impedance analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Dae Il; Yoon, Jeong Ah [Dept. of System Design and Control Engineering. Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2016-10-15

    The reliability of electronic assemblies is largely affected by the health of interconnects, such as solder joints, which provide mechanical, electrical and thermal connections between circuit components. During field lifecycle conditions, interconnects are often subjected to a DC open circuit, one of the most common interconnect failure modes, due to cracking. An interconnect damaged by cracking is sometimes extremely hard to detect when it is a part of a daisy-chain structure, neighboring with other healthy interconnects that have not yet cracked. This cracked interconnect may seem to provide a good electrical contact due to the compressive load applied by the neighboring healthy interconnects, but it can cause the occasional loss of electrical continuity under operational and environmental loading conditions in field applications. Thus, cracked interconnects can lead to the intermittent failure of electronic assemblies and eventually to permanent failure of the product or the system. This paper introduces a model-based prognostic approach to quantitatively detect and predict interconnect failure using impedance analysis and particle filtering. Impedance analysis was previously reported as a sensitive means of detecting incipient changes at the surface of interconnects, such as cracking, based on the continuous monitoring of RF impedance. To predict the time to failure, particle filtering was used as a prognostic approach using the Paris model to address the fatigue crack growth. To validate this approach, mechanical fatigue tests were conducted with continuous monitoring of RF impedance while degrading the solder joints under test due to fatigue cracking. The test results showed the RF impedance consistently increased as the solder joints were degraded due to the growth of cracks, and particle filtering predicted the time to failure of the interconnects similarly to their actual timesto- failure based on the early sensitivity of RF impedance.

  12. Delayed visual maturation is not from lead pollution

    International Nuclear Information System (INIS)

    Gulson, B.L.; Howarth, D.

    2000-01-01

    Delayed visual maturation, a term introduced by Illingworth (1961), was first described by Beauvieux in 1926. Patients with delayed visual maturation are found to be blind soon after birth and, if it is the only defect, the vision improves by the time the child is 6 months of age. Since first described, numerous cases have been reported but so far no cause has been found. The diagnosis of three infants with delayed visual maturation in a period of 19 months, in the lead mining community of Broken Hill, raised the possibility of a common cause. Two of the families lived in the most heavily polluted area of the town and were renovating their lead-contaminated houses while the wife was pregnant. The diagnosis of delayed visual maturation was made on the following characteristics:(1) poor visual attention at or soon after birth; (2) normal ocular examination; (3) visual response occurring by 5 to 6 months of age. Once the diagnosis was established, the parents were requested to retain any teeth the children lost for lead isotope and lead concentration analysis. Gulson and Wilson (1994) and Gulson (1996) demonstrated the use of the lead isotope technique, combined with the well-established histology of teeth, in evaluating in utero and early childhood lead exposure from slices of deciduous teeth. In this approach, analysis of the enamel provides evidence of in utero exposure. Analysis of dentine provides evidence of exposure during the early childhood years, when hand-to-mouth activity is usually an important contributor to lead body burden, and potentially up to the time of tooth loss. The lead isotope technique uses the four isotopes of lead. Three are the stable end products of radioactive decay of uranium and thorium: 238 U to 206 Pb, 235 U to 207 Pb, and 232 Th to 208 Pb. The abundance of the fourth, 204 Pb, has been essentially constant since the Earth began and this isotope is commonly used as a reference isotope. Because three isotopes of lead are produced by

  13. The Analysis on Leading industries in Central Java Province

    Directory of Open Access Journals (Sweden)

    Setyani Irmawati

    2016-06-01

    Full Text Available The purpose of this research is for identifying the types of industriesthat become leading industries in Central Java Province. The methods, used are LQ (SLQ and DLQ and Shift Share. The result of this research shows that the the leading industries in Central Java Province are beverage industry, tobacco processing industry, textile industry, apparel industry, wood industry, printing industry, furniture industry and other processing industries.In the future, the development of the industry should not only focus on the leading industries  but also onnon-leading industries, so that the non leading industries will not be left behind.

  14. Microscopic saw mark analysis: an empirical approach.

    Science.gov (United States)

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  15. Introduction to audio analysis a MATLAB approach

    CERN Document Server

    Giannakopoulos, Theodoros

    2014-01-01

    Introduction to Audio Analysis serves as a standalone introduction to audio analysis, providing theoretical background to many state-of-the-art techniques. It covers the essential theory necessary to develop audio engineering applications, but also uses programming techniques, notably MATLAB®, to take a more applied approach to the topic. Basic theory and reproducible experiments are combined to demonstrate theoretical concepts from a practical point of view and provide a solid foundation in the field of audio analysis. Audio feature extraction, audio classification, audio segmentation, au

  16. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  17. Extraction of lead from waste CRT funnel glass by generating lead sulfide - An approach for electronic waste management.

    Science.gov (United States)

    Hu, Biao; Hui, Wenlong

    2017-09-01

    Waste cathode ray tube (CRT) funnel glass is the key and difficult points in waste electrical and electronic equipment (WEEE) disposal. In this paper, a novel and effective process for the detoxification and reutilization of waste CRT funnel glass was developed by generating lead sulfide precipitate via a high-temperature melting process. The central function in this process was the generation of lead sulfide, which gathered at the bottom of the crucible and was then separated from the slag. Sodium carbonate was used as a flux and reaction agent, and sodium sulfide was used as a precipitating agent. The experimental results revealed that the lead sulfide recovery rate initially increased with an increase in the amount of added sodium carbonate, the amount of sodium sulfide, the temperature, and the holding time and then reached an equilibrium value. The maximum lead sulfide recovery rate was approximately 93%, at the optimum sodium carbonate level, sodium sulfide level, temperature, and holding time of 25%, 8%, 1200°C, and 2h, respectively. The glass slag can be made into sodium and potassium silicate by hydrolysis in an environmental and economical process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Determination of lead in whole blood: Comparison of the LeadCare blood lead testing system with zeeman longitudinal electrothermal atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Pineau, A.; Viallefont, A.; Fauconneau, B.; Rafael, M.; Guillard, O.

    2002-01-01

    This study compares the efficiency of blood lead level analysis by graphite furnace atomic absorption spectrometry (GFAAS) and the portable LeadCare Blood lead testing system (LCS). Recoveries of two added lead concentrations of 22 and 42 μg/dL ranged from 102.4 to 105.5% for LCS and from 96.3 to 97.2% for GFAAS. Measurement of a certified sample (Certified Danish Whole Blood) at a blood lead concentration of 26.2 μg/dL gave within- and between-run coefficients of variation which were both approximately 8% by LCS and 2% by GFAAS. Comparison of the tested method (LCS) versus GFAAS from analysis of 76 samples of blood lead collected from workers in different industrial sectors showed imperfect overall correlation (r = 0.95). The LCS is quite suitable for screening purposes, but requires the use of non-frozen blood collected less than 24 h before. Conservative threshold values should be applied when using the LCS for initial screening in the field. (orig.)

  19. The measurement of the chemically mobile fraction of lead in soil using isotopic dilution analysis

    International Nuclear Information System (INIS)

    Kirchhoff, J.; Brand, J.; Schuettelkopf, H.

    1992-12-01

    The chemically available fraction of lead in eight soils measured by isotopic dilution analysis using 212 Pb ranged from 7 to 16% of the total content of lead in soil. The soluble fractions achieved values up to 63% of the total content in 1 M NH 4 NO 3 , 1 M MgCl 2 and 0.05 M DTPA solutions. Increasing the contact time between water and soil, the water-soil ratio from 1:1 to 5:1 and increasing the temperature of the soil-water suspension raised the chemically available fraction in soil. Comparing various soil parameters and the mobile fraction of lead, only pH shows a significant correlation. The amphoteric character of lead causes a minimum of mobility about pH 6; pH-values below are responsible for the higher mobility of lead as Pb 2+ , at pH-values above 6 soluble hydroxy and humic acid complexes are formed. (orig.) [de

  20. Identifying Lead Markets in the European Automotive Industry

    DEFF Research Database (Denmark)

    Cleff, Thomas; Grimpe, Christoph; Rammer, Christian

    2015-01-01

    for automobiles and national markets differ considerably in their lead market potential. The German market is found to be most promising to serve as a lead market, while other European countries with a strong automotive tradition like France, Italy, the UK, and Sweden score lower. Our findings suggest that firms......This paper presents an indicator-based methodology to identify lead markets in the European automotive industry. The lead market approach tries to explain why certain countries are better positioned than others for developing and launching new products. While much research stresses the role...... of excellence in technology and interaction among users and producers, the lead market approach focuses on the role of demand characteristics. Based on the concept of innovation design, a lead market is defined as a country where customers prefer that design which subsequently becomes the globally dominant...

  1. Mercury-Free Analysis of Lead in Drinking Water by Anodic Stripping Square Wave Voltammetry

    Science.gov (United States)

    Wilburn, Jeremy P.; Brown, Kyle L.; Cliffel, David E.

    2007-01-01

    The analysis of drinking water for lead, which has well-known health effects, is presented as an instructive example for undergraduate chemistry students. It allows the students to perform an experiment and evaluate to monitor risk factors and common hazard of everyday life.

  2. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    NARCIS (Netherlands)

    Sözer, Hasan; Tekinerdogan, B.; Aksit, Mehmet; de Lemos, Rogerio; Gacek, Cristina

    2007-01-01

    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  3. TOPICAL REVIEW: Progress in engineering high strain lead-free piezoelectric ceramics

    Science.gov (United States)

    Leontsev, Serhiy O.; Eitel, Richard E.

    2010-08-01

    Environmental concerns are strongly driving the need to replace the lead-based piezoelectric materials currently employed as multilayer actuators. The current review describes both compositional and structural engineering approaches to achieve enhanced piezoelectric properties in lead-free materials. The review of the compositional engineering approach focuses on compositional tuning of the properties and phase behavior in three promising families of lead-free perovskite ferroelectrics: the titanate, alkaline niobate and bismuth perovskites and their solid solutions. The 'structural engineering' approaches focus instead on optimization of microstructural features including grain size, grain orientation or texture, ferroelectric domain size and electrical bias field as potential paths to induce large piezoelectric properties in lead-free piezoceramics. It is suggested that a combination of both compositional and novel structural engineering approaches will be required in order to realize viable lead-free alternatives to current lead-based materials for piezoelectric actuator applications.

  4. Thermal-hydraulic analysis of an innovative decay heat removal system for lead-cooled fast reactors

    International Nuclear Information System (INIS)

    Giannetti, Fabio; Vitale Di Maio, Damiano; Naviglio, Antonio; Caruso, Gianfranco

    2016-01-01

    Highlights: • LOOP thermal-hydraulic transient analysis for lead-cooled fast reactors. • Passive decay heat removal system concept to avoid lead freezing. • Solution developed for the diversification of the decay heat removal functions. • RELAP5 vs. RELAP5-3D comparison for lead applications. - Abstract: Improvement of safety requirements in GEN IV reactors needs more reliable safety systems, among which the decay heat removal system (DHR) is one of the most important. Complying with the diversification criteria and based on pure passive and very reliable components, an additional DHR for the ALFRED reactor (Advanced Lead Fast Reactor European Demonstrator) has been proposed and its thermal-hydraulic performances are analyzed. It consists in a coupling of two innovative subsystems: the radiative-based direct heat exchanger (DHX), and the pool heat exchanger (PHX). Preliminary thermal-hydraulic analyses, by using RELAP5 and RELAP5-3D© computer programs, have been carried out showing that the whole system can safely operate, in natural circulation, for a long term. Sensitivity analyses for: the emissivity of the DHX surfaces, the PHX water heat transfer coefficient (HTC) and the lead HTC have been carried out. In addition, the effects of the density variation uncertainty on the results has been analyzed and compared. It allowed to assess the feasibility of the system and to evaluate the acceptable range of the studied parameters. A comparison of the results obtained with RELAP5 and RELAP5-3D© has been carried out and the analysis of the differences of the two codes for lead is presented. The features of the innovative DHR allow to match the decay heat removal performance with the trend of the reactor decay heat power after shutdown, minimizing at the same time the risk of lead freezing. This system, proposed for the diversification of the DHR in the LFRs, could be applicable in the other pool-type liquid metal fast reactors.

  5. Thermal-hydraulic analysis of an innovative decay heat removal system for lead-cooled fast reactors

    Energy Technology Data Exchange (ETDEWEB)

    Giannetti, Fabio; Vitale Di Maio, Damiano; Naviglio, Antonio; Caruso, Gianfranco, E-mail: gianfranco.caruso@uniroma1.it

    2016-08-15

    Highlights: • LOOP thermal-hydraulic transient analysis for lead-cooled fast reactors. • Passive decay heat removal system concept to avoid lead freezing. • Solution developed for the diversification of the decay heat removal functions. • RELAP5 vs. RELAP5-3D comparison for lead applications. - Abstract: Improvement of safety requirements in GEN IV reactors needs more reliable safety systems, among which the decay heat removal system (DHR) is one of the most important. Complying with the diversification criteria and based on pure passive and very reliable components, an additional DHR for the ALFRED reactor (Advanced Lead Fast Reactor European Demonstrator) has been proposed and its thermal-hydraulic performances are analyzed. It consists in a coupling of two innovative subsystems: the radiative-based direct heat exchanger (DHX), and the pool heat exchanger (PHX). Preliminary thermal-hydraulic analyses, by using RELAP5 and RELAP5-3D© computer programs, have been carried out showing that the whole system can safely operate, in natural circulation, for a long term. Sensitivity analyses for: the emissivity of the DHX surfaces, the PHX water heat transfer coefficient (HTC) and the lead HTC have been carried out. In addition, the effects of the density variation uncertainty on the results has been analyzed and compared. It allowed to assess the feasibility of the system and to evaluate the acceptable range of the studied parameters. A comparison of the results obtained with RELAP5 and RELAP5-3D© has been carried out and the analysis of the differences of the two codes for lead is presented. The features of the innovative DHR allow to match the decay heat removal performance with the trend of the reactor decay heat power after shutdown, minimizing at the same time the risk of lead freezing. This system, proposed for the diversification of the DHR in the LFRs, could be applicable in the other pool-type liquid metal fast reactors.

  6. Relational Perspectives on Leading

    DEFF Research Database (Denmark)

    Relational Perspectives on Leading discusses leadership from a relational and social constructionism perspective as practiced on an everyday basis between people. The book pursues a fast growing, practice-based approach - particularly within the Anglo-Saxon parts of the world - to organization...

  7. Physical and technical aspects of lead cooled fast reactors safety

    International Nuclear Information System (INIS)

    Orlov, V.V.; Smirnov, V.S.; Filin, A.I.

    2001-01-01

    The safety analysis of lead-cooled fast reactors has been performed for the well-developed concept of BREST-OD-300 reactor. The most severe accidents have been considered. An ultimate design-basis accident has been defined as an event resulting from an external impact and involving a loss of leak-tightness of the lead circuit, loss of forced circulation of lead and loss of heat sink to the secondary circuit, failure of controls and of reactor scram with resultant insertion of total reactivity margin, etc. It was assumed in accident analysis that the protective feature available for accident mitigation was only reactivity feedback on the changes in the temperatures of the reactor core elements and coolant flow rate, and in some cases also actuation of passive protections of threshold action in response to low flow rate and high coolant temperature at the core outlet. It should be noted that the majority of the analyzed accidents could be overcame even without initiation of the above protections. It has been demonstrated that a combination of inherent properties of lead coolant, nitride fuel, physical and design features of fast reactors will ensure natural safety of BREST and are instrumental for avoiding by a deterministic approach the accidents associated with a significant release of radioactivity and requiring evacuation of people in any credible initiating event and a combination of events. (author)

  8. Overview of the use of ATHENA for thermal-hydraulic analysis of systems with lead-bismuth coolant

    International Nuclear Information System (INIS)

    Davis, C.B.; Shieh, A. S.

    2000-01-01

    The INEEL and MIT are investigating the suitability of lead-bismuth cooled fast reactor for producing low-cost electricity as well as for actinide burning. This paper is concerned with the general area of thermal-hydraulics of lead-bismuth cooled reactors. The ATHENA code is being used in the thermal-hydraulic design and analysis of lead-bismuth cooled reactors. The ATHENA code was reviewed to determine its applicability for simulating lead-bismuth cooled reactors. Two modifications were made to the code as a result of this review. Specifically, a correlation to represent heat transfer from rod bundles to a liquid metal and a void correlation based on data taken in a mixture of lead-bismuth and steam were added the code. The paper also summarizes the analytical work that is being performed with the code and plans for future analytical work

  9. Overview of the Use of ATHENA for Thermal-Hydraulic Analysis of Systems with Lead-Bismuth Coolant

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Cliff Bybee; Shieh, Arthur Shan Luk

    2000-04-01

    The INEEL and MIT are investigating the suitability of lead-bismuth cooled fast reactor for producing low-cost electricity as well as for actinide burning. This paper is concerned with the general area of thermal-hydraulics of lead-bismuth cooled reactors. The ATHENA code is being used in the thermal-hydraulic design and analysis of lead-bismuth cooled reactors. The ATHENA code was reviewed to determine its applicability for simulating lead-bismuth cooled reactors. Two modifications were made to the code as a result of this review. Specifically, a correlation to represent heat transfer from rod bundles to a liquid metal and a void correlation based on data taken in a mixture of lead-bismuth and steam were added the code. The paper also summarizes the analytical work that is being performed with the code and plans for future analytical work.

  10. Leading with integrity: a qualitative research study.

    Science.gov (United States)

    Storr, Loma

    2004-01-01

    This research paper gives an account of a study into the relationship between leadership and integrity. There is a critical analysis of the current literature for effective, successful and ethical leadership particularly, integrity. The purpose and aim of this paper is to build on the current notions of leadership within the literature, debate contemporary approaches, focussing specifically on practices within the UK National Health Service in the early 21st century. This leads to a discussion of the literature on ethical leadership theory, which includes public service values, ethical relationships and leading with integrity. A small study was undertaken consisting of 18 interviews with leaders and managers within a District General HospitaL Using the Repertory Grid technique and analysis 15 themes emerged from the constructs elicited, which were compared to the literature for leadership and integrity and other studies. As well as finding areas of overlap, a number of additional constructs were elicited which suggested that effective leadership correlates with integrity and the presence of integrity will improve organisational effectiveness. The study identified that perceptions of leadership character and behaviour are used to judge the effectiveness and integrity of a leader. However, the ethical implications and consequences of leaders' scope of power and influence such as policy and strategy are somewhat neglected and lacking in debate. The findings suggest that leaders are not judged according to the ethical nature of decision making, and leading and managing complex change but that the importance of integrity and ethical leadership correlated with higher levels of hierarchical status and that it is assumed by virtue of status and success that leaders lead with integrity. Finally, the findings of this study seem to suggest that nurse leadership capability is developing as a consequence of recent national investment.

  11. A Multimodal Data Analysis Approach for Targeted Drug Discovery Involving Topological Data Analysis (TDA).

    Science.gov (United States)

    Alagappan, Muthuraman; Jiang, Dadi; Denko, Nicholas; Koong, Albert C

    In silico drug discovery refers to a combination of computational techniques that augment our ability to discover drug compounds from compound libraries. Many such techniques exist, including virtual high-throughput screening (vHTS), high-throughput screening (HTS), and mechanisms for data storage and querying. However, presently these tools are often used independent of one another. In this chapter, we describe a new multimodal in silico technique for the hit identification and lead generation phases of traditional drug discovery. Our technique leverages the benefits of three independent methods-virtual high-throughput screening, high-throughput screening, and structural fingerprint analysis-by using a fourth technique called topological data analysis (TDA). We describe how a compound library can be independently tested with vHTS, HTS, and fingerprint analysis, and how the results can be transformed into a topological data analysis network to identify compounds from a diverse group of structural families. This process of using TDA or similar clustering methods to identify drug leads is advantageous because it provides a mechanism for choosing structurally diverse compounds while maintaining the unique advantages of already established techniques such as vHTS and HTS.

  12. Path analysis of risk factors leading to premature birth.

    Science.gov (United States)

    Fields, S J; Livshits, G; Sirotta, L; Merlob, P

    1996-01-01

    The present study tested whether various sociodemographic, anthropometric, behavioral, and medical/physiological factors act in a direct or indirect manner on the risk of prematurity using path analysis on a sample of Israeli births. The path model shows that medical complications, primarily toxemia, chorioammionitis, and a previous low birth weight delivery directly and significantly act on the risk of prematurity as do low maternal pregnancy weight gain and ethnicity. Other medical complications, including chronic hypertension, preclampsia, and placental abruption, although significantly correlated with prematurity, act indirectly on prematurity through toxemia. The model further shows that the commonly accepted sociodemographic, anthropometric, and behavioral risk factors act by modifying the development of medical complications that lead to prematurity as opposed to having a direct effect on premature delivery. © 1996 Wiley-Liss, Inc. Copyright © 1996 Wiley-Liss, Inc.

  13. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  14. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    Science.gov (United States)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  15. A geostatistical approach to the change-of-support problem and variable-support data fusion in spatial analysis

    Science.gov (United States)

    Wang, Jun; Wang, Yang; Zeng, Hui

    2016-01-01

    A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.

  16. On a mass independent approach leading to planetary orbit discretization

    International Nuclear Information System (INIS)

    Oliveira Neto, Marcal de

    2007-01-01

    The present article discusses a possible fractal approach for understanding orbit configurations around a central force field in well known systems of our infinitely small and infinitely large universes, based on quantum atomic models. This approach is supported by recent important theoretical investigations reported in the literature. An application presents a study involving the three star system HD 188753 Cygni in an approach similar to that employed in molecular quantum mechanics investigations

  17. Foreign Policy: Approaches, Levels Of Analysis, Dimensions

    OpenAIRE

    Nina Šoljan

    2012-01-01

    This paper provides an overview of key issues related to foreign policy and foreign policy theories in the wider context of political science. Discussing the origins and development of foreign policy analysis (FPA), as well as scholarly work produced over time, it argues that today FPA encompasses a variety of theoretical approaches, models and tools. These share the understanding that foreign policy outputs cannot be fully explained if analysis is confined to the systemic level. Furthermore,...

  18. Cardiac implantable electronic device lead extraction using the lead-locking device system: keeping it simple, safe, and inexpensive with mechanical tools and local anesthesia.

    Science.gov (United States)

    Manolis, Antonis S; Georgiopoulos, Georgios; Metaxa, Sofia; Koulouris, Spyridon; Tsiachris, Dimitris

    2017-10-01

    We have previously reported our successful approach for percutaneous cardiac implantable electronic device (CIED) lead extraction using inexpensive tools, which we have continued over the years. Herein we report the results of the systematic use of a unique stylet, the lead-locking device (LLD), which securely locks the entire lead lumen, aided with non-powered telescoping sheaths in 54 patients to extract 98 CIED leads. This prospective observational clinical study included 38 men and 16 women aged 68.9±13.1 years undergoing lead extraction for device infection (n=46), lead malfunction (n=5), or prior to defibrillator implant (n=3). Leads were in place for 6.7±4.3 years. Infections were more commonly due to Staphylococcus species (n=40). There were 78 pacing (31 ventricular, 37 atrial, 4 VDD, and 6 coronary sinus leads) and 20 defibrillating leads. Using simple traction (6 leads) and the LLD stylets (92 leads) aided with telescoping sheaths (15 patients), 96 (98%) leads in 52 (96.3%) patients were successfully removed, with all but one leads removed using a subclavian approach; in 1 patient, the right femoral approach was also required. In 2 patients, distal fragments from one ventricular pacing and one defibrillating lead could not be removed. Finally, lead removal was completely (52/54) (96.3%) or partially (2/54) (3.7%) successful in 54 patients for 96 of 98 leads (98%) without major complications. Percutaneous lead extraction can be successful with mechanical tools using the LLD locking stylet aided with non-powered telescoping sheaths through a simplified, safe, and inexpensive procedure using local anesthesia.

  19. A Frame Analysis Approach To Cross-Cultural Television Advertising

    OpenAIRE

    Noel M. Murray

    2011-01-01

    The role of visuals in advertising research is examined. An argument is developed to support a theory of frame analysis for cross-cultural television advertising. Frame analysis is explained and commercials from Japan and the Dominican Republic are used to illustrate application of the theory. It is hoped that frame analysis will supplement content analysis as a methodological approach to cross-cultural television advertising.

  20. Lead exposures from varnished floor refinishing.

    Science.gov (United States)

    Schirmer, Joseph; Havlena, Jeff; Jacobs, David E; Dixon, Sherry; Ikens, Robert

    2012-01-01

    We evaluated the presence of lead in varnish and factors predicting lead exposure from floor refinishing and inexpensive dust suppression control methods. Lead in varnish, settled dust, and air were measured using XRF, laboratory analysis of scrape and wipe samples, and National Institute for Occupational Safety and Health (NIOSH) Method 7300, respectively, during refinishing (n = 35 homes). Data were analyzed using step-wise logistic regression. Compared with federal standards, no lead in varnish samples exceeded 1.0 mg/cm(2), but 52% exceeded 5000 ppm and 70% of settled dust samples after refinishing exceeded 40 μg/ft(2). Refinishing pre-1930 dwellings or stairs predicted high lead dust on floors. Laboratory analysis of lead in varnish was significantly correlated with airborne lead (r = 0.23, p = 0.014). Adding dust collection bags into drum sanders and HEPA vacuums to edgers and buffers reduced mean floor lead dust by 8293 μg Pb/ft(2) (pairborne lead exposures to less than 50 μg/m(3). Refinishing varnished surfaces in older housing produces high but controllable lead exposures.

  1. Leading order relativistic chiral nucleon-nucleon interaction

    Science.gov (United States)

    Ren, Xiu-Lei; Li, Kai-Wen; Geng, Li-Sheng; Long, Bingwei; Ring, Peter; Meng, Jie

    2018-01-01

    Motivated by the successes of relativistic theories in studies of atomic/molecular and nuclear systems and the need for a relativistic chiral force in relativistic nuclear structure studies, we explore a new relativistic scheme to construct the nucleon-nucleon interaction in the framework of covariant chiral effective field theory. The chiral interaction is formulated up to leading order with covariant power counting and a Lorentz invariant chiral Lagrangian. We find that the relativistic scheme induces all six spin operators needed to describe the nuclear force. A detailed investigation of the partial wave potentials shows a better description of the {}1S0 and {}3P0 phase shifts than the leading order Weinberg approach, and similar to that of the next-to-leading order Weinberg approach. For the other partial waves with angular momenta J≥slant 1, the relativistic results are almost the same as their leading order non-relativistic counterparts. )

  2. Impact of right-ventricular apical pacing on the optimal left-ventricular lead positions measured by phase analysis of SPECT myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Hung, Guang-Uei; Huang, Jin-Long; Lin, Wan-Yu; Tsai, Shih-Chung; Wang, Kuo-Yang; Chen, Shih-Ann; Lloyd, Michael S.; Chen, Ji

    2014-01-01

    The use of SPECT phase analysis to optimize left-ventricular (LV) lead positions for cardiac resynchronization therapy (CRT) was performed at baseline, but CRT works as simultaneous right ventricular (RV) and LV pacing. The aim of this study was to assess the impact of RV apical (RVA) pacing on optimal LV lead positions measured by SPECT phase analysis. This study prospectively enrolled 46 patients. Two SPECT myocardial perfusion scans were acquired under sinus rhythm with complete left bundle branch block and RVA pacing, respectively, following a single injection of 99m Tc-sestamibi. LV dyssynchrony parameters and optimal LV lead positions were measured by the phase analysis technique and then compared between the two scans. The LV dyssynchrony parameters were significantly larger with RVA pacing than with sinus rhythm (p ∝0.01). In 39 of the 46 patients, the optimal LV lead positions were the same between RVA pacing and sinus rhythm (kappa = 0.861). In 6 of the remaining 7 patients, the optimal LV lead positions were along the same radial direction, but RVA pacing shifted the optimal LV lead positions toward the base. The optimal LV lead positions measured by SPECT phase analysis were consistent, no matter whether the SPECT images were acquired under sinus rhythm or RVA pacing. In some patients, RVA pacing shifted the optimal LV lead positions toward the base. This study supports the use of baseline SPECT myocardial perfusion imaging to optimize LV lead positions to increase CRT efficacy. (orig.)

  3. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  4. Can the combined use of an ensemble based modelling approach and the analysis of measured meteorological trends lead to increased confidence in climate change impact assessments?

    Science.gov (United States)

    Gädeke, Anne; Koch, Hagen; Pohle, Ina; Grünewald, Uwe

    2014-05-01

    In anthropogenically heavily impacted river catchments, such as the Lusatian river catchments of Spree and Schwarze Elster (Germany), the robust assessment of possible impacts of climate change on the regional water resources is of high relevance for the development and implementation of suitable climate change adaptation strategies. Large uncertainties inherent in future climate projections may, however, reduce the willingness of regional stakeholder to develop and implement suitable adaptation strategies to climate change. This study provides an overview of different possibilities to consider uncertainties in climate change impact assessments by means of (1) an ensemble based modelling approach and (2) the incorporation of measured and simulated meteorological trends. The ensemble based modelling approach consists of the meteorological output of four climate downscaling approaches (DAs) (two dynamical and two statistical DAs (113 realisations in total)), which drive different model configurations of two conceptually different hydrological models (HBV-light and WaSiM-ETH). As study area serve three near natural subcatchments of the Spree and Schwarze Elster river catchments. The objective of incorporating measured meteorological trends into the analysis was twofold: measured trends can (i) serve as a mean to validate the results of the DAs and (ii) be regarded as harbinger for the future direction of change. Moreover, regional stakeholders seem to have more trust in measurements than in modelling results. In order to evaluate the nature of the trends, both gradual (Mann-Kendall test) and step changes (Pettitt test) are considered as well as both temporal and spatial correlations in the data. The results of the ensemble based modelling chain show that depending on the type (dynamical or statistical) of DA used, opposing trends in precipitation, actual evapotranspiration and discharge are simulated in the scenario period (2031-2060). While the statistical DAs

  5. Data analysis with the DIANA meta-scheduling approach

    International Nuclear Information System (INIS)

    Anjum, A; McClatchey, R; Willers, I

    2008-01-01

    The concepts, design and evaluation of the Data Intensive and Network Aware (DIANA) meta-scheduling approach for solving the challenges of data analysis being faced by CERN experiments are discussed in this paper. Our results suggest that data analysis can be made robust by employing fault tolerant and decentralized meta-scheduling algorithms supported in our DIANA meta-scheduler. The DIANA meta-scheduler supports data intensive bulk scheduling, is network aware and follows a policy centric meta-scheduling. In this paper, we demonstrate that a decentralized and dynamic meta-scheduling approach is an effective strategy to cope with increasing numbers of users, jobs and datasets. We present 'quality of service' related statistics for physics analysis through the application of a policy centric fair-share scheduling model. The DIANA meta-schedulers create a peer-to-peer hierarchy of schedulers to accomplish resource management that changes with evolving loads and is dynamic and adapts to the volatile nature of the resources

  6. A Chemoinformatics Approach to the Discovery of Lead-Like Molecules from Marine and Microbial Sources En Route to Antitumor and Antibiotic Drugs

    Science.gov (United States)

    Pereira, Florbela; Latino, Diogo A. R. S.; Gaudêncio, Susana P.

    2014-01-01

    The comprehensive information of small molecules and their biological activities in the PubChem database allows chemoinformatic researchers to access and make use of large-scale biological activity data to improve the precision of drug profiling. A Quantitative Structure–Activity Relationship approach, for classification, was used for the prediction of active/inactive compounds relatively to overall biological activity, antitumor and antibiotic activities using a data set of 1804 compounds from PubChem. Using the best classification models for antibiotic and antitumor activities a data set of marine and microbial natural products from the AntiMarin database were screened—57 and 16 new lead compounds for antibiotic and antitumor drug design were proposed, respectively. All compounds proposed by our approach are classified as non-antibiotic and non-antitumor compounds in the AntiMarin database. Recently several of the lead-like compounds proposed by us were reported as being active in the literature. PMID:24473174

  7. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Yihb, Swu

    1997-01-01

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  8. Developing a New Approach for Arabic Morphological Analysis and Generation

    OpenAIRE

    Gridach, Mourad; Chenfour, Noureddine

    2011-01-01

    Arabic morphological analysis is one of the essential stages in Arabic Natural Language Processing. In this paper we present an approach for Arabic morphological analysis. This approach is based on Arabic morphological automaton (AMAUT). The proposed technique uses a morphological database realized using XMODEL language. Arabic morphology represents a special type of morphological systems because it is based on the concept of scheme to represent Arabic words. We use this concept to develop th...

  9. Lead in Hair and in Red Wine by Potentiometric Stripping Analysis: The University Students' Design.

    Science.gov (United States)

    Josephsen, Jens

    1985-01-01

    A new program for training upper secondary school chemistry teachers (SE 537 693) depends heavily on student project work. A project in which lead in hair and in red wine was examined by potentiometric stripping analysis is described and evaluated. (JN)

  10. A factorization approach to next-to-leading-power threshold logarithms

    Energy Technology Data Exchange (ETDEWEB)

    Bonocore, D. [Nikhef,Science Park 105, NL-1098 XG Amsterdam (Netherlands); Laenen, E. [Nikhef,Science Park 105, NL-1098 XG Amsterdam (Netherlands); ITFA, University of Amsterdam,Science Park 904, Amsterdam (Netherlands); ITF, Utrecht University,Leuvenlaan 4, Utrecht (Netherlands); Magnea, L. [Dipartimento di Fisica, Università di Torino and INFN, Sezione di Torino,Via P. Giuria 1, I-10125, Torino (Italy); Melville, S. [School of Physics and Astronomy, University of Glasgow,Glasgow, G12 8QQ (United Kingdom); Vernazza, L. [Higgs Centre for Theoretical Physics, School of Physics and Astronomy, University of Edinburgh,Edinburgh, EH9 3JZ, Scotland (United Kingdom); White, C.D. [School of Physics and Astronomy, University of Glasgow,Glasgow, G12 8QQ (United Kingdom)

    2015-06-03

    Threshold logarithms become dominant in partonic cross sections when the selected final state forces gluon radiation to be soft or collinear. Such radiation factorizes at the level of scattering amplitudes, and this leads to the resummation of threshold logarithms which appear at leading power in the threshold variable. In this paper, we consider the extension of this factorization to include effects suppressed by a single power of the threshold variable. Building upon the Low-Burnett-Kroll-Del Duca (LBKD) theorem, we propose a decomposition of radiative amplitudes into universal building blocks, which contain all effects ultimately responsible for next-to-leading-power (NLP) threshold logarithms in hadronic cross sections for electroweak annihilation processes. In particular, we provide a NLO evaluation of the radiative jet function, responsible for the interference of next-to-soft and collinear effects in these cross sections. As a test, using our expression for the amplitude, we reproduce all abelian-like NLP threshold logarithms in the NNLO Drell-Yan cross section, including the interplay of real and virtual emissions. Our results are a significant step towards developing a generally applicable resummation formalism for NLP threshold effects, and illustrate the breakdown of next-to-soft theorems for gauge theory amplitudes at loop level.

  11. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  12. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  13. Drag Coefficient of Water Droplets Approaching the Leading Edge of an Airfoil

    Science.gov (United States)

    Vargas, Mario; Sor, Suthyvann; Magarino, Adelaida Garcia

    2013-01-01

    This work presents results of an experimental study on droplet deformation and breakup near the leading edge of an airfoil. The experiment was conducted in the rotating rig test cell at the Instituto Nacional de Tecnica Aeroespacial (INTA) in Madrid, Spain. An airfoil model was placed at the end of the rotating arm and a monosize droplet generator produced droplets that fell from above, perpendicular to the path of the airfoil. The interaction between the droplets and the airfoil was captured with high speed imaging and allowed observation of droplet deformation and breakup as the droplet approached the airfoil near the stagnation line. Image processing software was used to measure the position of the droplet centroid, equivalent diameter, perimeter, area, and the major and minor axes of an ellipse superimposed over the deforming droplet. The horizontal and vertical displacement of each droplet against time was also measured, and the velocity, acceleration, Weber number, Bond number, Reynolds number, and the drag coefficients were calculated along the path of the droplet to the beginning of breakup. Results are presented and discussed for drag coefficients of droplets with diameters in the range of 300 to 1800 micrometers, and airfoil velocities of 50, 70 and 90 meters/second. The effect of droplet oscillation on the drag coefficient is discussed.

  14. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  15. Qualitative Analysis of Foundry Industry: A DMAIC Approach

    OpenAIRE

    Sehgala, Sumit; Kaushisha, Deepak; Rathia, Vijayesh

    2015-01-01

    DMAIC approach is a business strategy used to improve business profitability and efficiency of all operation to meet customer needs and expectations. In the present research work, an attempt has been made to apply DMAIC (Define, Measure, analysis, improve, control) approach. The emphasis was laid down towards reduction in the defects (Blow holes, Misrun, Slag inclusion, Rough surface) occurred in the sand castings by controlling the parameters with DMAIC technique. The results achieved shows ...

  16. Lead: Aspects of its ecology and environmental toxicity. [physiological effects of lead compound contamination of environment

    Science.gov (United States)

    Siegel, S. M.

    1973-01-01

    An analysis of lead toxicity in the Hawaiian environment was conducted. It was determined that lead enters the environment as an industrial contaminant resulting from the combustion of leaded gasoline. The amount of lead absorbed by the plants in various parts of the Hawaiian Islands is reported. The disposition of lead in the sediments of canals and yacht basins was investigated. The methods for conducting the surveys of lead content are described. Possible consequences of continued environmental pollution by burning leaded gasoline are discussed.

  17. Comparison of approaches for mobile document image analysis using server supported smartphones

    Science.gov (United States)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  18. Melatonin reduces lead levels in blood, brain and bone and increases lead excretion in rats subjected to subacute lead treatment.

    Science.gov (United States)

    Hernández-Plata, Everardo; Quiroz-Compeán, Fátima; Ramírez-Garcia, Gonzalo; Barrientos, Eunice Yáñez; Rodríguez-Morales, Nadia M; Flores, Alberto; Wrobel, Katarzina; Wrobel, Kazimierz; Méndez, Isabel; Díaz-Muñoz, Mauricio; Robles, Juvencio; Martínez-Alfaro, Minerva

    2015-03-04

    Melatonin, a hormone known for its effects on free radical scavenging and antioxidant activity, can reduce lead toxicity in vivo and in vitro.We examined the effects of melatonin on lead bio-distribution. Rats were intraperitoneally injected with lead acetate (10, 15 or 20mg/kg/day) with or without melatonin (10mg/kg/day) daily for 10 days. In rats intoxicated with the highest lead doses, those treated with melatonin had lower lead levels in blood and higher levels in urine and feces than those treated with lead alone, suggesting that melatonin increases lead excretion. To explore the mechanism underlying this effect, we first assessed whether lead/melatonin complexes were formed directly. Electronic density functional (DFT) calculations showed that a lead/melatonin complex is energetically feasible; however, UV spectroscopy and NMR analysis showed no evidence of such complexes. Next, we examined the liver mRNA levels of metallothioneins (MT) 1 and 2. Melatonin cotreatment increased the MT2 mRNA expression in the liver of rats that received the highest doses of lead. The potential effects of MTs on the tissue distribution and excretion of lead are not well understood. This is the first report to suggest that melatonin directly affects lead levels in organisms exposed to subacute lead intoxication. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Lead-resistant Providencia alcalifaciens strain 2EA bioprecipitates Pb+2 as lead phosphate.

    Science.gov (United States)

    Naik, M M; Khanolkar, D; Dubey, S K

    2013-02-01

    A lead-resistant bacteria isolated from soil contaminated with car battery waste were identified as Providencia alcalifaciens based on biochemical characteristics, FAME profile and 16S rRNA sequencing and designated as strain 2EA. It resists lead nitrate up to 0·0014 mol l(-1) by precipitating soluble lead as insoluble light brown solid. Scanning electron microscopy coupled with energy-dispersive X-ray spectrometric analysis (SEM-EDX) and X-ray diffraction spectroscopy (XRD) revealed extracellular light brown precipitate as lead orthophosphate mineral, that is, Pb(9) (PO(4))(6) catalysed by phosphatase enzyme. This lead-resistant bacterial strain also demonstrated tolerance to high levels of cadmium and mercury along with multiple antibiotic resistance. Providencia alcalifaciens strain 2EA could be used for bioremediation of lead-contaminated environmental sites, as it can efficiently precipitate lead as lead phosphate. © 2012 The Society for Applied Microbiology.

  20. Assessment and Remediation of Lead Contamination in Senegal

    OpenAIRE

    Donald E. Jones, MS; Assane Diop, BS; Meredith Block, MPA; Alexander Smith-Jones, BS; Andrea Smith-Jones, MS

    2011-01-01

    Background. This paper describes the impact of improper used lead-acid battery (ULAB) handling and disposal. A specific case study is presented describing the field assessment and remediation of lead contamination in a community in Senegal where at least 18 children died from lead poisoning. Objectives. The assessment and remediation process utilized to address the Senegal lead contamination has been used as a model approach to solving used lead-acid battery (ULAB) contamination in other e...

  1. In-Service Inspection Approaches for Lead-Cooled Nuclear Reactors

    Science.gov (United States)

    2017-06-01

    heavily regulated and mature. For example, the Illinois Emergency Management Agency (IEMA) conducted 805 soil samples testing for radionuclides around... radiation , and lead-cooled reactors are expected to have economic advantages compared to other nuclear coolant/moderator systems due to design...their six nuclear reactors in 22 2015 (IEMA, 2016, 3). In addition, they currently have 1649 environmental dosimeters testing for gamma radiation

  2. Next-to-next-to-leading order QCD analysis of the revised CCFR data for xF3 structure function

    International Nuclear Information System (INIS)

    Kataev, A.L.; Kotikov, A.V.; Parente, G.; Sidorov, A.V.

    1997-01-01

    The results of the next-to-next-to-leading order QCD analysis of the recently revised experimental data of the CCFR collaboration for the xF 3 structure function using the Jacobi polynomial expansion method are presented. The effects of the higher twist contributions are included into the fits following the infrared renormalon motivated model. It is stressed that at the next-to-next-to-leading order the results for the parameter Λ M -bar S -bar (4) turn out to be almost nonsensitive to the predictions of the infrared renormalon model. The outcomes of our analysis are compared to the ones obtained by the CCFR collaboration itself at the next-to-leading order. (author)

  3. Surface dust wipes are the best predictors of blood leads in young children with elevated blood lead levels

    Energy Technology Data Exchange (ETDEWEB)

    Gulson, Brian, E-mail: brian.gulson@mq.edu.au [Graduate School of the Environment, Macquarie University, North Ryde NSW 2109 (Australia); CSIRO Earth Science and Resource Engineering, North Ryde NSW 2113 (Australia); Anderson, Phil [Information and Statistics Group, Australian Institute of Health and Welfare, Canberra ACT 2601 (Australia); Faculty of Health, University of Canberra, Canberra ACT 2601 (Australia); Taylor, Alan [Department of Psychology, Macquarie University, Sydney NSW 2109 (Australia)

    2013-10-15

    Background: As part of the only national survey of lead in Australian children, which was undertaken in 1996, lead isotopic and lead concentration measurements were obtained from children from 24 dwellings whose blood lead levels were ≥15 µg/dL in an attempt to determine the source(s) of their elevated blood lead. Comparisons were made with data for six children with lower blood lead levels (<10 µg/dL). Methods: Thermal ionisation and isotope dilution mass spectrometry were used to determine high precision lead isotopic ratios ({sup 208}Pb/{sup 206}Pb, {sup 207}Pb/{sup 206}Pb and {sup 206}Pb/{sup 204}Pb) and lead concentrations in blood, dust from floor wipes, soil, drinking water and paint (where available). Evaluation of associations between blood and the environmental samples was based on the analysis of individual cases, and Pearson correlations and multiple regression analyses based on the whole dataset. Results and discussion: The correlations showed an association for isotopic ratios in blood and wipes (r=0.52, 95% CI 0.19–0.74), blood and soil (r=0.33, 95% CI −0.05–0.62), and blood and paint (r=0.56, 95% CI 0.09–0.83). The regression analyses indicated that the only statistically significant relationship for blood isotopic ratios was with dust wipes (B=0.65, 95% CI 0.35–0.95); there were no significant associations for lead concentrations in blood and environmental samples. There is a strong isotopic correlation of soils and house dust (r=0.53, 95% CI 0.20–0.75) indicative of a common source(s) for lead in soil and house dust. In contrast, as with the regression analyses, no such association is present for bulk lead concentrations (r=−0.003, 95% CI −0.37–0.36), the most common approach employed in source investigations. In evaluation of the isotopic results on a case by case basis, the strongest associations were for dust wipes and blood. -- Highlights: • Children with elevated blood lead ≥15 µg/dL compared with a group with <10

  4. Remediation of lead contaminated soil

    International Nuclear Information System (INIS)

    Urban, W.; Krishnamurthy, S.

    1992-01-01

    Lead contaminated soil in urban area is of major concern because of the potential health risk to children. Many studies have established a direct correlation between lead in soil and elevated blood lead levels in children. In Minneapolis, Minnesota, Mielke et al. (1983) reported that 50% of the Hmong children with lead poisioning were in areas where soil lead levels were between 500 and 1000 micrograms per gram (ug/g), and 40% of the children suffering from lead poisioning lived in areas where soil lead levels exceeded 1000 ug/g. In urban areas, lead pollution in soil has come from many different sources. The sources include lead paint, lead batteries and automobile exhaust. Olson and Skogerbee (1975) found the following lead compounds in soils where the primary source of pollution was from automobiles: lead sulfate, lead oxide, lead dioxide, lead sulfide, and metallic lead. The primary form of lead found was lead sulfate. Lead sulfate, lead tetraoxide, white lead, and other forms of lead have been used in the manufacture of paints for houses. At present, two remediation techniques, solidification and Bureau of Mines fluosilicic acid leaching, are available for lead-contaminated sites. The objective of the present investigation at the Risk Reduction Engineering Laboratory (RREL), Edison, was to try to solubilize the lead species by appropriate reagents and then recover the contaminants by precipitation as lead sulfate, using environmentally acceptable methods. The apparatus used for mixing was a LabMaster mixer, with variable speed and high-shear impeller. Previous work had used nitric acid for dissolving metallic lead. Owing to the environmental concerns, it was decided to use acetic acid in the presence of oxygen. The theoretical justification for this approach is the favorable redox potential for the reaction between metallic lead, acetic acid, and gaseous oxygen

  5. Lithium attenuates lead induced toxicity on mouse non-adherent bone marrow cells.

    Science.gov (United States)

    Banijamali, Mahsan; Rabbani-Chadegani, Azra; Shahhoseini, Maryam

    2016-07-01

    Lead is a poisonous heavy metal that occurs in all parts of environment and causes serious health problems in humans. The aim of the present study was to investigate the possible protective effect of lithium against lead nitrate induced toxicity in non-adherent bone marrow stem cells. Trypan blue and MTT assays represented that exposure of the cells to different concentrations of lead nitrate decreased viability in a dose dependent manner, whereas, pretreatment of the cells with lithium protected the cells against lead toxicity. Lead reduced the number and differentiation status of bone marrow-derived precursors when cultured in the presence of colony stimulating factor (CSF), while the effect was attenuated by lithium. The cells treated with lead nitrate exhibited cell shrinkage, DNA fragmentation, anion superoxide production, but lithium prevented lead action. Moreover, apoptotic indexes such as PARP cleavage and release of HMGB1 induced by lead, were protected by lithium, suggesting anti-apoptotic effect of lithium. Immunoblot analysis of histone H3K9 acetylation indicated that lithium overcame lead effect on acetylation. In conclusion, lithium efficiently reduces lead toxicity suggesting new insight into lithium action which may contribute to increased cell survival. It also provides a potentially new therapeutic strategy for lithium and a cost-effective approach to minimize destructive effects of lead on bone marrow stem cells. Copyright © 2016 Elsevier GmbH. All rights reserved.

  6. Development of an airborne lead analysis kit and its application.

    Science.gov (United States)

    Kongtip, Pornpimol; Borisut, Pornchulee; Yoosook, Witaya; Osiri, Pramuk; Rojanavipart, Piangchan

    2010-11-01

    We developed a method to analyze airborne lead concentrations in the field. It was a modification of the colorimetric method using the reaction between 4(2-pyridylazo)-resorcinol (PAR) and lead with cyanex302 in an acid medium to reduce interfering metals. The lead concentration was detected with a photometer made in Thailand. The developed method uses an impinger containing 1% nitric acid solution as an absorbing agent to collect airborne lead at a flow rate of less than or equal to one liter/minute. Cyanex302 solution in toluene was used to extract metals from the samples and 0.1M nitric acid was used to extract just lead. The lead solution was reacted in 0.5 ml of 0.03% PAR solution, with 1 ml ammonium chloride buffer; the absorption of this solution was measured by a photometer. The results show the limit of detection (LOD) was 0.01 mg/l. The limit of quantification (LOQ) was 0.03 mg/l. The percent recovery of the lead concentrations of 0.05 - 3.0 mg/l was 94.0 to 103.5%. The precision presented as %CV ranged from 0.65 to 10.27%. Lead concentration in a lead smelting factory detected by this method was not significantly different from that detected by the NIOSH method: 7,303 at a 95% confidence level.

  7. Circuit Board Analysis for Lead by Atomic Absorption Spectroscopy in a Course for Nonscience Majors

    Science.gov (United States)

    Weidenhammer, Jeffrey D.

    2007-01-01

    A circuit board analysis of the atomic absorption spectroscopy, which is used to measure lead content in a course for nonscience majors, is being presented. The experiment can also be used to explain the potential environmental hazards of unsafe disposal of various used electronic equipments.

  8. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    Science.gov (United States)

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  9. Approach to proliferation risk assessment based on multiple objective analysis framework

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, A.; Kuptsov, I. [Obninsk Institute for Nuclear Power Engineering of NNRU MEPhI (Russian Federation); Studgorodok 1, Obninsk, Kaluga region, 249030 (Russian Federation)

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  10. Approach to proliferation risk assessment based on multiple objective analysis framework

    International Nuclear Information System (INIS)

    Andrianov, A.; Kuptsov, I.

    2013-01-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk

  11. SOCIOLOGICAL UNDERSTANDING OF INTERNET: THEORETICAL APPROACHES TO THE NETWORK ANALYSIS

    Directory of Open Access Journals (Sweden)

    D. E. Dobrinskaya

    2016-01-01

    Full Text Available The network is an efficient way of social structure analysis for contemporary sociologists. It gives broad opportunities for detailed and fruitful research of different patterns of ties and social relations by quantitative analytical methods and visualization of network models. The network metaphor is used as the most representative tool for description of a new type of society. This new type is characterized by flexibility, decentralization and individualization. Network organizational form became the dominant form in modern societies. The network is also used as a mode of inquiry. Actually three theoretical network approaches in the Internet research case are the most relevant: social network analysis, “network society” theory and actor-network theory. Every theoretical approach has got its own notion of network. Their special methodological and theoretical features contribute to the Internet studies in different ways. The article represents a brief overview of these network approaches. This overview demonstrates the absence of a unified semantic space of the notion of “network” category. This fact, in turn, points out the need for detailed analysis of these approaches to reveal their theoretical and empirical possibilities in application to the Internet studies. 

  12. Lead-oriented synthesis: Investigation of organolithium-mediated routes to 3-D scaffolds and 3-D shape analysis of a virtual lead-like library.

    Science.gov (United States)

    Lüthy, Monique; Wheldon, Mary C; Haji-Cheteh, Chehasnah; Atobe, Masakazu; Bond, Paul S; O'Brien, Peter; Hubbard, Roderick E; Fairlamb, Ian J S

    2015-06-01

    Synthetic routes to six 3-D scaffolds containing piperazine, pyrrolidine and piperidine cores have been developed. The synthetic methodology focused on the use of N-Boc α-lithiation-trapping chemistry. Notably, suitably protected and/or functionalised medicinal chemistry building blocks were synthesised via concise, connective methodology. This represents a rare example of lead-oriented synthesis. A virtual library of 190 compounds was then enumerated from the six scaffolds. Of these, 92 compounds (48%) fit the lead-like criteria of: (i) -1⩽AlogP⩽3; (ii) 14⩽number of heavy atoms⩽26; (iii) total polar surface area⩾50Å(2). The 3-D shapes of the 190 compounds were analysed using a triangular plot of normalised principal moments of inertia (PMI). From this, 46 compounds were identified which had lead-like properties and possessed 3-D shapes in under-represented areas of pharmaceutical space. Thus, the PMI analysis of the 190 member virtual library showed that whilst scaffolds which may appear on paper to be 3-D in shape, only 24% of the compounds actually had 3-D structures in the more interesting areas of 3-D drug space. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    Science.gov (United States)

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  14. Closed-loop, pilot/vehicle analysis of the approach and landing task

    Science.gov (United States)

    Anderson, M. R.; Schmidt, D. K.

    1986-01-01

    In the case of approach and landing, it is universally accepted that the pilot uses more than one vehicle response, or output, to close his control loops. Therefore, to model this task, a multi-loop analysis technique is required. The analysis problem has been in obtaining reasonable analytic estimates of the describing functions representing the pilot's loop compensation. Once these pilot describing functions are obtained, appropriate performance and workload metrics must then be developed for the landing task. The optimal control approach provides a powerful technique for obtaining the necessary describing functions, once the appropriate task objective is defined in terms of a quadratic objective function. An approach is presented through the use of a simple, reasonable objective function and model-based metrics to evaluate loop performance and pilot workload. The results of an analysis of the LAHOS (Landing and Approach of Higher Order Systems) study performed by R.E. Smith is also presented.

  15. An approach to multi-attribute utility analysis under parametric uncertainty

    International Nuclear Information System (INIS)

    Kelly, M.; Thorne, M.C.

    2001-01-01

    The techniques of cost-benefit analysis and multi-attribute analysis provide a useful basis for informing decisions in situations where a number of potentially conflicting opinions or interests need to be considered, and where there are a number of possible decisions that could be adopted. When the input data to such decision-making processes are uniquely specified, cost-benefit analysis and multi-attribute utility analysis provide unambiguous guidance on the preferred decision option. However, when the data are not uniquely specified, application and interpretation of these techniques is more complex. Herein, an approach to multi-attribute utility analysis (and hence, as a special case, cost-benefit analysis) when input data are subject to parametric uncertainty is presented. The approach is based on the use of a Monte Carlo technique, and has recently been applied to options for the remediation of former uranium mining liabilities in a number of Central and Eastern European States

  16. Lead reactor strategy economical analysis

    International Nuclear Information System (INIS)

    Ciotti, Marco

    2013-01-01

    Conclusions: • A first attempt to evaluate LFR power plant electricity production cost has been performed; • Electricity price is similar to Gen III + plants; • The estimation accuracy is probably low; • Possible costs reduction could arise from coolant characteristics that may improve safety and simplicity by design; • Accident perception, not acceptable by public opinion, may be changed with low potential energy system (non exploding coolant); • Sustainability improvement could open to a better Public acceptance, depending on us. • Problems may arise in coupling a high capital cost low fuel cost plant in a grid with large amount of intermittent sources with priority dispatch. • Lead fast reactors can compete

  17. Lead in rice: analysis of baseline lead levels in market and field collected rice grains.

    Science.gov (United States)

    Norton, Gareth J; Williams, Paul N; Adomako, Eureka E; Price, Adam H; Zhu, Yongguan; Zhao, Fang-Jie; McGrath, Steve; Deacon, Claire M; Villada, Antia; Sommella, Alessia; Lu, Ying; Ming, Lei; De Silva, P Mangala C S; Brammer, Hugh; Dasgupta, Tapash; Islam, M Rafiqul; Meharg, Andrew A

    2014-07-01

    In a large scale survey of rice grains from markets (13 countries) and fields (6 countries), a total of 1578 rice grain samples were analysed for lead. From the market collected samples, only 0.6% of the samples exceeded the Chinese and EU limit of 0.2 μg g(-1) lead in rice (when excluding samples collected from known contaminated/mine impacted regions). When evaluating the rice grain samples against the Food and Drug Administration's (FDA) provisional total tolerable intake (PTTI) values for children and pregnant women, it was found that only people consuming large quantities of rice were at risk of exceeding the PTTI from rice alone. Furthermore, 6 field experiments were conducted to evaluate the proportion of the variation in lead concentration in rice grains due to genetics. A total of 4 of the 6 field experiments had significant differences between genotypes, but when the genotypes common across all six field sites were assessed, only 4% of the variation was explained by genotype, with 9.5% and 11% of the variation explained by the environment and genotype by environment interaction respectively. Further work is needed to identify the sources of lead contamination in rice, with detailed information obtained on the locations and environments where the rice is sampled, so that specific risk assessments can be performed. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Ancient bronze coins from Mediterranean basin: LAMQS potentiality for lead isotopes comparative analysis with former mineral

    Energy Technology Data Exchange (ETDEWEB)

    Torrisi, L., E-mail: Lorenzo.Torrisi@unime.it [Department of Physics Science - MIFT, Messina University, V.le F.S. d’Alcontres 31, 98166 S. Agata, Messina (Italy); Italiano, A. [INFN, Sezione di Catania, Gruppo collegato di Messina (Italy); Torrisi, A. [Institute of Optoelectronics, Military University of Technology, 2 Kaliskiego Str., 00-908 Warsaw (Poland)

    2016-11-30

    Highlights: • Surface and bulk compositional elements in ancient bronze coins were investigated using XRF analysis. • Lead stable isotope {sup 204}Pb, {sup 206}Pb, {sup 207}Pb and {sup 208}Pb were measured in ancient coins with LAMQS analysis. • Lead ratios {sup 208}Pb/{sup 206}Pb and {sup 207}Pb/{sup 206}Pb, measured by LAMQS, were compared with Brettscaife.net geological database relative to the minerals in different mines of Mediterranean basin. • Bronze coins were correlated to possible ancient mining sites of minerals from which lead was extracted. - Abstract: Bronze coins coming from the area of the Mediterranean basin, dated back the II–X Cent. A.D., were analyzed using different physical analytical techniques. Characteristic X-ray fluorescence was used with electrons and photons, in order to investigate the elemental composition of both the surface layers and bulk. Moreover, the quadrupole mass spectrometry coupled to laser ablation (LAMQS technique) in high vacuum was used to analyse typical material compounds from surface contamination. Mass spectrometry, at high resolution and sensitivity, extended up to 300 amu, allowed measuring the {sup 208}Pb/{sup 206}Pb and {sup 207}Pb/{sup 206}Pb isotopic ratios into the coins. Quantitative relative analyses of these isotopic ratios identify the coin composition such as a “fingerprint” depending on the mineral used to extract the lead. Isotopic ratios in coins can be compared to those of the possible minerals used to produce the bronze alloy. A comparison between the measured isotope ratios in the analyzed coins and the literature database, related to the mineral containing Pb as a function of its geological and geophysical extraction mine, is presented. The analysis, restricted to old coins and the mines of the Mediterranean basin, indicates a possible correlation between the coin compositions and the possible geological sites of the extracted mineral.

  19. An artificial neural network approach to laser-induced breakdown spectroscopy quantitative analysis

    International Nuclear Information System (INIS)

    D’Andrea, Eleonora; Pagnotta, Stefano; Grifoni, Emanuela; Lorenzetti, Giulia; Legnaioli, Stefano; Palleschi, Vincenzo; Lazzerini, Beatrice

    2014-01-01

    The usual approach to laser-induced breakdown spectroscopy (LIBS) quantitative analysis is based on the use of calibration curves, suitably built using appropriate reference standards. More recently, statistical methods relying on the principles of artificial neural networks (ANN) are increasingly used. However, ANN analysis is often used as a ‘black box’ system and the peculiarities of the LIBS spectra are not exploited fully. An a priori exploration of the raw data contained in the LIBS spectra, carried out by a neural network to learn what are the significant areas of the spectrum to be used for a subsequent neural network delegated to the calibration, is able to throw light upon important information initially unknown, although already contained within the spectrum. This communication will demonstrate that an approach based on neural networks specially taylored for dealing with LIBS spectra would provide a viable, fast and robust method for LIBS quantitative analysis. This would allow the use of a relatively limited number of reference samples for the training of the network, with respect to the current approaches, and provide a fully automatizable approach for the analysis of a large number of samples. - Highlights: • A methodological approach to neural network analysis of LIBS spectra is proposed. • The architecture of the network and the number of inputs are optimized. • The method is tested on bronze samples already analyzed using a calibration-free LIBS approach. • The results are validated, compared and discussed

  20. Energy policy and externalities: the life cycle analysis approach

    International Nuclear Information System (INIS)

    Virdis, M.R.

    2002-01-01

    In the energy sector, getting the prices right is a prerequisite for market mechanisms to work effectively towards sustainable development. However, energy production and use creates 'costs' external to traditional accounting practices, such as damages to human health and the environment resulting from residual emissions or risks associated with dependence on foreign suppliers. Energy market prices do not fully reflect those external costs. For example, the costs of climate change are not internalized and, therefore, consumers do not get the right price signals leading them to make choices that are optimised from a societal viewpoint. Economic theory has developed approaches to assessing and internalizing external costs that can be applied to the energy sector and, in principle, provide means to quantify and integrate relevant information in a comprehensive framework. The tools developed for addressing these issues are generally aimed at monetary valuation of impacts and damages and integration of the valued 'external costs' in total cost of the product, e.g. electricity. The approach of Life Cycle Analysis (LCA) provides a conceptual framework for a detailed and comprehensive comparative evaluation of energy supply options. This paper offers a summary of the LCA methodology and an overview of some of its limitations. It then illustrates, through a few examples, how the methodology can be used to inform or correct policy making and to orient investment decisions. Difficulties and issues emerging at various stages in the application and use of LCA results are discussed, although in such a short note, it is impossible to address all issues related to LCA. Therefore, as part of the concluding section, some issues are left open - and areas in which further analytical work may be needed are described. (author)

  1. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  2. Analysis of application of different approaches to secure safe drinking water

    Directory of Open Access Journals (Sweden)

    Pendić Zoran

    2017-01-01

    Full Text Available In this analysis, the risk systems include the systems within which services sensitive to risk are executed. The complex service of population supply with safe drinking water is considered to be risky. Guidelines for drinking water quality of the World Health Organization (WHO recommends the use of effective preventive approaches to risk-based management of the safety and quality of drinking water. For example, Food Safety Law of the Republic of Serbia stipulates mandatory application of HACCP system in order to obtain safe drinking water. Different approaches to preventive risk-based management for the sake of the safety and quality of drinking water are applied nowadays. In this paper we consider the following approaches: Original Codex Alimentarius HACCP system and some of its modified versions; International standard ISO 22000: 2005 Food safety management systems - Requirements for any organization in the food chain; Water Safety Plan (WSP of the World Health Organization (WHO; Generalized HACCP system. All of these approaches are based, to a greater or lesser extent, on the original Codex Alimentarius HACCP system. The paper gives a situation analysis (SWOT analysis of considered approaches.

  3. Adversarial risk analysis with incomplete information: a level-k approach.

    Science.gov (United States)

    Rothschild, Casey; McLay, Laura; Guikema, Seth

    2012-07-01

    This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.

  4. Marketing approaches for OTC analgesics in Bulgaria.

    Science.gov (United States)

    Petkova, Valentina; Valchanova, Velislava; Ibrahim, Adel; Nikolova, Irina; Benbasat, Niko; Dimitrov, Milen

    2014-03-04

    The marketing management includes analysis of market opportunities, selection of target markets, planning, developing and implementing of marketing strategies, monitoring and result control. The object of the present study was to analyse the marketing approaches applied for non-steroidal anti-inflammatory drugs (NSAIDs) in Bulgaria. The performed SWOT(planning method used to evaluate the strengths, weaknesses, opportunities, and threats) analysis for one of the leading Bulgarian manufacturers marked the complex corporative strategy for stimulating the sales of NSAIDs. The study results show that the legislation frame in the country gives an opportunity for regulation of the NSAID market in order that incorrect marketing approaches such as disloyal competition are avoided.

  5. Fatigue in engineering structures. A three fold analysis approach

    International Nuclear Information System (INIS)

    Malik, Afzaal M.; Qureshi, Ejaz M.; Dar, Naeem Ullah; Khan, Iqbal

    2007-01-01

    The integrity in most of the engineering structures in influenced by the presence of cracks or crack like defects. These structures fail, even catastrophically if a crack greater than a critically safe size exist. Although most of the optimal designed structures are initially free from critical cracks, sub-critical cracks can lead to failures under cyclic loadings, called fatigue crack growth. It is nearly impractical to prevent sub-critical crack growth in engineering structures particularly in crack sensitive structures like most of the structures in nuclear, aerospace and aeronautical domains. However, it is essential to predict the fatigue crack growth for these structures to preclude the in service failures causing loss of assets. The present research presents an automatic procedure for the prediction of fatigue crack growth in three dimensional engineering structures and the key data for the fracture mechanics based design: the stress intensity factors. Three fold analysis procedures are adopted to investigate the effects of repetitive (cyclic) loadings on the fatigue life of different geometries of aluminum alloy 2219-O. A general purpose Finite Element (FE) Code ANSYS-8.0 is used to predict/estimate the fatigue life of the geometries. Computer codes utilizing the Green's Function are developed to calculate the stress intensity factors. Another code based on superposition technique presented by Shivakumara and Foreman is developed to calculate the fatigue crack growth rate, fatigue life (No. of loading cycles are developed to validate the results and finally full scale laboratory tests are conducted for the comparison of the results. The results showing a close co-relation between the different techniques employed gives the promising feature of the analysis approach for the future work. (author)

  6. Experimental study of gas-cooled current leads for superconducting magnets

    International Nuclear Information System (INIS)

    Warren, R.P.

    1978-04-01

    Design details and experimental test results from several design variations of the gas-cooled, copper current leads used in conjunction with the superconducting dipole magnets for ESCAR (Experimental Superconducting Accelerator Ring) are reported. Thermal acoustic oscillations, which were experienced with an initial design, were eliminated in subsequent designs by a reduction of the hydraulic diameter. The occurrence of these oscillations is in general agreement with the stability analysis of Rott but the observed gas flow dependence is not in agreement with some other recently reported results for leads operated supercritical phase coolant. An empirically determined correlation was obtained by plotting lead resistance vs. enthalpy gain of the coolant gas. The resulting family of curves can be reduced to a single line on a plot of effective resistivity vs. the product of current and cross-sectional area divided by the product of the square of the mass flow of the coolant and the lead length. This correlation, which should be applicable to other designs of copper current leads in which ideal heat transfer to the coolant gas is approached, predicts that the enthalpy gain of the coolant, and therefore the peak lead temperature, is proportional to the cube of the ratio of current to coolant mass flow. The effective value of the strongly temperature-dependent kinematic viscosity of the coolant gas was found to vary linearly with the effective resistivity of the lead

  7. A new approach for reliability analysis with time-variant performance characteristics

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2013-01-01

    Reliability represents safety level in industry practice and may variant due to time-variant operation condition and components deterioration throughout a product life-cycle. Thus, the capability to perform time-variant reliability analysis is of vital importance in practical engineering applications. This paper presents a new approach, referred to as nested extreme response surface (NERS), that can efficiently tackle time dependency issue in time-variant reliability analysis and enable to solve such problem by easily integrating with advanced time-independent tools. The key of the NERS approach is to build a nested response surface of time corresponding to the extreme value of the limit state function by employing Kriging model. To obtain the data for the Kriging model, the efficient global optimization technique is integrated with the NERS to extract the extreme time responses of the limit state function for any given system input. An adaptive response prediction and model maturation mechanism is developed based on mean square error (MSE) to concurrently improve the accuracy and computational efficiency of the proposed approach. With the nested response surface of time, the time-variant reliability analysis can be converted into the time-independent reliability analysis and existing advanced reliability analysis methods can be used. Three case studies are used to demonstrate the efficiency and accuracy of NERS approach

  8. A Novel Approach for the Removal of Lead(II Ion from Wastewater Using Mucilaginous Leaves of Diceriocaryum eriocarpum Plant

    Directory of Open Access Journals (Sweden)

    Joshua N. Edokpayi

    2015-10-01

    Full Text Available Lead(II ion is a very toxic element known to cause detrimental effects to human health even at very low concentrations. An adsorbent prepared using mucilaginous leaves from Diceriocaryum eriocarpum plant (DEP was used for the adsorption of lead(II ion from aqueous solution. Batch experiments were performed on simulated aqueous solutions under optimized conditions of adsorbent dosage, contact time, pH and initial lead(II ion concentration at 298 K. The Langmuir isotherm model more suitably described the adsorption process than the Freundlich model with linearized coefficients of 0.9661 and 0.9547, respectively. Pseudo-second order kinetic equation best described the kinetics of the reaction. Fourier transform infra-red analysis confirmed the presence of amino (–NH, carbonyl (–C=O and hydroxyl (–OH functional groups. Application of the prepared adsorbent to wastewater samples of 10 mg/L and 12 mg/L of lead(II ion concentration taken from a waste stabilization pond showed removal efficiencies of 95.8% and 96.4%, respectively. Futhermore, 0.1 M HCl was a better desorbing agent than 0.1 M NaOH and de-ionized water. The experimental data obtained demonstrated that mucilaginous leaves from DEP can be used as a suitable adsorbent for lead(II ion removal from wastewater.

  9. Practical approach on gas pipeline compression system availability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Sidney Pereira dos [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Kurz, Rainer; Lubomirsky, Matvey [Solar Turbines, San Diego, CA (United States)

    2009-12-19

    Gas pipeline projects traditionally have been designed based on load factor and steady state flow. This approach exposes project sponsors to project sustainability risks due to potential losses of revenues and transportation contract penalties related to pipeline capacity shortage as consequence of compressor unit's unavailability. Such unavailability should previously be quantified during the design phase. This paper presents a case study and a methodology that highlights the practical benefits of applying Monte Carlo simulation for the compression system availability analysis in conjunction with quantitative risk analysis and economic feasibility study. Project economics main variables and their impacts on the project NPV (Net Present Value) are evaluated with their respective statistics distribution to quantify risk and support decision makers to adopt mitigating measures to guarantee competitiveness while protecting project sponsors from otherwise unpredictable risks. This practical approach is compared to load factor approach and the results are presented and evaluated. (author)

  10. Understanding common risk analysis problems leads to better E and P decisions

    International Nuclear Information System (INIS)

    Smith, M.B.

    1994-01-01

    Many petroleum geologists, engineers and managers who have been introduced to petroleum risk analysis doubt that probability theory actually works in practice. Discovery probability estimates for exploration prospects always seem to be more optimistic than after-the-fact results. In general, probability estimates seem to be plucked from the air without any objective basis. Because of subtleties in probability theories, errors may result in applying risk analysis to real problems. Four examples have been selected to illustrate how misunderstanding in applying risk analysis may lead to incorrect decisions. Examples 1 and 2 show how falsely assuming statistical independence distorts probability calculations. Example 1 and 2 show how falsely assuming statistical independence distorts probability calculations. Example 3 discusses problems with related variable using the Monte Carlo method. Example 4 shows how subsurface data yields a probability value that is superior to a simple statistical estimate. The potential mistakes in the following examples would go unnoticed in analyses in most companies. Lack of objectivity and flawed theory would be blamed when fault actually would lies with incorrect application of basic probability principles

  11. Microbial genome analysis: the COG approach.

    Science.gov (United States)

    Galperin, Michael Y; Kristensen, David M; Makarova, Kira S; Wolf, Yuri I; Koonin, Eugene V

    2017-09-14

    For the past 20 years, the Clusters of Orthologous Genes (COG) database had been a popular tool for microbial genome annotation and comparative genomics. Initially created for the purpose of evolutionary classification of protein families, the COG have been used, apart from straightforward functional annotation of sequenced genomes, for such tasks as (i) unification of genome annotation in groups of related organisms; (ii) identification of missing and/or undetected genes in complete microbial genomes; (iii) analysis of genomic neighborhoods, in many cases allowing prediction of novel functional systems; (iv) analysis of metabolic pathways and prediction of alternative forms of enzymes; (v) comparison of organisms by COG functional categories; and (vi) prioritization of targets for structural and functional characterization. Here we review the principles of the COG approach and discuss its key advantages and drawbacks in microbial genome analysis. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.

  12. A systems biology approach for pathway level analysis

    OpenAIRE

    Draghici, Sorin; Khatri, Purvesh; Tarca, Adi Laurentiu; Amin, Kashyap; Done, Arina; Voichita, Calin; Georgescu, Constantin; Romero, Roberto

    2007-01-01

    A common challenge in the analysis of genomics data is trying to understand the underlying phenomenon in the context of all complex interactions taking place on various signaling pathways. A statistical approach using various models is universally used to identify the most relevant pathways in a given experiment. Here, we show that the existing pathway analysis methods fail to take into consideration important biological aspects and may provide incorrect results in certain situations. By usin...

  13. Personalized translational epilepsy research - Novel approaches and future perspectives: Part I: Clinical and network analysis approaches.

    Science.gov (United States)

    Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc

  14. Exclusive vector meson production with leading neutrons in a saturation model for the dipole amplitude in mixed space

    Science.gov (United States)

    Amaral, J. T.; Becker, V. M.

    2018-05-01

    We investigate ρ vector meson production in e p collisions at HERA with leading neutrons in the dipole formalism. The interaction of the dipole and the pion is described in a mixed-space approach, in which the dipole-pion scattering amplitude is given by the Marquet-Peschanski-Soyez saturation model, which is based on the traveling wave solutions of the nonlinear Balitsky-Kovchegov equation. We estimate the magnitude of the absorption effects and compare our results with a previous analysis of the same process in full coordinate space. In contrast with this approach, the present study leads to absorption K factors in the range of those predicted by previous theoretical studies on semi-inclusive processes.

  15. Nuclear microprobe analysis of lead profile in crocodile bones

    Energy Technology Data Exchange (ETDEWEB)

    Orlic, I. E-mail: ivo@ansto.gov.au; Siegele, R.; Hammerton, K.; Jeffree, R.A.; Cohen, D.D

    2003-09-01

    Elevated concentrations of lead were found in Australian free ranging saltwater crocodile (Crocodylus porosus) bone and flesh. Lead shots were found as potential source of lead in these animals. ANSTO's heavy ion nuclear microprobe was used to measure the distribution of Pb in a number of bones and osteoderms. The aim was to find out if elevated Pb concentration remains in growth rings and if the concentration is correlated with the blood levels recorded at the time. Results of our study show a very distinct distribution of accumulated Pb in bones and osteoderms as well as good correlation with the level of lead concentration in blood. To investigate influence of ion species on detection limits measurements of the same sample were performed by using 3 MeV protons, 9 MeV He ions and 20 MeV carbon ions. Peak to background ratios, detection limits and the overall 'quality' of obtained spectra are compared and discussed.

  16. Correlations in particle production in proton-lead and lead-lead collisions at the LHC

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00361447

    In high-energy heavy-ion collisions at the Large Hadron Collider (LHC), a hot and dense state of matter called the Quark-Gluon Plasma (QGP) is formed. The initial collision geometry and the subsequent expansion during the QGP stage result in the correlations of produced particles, through which the properties of the QGP can be investigated. Two analyses based on the geometrical correlations of produced particles, one in proton-lead (p–Pb) collisions and the other in lead-lead (Pb–Pb) collisions, are presented in this thesis. The data analyzed in this thesis were collected with the ALICE detector at the LHC in p– Pb collisions at a nucleon–nucleon center-of-mass energy of 5.02 TeV, and Pb–Pb collisions at a nucleon–nucleon center-of-mass energy of 2.76 TeV. In the forward-central two-particle correlation analysis in p–Pb collisions, two-particle an- gular correlations between trigger particles in the forward pseudorapidity range (2.5 < |η| < 4.0) and associated particles in the central ran...

  17. Levels and source apportionment of children's lead exposure: could urinary lead be used to identify the levels and sources of children's lead pollution?

    Science.gov (United States)

    Cao, Suzhen; Duan, Xiaoli; Zhao, Xiuge; Wang, Beibei; Ma, Jin; Fan, Delong; Sun, Chengye; He, Bin; Wei, Fusheng; Jiang, Guibin

    2015-04-01

    As a highly toxic heavy metal, the pollution and exposure risks of lead are of widespread concern for human health. However, the collection of blood samples for use as an indicator of lead pollution is not always feasible in most cohort or longitudinal studies, especially those involving children health. To evaluate the potential use of urinary lead as an indicator of exposure levels and source apportionment, accompanying with environmental media samples, lead concentrations and isotopic measurements (expressed as (207)Pb/(206)Pb, (208)Pb/(206)Pb and (204)Pb/(206)Pb) were investigated and compared between blood and urine from children living in the vicinities of a typical coking plant and lead-acid battery factory. The results showed urinary lead might not be a preferable proxy for estimating blood lead levels. Fortunately, urinary lead isotopic measurements could be used as an alternative for identifying the sources of children's lead exposure, which coincided well with the blood lead isotope ratio analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  19. Leading with "Emotional" Intelligence--Existential and Motivational Analysis in Leadership and Leadership Development

    Science.gov (United States)

    Mengel, Thomas

    2012-01-01

    This conceptual and practical paper is integrating the work of Viktor Frankl (1985) and Steven Reiss (2000, 2008) into a model of Existential and Motivational Analysis (EMotiAn). This integrated model and approach may provide scholars, educators, consultants and practitioners alike with an innovative and meaningful framework for leadership and…

  20. Lead dust in Broken Hill homes: effect of remediation on indoor lead levels.

    Science.gov (United States)

    Boreland, F; Lyle, D M

    2006-02-01

    This study was undertaken to determine whether home remediation effectively reduced indoor lead levels in Broken Hill, a long-established silver-lead-zinc mining town in outback Australia. A before-after study of the effect of home remediation on indoor lead levels was embedded into a randomized controlled trial of the effectiveness of remediation for reducing elevated blood lead levels in young children. Moist towelettes were used to measure lead loading (microg/m2) on internal windowsills and internal and entry floors of 98 homes; samples were collected before, immediately after, and 2, 4, 6, 8, and 10 months after remediation. Data were log(10) transformed for the analysis. Remediation reduced average indoor lead levels by approximately 50%, and lead levels remained low for the duration of the follow-up period (10 months). The greatest gains were made in homes with the highest initial lead levels; homes with low preremediation lead levels showed little or no benefit. Before remediation, homes located in areas with high soil lead levels or with "poor" dust proofing had higher lead levels than those in areas with lower soil lead levels or with "medium" or "good" dust proofing; these relative differences remained after remediation. There was no evidence that lead loading was reduced by an increased opportunity to become aware of lead issues. We conclude that remediation is an effective strategy for reducing the lead exposure of children living in homes with high indoor lead levels.

  1. Hybrid Approach of Aortic Diseases: Zone 1 Delivery and Volumetric Analysis on the Descending Aorta

    Directory of Open Access Journals (Sweden)

    José Augusto Duncan

    Full Text Available Abstract Introduction: Conventional techniques of surgical correction of arch and descending aortic diseases remains as high-risk procedures. Endovascular treatments of abdominal and descending thoracic aorta have lower surgical risk. Evolution of both techniques - open debranching of the arch and endovascular approach of the descending aorta - may extend a less invasive endovascular treatment for a more extensive disease with necessity of proximal landing zone in the arch. Objective: To evaluate descending thoracic aortic remodeling by means of volumetric analysis after hybrid approach of aortic arch debranching and stenting the descending aorta. Methods: Retrospective review of seven consecutive patients treated between September 2014 and August 2016 for diseases of proximal descending aorta (aneurysms and dissections by hybrid approach to deliver the endograft at zone 1. Computed tomography angiography were analyzed using a specific software to calculate descending thoracic aorta volumes pre- and postoperatively. Results: Follow-up was done in 100% of patients with a median time of 321 days (range, 41-625 days. No deaths or permanent neurological complications were observed. There were no endoleaks or stent migrations. Freedom from reintervention was 100% at 300 days and 66% at 600 days. Median volume reduction was of 45.5 cm3, representing a median volume shrinkage by 9.3%. Conclusion: Hybrid approach of arch and descending thoracic aorta diseases is feasible and leads to a favorable aortic remodeling with significant volume reduction.

  2. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    Science.gov (United States)

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2018-03-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  3. New Approach to Quantitative Analysis by Laser-induced Breakdown Spectroscopy

    International Nuclear Information System (INIS)

    Lee, D. H.; Kim, T. H.; Yun, J. I.; Jung, E. C.

    2009-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been studied as the technique of choice in some particular situations like screening, in situ measurement, process monitoring, hostile environments, etc. Especially, LIBS can fulfill the qualitative and quantitative analysis for radioactive high level waste (HLW) glass in restricted experimental conditions. Several ways have been suggested to get quantitative information from LIBS. The one approach is to use the absolute intensities of each element. The other approach is to use the elemental emission intensities relative to the intensity of the internal standard element whose concentration is known already in the specimen. But these methods are not applicable to unknown samples. In the present work, we introduce new approach to LIBS quantitative analysis by using H α (656.28 nm) emission line as external standard

  4. Integrated Risk-Capability Analysis under Deep Uncertainty : An ESDMA Approach

    NARCIS (Netherlands)

    Pruyt, E.; Kwakkel, J.H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations

  5. Interactions of lead with sediments and meiofauna

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, D. (Queens Univ., Belfast); Maguire, C.

    1976-11-01

    Harpactacoid copepods and Turbellaria appear to be the most sensitive faunal groups in surface sand meiofauna when subjected to contamination by lead; in subsurface sand, nematodes are found to be the most sensitive group. Simple laboratory attempts to assess lead partitioning in littoral sand gave variable results and the problems and merits of such experimental approaches are discussed.

  6. LeadMine: a grammar and dictionary driven approach to entity recognition

    Science.gov (United States)

    2015-01-01

    Background Chemical entity recognition has traditionally been performed by machine learning approaches. Here we describe an approach using grammars and dictionaries. This approach has the advantage that the entities found can be directly related to a given grammar or dictionary, which allows the type of an entity to be known and, if an entity is misannotated, indicates which resource should be corrected. As recognition is driven by what is expected, if spelling errors occur, they can be corrected. Correcting such errors is highly useful when attempting to lookup an entity in a database or, in the case of chemical names, converting them to structures. Results Our system uses a mixture of expertly curated grammars and dictionaries, as well as dictionaries automatically derived from public resources. We show that the heuristics developed to filter our dictionary of trivial chemical names (from PubChem) yields a better performing dictionary than the previously published Jochem dictionary. Our final system performs post-processing steps to modify the boundaries of entities and to detect abbreviations. These steps are shown to significantly improve performance (2.6% and 4.0% F1-score respectively). Our complete system, with incremental post-BioCreative workshop improvements, achieves 89.9% precision and 85.4% recall (87.6% F1-score) on the CHEMDNER test set. Conclusions Grammar and dictionary approaches can produce results at least as good as the current state of the art in machine learning approaches. While machine learning approaches are commonly thought of as "black box" systems, our approach directly links the output entities to the input dictionaries and grammars. Our approach also allows correction of errors in detected entities, which can assist with entity resolution. PMID:25810776

  7. Maritime Load Dependent Lead Times - An Analysis

    DEFF Research Database (Denmark)

    Pahl, Julia; Voss, Stefan

    2017-01-01

    in production. Inspired by supply chain planning systems, we analyze the current state of (collaborative) planning in the maritime transport chain with focus on containers. Regarding the problem of congestion, we particularly emphasize on load dependent lead times (LDLT) which are well studied in production....

  8. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2015-01-01

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  9. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  10. Introduction to Safety Analysis Approach for Research Reactors

    International Nuclear Information System (INIS)

    Park, Suki

    2016-01-01

    The research reactors have a wide variety in terms of thermal powers, coolants, moderators, reflectors, fuels, reactor tanks and pools, flow direction in the core, and the operating pressure and temperature of the cooling system. Around 110 research reactors have a thermal power greater than 1 MW. This paper introduces a general approach to safety analysis for research reactors and deals with the experience of safety analysis on a 10 MW research reactor with an open-pool and open-tank reactor and a downward flow in the reactor core during normal operation. The general approach to safety analysis for research reactors is described and the design features of a typical open-pool and open-tank type reactor are discussed. The representative events expected in research reactors are investigated. The reactor responses and the thermal hydraulic behavior to the events are presented and discussed. From the minimum CHFR and the maximum fuel temperature calculated, it is ensured that the fuel is not damaged in the step insertion of reactivity by 1.8 mk and the failure of all primary pumps for the reactor with a 10 MW thermal power and downward core flow

  11. Temporal stability of blood lead concentrations in adults exposed only to environmental lead

    Energy Technology Data Exchange (ETDEWEB)

    Delves, H T; Sherlock, J C; Quinn, M J

    1984-08-01

    The temporal stability of blood lead concentrations of 21 health adults (14 men and 7 women) exposed only to environmental lead was assessed by analysis of 253 blood specimens collected serially over periods from 7 to 11 months. The women had lower blood lead concentrations (mean 8.5, range 7.4-10.8 micrograms/100 ml) than did the men (mean 12.2, range 8.6-15.8 micrograms/100 ml). These are within the expected ranges for non-occupationally exposed persons. Blood lead concentrations in the serial specimens from both men and women changed very little over the study period, with standard deviations of less than 0.5 micrograms/100 ml for the majority of individual mean concentrations: for all except low subjects the standard deviations were less than 0.8 micrograms/100 ml. Two subjects showed significant changes in blood lead concentrations during the study. A temporary increase in oral lead intake was identified for one of these subjects. In the absence of substantial changes in lead exposure blood lead levels in adults are remarkably stable and for their environmental monitoring a single blood lead concentration is an excellent biological indicator.

  12. Biological characterization of lead-enhanced exopolysaccharide produced by a lead resistant Enterobacter cloacae strain P2B.

    Science.gov (United States)

    Naik, Milind Mohan; Pandey, Anju; Dubey, Santosh Kumar

    2012-09-01

    A lead resistant bacterial strain isolated from effluent of lead battery manufacturing company of Goa, India has been identified as Enterobacter cloacae strain P2B based on morphological, biochemical characters, FAME profile and 16S rDNA sequence data. This bacterial strain could resist lead nitrate up to 1.6 mM. Significant increase in exopolysaccharide (EPS) production was observed as the production increased from 28 to 108 mg/L dry weight when exposed to 1.6 mM lead nitrate in Tris buffered minimal medium. Fourier-transformed infrared spectroscopy of this EPS revealed presence of several functional groups involved in metal binding viz. carboxyl, hydroxyl and amide groups along with glucuronic acid. Gas chromatography coupled with mass spectrometry analysis of alditol-acetate derivatives of acid hydrolysed EPS produced in presence of 1.6 mM lead nitrate demonstrated presence of several neutral sugars such as rhamnose, arabinose, xylose, mannose, galactose and glucose, which contribute to lead binding hydroxyl groups. Scanning electron microscope coupled with energy dispersive X-ray spectrometric analysis of this lead resistant strain exposed to 1.6 mM lead nitrate interestingly revealed mucous EPS surrounding bacterial cells which sequestered 17 % lead (as weight %) extracellularly and protected the bacterial cells from toxic effects of lead. This lead resistant strain also showed multidrug resistance. Thus these results significantly contribute to better understanding of structure, function and environmental application of lead-enhanced EPSs produced by bacteria. This lead-enhanced biopolymer can play a very important role in bioremediation of several heavy metals including lead.

  13. Determination of lead isotopic composition of airborne particulate matter by ICPMS: implications for lead atmospheric emissions in Canada

    International Nuclear Information System (INIS)

    Celo, V.; Dabek-Zlotorzynska, E.

    2009-01-01

    Full text: Quadrupole ICPMS was used for determination of trace metal concentrations and lead isotopic composition in fine particulate matter (PM 2.5 ) collected at selected sites within the Canadian National Air Pollution Surveillance network, from February 2005 to February 2007. High enrichment factors indicated that lead is mostly of anthropogenic origin and consequently, the lead isotopic composition is directly related to that of pollution sources. The 206 Pb/ 207 Pb and 208 Pb/ 207 Pb ratios were measured and the results were compared to the isotopic signatures of lead from different sources. Various approaches were used to assess the impact of relevant sources and the meteorological conditions in the occurrence and distribution of lead in Canadian atmospheric aerosols. (author)

  14. Test and Analysis Correlation of Form Impact onto Space Shuttle Wing Leading Edge RCC Panel 8

    Science.gov (United States)

    Fasanella, Edwin L.; Lyle, Karen H.; Gabrys, Jonathan; Melis, Matthew; Carney, Kelly

    2004-01-01

    Soon after the Columbia Accident Investigation Board (CAIB) began their study of the space shuttle Columbia accident, "physics-based" analyses using LS-DYNA were applied to characterize the expected damage to the Reinforced Carbon-Carbon (RCC) leading edge from high-speed foam impacts. Forensic evidence quickly led CAIB investigators to concentrate on the left wing leading edge RCC panels. This paper will concentrate on the test of the left-wing RCC panel 8 conducted at Southwest Research Institute (SwRI) and the correlation with an LS-DYNA analysis. The successful correlation of the LS-DYNA model has resulted in the use of LS-DYNA as a predictive tool for characterizing the threshold of damage for impacts of various debris such as foam, ice, and ablators onto the RCC leading edge for shuttle return-to-flight.

  15. Investigation of the Study Characteristics Affecting Clinical Trial Quality Using the Protocol Deviations Leading to Exclusion of Subjects From the Per Protocol Set Data in Studies for New Drug Application: A Retrospective Analysis.

    Science.gov (United States)

    Kohara, Norihito; Kaneko, Masayuki; Narukawa, Mamoru

    2018-01-01

    The concept of the risk-based approach has been introduced as an effort to secure the quality of clinical trials. In the risk-based approach, identification and evaluation of risk in advance are considered important. For recently completed clinical trials, we investigated the relationship between study characteristics and protocol deviations leading to the exclusion of subjects from Per Protocol Set (PPS) efficacy analysis. New drugs approved in Japan in the fiscal year 2014-2015 were targeted in the research. The reasons for excluding subjects from the PPS efficacy analysis were described in 102 trials out of 492 in the summary of new drug application documents, which was publicly disclosed after the drug's regulatory approval. The author extracted these reasons along with the numbers of the cases and the study characteristics of each clinical trial. Then, the direct comparison, univariate regression analysis, and multivariate regression analysis was carried out based on the exclusion rate. The study characteristics for which exclusion of subjects from the PPS efficacy analysis were frequently observed was multiregional clinical trials in study region; inhalant and external use in administration route; Anti-infective for systemic use; Respiratory system, Dermatologicals, and Nervous system in therapeutic drug under the Anatomical Therapeutic Chemical Classification. In the multivariate regression analysis, the clinical trial variables of inhalant, Respiratory system, or Dermatologicals were selected as study characteristics leading to a higher exclusion rate. The characteristics of the clinical trial that is likely to cause protocol deviations that will affect efficacy analysis were suggested. These studies should be considered for specific attention and priority observation in the trial protocol or its monitoring plan and execution, such as a clear description of inclusion/exclusion criteria in the protocol, development of training materials to site staff, and

  16. Error propagation in spatial modeling of public health data: a simulation approach using pediatric blood lead level data for Syracuse, New York.

    Science.gov (United States)

    Lee, Monghyeon; Chun, Yongwan; Griffith, Daniel A

    2018-04-01

    Lead poisoning produces serious health problems, which are worse when a victim is younger. The US government and society have tried to prevent lead poisoning, especially since the 1970s; however, lead exposure remains prevalent. Lead poisoning analyses frequently use georeferenced blood lead level data. Like other types of data, these spatial data may contain uncertainties, such as location and attribute measurement errors, which can propagate to analysis results. For this paper, simulation experiments are employed to investigate how selected uncertainties impact regression analyses of blood lead level data in Syracuse, New York. In these simulations, location error and attribute measurement error, as well as a combination of these two errors, are embedded into the original data, and then these data are aggregated into census block group and census tract polygons. These aggregated data are analyzed with regression techniques, and comparisons are reported between the regression coefficients and their standard errors for the error added simulation results and the original results. To account for spatial autocorrelation, the eigenvector spatial filtering method and spatial autoregressive specifications are utilized with linear and generalized linear models. Our findings confirm that location error has more of an impact on the differences than does attribute measurement error, and show that the combined error leads to the greatest deviations. Location error simulation results show that smaller administrative units experience more of a location error impact, and, interestingly, coefficients and standard errors deviate more from their true values for a variable with a low level of spatial autocorrelation. These results imply that uncertainty, especially location error, has a considerable impact on the reliability of spatial analysis results for public health data, and that the level of spatial autocorrelation in a variable also has an impact on modeling results.

  17. Lead isotopes in soils near five historic American lead smelters and refineries

    International Nuclear Information System (INIS)

    Rabinowitz, Michael B.

    2005-01-01

    This survey of soil lead in the vicinity of old industrial sites examines how the stable isotope patterns vary among the sites according to the sources of the lead ore processed at each site. Lead smelters and refineries, which closed down decades ago, are the basis of this investigation. Samples were taken from near five old factory sites in Collinsville and Alton (Illinois), Ponderay (Idaho), East Chicago (Indiana) and Omaha (Nebraska). Historical records were searched for accounts of the sources of the lead. Lead concentrations were measured by atomic absorption flame spectrophotometry, and stable isotopic analysis was done by plasma ionization mass spectrometry. At every site visited, remnants of the old factories, in terms of soil lead pollution, could be found. In spite of potential complications of varying smelter feedstock sourced from mines of different geological age, it was possible to match the isotopic patterns in the soils with the documented sources of the ores. The Collinsville and Alton sites resembled Missouri lead. The Ponderay value was higher than major Bunker Hill, Idaho deposits, but closer to the minor, nearby Oreille County, Washington ores. Mostly Utah ore was used in East Chicago. The Omaha soil reflects lead from Mexico, Colorado and Montana

  18. Identification and comparison of lead white of the 15-th century Gdansk panel paintings by neutron activation analysis

    International Nuclear Information System (INIS)

    Olszewska-Swietlik, J.; Panczyk, E.

    2005-01-01

    The lead white has been used in painting science the middle ages both as a priming ground and pigment. Purity of the lead white is directly connected with the lead purification methods that have undergone considerable changes throughout centuries. Marked progress in this respect was noted in the 19-th century. That is why determination of such elements a Ag, Hg, Zn, Cu, Co, Cr, Ba and Sb in the lead white gives reliable information regarding the age of the painting in question. Analyses of samples of the lead white taken from genuine 15-th century panel paintings representing the so-called Pomeranian school were carried out using instrumental neutron activation analysis technique. The total of 32 elements were determined in those samples; also the comparison of the data for Malopolska and Silesian schools was performed. (author)

  19. Association of umbilical cord blood lead with neonatal behavior at varying levels of exposure

    Directory of Open Access Journals (Sweden)

    Mamtani Manju R

    2006-06-01

    Full Text Available Abstract Background In the light of the ongoing debate about lowering the cut-off for acceptable blood lead level to Methods Using Brazelton's Neonatal Behavioral Assessment Scale (NBAS, an epidemiological approach and robust statistical techniques like multivariate linear regression, logistic regression, Poisson regression and structural equations modeling analyses we estimated the simultaneous indirect effects of umbilical cord blood lead (CBL levels and other neonatal covariates on the NBAS clusters. Results We observed that when analyzed in all study subjects, the CBL levels independently and strongly influenced autonomic stability and abnormal reflexes clusters. However, when the analysis was restricted to neonates with CBL Conclusion Our results further endorse the need to be cognizant of the detrimental effects of blood lead on neonates even at a low-dose prenatal exposure.

  20. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  1. Hybrid input-output approach to metal production and its application to the introduction of lead-free solders.

    Science.gov (United States)

    Nakamura, Shinichiro; Murakami, Shinsuke; Nakajima, Kenichi; Nagasaka, Tetsuya

    2008-05-15

    The production process of metals such as copper, lead, and zinc is characterized by mutual interconnections and interdependence, as well as by the occurrence of a large number of byproducts, which include precious or rare metals, such as gold, silver, bismuth, and indium. On the basis of the framework of waste input-output (WIO), we present a hybrid 10 model that takes full account of the mutual interdependence among the metal production processes and the interdependence between them and all the other production sectors of the economy as well. The combination of a comprehensive representation of the whole national economy and the introduction of process knowledge of metal production allows for a detailed analysis of different materials-use scenarios under the consideration of full supply chain effects. For illustration, a hypothetical case study of the introduction of lead-free solder involving the production of silver as a byproduct of copper and lead smelting processes was developed and implemented using Japanese data. To meet the increased demand for the recovery and recycling of silver resources from end-of-life products, the final destination of metal silver in terms of products and user categories was estimated, and the target components with the highest silver concentration were identified.

  2. STUDY OF INTERACTION BETWEEN LEAD AND GASTRIC MUCOSAL PROTEIN OF RATS WITH FORENSIC TOXICOLOGY APPROACH

    Directory of Open Access Journals (Sweden)

    Iwan Aflanie

    2017-09-01

    Full Text Available Abstract: Recently, forensic toxicology has been an interesting concern, especially in exposing the phenomena associated with the law. Using the forensic toxicology approach, several cases of lead (Pb poisoning have been widely revealed. In this present study will be investigate the interaction between Pb and amino acid gastric mucosal constituent proteins, especially cysteine and tyrosine groups. This research is a pure experimental research with posttest control group design, which is divided into 4 groups with 6 rats (Rattus novergicus in each group. Treatment in each group as follows; P0 was control group were given 2 ml of distilled water; P1 = administration of Pb 0.1 g/L; P2 = Pb administration of 1 mg/L; and P3 = Pb administration of 10 g/L for 4 weeks repectively. According to the results, it can be concluded that Pb-Protein interaction tends to binding of Pb-Cysteine rather than Pb-Tyrosine

  3. Remediation of lead-contaminated soils

    International Nuclear Information System (INIS)

    Peters, R.W.; Shem, L.

    1992-01-01

    Excavation and transport of soil contaminated with heavy metals has generally been the standard remediation technique for treatment of heavy-metal-contaminated soils. This approach is not a permanent solution; moreover, off-site shipment and disposal of contaminated soil involves high expense, liability, and appropriate regulatory approval. Recently, a number of other techniques have been investigated for treating such contaminated sites, including flotation, solidification/stabilization, vitrification, and chemical extraction. This paper reports the results of a laboratory investigation determining the efficiency of using chelating agents to extract lead from contaminated soils. Lead concentrations in the soils ranged from 500 to 10,000 mg/kg. Ethylenediaminetetraacetic acid (EDTA) and nitrilotriacetic acid (NTA) were examined for their potential extractive capabilities. Concentrations of the chelating agents ranged from 0.01 to 0.10 M. The pH of the suspensions in which the extractions were performed ranged from 4 to 12. Results showed that the removal of lead using NTA and water was ph-dependent, whereas the removal of lead using EDTA was ph-insensitive. Maximum removals of lead were 68.7%,19.1%, and 7.3% using EDTA, NTA, and water, respectively (as compared with initial lead concentrations)

  4. Metabolic pathway analysis using a nash equilibrium approach

    NARCIS (Netherlands)

    Lucia, Angelo; DiMaggio, Peter A.; Alonso-Martinez, Diego

    2018-01-01

    A novel approach to metabolic network analysis using a Nash Equilibrium (NE) formulation is proposed in which enzymes are considered players in a multi-player game. Each player has its own payoff function with the objective of minimizing the Gibbs free energy associated with the biochemical

  5. A Disciplined Architectural Approach to Scaling Data Analysis for Massive, Scientific Data

    Science.gov (United States)

    Crichton, D. J.; Braverman, A. J.; Cinquini, L.; Turmon, M.; Lee, H.; Law, E.

    2014-12-01

    Data collections across remote sensing and ground-based instruments in astronomy, Earth science, and planetary science are outpacing scientists' ability to analyze them. Furthermore, the distribution, structure, and heterogeneity of the measurements themselves pose challenges that limit the scalability of data analysis using traditional approaches. Methods for developing science data processing pipelines, distribution of scientific datasets, and performing analysis will require innovative approaches that integrate cyber-infrastructure, algorithms, and data into more systematic approaches that can more efficiently compute and reduce data, particularly distributed data. This requires the integration of computer science, machine learning, statistics and domain expertise to identify scalable architectures for data analysis. The size of data returned from Earth Science observing satellites and the magnitude of data from climate model output, is predicted to grow into the tens of petabytes challenging current data analysis paradigms. This same kind of growth is present in astronomy and planetary science data. One of the major challenges in data science and related disciplines defining new approaches to scaling systems and analysis in order to increase scientific productivity and yield. Specific needs include: 1) identification of optimized system architectures for analyzing massive, distributed data sets; 2) algorithms for systematic analysis of massive data sets in distributed environments; and 3) the development of software infrastructures that are capable of performing massive, distributed data analysis across a comprehensive data science framework. NASA/JPL has begun an initiative in data science to address these challenges. Our goal is to evaluate how scientific productivity can be improved through optimized architectural topologies that identify how to deploy and manage the access, distribution, computation, and reduction of massive, distributed data, while

  6. Fast Potentiometric Analysis of Lead in Aqueous Medium under Competitive Conditions Using an Acridono-Crown Ether Neutral Ionophore

    Directory of Open Access Journals (Sweden)

    Ádám Golcs

    2018-05-01

    Full Text Available Lead is a particularly toxic heavy metal that is present above acceptable levels in the water of many countries. This article describes a quick detection method of lead(II ions using a polyvinyl chloride (PVC-based ion-selective membrane electrode containing an acridono-crown ether ionophore by potentiometry. The electrochemical cell exhibits a Nernstian response for lead(II ions between the concentration range of 10−4 to 10−2 M, and can be used in the pH range of 4–7. The applicability of this sensor was verified by measuring a multicomponent aqueous sample. Under the given conditions, this electrode is suitable for the selective quantitative analysis of lead(II ions in the presence of many additional metal ions.

  7. Pressurizer /Auxiliary Spray Piping Stress Analysis For Determination Of Lead Shielding Maximum Allow Able Load

    International Nuclear Information System (INIS)

    Setjo, Renaningsih

    2000-01-01

    Piping stress analysis for PZR/Auxiliary Spray Lines Nuclear Power Plant AV Unit I(PWR Type) has been carried out. The purpose of this analysis is to establish a maximum allowable load that is permitted at the time of need by placing lead shielding on the piping system on class 1 pipe, Pressurizer/Auxiliary Spray Lines (PZR/Aux.) Reactor Coolant Loop 1 and 4 for NPP AV Unit one in the mode 5 and 6 during outage. This analysis is intended to reduce the maximum amount of radiation dose for the operator during ISI ( In service Inspection) period.The result shown that the maximum allowable loads for 4 inches lines for PZR/Auxiliary Spray Lines is 123 lbs/feet

  8. Study of the speciation of lead and zinc in industrial dusts and slags and in a contaminated soil: a spectroscopic approach

    International Nuclear Information System (INIS)

    Sobanska, Sophie

    1999-01-01

    As the study of physicochemical forms of metals in polluted soils is necessary to understand their mobilisation, and therefore to assess the risk they represent for the environment, the objective of this research thesis is to determine the speciation of lead and zinc in a soil contaminated by particles (dust and slag) released by a lead production plant. This determination is performed by using a spectroscopic approach, optic microscopy, X ray diffraction, scanning electronic microscopy, transmission electronic microscopy, electronic microprobe, and Raman micro-spectrometry. In order to understand the evolution of speciation of metals and of their propagation in soils, dust and slag produced by the industrial process have been sampled, and morphologically characterized. Associations of metals with other compounds like iron oxides and carbonates have been highlighted. The author shows that the contact with the ground results in a higher alteration of particles and in metal mobilisation. She reports the study of lead and zinc localisation in various particles, and of the influence of a change of soil physicochemical conditions (pH decrease, reduction by soil clogging during humid periods) [fr

  9. Anemia risk in relation to lead exposure in lead-related manufacturing

    Directory of Open Access Journals (Sweden)

    Nan-Hung Hsieh

    2017-05-01

    Full Text Available Abstract Background Lead-exposed workers may suffer adverse health effects under the currently regulated blood lead (BPb levels. However, a probabilistic assessment about lead exposure-associated anemia risk is lacking. The goal of this study was to examine the association between lead exposure and anemia risk among factory workers in Taiwan. Methods We first collated BPb and indicators of hematopoietic function data via health examination records that included 533 male and 218 female lead-exposed workers between 2012 and 2014. We used benchmark dose (BMD modeling to estimate the critical effect doses for detection of abnormal indicators. A risk-based probabilistic model was used to characterize the potential hazard of lead poisoning for job-specific workers by hazard index (HI. We applied Bayesian decision analysis to determine whether BMD could be implicated as a suitable BPb standard. Results Our results indicated that HI for total lead-exposed workers was 0.78 (95% confidence interval: 0.50–1.26 with risk occurrence probability of 11.1%. The abnormal risk of anemia indicators for male and female workers could be reduced, respectively, by 67–77% and 86–95% by adopting the suggested BPb standards of 25 and 15 μg/dL. Conclusions We conclude that cumulative exposure to lead in the workplace was significantly associated with anemia risk. This study suggests that current BPb standard needs to be better understood for the application of lead-exposed population protection in different scenarios to provide a novel standard for health management. Low-level lead exposure risk is an occupational and public health problem that should be paid more attention.

  10. Estimation of age in forensic medicine using multivariate approach to image analysis

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey V.; Belyaev, Ivan; Fominykh, Sergey

    2009-01-01

    approach based on statistical analysis of grey-level co-occurrence matrix, fractal analysis, wavelet transformation and Angle Measure Technique. Projection on latent structures regression was chosen for calibration and prediction. The method has been applied to 70 male and 63 female individuals aged from...... 21 to 93 and results were compared with traditional approach. Some important questions and problems have been raised....

  11. Development of a novel kinetic model for the analysis of PAH biodegradation in the presence of lead and cadmium co-contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Deary, Michael E., E-mail: michael.deary@northumbria.ac.uk [Department of Geography,Faculty of Engineering and Environment, Northumbria University, Ellison Building, Newcastle upon Tyne NE1 8ST (United Kingdom); Ekumankama, Chinedu C. [Department of Geography,Faculty of Engineering and Environment, Northumbria University, Ellison Building, Newcastle upon Tyne NE1 8ST (United Kingdom); Cummings, Stephen P. [Faculty of Health and Life Sciences, Northumbria University, Ellison Building, Newcastle upon Tyne NE1 8ST (United Kingdom)

    2016-04-15

    Highlights: • 40 week study of the biodegradation of 16 US EPA priority PAHs in a soil with high organic matter. • Effects of cadmium, lead and mercury co-contaminants studied. • Novel kinetic approach developed. • Biodegradation of lower molecular weight PAHs relatively unaffected by Cd or Pb. • Soil organic matter plays a key role in the PAH removal mechanism. - Abstract: We report on the results of a 40 week study in which the biodegradation of 16 US EPA polycyclic aromatic hydrocarbons (PAHs) was followed in microcosms containing soil of high organic carbon content (11%) in the presence and absence of lead and cadmium co-contaminants. The total spiked PAH concentration was 2166 mg/kg. Mercury amendment was also made to give an abiotic control. A novel kinetic model has been developed to explain the observed biphasic nature of PAH degradation. The model assumes that PAHs are distributed across soil phases of varying degrees of bioaccessibility. The results of the analysis suggest that overall percentage PAH loss is dependent on the respective rates at which the PAHs (a) are biodegraded by soil microorganisms in pore water and bioaccessible soil phases and (b) migrate from bioaccessible to non-bioaccessible soil phases. In addition, migration of PAHs to non-bioaccessible and non-Soxhlet-extractable soil phases associated with the humin pores gives rise to an apparent removal process. The presence of metal co-contaminants shows a concentration dependent inhibition of the biological degradation processes that results in a reduction in overall degradation. Lead appears to have a marginally greater inhibitory effect than cadmium.

  12. A novel exploratory chemometric approach to environmental monitorring by combining block clustering with Partial Least Square (PLS) analysis

    Science.gov (United States)

    2013-01-01

    Background Given the serious threats posed to terrestrial ecosystems by industrial contamination, environmental monitoring is a standard procedure used for assessing the current status of an environment or trends in environmental parameters. Measurement of metal concentrations at different trophic levels followed by their statistical analysis using exploratory multivariate methods can provide meaningful information on the status of environmental quality. In this context, the present paper proposes a novel chemometric approach to standard statistical methods by combining the Block clustering with Partial least square (PLS) analysis to investigate the accumulation patterns of metals in anthropized terrestrial ecosystems. The present study focused on copper, zinc, manganese, iron, cobalt, cadmium, nickel, and lead transfer along a soil-plant-snai food chain, and the hepatopancreas of the Roman snail (Helix pomatia) was used as a biological end-point of metal accumulation. Results Block clustering deliniates between the areas exposed to industrial and vehicular contamination. The toxic metals have similar distributions in the nettle leaves and snail hepatopancreas. PLS analysis showed that (1) zinc and copper concentrations at the lower trophic levels are the most important latent factors that contribute to metal accumulation in land snails; (2) cadmium and lead are the main determinants of pollution pattern in areas exposed to industrial contamination; (3) at the sites located near roads lead is the most threatfull metal for terrestrial ecosystems. Conclusion There were three major benefits by applying block clustering with PLS for processing the obtained data: firstly, it helped in grouping sites depending on the type of contamination. Secondly, it was valuable for identifying the latent factors that contribute the most to metal accumulation in land snails. Finally, it optimized the number and type of data that are best for monitoring the status of metallic

  13. A novel exploratory chemometric approach to environmental monitorring by combining block clustering with Partial Least Square (PLS) analysis.

    Science.gov (United States)

    Nica, Dragos V; Bordean, Despina Maria; Pet, Ioan; Pet, Elena; Alda, Simion; Gergen, Iosif

    2013-08-30

    Given the serious threats posed to terrestrial ecosystems by industrial contamination, environmental monitoring is a standard procedure used for assessing the current status of an environment or trends in environmental parameters. Measurement of metal concentrations at different trophic levels followed by their statistical analysis using exploratory multivariate methods can provide meaningful information on the status of environmental quality. In this context, the present paper proposes a novel chemometric approach to standard statistical methods by combining the Block clustering with Partial least square (PLS) analysis to investigate the accumulation patterns of metals in anthropized terrestrial ecosystems. The present study focused on copper, zinc, manganese, iron, cobalt, cadmium, nickel, and lead transfer along a soil-plant-snai food chain, and the hepatopancreas of the Roman snail (Helix pomatia) was used as a biological end-point of metal accumulation. Block clustering deliniates between the areas exposed to industrial and vehicular contamination. The toxic metals have similar distributions in the nettle leaves and snail hepatopancreas. PLS analysis showed that (1) zinc and copper concentrations at the lower trophic levels are the most important latent factors that contribute to metal accumulation in land snails; (2) cadmium and lead are the main determinants of pollution pattern in areas exposed to industrial contamination; (3) at the sites located near roads lead is the most threatfull metal for terrestrial ecosystems. There were three major benefits by applying block clustering with PLS for processing the obtained data: firstly, it helped in grouping sites depending on the type of contamination. Secondly, it was valuable for identifying the latent factors that contribute the most to metal accumulation in land snails. Finally, it optimized the number and type of data that are best for monitoring the status of metallic contamination in terrestrial ecosystems

  14. Analytical model and stability analysis of the leading edge spar of a passively morphing ornithopter wing.

    Science.gov (United States)

    Wissa, Aimy; Calogero, Joseph; Wereley, Norman; Hubbard, James E; Frecker, Mary

    2015-10-26

    This paper presents the stability analysis of the leading edge spar of a flapping wing unmanned air vehicle with a compliant spine inserted in it. The compliant spine is a mechanism that was designed to be flexible during the upstroke and stiff during the downstroke. Inserting a variable stiffness mechanism into the leading edge spar affects its structural stability. The model for the spar-spine system was formulated in terms of the well-known Mathieu's equation, in which the compliant spine was modeled as a torsional spring with a sinusoidal stiffness function. Experimental data was used to validate the model and results show agreement within 11%. The structural stability of the leading edge spar-spine system was determined analytically and graphically using a phase plane plot and Strutt diagrams. Lastly, a torsional viscous damper was added to the leading edge spar-spine model to investigate the effect of damping on stability. Results show that for the un-damped case, the leading edge spar-spine response was stable and bounded; however, there were areas of instability that appear for a range of spine upstroke and downstroke stiffnesses. Results also show that there exist a damping ratio between 0.2 and 0.5, for which the leading edge spar-spine system was stable for all values of spine upstroke and downstroke stiffnesses.

  15. A new approach for elasto-plastic finite strain analysis of cantilever ...

    Indian Academy of Sciences (India)

    A new approach for elasto-plastic finite strain analysis of cantilever beams subjected to uniform bending moment ... Curvature; deflection curve; cantilever beam; elasto-plastic analysis; tapered beam subjected to tipmoment; ... Sadhana | News.

  16. The economic consequences of elevated body-lead burdens in urban children

    International Nuclear Information System (INIS)

    Agree, M.D.

    1991-01-01

    The following analysis develops the theory and implementation of the observed behavior technique in an altruistic setting, to assess the health benefits of reducing environmental lead exposure in urban children. Three models are presented which allow for endogenous body lead burden, risk of irreversible neurological damages, and Bayesian information. Conditions are derived under which the observed behavior technique can be modified to value the health consequences of exposure to a general class of persistent micropollutants (PMP's): the heavy metals. Benefit expressions reflect the tradeoff between parental wealth and child health when children are exposed to low level doses of lead. The purpose is to derive exact measures of marginal welfare change associated with variations in child body lead burden, and to determine the conditions under which these measures will be functions of observable parameters. The analysis presents an entirely ex ante approach to the recovery of benefit estimates when PMP exposure involves risk of irreversible health damages. In doing so, an empirical estimate is also obtained for the parental value of child health information that is used in the revision of prior risk beliefs. Risk of chronic irreversible health effects in younger generations from environmental lead exposure may be experienced by a large share of metropolitan population in the US. Given the large numbers of possible victims, the aggregate social value of avoiding this risk is an important policy issues. Moreover, the value of health risk information is potentially important to the use of an information program as a policy instrument in reducing health risk because it would enable the comparison of societal benefits from an information program to the cost of it's implementation

  17. Gas cooled leads

    International Nuclear Information System (INIS)

    Shutt, R.P.; Rehak, M.L.; Hornik, K.E.

    1993-01-01

    The intent of this paper is to cover as completely as possible and in sufficient detail the topics relevant to lead design. The first part identifies the problems associated with lead design, states the mathematical formulation, and shows the results of numerical and analytical solutions. The second part presents the results of a parametric study whose object is to determine the best choice for cooling method, material, and geometry. These findings axe applied in a third part to the design of high-current leads whose end temperatures are determined from the surrounding equipment. It is found that cooling method or improved heat transfer are not critical once good heat exchange is established. The range 5 5 but extends over a large of values. Mass flow needed to prevent thermal runaway varies linearly with current above a given threshold. Below that value, the mass flow is constant with current. Transient analysis shows no evidence of hysteresis. If cooling is interrupted, the mass flow needed to restore the lead to its initially cooled state grows exponentially with the time that the lead was left without cooling

  18. Principal coordinate analysis assisted chromatographic analysis of bacterial cell wall collection: A robust classification approach.

    Science.gov (United States)

    Kumar, Keshav; Cava, Felipe

    2018-04-10

    In the present work, Principal coordinate analysis (PCoA) is introduced to develop a robust model to classify the chromatographic data sets of peptidoglycan sample. PcoA captures the heterogeneity present in the data sets by using the dissimilarity matrix as input. Thus, in principle, it can even capture the subtle differences in the bacterial peptidoglycan composition and can provide a more robust and fast approach for classifying the bacterial collection and identifying the novel cell wall targets for further biological and clinical studies. The utility of the proposed approach is successfully demonstrated by analysing the two different kind of bacterial collections. The first set comprised of peptidoglycan sample belonging to different subclasses of Alphaproteobacteria. Whereas, the second set that is relatively more intricate for the chemometric analysis consist of different wild type Vibrio Cholerae and its mutants having subtle differences in their peptidoglycan composition. The present work clearly proposes a useful approach that can classify the chromatographic data sets of chromatographic peptidoglycan samples having subtle differences. Furthermore, present work clearly suggest that PCoA can be a method of choice in any data analysis workflow. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. A Hybrid approach for aeroacoustic analysis of the engine exhaust system

    OpenAIRE

    Sathyanarayana, Y; Munjal, ML

    2000-01-01

    This paper presents a new hybrid approach for prediction of noise radiation from engine exhaust systems. It couples the time domain analysis of the engine and the frequency domain analysis of the muffler, and has the advantages of both. In this approach, cylinder/cavity is analyzed in the time domain to calculate the exhaust mass flux history at the exhaust valve by means of the method of characteristics, avoiding the tedious procedure of interpolation at every mesh point and solving a number...

  20. Multivariate analysis of 2-DE protein patterns - Practical approaches

    DEFF Research Database (Denmark)

    Jacobsen, Charlotte; Jacobsen, Susanne; Grove, H.

    2007-01-01

    Practical approaches to the use of multivariate data analysis of 2-DE protein patterns are demonstrated by three independent strategies for the image analysis and the multivariate analysis on the same set of 2-DE data. Four wheat varieties were selected on the basis of their baking quality. Two...... of the varieties were of strong baking quality and hard wheat kernel and two were of weak baking quality and soft kernel. Gliadins at different stages of grain development were analyzed by the application of multivariate data analysis on images of 2-DEs. Patterns related to the wheat varieties, harvest times...

  1. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael

    2017-01-01

    , functional forms, estimation methods and derived estimates of the degree of market power. Thereafter, we use our framework to evaluate several structural models based on PTA and GIM to measure oligopsony power in the Ukrainian dairy industry. The PTA-based results suggest that the estimated parameters......This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power...... in agricultural and food markets. We provide a framework that may help researchers to evaluate and improve structural models of market power. Starting with the specification of the approaches in question, we compare published empirical studies of market power with respect to the choice of the applied approach...

  2. Two-process approach to electron beam welding control

    International Nuclear Information System (INIS)

    Lastovirya, V.N.

    1987-01-01

    The analysis and synthesis of multi-dimensional welding control systems, which require the usage of computers, should be conducted within the temporal range. From the general control theory point two approaches - one-process and two-process - are possible to electron beam welding. In case of two-process approach, subprocesses of heat source formation and direct metal melting are separated. Two-process approach leads to two-profile control system and provides the complete controlability of electron beam welding within the frameworks of systems with concentrated, as well as, with distributed parameters. Approach choice for the given problem solution is determined, first of all, by stability degree of heat source during welding

  3. Analysis and testing of the DIII-D ohmic heating coil lead repair clamp

    International Nuclear Information System (INIS)

    Reis, E.E.; Anderson, P.M.; Chin, E.; Robinson, J.I.

    1997-11-01

    DIII-D has been operating for the last year with limited volt-second capabilities due to structural failure of a conductor lead to one of the ohmic heating (OH) solenoids. The conductor failure was due to poor epoxy impregnation of the overwrap of the lead pack, resulting in copper fatigue and a water leak. A number of structural analyses were performed to assist in determining the failure scenario and to evaluate various repair options. A fatigue stress analysis of the leads with a failed epoxy overwrap indicated crack initiation after 1,000 cycles at the maximum operating conditions. The failure occurred in a very inaccessible area which restricted design repair options to concepts which could be implemented remotely. Several design options were considered for repairing the lead so that it can sustain the loads for 7.5 Vs conditions at full toroidal field. A clamp, along with preloaded banding straps and shim bags, provides a system that guarantees that the stress at the crack location is always compressive and prevents further crack growth in the conductor. Due to the limited space available for the repair, it was necessary to design the clamp system to operate at the material yield stress. The primary components of the clamp system were verified by load tests prior to installation. The main body of the clamp contains a load cell and potentiometer for monitoring the load-deflection characteristics of the clamp and conductors during plasma operation. Strain gages provides redundant instrumentation. If required, the preload on the conductors can be increased remotely by a special wrench attached to the clamp assembly

  4. Suspected lead toxicosis in a bald eagle

    Science.gov (United States)

    Jacobson, E.; Carpenter, J.W.; Novilla, M.

    1977-01-01

    An immature bald eagle (Haliaeetus leucocephalus) was submitted to the University of Maryland, College Park, for clinical examination. The bird was thin, had green watery feces, and was unable to maintain itself in upright posture. Following radiography, the bird went into respiratory distress and died. Numerous lead shot were recovered from the gizzard, and chemical analysis of liver and kidney tissue revealed 22.9 and 11.3 ppm lead, respectively. The clinical signs, necropsy findings, and chemical analysis of the eagle were compatible with lead toxicosis.

  5. Design and analysis of experiments classical and regression approaches with SAS

    CERN Document Server

    Onyiah, Leonard C

    2008-01-01

    Introductory Statistical Inference and Regression Analysis Elementary Statistical Inference Regression Analysis Experiments, the Completely Randomized Design (CRD)-Classical and Regression Approaches Experiments Experiments to Compare Treatments Some Basic Ideas Requirements of a Good Experiment One-Way Experimental Layout or the CRD: Design and Analysis Analysis of Experimental Data (Fixed Effects Model) Expected Values for the Sums of Squares The Analysis of Variance (ANOVA) Table Follow-Up Analysis to Check fo

  6. A Routine 'Top-Down' Approach to Analysis of the Human Serum Proteome.

    Science.gov (United States)

    D'Silva, Arlene M; Hyett, Jon A; Coorssen, Jens R

    2017-06-06

    Serum provides a rich source of potential biomarker proteoforms. One of the major obstacles in analysing serum proteomes is detecting lower abundance proteins owing to the presence of hyper-abundant species (e.g., serum albumin and immunoglobulins). Although depletion methods have been used to address this, these can lead to the concomitant removal of non-targeted protein species, and thus raise issues of specificity, reproducibility, and the capacity for meaningful quantitative analyses. Altering the native stoichiometry of the proteome components may thus yield a more complex series of issues than dealing directly with the inherent complexity of the sample. Hence, here we targeted method refinements so as to ensure optimum resolution of serum proteomes via a top down two-dimensional gel electrophoresis (2DE) approach that enables the routine assessment of proteoforms and is fully compatible with subsequent mass spectrometric analyses. Testing included various fractionation and non-fractionation approaches. The data show that resolving 500 µg protein on 17 cm 3-10 non-linear immobilised pH gradient strips in the first dimension followed by second dimension resolution on 7-20% gradient gels with a combination of lithium dodecyl sulfate (LDS) and sodium dodecyl sulfate (SDS) detergents markedly improves the resolution and detection of proteoforms in serum. In addition, well established third dimension electrophoretic separations in combination with deep imaging further contributed to the best available resolution, detection, and thus quantitative top-down analysis of serum proteomes.

  7. Computational Approaches for Integrative Analysis of the Metabolome and Microbiome

    Directory of Open Access Journals (Sweden)

    Jasmine Chong

    2017-11-01

    Full Text Available The study of the microbiome, the totality of all microbes inhabiting the host or an environmental niche, has experienced exponential growth over the past few years. The microbiome contributes functional genes and metabolites, and is an important factor for maintaining health. In this context, metabolomics is increasingly applied to complement sequencing-based approaches (marker genes or shotgun metagenomics to enable resolution of microbiome-conferred functionalities associated with health. However, analyzing the resulting multi-omics data remains a significant challenge in current microbiome studies. In this review, we provide an overview of different computational approaches that have been used in recent years for integrative analysis of metabolome and microbiome data, ranging from statistical correlation analysis to metabolic network-based modeling approaches. Throughout the process, we strive to present a unified conceptual framework for multi-omics integration and interpretation, as well as point out potential future directions.

  8. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  9. Qualitative analysis of factors leading to clinical incidents.

    Science.gov (United States)

    Smith, Matthew D; Birch, Julian D; Renshaw, Mark; Ottewill, Melanie

    2013-01-01

    The purpose of this paper is to evaluate the common themes leading or contributing to clinical incidents in a UK teaching hospital. A root-cause analysis was conducted on patient safety incidents. Commonly occurring root causes and contributing factors were collected and correlated with incident timing and severity. In total, 65 root-cause analyses were reviewed, highlighting 202 factors implicated in the clinical incidents and 69 categories were identified. The 14 most commonly occurring causes (encountered in four incidents or more) were examined as a key-root or contributory cause. Incident timing was also analysed; common factors were encountered more frequently during out-hours--occurring as contributory rather than a key-root cause. In total, 14 commonly occurring factors were identified to direct interventions that could prevent many clinical incidents. From these, an "Organisational Safety Checklist" was developed to involve departmental level clinicians to monitor practice. This study demonstrates that comprehensively investigating incidents highlights common factors that can be addressed at a local level. Resilience against clinical incidents is low during out-of-hours periods, where factors such as lower staffing levels and poor service provision allows problems to escalate and become clinical incidents, which adds to the literature regarding out-of-hours care provision and should prove useful to those organising hospital services at departmental and management levels.

  10. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  11. The radiative decays $B \\to V_{\\gamma}$ at next-to-leading order in QCD

    CERN Document Server

    Bosch, S W; Bosch, Stefan W.; Buchalla, Gerhard

    2002-01-01

    We provide a model-independent framework for the analysis of the radiative B-meson decays B -> K* gamma and B -> rho gamma. In particular, we give a systematic discussion of the various contributions to these exclusive processes based on the heavy-quark limit of QCD. We propose a novel factorization formula for the consistent treatment of B -> V gamma matrix elements involving charm (or up-quark) loops, which contribute at leading power in Lambda_QCD/m_B to the decay amplitude. Annihilation topologies are shown to be power suppressed. In some cases they are nevertheless calculable. The approach is similar to the framework of QCD factorization that has recently been formulated for two-body non-leptonic B decays. These results allow us, for the first time, to compute exclusive b -> s(d) gamma decays systematically beyond the leading logarithmic approximation. We present results for these decays complete to next-to-leading order in QCD and to leading order in the heavy-quark limit. Phenomenological implications ...

  12. ADVANCEMENTS IN TIME-SPECTRA ANALYSIS METHODS FOR LEAD SLOWING-DOWN SPECTROSCOPY

    International Nuclear Information System (INIS)

    Smith, Leon E.; Anderson, Kevin K.; Gesh, Christopher J.; Shaver, Mark W.

    2010-01-01

    Direct measurement of Pu in spent nuclear fuel remains a key challenge for safeguarding nuclear fuel cycles of today and tomorrow. Lead slowing-down spectroscopy (LSDS) is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic mass with an uncertainty lower than the approximately 10 percent typical of today's confirmatory assay methods. Pacific Northwest National Laboratory's (PNNL) previous work to assess the viability of LSDS for the assay of pressurized water reactor (PWR) assemblies indicated that the method could provide direct assay of Pu-239 and U-235 (and possibly Pu-240 and Pu-241) with uncertainties less than a few percent, assuming suitably efficient instrumentation, an intense pulsed neutron source, and improvements in the time-spectra analysis methods used to extract isotopic information from a complex LSDS signal. This previous simulation-based evaluation used relatively simple PWR fuel assembly definitions (e.g. constant burnup across the assembly) and a constant initial enrichment and cooling time. The time-spectra analysis method was founded on a preliminary analytical model of self-shielding intended to correct for assay-signal nonlinearities introduced by attenuation of the interrogating neutron flux within the assembly.

  13. Intelligent Systems Approaches to Product Sound Quality Analysis

    Science.gov (United States)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  14. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  15. A global sensitivity analysis approach for morphogenesis models.

    Science.gov (United States)

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  16. Points of convergence between functional and formal approaches to syntactic analysis

    DEFF Research Database (Denmark)

    Bjerre, Tavs; Engels, Eva; Jørgensen, Henrik

    2008-01-01

    respectively: The functional approach is represented by Paul Diderichsen's (1936, 1941, 1946, 1964) sætningsskema, ‘sentence model', and the formal approach is represented by analysis whose main features are common to the principles and parameters framework (Chomsky 1986) and the minimalist programme (Chomsky...

  17. The analysis of QT interval and repolarization morphology of the heart in chronic exposure to lead.

    Science.gov (United States)

    Kiełtucki, J; Dobrakowski, M; Pawlas, N; Średniawa, B; Boroń, M; Kasperczyk, S

    2017-10-01

    There are no common recommendations regarding electrocardiographic monitoring in occupationally exposed workers. Therefore, the present study was designed to investigate whether exposure to lead results in an increase of selected electrocardiography (ECG) pathologies, such as QT interval prolongation and repolarization disorders, in occupationally exposed workers. The study group included 180 workers occupationally exposed to lead compounds. The exposed group was divided according to the median of the mean blood lead level (PbB mean ) calculated based on a series of measurements performed during 5-year observation period (35 µg/dl) into two subgroups: low exposure (LE, PbB mean = 20.0-35.0 µg/dl) and high exposure (HE, PbB mean = 35.1-46.4 µg/dl). The control group consisted of 69 healthy workers without occupational exposure to lead. ECG evaluation included the analysis of heart rate (HR), QT interval and repolarization abnormalities. Mean QT interval was significantly greater in the exposed population than in the control group by 2%. In the HE group, mean QT interval was significantly greater than in the control group by 4% and significantly different from those noted in the LE group. Positive correlations between QT interval and lead exposure indices were also reported. Besides, there was a negative correlation between HR and blood lead level. Increased concentration of lead in the blood above 35 μg/dl is associated with the QT interval prolongation, which may trigger arrhythmias when combined with other abnormalities, such as long QT syndrome. Therefore, electrocardiographic evaluation should be a part of a routine monitoring of occupationally exposed populations.

  18. Natural phenomena risk analysis - an approach for the tritium facilities 5480.23 SAR natural phenomena hazards accident analysis

    International Nuclear Information System (INIS)

    Cappucci, A.J. Jr.; Joshi, J.R.; Long, T.A.; Taylor, R.P.

    1997-01-01

    A Tritium Facilities (TF) Safety Analysis Report (SAR) has been developed which is compliant with DOE Order 5480.23. The 5480.23 SAR upgrades and integrates the safety documentation for the TF into a single SAR for all of the tritium processing buildings. As part of the TF SAR effort, natural phenomena hazards (NPH) were analyzed. A cost effective strategy was developed using a team approach to take advantage of limited resources and budgets. During development of the Hazard and Accident Analysis for the 5480.23 SAR, a strategy was required to allow maximum use of existing analysis and to develop a cost effective graded approach for any new analysis in identifying and analyzing the bounding accidents for the TF. This approach was used to effectively identify and analyze NPH for the TF. The first part of the strategy consisted of evaluating the current SAR for the RTF to determine what NPH analysis could be used in the new combined 5480.23 SAR. The second part was to develop a method for identifying and analyzing NPH events for the older facilities which took advantage of engineering judgment, was cost effective, and followed a graded approach. The second part was especially challenging because of the lack of documented existing analysis considered adequate for the 5480.23 SAR and a limited budget for SAR development and preparation. This paper addresses the strategy for the older facilities

  19. A network approach for distinguishing ethical issues in research and development.

    Science.gov (United States)

    Zwart, Sjoerd D; van de Poel, Ibo; van Mil, Harald; Brumsen, Michiel

    2006-10-01

    In this paper we report on our experiences with using network analysis to discern and analyse ethical issues in research into, and the development of, a new wastewater treatment technology. Using network analysis, we preliminarily interpreted some of our observations in a Group Decision Room (GDR) session where we invited important stakeholders to think about the risks of this new technology. We show how a network approach is useful for understanding the observations, and suggests some relevant ethical issues. We argue that a network approach is also useful for ethical analysis of issues in other fields of research and development. The abandoning of the overarching rationality assumption, which is central to network approaches, does not have to lead to ethical relativism.

  20. Development of exploratory approach for scenario analysis in the performance assessment of geological disposal

    International Nuclear Information System (INIS)

    Makino, Hitoshi; Ishiguro, Katsuhiko; Umeki, Hiroyuki; Oyamada, Kiyoshi; Takase, Hiroyasu; Grindrod, Peter

    1998-01-01

    It becomes difficult to apply the ordinary method for scenario analysis as number of the processes and complexity in their interrelations are increased. For this problem, an exploratory approach, that can perform scenario analysis on wider range of problems, was developed. The approach includes ensemble runs of a mass transport model, that was developed as a generic and flexible model and can cover effects of various processes on the mass transport, and analysis of sensitivity structure among the input and output space of the ensemble runs. The technique of clustering and principal component analysis were applied in the approach. As the result of its test application, applicability of the approach was confirmed to identify important processes from number of the processes in the systematic and objective manner. (author)

  1. Effects of lead pollution on Ammonia parkinsoniana (foraminifera): ultrastructural and microanalytical approaches.

    Science.gov (United States)

    Frontalini, F; Curzi, D; Giordano, F M; Bernhard, J M; Falcieri, E; Coccioni, R

    2015-01-30

    The responses of Ammonia parkinsoniana (Foraminifera) exposed to different concentrations of lead (Pb) were evaluated at the cytological level. Foraminifera-bearing sediments were placed in mesocosms that were housed in aquaria each with seawater of a different lead concentration. On the basis of transmission electron microscopy and environmental scanning electron microscopy coupled with energy dispersive spectrometer analyses, it was possible to recognize numerous morphological differences between untreated (i.e., control) and treated (i.e., lead enrichment) specimens. In particular, higher concentrations of this pollutant led to numerical increase of lipid droplets characterized by a more electron-dense core, proliferation of residual bodies, a thickening of the organic lining, mitochondrial degeneration, autophagosome proliferation and the development of inorganic aggregates.  All these cytological modifications might be related to the pollutant-induced stress and some of them such as the thickening of organic lining might suggest a potential mechanism of protection adopted by foraminifera.

  2. Semiotic Approach to the Analysis of Children's Drawings

    Science.gov (United States)

    Turkcan, Burcin

    2013-01-01

    Semiotics, which is used for the analysis of a number of communication languages, helps describe the specific operational rules by determining the sub-systems included in the field it examines. Considering that art is a communication language, this approach could be used in analyzing children's products in art education. The present study aiming…

  3. Survival Analysis of Occipital Nerve Stimulator Leads Placed under Fluoroscopic Guidance with and without Ultrasonography.

    Science.gov (United States)

    Jones, James H; Brown, Alison; Moyse, Daniel; Qi, Wenjing; Roy, Lance

    2017-11-01

    Electrical stimulation of the greater occipital nerves is performed to treat pain secondary to chronic daily headaches and occipital neuralgia. The use of fluoroscopy alone to guide the surgical placement of electrodes near the greater occipital nerves disregards the impact of tissue planes on lead stability and stimulation efficacy. We hypothesized that occipital neurostimulator (ONS) leads placed with ultrasonography combined with fluoroscopy would demonstrate increased survival rates and times when compared to ONS leads placed with fluoroscopy alone. A 2-arm retrospective chart review. A single academic medical center. This retrospective chart review analyzed the procedure notes and demographic data of patients who underwent the permanent implant of an ONS lead between July 2012 and August 2015. Patient data included the diagnosis (reason for implant), smoking tobacco use, disability, and age. ONS lead data included the date of permanent implant, the imaging modality used during permanent implant (fluoroscopy with or without ultrasonography), and, if applicable, the date and reason for lead removal. A total of 21 patients (53 leads) were included for the review. Chi-squared tests, Fishers exact tests, 2-sample t-tests, and Wilcoxon rank-sum tests were used to compare fluoroscopy against combined fluoroscopy and ultrasonography as implant methods with respect to patient demographics. These tests were also used to evaluate the primary aim of this study, which was to compare the survival rates and times of ONS leads placed with combined ultrasonography and fluoroscopy versus those placed with fluoroscopy alone. Survival analysis was used to assess the effect of implant method, adjusted for patient demographics (age, smoking tobacco use, and disability), on the risk of lead explant. Data from 21 patients were collected, including a total of 53 ONS leads. There was no statistically significant difference in the lead survival rate or time, disability, or patient age

  4. Feed particle size evaluation: conventional approach versus digital holography based image analysis

    Directory of Open Access Journals (Sweden)

    Vittorio Dell’Orto

    2010-01-01

    Full Text Available The aim of this study was to evaluate the application of image analysis approach based on digital holography in defining particle size in comparison with the sieve shaker method (sieving method as reference method. For this purpose ground corn meal was analyzed by a sieve shaker Retsch VS 1000 and by image analysis approach based on digital holography. Particle size from digital holography were compared with results obtained by screen (sieving analysis for each of size classes by a cumulative distribution plot. Comparison between particle size values obtained by sieving method and image analysis indicated that values were comparable in term of particle size information, introducing a potential application for digital holography and image analysis in feed industry.

  5. Adaptation and implementation of the TRACE code for transient analysis on designs of cooled lead fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2014-01-01

    The article describes the changes implemented in the TRACE code to include thermodynamic tables of liquid lead drawn from experimental results. He then explains the process for developing a thermohydraulic model for the prototype ALFRED and analysis of a selection of representative transient conducted within the framework of international research projects. The study demonstrates the applicability of TRACE code to simulate designs of cooled lead fast reactors and exposes the high safety margins are there in this technology to accommodate the most severe transients identified in their security study. (Author)

  6. Multiscale approach to the physics of radiation damage with ions

    International Nuclear Information System (INIS)

    Surdutovich, E.; Solov'yov, A.

    2014-01-01

    The multiscale approach to the assessment of bio-damage resulting upon irradiation of biological media with ions is reviewed, explained and compared to other approaches. The processes of ion propagation in the medium concurrent with ionization and excitation of molecules, transport of secondary products, dynamics of the medium, and biological damage take place on a number of different temporal, spatial and energy scales. The multiscale approach, a physical phenomenon-based analysis of the scenario that leads to radiation damage, has been designed to consider all relevant effects on a variety of scales and develop an approach to the quantitative assessment of biological damage as a result of irradiation with ions. Presently, physical and chemical effects are included in the scenario while the biological effects such as DNA repair are only mentioned. This paper explains the scenario of radiation damage with ions, overviews its major parts, and applies the multiscale approach to different experimental conditions. On the basis of this experience, the recipe for application of the multiscale approach is formulated. The recipe leads to the calculation of relative biological effectiveness. (authors)

  7. The Existence Of Leading Islands Securing And The Border Areas Unitary State Of Indonesia An Analysis In Law Perspective

    Directory of Open Access Journals (Sweden)

    Nazali

    2015-08-01

    Full Text Available Abstract The research was carried with the aim to discover the existence of securing the foremost islands and state border region of the Republic of Indonesia reviewed from a legal perspective which is directly related to the existence of security and dispute resolution methods as well as the governance of the foremost islands and border region in Kalimantan which bordering Malaysia. This study was conducted in Nunukan district and the surrounding provinces of Kalimantan in this research method that used is normative legal analysis data with juridical and qualitative descriptive approach. The results showed that the security of foremost islands and border region of law perspective in accordance with the Law No. 34 of 2004 regarding the Indonesian National Army has not been implemented to the fullest to realize the security of foremost islands and border region as the frontline of the Republic of Indonesia. The existence of leading islands securing and the border region of the Republic of Indonesia still contain many weaknesses in terms of both governance and security.

  8. Influence of ROI selection on Resting Functional Connectivity: An Individualized Approach for Resting fMRI Analysis

    Directory of Open Access Journals (Sweden)

    William Seunghyun Sohn

    2015-08-01

    Full Text Available The differences in how our brain is connected are often thought to reflect the differences in our individual personalities and cognitive abilities. Individual differences in brain connectivity has long been recognized in the neuroscience community however it has yet to manifest itself in the methodology of resting state analysis. This is evident as previous studies use the same region of interest (ROIs for all subjects. In this paper we demonstrate that the use of ROIs which are standardized across individuals leads to inaccurate calculations of functional connectivity. We also show that this problem can be addressed by taking an individualized approach by using subject-specific ROIs. Finally we show that ROI selection can affect the way we interpret our data by showing different changes in functional connectivity with ageing.

  9. A next-to-leading order QCD analysis of the spin structure function $g_1$

    CERN Document Server

    AUTHOR|(CDS)2067425; Arik, E; Badelek, B; Bardin, G; Baum, G; Berglund, P; Betev, L; Birsa, R; De Botton, N R; Bradamante, Franco; Bravar, A; Bressan, A; Bültmann, S; Burtin, E; Crabb, D; Cranshaw, J; Çuhadar-Dönszelmann, T; Dalla Torre, S; Van Dantzig, R; Derro, B R; Deshpande, A A; Dhawan, S K; Dulya, C M; Eichblatt, S; Fasching, D; Feinstein, F; Fernández, C; Forthmann, S; Frois, Bernard; Gallas, A; Garzón, J A; Gilly, H; Giorgi, M A; von Goeler, E; Görtz, S; Gracia, G; De Groot, N; Grosse-Perdekamp, M; Haft, K; Von Harrach, D; Hasegawa, T; Hautle, P; Hayashi, N; Heusch, C A; Horikawa, N; Hughes, V W; Igo, G; Ishimoto, S; Iwata, T; Kabuss, E M; Kageya, T; Karev, A G; Kessler, H J; Ketel, T; Kiryluk, J; Kiselev, Yu F; Krämer, Dietrich; Krivokhizhin, V G; Kröger, W; Kukhtin, V V; Kurek, K; Kyynäräinen, J; Lamanna, M; Landgraf, U; Le Goff, J M; Lehár, F; de Lesquen, A; Lichtenstadt, J; Litmaath, M; Magnon, A; Mallot, G K; Marie, F; Martin, A; Martino, J; Matsuda, T; Mayes, B W; McCarthy, J S; Medved, K S; Meyer, W T; Van Middelkoop, G; Miller, D; Miyachi, Y; Mori, K; Moromisato, J H; Nassalski, J P; Naumann, Lutz; Niinikoski, T O; Oberski, J; Ogawa, A; Ozben, C; Pereira, H; Perrot-Kunne, F; Peshekhonov, V D; Piegia, R; Pinsky, L; Platchkov, S K; Pló, M; Pose, D; Postma, H; Pretz, J; Puntaferro, R; Rädel, G; Rijllart, A; Reicherz, G; Roberts, J; Rodríguez, M; Rondio, Ewa; Sabo, I; Saborido, J; Sandacz, A; Savin, I A; Schiavon, R P; Schiller, A; Sichtermann, E P; Simeoni, F; Smirnov, G I; Staude, A; Steinmetz, A; Stiegler, U; Stuhrmann, H B; Szleper, M; Tessarotto, F; Thers, D; Tlaczala, W; Tripet, A; Ünel, G; Velasco, M; Vogt, J; Voss, Rüdiger; Whitten, C; Windmolders, R; Willumeit, R; Wislicki, W; Witzmann, A; Ylöstalo, J; Zanetti, A M; Zaremba, K; Zhao, J

    1998-01-01

    We present a next-to-leading order QCD analysis of the presently available data on the spin structure function $g_1$ including the final data from the Spin Muon Collaboration (SMC). We present resu lts for the first moments of the proton, deuteron and neutron structure functions, and determine singlet and non-singlet parton distributions in two factorization schemes. We also test the Bjor ken sum rule and find agreement with the theoretical prediction at the level of 10\\%.

  10. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  11. A COMPARATIVE ANALYSIS OF ASEAN CURRENCIES USING A COPULA APPROACH AND A DYNAMIC COPULA APPROACH

    Directory of Open Access Journals (Sweden)

    CHUKIAT CHAIBOONSRI

    2012-12-01

    Full Text Available The ASEAN Economic Community (AEC will be shaped developing to be a single market and production base in 2015, moving towards regional Economic Integration, 2009. These developments in international financial markets do lead to some adverse cost for AEC country borrowers. The specific objective aims to investigate the dependent measures and the co-movement among selected ASEAN currencies. A Copula Approach was used to examine dependent measures of Thai Baht exchange rate among selected ASEAN currencies during the period of 2008-2011. Also, a Dynamic Copula Approach was tested to investigate the co-movement of Thai Baht exchange rate among selected ASEAN currencies during the period of 2008-2011. The results of the study based on a Pearson linear correlation coefficient confirmed that Thai Baht exchange rate and each of selected ASEAN currencies have a linear correlation during the specific period excluding Vietnam exchange rate. Furthermore, based on empirical Copula Approach, Thai Baht exchange rate had a dependent structure with each of the selected in ASEAN currencies including Brunei exchange rate, Singapore exchange rate, Malaysia exchange rate, Indonesia exchange rate, Philippine exchange rate, and Vietnam exchange rate respectively. The results of Dynamic Copula estimation indicated that Thai Baht exchange rate had a co-movement with selected ASEAN currencies. The research results provide an informative and interactive ASEAN financial market to all users, including Global financial market.

  12. A new approach for heparin standardization: combination of scanning UV spectroscopy, nuclear magnetic resonance and principal component analysis.

    Directory of Open Access Journals (Sweden)

    Marcelo A Lima

    Full Text Available The year 2007 was marked by widespread adverse clinical responses to heparin use, leading to a global recall of potentially affected heparin batches in 2008. Several analytical methods have since been developed to detect impurities in heparin preparations; however, many are costly and dependent on instrumentation with only limited accessibility. A method based on a simple UV-scanning assay, combined with principal component analysis (PCA, was developed to detect impurities, such as glycosaminoglycans, other complex polysaccharides and aromatic compounds, in heparin preparations. Results were confirmed by NMR spectroscopy. This approach provides an additional, sensitive tool to determine heparin purity and safety, even when NMR spectroscopy failed, requiring only standard laboratory equipment and computing facilities.

  13. Radiometric trace analysis of lead with diethyldithiocarbamate and 204T1

    NARCIS (Netherlands)

    Erkelens, P.C. van

    1962-01-01

    Two methods for the determination of submugram amounts of lead are described. (A) Lead is selectively extracted with carbon tetrachloride from an alkaline solution containing exccss diethyl-dithiocarbamatc (DDC) and cyanide. Traces of DDC arc back-extracted. The lead in the DDC complex is exchanged

  14. Occurrence and determinants of increases in blood lead levels in children shortly after lead hazard control activities

    International Nuclear Information System (INIS)

    Clark, Scott; Grote, JoAnn; Wilson, Jonathan; Succop, Paul; Chen Mei; Galke, Warren; McLaine, Pat

    2004-01-01

    This study is an examination of the effect of lead hazard control strategies on children's blood lead levels immediately after an intervention was conducted as part of the US Department of Housing and Urban Development's Lead-Based Paint Hazard Control Grant Program. Fourteen state and local government grantees participated in the evaluation. The findings indicated an overall average reduction in the blood lead levels of 869 children soon after the implementation of lead hazard controls. However, 9.3% of these children (n=81) had blood lead increases of 5 μg/dL or more. Data routinely collected as part of the evaluation, as well as additional information supplied by the individual programs, were used to determine potential reasons for these observed increases in blood lead. A logistic regression analysis indicated that three principal factors were associated with the blood lead increases: the number of exterior deteriorations present in the child's home (prior to intervention), the educational level of the female parent or guardian of the child, and the child's age. The statistical analysis did not find evidence that children living in households that either did not relocate or relocated for less than the full work period were significantly more likely to have a blood lead increase equal to or greater than 5 μg/dL than children living in households that fully relocated. Statistical analyses also did not reveal any single interior strategy to be more or less likely than others to be associated with a blood lead increase of 5 μg/dL or more

  15. Rethinking vulnerability analysis and governance with emphasis on a participatory approach.

    Science.gov (United States)

    Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel

    2015-01-01

    This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework. © 2014 Society for Risk Analysis.

  16. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  17. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.

  18. Temperature control characteristics analysis of lead-cooled fast reactor with natural circulation

    International Nuclear Information System (INIS)

    Yang, Minghan; Song, Yong; Wang, Jianye; Xu, Peng; Zhang, Guangyu

    2016-01-01

    Highlights: • The LFR temperature control system are analyzed with frequency domain method. • The temperature control compensator is designed according to the frequency analysis. • Dynamic simulation is performed by SIMULINK and RELAP5-HD. - Abstract: Lead-cooled Fast Reactor (LFR) with natural circulation in primary system is among the highlights in advance nuclear reactor research, due to its great superiority in reactor safety and reliability. In this work, a transfer function matrix describing coolant temperature dynamic process, obtained by Laplace transform of the one-dimensional system dynamic model is developed in order to investigate the temperature control characteristics of LFR. Based on the transfer function matrix, a close-loop coolant temperature control system without compensator is built. The frequency domain analysis indicates that the stability and steady-state of the temperature control system needs to be improved. Accordingly, a temperature compensator based on Proportion–Integration and feed-forward is designed. The dynamic simulation of the whole system with the temperature compensator for core power step change is performed with SIMULINK and RELAP5-HD. The result shows that the temperature compensator can provide superior coolant temperature control capabilities in LFR with natural circulation due to the efficiency of the frequency domain analysis method.

  19. A Unit-Problem Investigation of Blunt Leading-Edge Separation Motivated by AVT-161 SACCON Research

    Science.gov (United States)

    Luckring, James M.; Boelens, Okko J.

    2011-01-01

    A research effort has been initiated to examine in more detail some of the challenging flow fields discovered from analysis of the SACCON configuration aerodynamics. This particular effort is oriented toward a diamond wing investigation specifically designed to isolate blunt leading-edge separation phenomena relevant to the SACCON investigations of the present workshop. The approach taken to design this new effort is reviewed along with the current status of the program.

  20. LEADING CHANGES IN ASSESSMENT USING AN EVIDENCE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    J. O. Macaulay

    2015-08-01

    Full Text Available Introduction and objectivesIt is has been widely accepted that assessment of learning is a critical component of education and that assessment drives/guides student learning through shaping study habits and student approaches to learning. However, although most academics would agree that assessment is a critical aspect of their roles as teachers it is often an aspect of teaching that is regarded more as an additional task rather than an integral component of the teaching/learning continuum. An additional impediment to high quality assessment is the non-evidence based-approach to the decision making process. The overall aim of this project was to improve the quality of assessment in Biochemistry and Molecular Biology undergraduate education by promoting high quality assessment.Materials and methodsTo do this we developed and trialled an audit tool for mapping assessment practices. The audit tool was designed to gather data on current assessment practices and identify areas of good practice in which assessment aligned with the learning objectives and areas in need of improvement. This evidence base will then be used to drive change in assessment.Results and conclusionsUsing the assessment mapping tool we have mapped the assessment regime in a Biochemistry and Molecular Biology major at Monash University. Criteria used included: assessment type, format, timing, assessors, provision of feedback, level of learning (Bloom’s, approaches taken to planning assessment. We have mapped assessment of content and the systematic development of higher order learning and skills progression throughout the program of study. The data has enabled us to examine the assessment at unit (course level as well as the vertical development across the major. This information is now being used to inform a review of the units and the major.

  1. Statistical approach to partial equilibrium analysis

    Science.gov (United States)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  2. A Comparison of Microeconomic and Macroeconomic Approaches to Deforestation Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Felardo

    2016-01-01

    Full Text Available The economics of deforestation has been explored in detail. Generally, the frame of analysis takes either a microeconomics or macroeconomics approach. The microeconomics approach assumes that individual decision makers are responsible for deforestation as a result of utility maximizing behavior and imperfect property right regimes. The macroeconomics approach explores nationwide trends thought to be associated with forest conversion. This paper investigates the relationship between these two approaches by empirically testing the determinants of deforestation using the same data set from Thailand. The theory for both the microeconomics-based and macroeconomics-based approaches are developed and then tested statistically. The models were constructed using established theoretical frames developed in the literature. The results from both models show statistical significance consistent with prior results in the tropical deforestation literature. A comparison of the two approaches demonstrates that the macro approach is useful in identifying relevant aggregate trends in the deforestation process; the micro approach provides the opportunity to isolate factors of those trends which are necessary for effective policy decisions.

  3. Adjoint-based global variance reduction approach for reactor analysis problems

    International Nuclear Information System (INIS)

    Zhang, Qiong; Abdel-Khalik, Hany S.

    2011-01-01

    A new variant of a hybrid Monte Carlo-Deterministic approach for simulating particle transport problems is presented and compared to the SCALE FW-CADIS approach. The new approach, denoted by the Subspace approach, optimizes the selection of the weight windows for reactor analysis problems where detailed properties of all fuel assemblies are required everywhere in the reactor core. Like the FW-CADIS approach, the Subspace approach utilizes importance maps obtained from deterministic adjoint models to derive automatic weight-window biasing. In contrast to FW-CADIS, the Subspace approach identifies the correlations between weight window maps to minimize the computational time required for global variance reduction, i.e., when the solution is required everywhere in the phase space. The correlations are employed to reduce the number of maps required to achieve the same level of variance reduction that would be obtained with single-response maps. Numerical experiments, serving as proof of principle, are presented to compare the Subspace and FW-CADIS approaches in terms of the global reduction in standard deviation. (author)

  4. New advances in the statistical parton distributions approach*

    Directory of Open Access Journals (Sweden)

    Soffer Jacques

    2016-01-01

    Full Text Available The quantum statistical parton distributions approach proposed more than one decade ago is revisited by considering a larger set of recent and accurate Deep Inelastic Scattering experimental results. It enables us to improve the description of the data by means of a new determination of the parton distributions. This global next-to-leading order QCD analysis leads to a good description of several structure functions, involving unpolarized parton distributions and helicity distributions, in terms of a rather small number of free parameters. There are many serious challenging issues. The predictions of this theoretical approach will be tested for single-jet production and charge asymmetry in W± production in p̄p and pp collisions up to LHC energies, using recent data and also for forthcoming experimental results.

  5. High Burden of Subclinical Lead Toxicity after Phase Out of Lead from Petroleum in Pakistan.

    Science.gov (United States)

    Majid, Hafsa; Khan, Aysha Habib; Khan, Nadeem Ullah; Siddiqui, Imran; Ghani, Farooq; Jafri, Lena

    2017-12-01

    To evaluate the frequency of subclinical lead toxicity. Cross-sectional study. Department of Pathology and Laboratory Medicine, The Aga Khan University Hospital, Karachi, from January 2011 to December 2014. Analysis of laboratory data for blood lead levels (BLL) was performed. Lead was tested by atomic absorption spectrometer. For all subjects, only initial test results were included while the results of repeated testing were excluded. Exemption was sought from institutional ethical review committee. BLL of 2-10 ug/dl and 10-70 ug/dl in children and adults, respectively were taken as subclinical lead toxicity. Amongst the total number of subjects tested (n=524), 26.5% (n=139) were children (lead level 16.9 ug/dl (36.1-4)] and lower level [4.2 ug/dl (6.8-2.6)] in children with lead levels while most had either subclinical (76%, n=106) or toxic lead levels (8%, n=11). In adults, (55%, n=212) subjects had desired lead levels, and 40% (n=154) and 4.99% (n=19) had subclinical and toxic lead levels. Presence of subclinical lead poisoning even after phasing out of lead petroleum in Pakistanis is alarming, especially in children. A national population-based study to determine the lead status and targeted intervention to identify potential sources is need of the time.

  6. An investigation into international business collaboration in higher education organisations: a case study of international partnerships in four UK leading universities

    OpenAIRE

    Ayoubi, R; Al-Habaibeh, A

    2006-01-01

    Purpose - The purpose of this paper is to develop a comparative analysis of the main objectives of international institutional partnerships in four UK leading universities. Based on the presented case studies, the paper outlines a model for objectives and implementation of international partnership. Design/methodology/approach - Using a multiple case study approach, the paper employs three sources of data: templates of international partnerships, actual agreements of international partnership...

  7. From the Cover: 7,8-Dihydroxyflavone Rescues Lead-Induced Impairment of Vesicular Release: A Novel Therapeutic Approach for Lead Intoxicated Children.

    Science.gov (United States)

    Zhang, Xiao-Lei; McGlothan, Jennifer L; Miry, Omid; Stansfield, Kirstie H; Loth, Meredith K; Stanton, Patric K; Guilarte, Tomás R

    2018-01-01

    Childhood lead (Pb2+) intoxication is a public health problem of global proportion. Lead exposure during development produces multiple effects on the central nervous system including impaired synapse formation, altered synaptic plasticity, and learning deficits. In primary hippocampal neurons in culture and hippocampal slices, Pb2+ exposure inhibits vesicular release and reduces the number of fast-releasing sites, an effect associated with Pb2+ inhibition of NMDA receptor-mediated trans-synaptic Brain-Derived Neurotrophic Factor (BDNF) signaling. The objective of this study was to determine if activation of TrkB, the cognate receptor for BDNF, would rescue Pb2+-induced impairments of vesicular release. Rats were chronically exposed to Pb2+ prenatally and postnatally until 50 days of age. This chronic Pb2+ exposure paradigm enhanced paired-pulse facilitation of synaptic potentials in Schaffer collateral-CA1 synapses in the hippocampus, a phenomenon indicative of reduced vesicular release probability. Decreased vesicular release probability was confirmed by both mean-variance analysis and direct 2-photon imaging of vesicular release from hippocampal slices of rats exposed to Pb2+in vivo. We also found a Pb2+-induced impairment of calcium influx in Schaffer collateral-CA1 synaptic terminals. Intraperitoneal injections of Pb2+ rats with the TrkB receptor agonist 7,8-dihydroxyflavone (5 mg/kg) for 14-15 days starting at postnatal day 35, reversed all Pb2+-induced impairments of presynaptic transmitter release at Schaffer collateral-CA1 synapses. This study demonstrates for the first time that in vivo pharmacological activation of TrkB receptors by small molecules such as 7,8-dihydroxyflavone can reverse long-term effects of chronic Pb2+ exposure on presynaptic terminals, pointing to TrkB receptor activation as a promising therapeutic intervention in Pb2+-intoxicated children. © The Author 2017. Published by Oxford University Press on behalf of the Society of

  8. Effective Approach to Calculate Analysis Window in Infinite Discrete Gabor Transform

    Directory of Open Access Journals (Sweden)

    Rui Li

    2018-01-01

    Full Text Available The long-periodic/infinite discrete Gabor transform (DGT is more effective than the periodic/finite one in many applications. In this paper, a fast and effective approach is presented to efficiently compute the Gabor analysis window for arbitrary given synthesis window in DGT of long-periodic/infinite sequences, in which the new orthogonality constraint between analysis window and synthesis window in DGT for long-periodic/infinite sequences is derived and proved to be equivalent to the completeness condition of the long-periodic/infinite DGT. By using the property of delta function, the original orthogonality can be expressed as a certain number of linear equation sets in both the critical sampling case and the oversampling case, which can be fast and efficiently calculated by fast discrete Fourier transform (FFT. The computational complexity of the proposed approach is analyzed and compared with that of the existing canonical algorithms. The numerical results indicate that the proposed approach is efficient and fast for computing Gabor analysis window in both the critical sampling case and the oversampling case in comparison to existing algorithms.

  9. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    Science.gov (United States)

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    2017-01-01

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO2(110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.

  10. Steady and dynamic states analysis of induction motor: FEA approach

    African Journals Online (AJOL)

    This paper deals with the steady and dynamic states analysis of induction motor using finite element analysis (FEA) approach. The motor has aluminum rotor bars and is designed for direct-on-line operation at 50 Hz. A study of the losses occurring in the motor performed at operating frequency of 50Hz showed that stator ...

  11. Lead in the environment

    Science.gov (United States)

    Pattee, Oliver H.; Pain, Deborah J.; Hoffman, David J.; Rattner, Barnett A.; Burton, G. Allen; Cairns, John

    2003-01-01

    Anthropogenic uses of lead have probably altered its availability and environmental distribution more than any other toxic element. Consequently, lead concentrations in many living organisms may be approaching thresholds of toxicity for the adverse effects of lead. Such thresholds are difficult to define, as they vary with the chemical and physical form of lead, exposure regime, other elements present and also vary both within and between species. The technological capability to accurately quantify low lead concentrations has increased over the last decade, and physiological and behavioral effects have been measured in wildlife with tissue lead concentrations below those previously considered safe for humans.s.236 Consequently. lead criteria for the protection of wildlife and human health are frequently under review, and 'thresholds' of lead toxicity are being reconsidered. Proposed lead criteria for the protection of natural resources have been reviewed by Eisler. Uptake of lead by plants is limited by its generally low availability in soils and sediments, and toxicity may be limited by storage mechanisms and its apparently limited translocation within most plants. Lead does not generally accumulate within the foliar parts of plants, which limits its transfer to higher trophic levels. Although lead may concentrate in plant and animal tissues, no evidence of biomagnification exists. Acid deposition onto surface waters and soils with low buffering capacity may influence the availability of lead for uptake by plants and animals, and this may merit investigation at susceptible sites. The biological significance of chronic low-level lead exposure to wildlife is sometimes difficult to quantify. Animals living in urban environments or near point sources of lead emission are inevitably subject to greater exposure to lead and enhanced risk of lead poisoning. Increasingly strict controls on lead emissions in many countries have reduced exposure to lead from some sources

  12. NASA Armstrong's Approach to Store Separation Analysis

    Science.gov (United States)

    Acuff, Chris; Bui, Trong

    2015-01-01

    Presentation will an overview of NASA Armstrong's store separation capabilities and how they have been applied recently. Objective of the presentation is to brief Generation Orbit and other potential partners on NASA Armstrong's store separation capabilities. It will include discussions on the use of NAVSEP and Cart3D, as well as some Python scripting work to perform the analysis, and a short overview of this methodology applied to the Towed Glider Air Launch System. Collaboration with potential customers in this area could lead to funding for the further development of a store separation capability at NASA Armstrong, which would boost the portfolio of engineering expertise at the center.

  13. Phasing out lead from gasoline in Pakistan: a benefit cost analysis

    International Nuclear Information System (INIS)

    Martin, R.P.; Zaman, Q.U.

    1999-01-01

    Medical research has established a clear link between elevated blood lead levels nd adverse health effects in humans including the retardation of neurological development, hypertension, and cardiovascular ailments. Due to this, a large number of countries now restrict the sale of leaded gasoline. In contrast, only highly leaded gasoline is readily available in Pakistan, resulting in serious health concerns in certain areas. This paper presents the findings of a study to evaluate consumers' perceived benefits and actual costs of switching to unleaded gasoline in Pakistan. Policy implications are noted. The study indicates a concentration of adverse health effects in the major urban centers. Of special interest is the loss of approximately 2,5000 IQ points annually in Karachi and Lahore as a result of gasoline linked lead exposure. Consumers' willingness to pay for the removal of lead from gasoline, as estimated using a contingent valuation technique, is shown to be positively related to both educational attainment and income. Once consumers are informed of the adverse health effects associated with lead exposure, their willingness to pay for a switch to unleaded gasoline for exceeds the costs incurred. This suggests that significant gains in social welfare may be obtained by phasing out lead from gasoline in Pakistan. The benefits are most pronounced in urban areas, while in rural villages and small cities the costs are likely to out weight the benefits. A flexible program to restrict the sale of leaded gasoline in urban areas is thus recommended. (author)

  14. Minimal invasive epicardial lead implantation: optimizing cardiac resynchronization with a new mapping device for epicardial lead placement.

    Science.gov (United States)

    Maessen, J G; Phelps, B; Dekker, A L A J; Dijkman, B

    2004-05-01

    To optimize resynchronization in biventricular pacing with epicardial leads, mapping to determine the best pacing site, is a prerequisite. A port access surgical mapping technique was developed that allowed multiple pace site selection and reproducible lead evaluation and implantation. Pressure-volume loops analysis was used for real time guidance in targeting epicardial lead placement. Even the smallest changes in lead position revealed significantly different functional results. Optimizing the pacing site with this technique allowed functional improvement up to 40% versus random pace site selection.

  15. Leading Causes of Death among Asian American Subgroups (2003-2011.

    Directory of Open Access Journals (Sweden)

    Katherine G Hastings

    Full Text Available Our current understanding of Asian American mortality patterns has been distorted by the historical aggregation of diverse Asian subgroups on death certificates, masking important differences in the leading causes of death across subgroups. In this analysis, we aim to fill an important knowledge gap in Asian American health by reporting leading causes of mortality by disaggregated Asian American subgroups.We examined national mortality records for the six largest Asian subgroups (Asian Indian, Chinese, Filipino, Japanese, Korean, Vietnamese and non-Hispanic Whites (NHWs from 2003-2011, and ranked the leading causes of death. We calculated all-cause and cause-specific age-adjusted rates, temporal trends with annual percent changes, and rate ratios by race/ethnicity and sex. Rankings revealed that as an aggregated group, cancer was the leading cause of death for Asian Americans. When disaggregated, there was notable heterogeneity. Among women, cancer was the leading cause of death for every group except Asian Indians. In men, cancer was the leading cause of death among Chinese, Korean, and Vietnamese men, while heart disease was the leading cause of death among Asian Indians, Filipino and Japanese men. The proportion of death due to heart disease for Asian Indian males was nearly double that of cancer (31% vs. 18%. Temporal trends showed increased mortality of cancer and diabetes in Asian Indians and Vietnamese; increased stroke mortality in Asian Indians; increased suicide mortality in Koreans; and increased mortality from Alzheimer's disease for all racial/ethnic groups from 2003-2011. All-cause rate ratios revealed that overall mortality is lower in Asian Americans compared to NHWs.Our findings show heterogeneity in the leading causes of death among Asian American subgroups. Additional research should focus on culturally competent and cost-effective approaches to prevent and treat specific diseases among these growing diverse populations.

  16. Leading Causes of Death among Asian American Subgroups (2003-2011).

    Science.gov (United States)

    Hastings, Katherine G; Jose, Powell O; Kapphahn, Kristopher I; Frank, Ariel T H; Goldstein, Benjamin A; Thompson, Caroline A; Eggleston, Karen; Cullen, Mark R; Palaniappan, Latha P

    2015-01-01

    Our current understanding of Asian American mortality patterns has been distorted by the historical aggregation of diverse Asian subgroups on death certificates, masking important differences in the leading causes of death across subgroups. In this analysis, we aim to fill an important knowledge gap in Asian American health by reporting leading causes of mortality by disaggregated Asian American subgroups. We examined national mortality records for the six largest Asian subgroups (Asian Indian, Chinese, Filipino, Japanese, Korean, Vietnamese) and non-Hispanic Whites (NHWs) from 2003-2011, and ranked the leading causes of death. We calculated all-cause and cause-specific age-adjusted rates, temporal trends with annual percent changes, and rate ratios by race/ethnicity and sex. Rankings revealed that as an aggregated group, cancer was the leading cause of death for Asian Americans. When disaggregated, there was notable heterogeneity. Among women, cancer was the leading cause of death for every group except Asian Indians. In men, cancer was the leading cause of death among Chinese, Korean, and Vietnamese men, while heart disease was the leading cause of death among Asian Indians, Filipino and Japanese men. The proportion of death due to heart disease for Asian Indian males was nearly double that of cancer (31% vs. 18%). Temporal trends showed increased mortality of cancer and diabetes in Asian Indians and Vietnamese; increased stroke mortality in Asian Indians; increased suicide mortality in Koreans; and increased mortality from Alzheimer's disease for all racial/ethnic groups from 2003-2011. All-cause rate ratios revealed that overall mortality is lower in Asian Americans compared to NHWs. Our findings show heterogeneity in the leading causes of death among Asian American subgroups. Additional research should focus on culturally competent and cost-effective approaches to prevent and treat specific diseases among these growing diverse populations.

  17. Approach to uncertainty evaluation for safety analysis

    International Nuclear Information System (INIS)

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  18. Experimental design research approaches, perspectives, applications

    CERN Document Server

    Stanković, Tino; Štorga, Mario

    2016-01-01

    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current research practice where methods are diverging and integration between individual, team and organizational under...

  19. XRD, lead equivalent and UV-VIS properties study of Ce and Pr lead silicate glasses

    International Nuclear Information System (INIS)

    Alias, Nor Hayati; Abdullah, Wan Shafie Wan; Isa, Norriza Mohd; Isa, Muhammad Jamal Md; Zali, Nurazila Mat; Abdullah, Nuhaslinda Ee; Muhammad, Azali

    2014-01-01

    In this work, Cerium (Ce) and Praseodymium (Pr) containing lead silicate glasses were produced with 2 different molar ratios low (0.2 wt%) and high (0.4wt%). These types of glasses can satisfy the characteristics required for radiation shielding glasses and minimize the lead composition in glass. The radiation shielding properties of the synthesized glasses is explained in the form of lead equivalent study. The XRD diffraction and UV-VIS analysis were performed to observe the structural changes of the synthesis glasses at 1.5 Gy gamma radiation exposures

  20. Thermal optimization of the helium-cooled power leads for the SSC

    International Nuclear Information System (INIS)

    Demko, J.A.; Schiesser, W.E.; Carcagno, R.; McAshan, M.; McConeghy, R.

    1992-01-01

    The optimum thermal design of the power leads for the Superconducting Super Collider (SSC) will minimize the amount of Carnot work (which is a combination of refrigeration and liquefaction work) required. This optimization can be accomplished by the judicious selection of lead length and diameter. Even though an optimum set of dimensions is found, the final design must satisfy other physical constraints such as maximum allowable heat leak and helium vapor mass flow rate. A set of corresponding lengths and diameters has been determined that meets these requirements for the helium vapor-cooled, spiral-fin power lead design of the SSC. Early efforts by McFee and Mallon investigated optimizing power leads for cryogenic applications with no convection cooling. Later designs utilized the boiled-off helium vapor to cool the lead. One notable design for currents up to several thousand amps is presented by Efferson based on a series of recommendations discussed by Deiness. Buyanov presents many theoretical models and design formulae but does not demonstrate an approach to thermally optimizing the design of a vapor-cooled lead. In this study, a detailed numerical thermal model of a power lead design for the SSC has been developed. It was adapted from the dynamic model developed by Schiesser. This model was used to determine the optimum dimensions that minimize the Carnot refrigeration and liquefaction work due to the leads. Since the SSC leads will be cooled by supercritical helium, the flow of vapor is regulated by a control valve. These leads include a superconducting portion at the cold end. All of the material properties in the model are functions of temperature, and for the helium are functions of pressure and temperature. No pressure drop calculations were performed as part of this analysis. The diameter that minimizes the Carnot work was determined for four different lengths at a design current of 6600 amps

  1. Fetal ECG extraction using independent component analysis by Jade approach

    Science.gov (United States)

    Giraldo-Guzmán, Jader; Contreras-Ortiz, Sonia H.; Lasprilla, Gloria Isabel Bautista; Kotas, Marian

    2017-11-01

    Fetal ECG monitoring is a useful method to assess the fetus health and detect abnormal conditions. In this paper we propose an approach to extract fetal ECG from abdomen and chest signals using independent component analysis based on the joint approximate diagonalization of eigenmatrices approach. The JADE approach avoids redundancy, what reduces matrix dimension and computational costs. Signals were filtered with a high pass filter to eliminate low frequency noise. Several levels of decomposition were tested until the fetal ECG was recognized in one of the separated sources output. The proposed method shows fast and good performance.

  2. Anterior approach versus posterior approach for Pipkin I and II femoral head fractures: A systemic review and meta-analysis.

    Science.gov (United States)

    Wang, Chen-guang; Li, Yao-min; Zhang, Hua-feng; Li, Hui; Li, Zhi-jun

    2016-03-01

    We performed a meta-analysis, pooling the results from controlled clinical trials to compare the efficiency of anterior and posterior surgical approaches to Pipkin I and II fractures of the femoral head. Potential academic articles were identified from the Cochrane Library, Medline (1966-2015.5), PubMed (1966-2015.5), Embase (1980-2015.5) and ScienceDirect (1966-2015.5) databases. Gray studies were identified from the references of the included literature. Pooling of the data was performed and analyzed by RevMan software, version 5.1. Five case-control trials (CCTs) met the inclusion criteria. There were significant differences in the incidence of heterotopic ossification (HO) between the approaches, but no significant differences were found between the two groups regarding functional outcomes of the hip, general postoperative complications, osteonecrosis of the femoral head or post-traumatic arthritis. The present meta-analysis indicated that the posterior approach decreased the risk of heterotopic ossification compared with the anterior approach for the treatment of Pipkin I and II femoral head fractures. No other complications were related to anterior and posterior approaches. Future high-quality randomized, controlled trials (RCTs) are needed to determine the optimal surgical approach and to predict other postoperative complications. III. Copyright © 2016 IJS Publishing Group Limited. Published by Elsevier Ltd. All rights reserved.

  3. A SURVEY ON DOCUMENT CLUSTERING APPROACH FOR COMPUTER FORENSIC ANALYSIS

    OpenAIRE

    Monika Raghuvanshi*, Rahul Patel

    2016-01-01

    In a forensic analysis, large numbers of files are examined. Much of the information comprises of in unstructured format, so it’s quite difficult task for computer forensic to perform such analysis. That’s why to do the forensic analysis of document within a limited period of time require a special approach such as document clustering. This paper review different document clustering algorithms methodologies for example K-mean, K-medoid, single link, complete link, average link in accorandance...

  4. Progress and challenges in bipolar lead-acid battery development

    Science.gov (United States)

    Bullock, Kathryn R.

    1995-05-01

    Bipolar lead-acid batteries have higher power densities than any other aqueous battery system. Predicted specific powers based on models and prototypes range from 800 kW/kg for 100 ms discharge times to 1.6 kW/kg for 10 s. A 48 V automotive bipolar battery could have 2 1/2 times the cold cranking rate of a monopolar 12 V design in the same size. Problems which have precluded the development of commercial bipolar designs include the instability of substrate materials and enhanced side reactions. Design approaches include pseudo-bipolar configurations, as well as true bipolar designs in planar and tubular configurations. Substrate materials used include lead and lead alloys, carbons, conductive ceramics, and tin-oxide-coated glass fibers. These approaches are reviewed and evaluated.

  5. Personalization of models with many model parameters: an efficient sensitivity analysis approach.

    Science.gov (United States)

    Donders, W P; Huberts, W; van de Vosse, F N; Delhaas, T

    2015-10-01

    Uncertainty quantification and global sensitivity analysis are indispensable for patient-specific applications of models that enhance diagnosis or aid decision-making. Variance-based sensitivity analysis methods, which apportion each fraction of the output uncertainty (variance) to the effects of individual input parameters or their interactions, are considered the gold standard. The variance portions are called the Sobol sensitivity indices and can be estimated by a Monte Carlo (MC) approach (e.g., Saltelli's method [1]) or by employing a metamodel (e.g., the (generalized) polynomial chaos expansion (gPCE) [2, 3]). All these methods require a large number of model evaluations when estimating the Sobol sensitivity indices for models with many parameters [4]. To reduce the computational cost, we introduce a two-step approach. In the first step, a subset of important parameters is identified for each output of interest using the screening method of Morris [5]. In the second step, a quantitative variance-based sensitivity analysis is performed using gPCE. Efficient sampling strategies are introduced to minimize the number of model runs required to obtain the sensitivity indices for models considering multiple outputs. The approach is tested using a model that was developed for predicting post-operative flows after creation of a vascular access for renal failure patients. We compare the sensitivity indices obtained with the novel two-step approach with those obtained from a reference analysis that applies Saltelli's MC method. The two-step approach was found to yield accurate estimates of the sensitivity indices at two orders of magnitude lower computational cost. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Telemetry-assisted early detection of STEMI in patients with atypical symptoms by paramedic-performed 12-lead ECG with subsequent cardiological analysis.

    Science.gov (United States)

    Campo Dell' Orto, Marco; Hamm, Christian; Liebetrau, Christoph; Hempel, Dorothea; Merbs, Reinhold; Cuca, Colleen; Breitkreutz, Raoul

    2017-08-01

    ECG is an essential diagnostic tool in patients with acute coronary syndrome. We aimed to determine how many patients presenting with atypical symptoms for an acute myocardial infarction show ST-segment elevations on prehospital ECG. We also aimed to study the feasibility of telemetric-assisted prehospital ECG analysis. Between April 2010 and February 2011, consecutive emergency patients presenting with atypical symptoms such as nausea, vomiting, atypical chest pain, palpitations, hypertension, syncope, or dizziness were included in the study. After basic measures were completed, a 12-lead ECG was written and telemetrically transmitted to the cardiac center, where it was analyzed by attending physicians. Any identification of an ST-elevation myocardial infarction resulted in patient admission at the closest coronary angiography facility. A total of 313 emergency patients presented with the following symptoms: dyspnea, nausea, vomiting, dizziness/collapse, or acute hypertension. Thirty-four (11%) patients of this cohort were found to show ST-segment elevations on the 12-lead ECG. These patients were directly admitted to the closest coronary catheterization facility rather than the closest hospital. The time required for transmission and analysis of the ECG was 3.6±1.2 min. Telemetry-assisted 12-lead ECG analysis in a prehospital setting may lead to earlier detection of ST-elevation myocardial infarction in patients with atypical symptoms. Thus, a 12-lead ECG should be considered in all prehospital patients both with typical and atypical symptoms.

  7. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  8. Hydrogen generation comparison between lead-calcium and lead-antimony batteries in nuclear power plant

    International Nuclear Information System (INIS)

    Zhao Hongjun; Qi Suoni; Shen Yan; Li Jia

    2014-01-01

    Battery type selection is performed with the help of technical information supplied by vendors, and according to relevant criteria. Analysis and comparison of the hydrogen generation differences between two different lead-acid battery types are carried out through calculation. The analysis result may provide suggestions for battery type selection in nuclear power plant. (authors)

  9. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    Science.gov (United States)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  10. Helicopter Gas Turbine Engine Performance Analysis : A Multivariable Approach

    NARCIS (Netherlands)

    Arush, Ilan; Pavel, M.D.

    2017-01-01

    Helicopter performance relies heavily on the available output power of the engine(s) installed. A simplistic single-variable analysis approach is often used within the flight-testing community to reduce raw flight-test data in order to predict the available output power under different atmospheric

  11. The Use of a Modified Semantic Features Analysis Approach in Aphasia

    Science.gov (United States)

    Hashimoto, Naomi; Frome, Amber

    2011-01-01

    Several studies have reported improved naming using the semantic feature analysis (SFA) approach in individuals with aphasia. Whether the SFA can be modified and still produce naming improvements in aphasia is unknown. The present study was designed to address this question by using a modified version of the SFA approach. Three, rather than the…

  12. Estimation of the exchange current density and comparative analysis of morphology of electrochemically produced lead and zinc deposits

    Directory of Open Access Journals (Sweden)

    Nikolić Nebojša D.

    2017-01-01

    Full Text Available The processes of lead and zinc electrodeposition from the very dilute electrolytes were compared by the analysis of polarization characteristics and by the scanning electron microscopic (SEM analysis of the morphology of the deposits obtained in the galvanostatic regime of electrolysis. The exchange current densities for lead and zinc were estimated by comparison of experimentally obtained polarization curves with the simulated ones obtained for the different the exchange current density to the limiting diffusion current density ratios. Using this way for the estimation of the exchange current density, it is shown that the exchange current density for Pb was more than 1300 times higher than the one for Zn. In this way, it is confirmed that the Pb electrodeposition processes are considerably faster than the Zn electrodeposition processes. The difference in the rate of electrochemical processes was confirmed by a comparison of morphologies of lead and zinc deposits obtained at current densities which corresponded to 0.25 and 0.50 values of the limiting diffusion current densities. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. 172046

  13. Generalized structured component analysis a component-based approach to structural equation modeling

    CERN Document Server

    Hwang, Heungsun

    2014-01-01

    Winner of the 2015 Sugiyama Meiko Award (Publication Award) of the Behaviormetric Society of Japan Developed by the authors, generalized structured component analysis is an alternative to two longstanding approaches to structural equation modeling: covariance structure analysis and partial least squares path modeling. Generalized structured component analysis allows researchers to evaluate the adequacy of a model as a whole, compare a model to alternative specifications, and conduct complex analyses in a straightforward manner. Generalized Structured Component Analysis: A Component-Based Approach to Structural Equation Modeling provides a detailed account of this novel statistical methodology and its various extensions. The authors present the theoretical underpinnings of generalized structured component analysis and demonstrate how it can be applied to various empirical examples. The book enables quantitative methodologists, applied researchers, and practitioners to grasp the basic concepts behind this new a...

  14. Library design practices for success in lead generation with small molecule libraries.

    Science.gov (United States)

    Goodnow, R A; Guba, W; Haap, W

    2003-11-01

    The generation of novel structures amenable to rapid and efficient lead optimization comprises an emerging strategy for success in modern drug discovery. Small molecule libraries of sufficient size and diversity to increase the chances of discovery of novel structures make the high throughput synthesis approach the method of choice for lead generation. Despite an industry trend for smaller, more focused libraries, the need to generate novel lead structures makes larger libraries a necessary strategy. For libraries of a several thousand or more members, solid phase synthesis approaches are the most suitable. While the technology and chemistry necessary for small molecule library synthesis continue to advance, success in lead generation requires rigorous consideration in the library design process to ensure the synthesis of molecules possessing the proper characteristics for subsequent lead optimization. Without proper selection of library templates and building blocks, solid phase synthesis methods often generate molecules which are too heavy, too lipophilic and too complex to be useful for lead optimization. The appropriate filtering of virtual library designs with multiple computational tools allows the generation of information-rich libraries within a drug-like molecular property space. An understanding of the hit-to-lead process provides a practical guide to molecular design characteristics. Examples of leads generated from library approaches also provide a benchmarking of successes as well as aspects for continued development of library design practices.

  15. Study on the systematic approach of Markov modeling for dependability analysis of complex fault-tolerant features with voting logics

    International Nuclear Information System (INIS)

    Son, Kwang Seop; Kim, Dong Hoon; Kim, Chang Hwoi; Kang, Hyun Gook

    2016-01-01

    The Markov analysis is a technique for modeling system state transitions and calculating the probability of reaching various system states. While it is a proper tool for modeling complex system designs involving timing, sequencing, repair, redundancy, and fault tolerance, as the complexity or size of the system increases, so does the number of states of interest, leading to difficulty in constructing and solving the Markov model. This paper introduces a systematic approach of Markov modeling to analyze the dependability of a complex fault-tolerant system. This method is based on the decomposition of the system into independent subsystem sets, and the system-level failure rate and the unavailability rate for the decomposed subsystems. A Markov model for the target system is easily constructed using the system-level failure and unavailability rates for the subsystems, which can be treated separately. This approach can decrease the number of states to consider simultaneously in the target system by building Markov models of the independent subsystems stage by stage, and results in an exact solution for the Markov model of the whole target system. To apply this method we construct a Markov model for the reactor protection system found in nuclear power plants, a system configured with four identical channels and various fault-tolerant architectures. The results show that the proposed method in this study treats the complex architecture of the system in an efficient manner using the merits of the Markov model, such as a time dependent analysis and a sequential process analysis. - Highlights: • Systematic approach of Markov modeling for system dependability analysis is proposed based on the independent subsystem set, its failure rate and unavailability rate. • As an application example, we construct the Markov model for the digital reactor protection system configured with four identical and independent channels, and various fault-tolerant architectures. • The

  16. Design of an Actinide-Burning, Lead or Lead-Bismuth Cooled Reactor that Produces Low-Cost Electricity

    Energy Technology Data Exchange (ETDEWEB)

    Mac Donald, Philip Elsworth; Weaver, Kevan Dean; Davis, Cliff Bybee; MIT folks

    2000-07-01

    indirect cycle designs has investigated the effects of various parameters to increase electric production at full power. For the direct-contact reactor, major issues related to the direct-contact heat transfer rate and entrainment and carryover of liquid lead-bismuth to the turbine have been identified and analyzed. An economic analysis approach was also developed to determine the cost of electricity production in the lead-bismuth reactor. The approach will be formulated into a model and applied to develop scientific cost estimates for the different reactor designs and thus aid in the selection of the most economic option. In the area of lead-bismuth coolant activation, the radiological hazard was evaluated with particular emphasis on the direct-contact reactor. In this system, the lack of a physical barrier between the primary and secondary coolant favors the release of the alpha-emitter Po?210 and its transport throughout the plant. Modeling undertaken on the basis of the scarce information available in the literature confirmed the importance of this issue, as well as the need for experimental work to reduce the uncertainties on the basic characteristics of volatile polonium chemical forms.

  17. Occupational exposure and biological evaluation of lead in Iranian workers-a systematic review and meta-analysis

    Directory of Open Access Journals (Sweden)

    Kourosh Sayehmiri

    2016-09-01

    Full Text Available Introduction: Lead exposure is considered as a global health problem. The irreparable harmful effects of this heavy metal on human have been proven in various studies. Comparing to general population, workers in related industries are more exposed to lead. Several studies have investigated lead occupational exposure and its biological evaluation in Iran; however there is no overall estimate. Thus, the present study was conducted to determine the occupational exposure to lead and its biological evaluation in Iranian workers, using systematic review and meta-analysis. Material and Method: This study was carried out based on information obtained from databases including Magiran, Iranmedex, SID, Medlib, Trials Register, Scopus, Pubmed, Science Direct, Cochran, Embase, Medline, Web of Science, Springer, Online Library Wiley, and Google Scholar from 1991 to 2016, using standard key words. All of the reviewed papers which met the inclusion criteria have been evaluated. Data combination was performed according to Random Effects Model using Stata software version 11.1. Result: In the 34 qualified studies, the mean blood lead level (BLL concentration in Iranian workers was estimated 42.8µg/dl (95% CI: 35.15-50.49. The minimum and maximum BLL were belonged to west (28.348µg/dl and center (45.928µg/dl regions of Iran, respectively. Considering different occupations, the lowest mean value was reported in textile industry workers (12.3 µg/dl, while the highest value was for zinc-lead mine workers (72.6 µg/dl. Mean breathing air lead level of Iranian workers reported in 4 studies was estimated 0.23 mg/m3 (95% CI: 0.14-0.33. Conclusion: According to the high concentration of BLL and breathing air, it is recommended to increase protective measures and frequent screening. Scheduled clinical and paraclinical examination should also be performed for workers.

  18. A Monte Carlo Library Least Square approach in the Neutron Inelastic-scattering and Thermal-capture Analysis (NISTA) process in bulk coal samples

    Science.gov (United States)

    Reyhancan, Iskender Atilla; Ebrahimi, Alborz; Çolak, Üner; Erduran, M. Nizamettin; Angin, Nergis

    2017-01-01

    A new Monte-Carlo Library Least Square (MCLLS) approach for treating non-linear radiation analysis problem in Neutron Inelastic-scattering and Thermal-capture Analysis (NISTA) was developed. 14 MeV neutrons were produced by a neutron generator via the 3H (2H , n) 4He reaction. The prompt gamma ray spectra from bulk samples of seven different materials were measured by a Bismuth Germanate (BGO) gamma detection system. Polyethylene was used as neutron moderator along with iron and lead as neutron and gamma ray shielding, respectively. The gamma detection system was equipped with a list mode data acquisition system which streams spectroscopy data directly to the computer, event-by-event. A GEANT4 simulation toolkit was used for generating the single-element libraries of all the elements of interest. These libraries were then used in a Linear Library Least Square (LLLS) approach with an unknown experimental sample spectrum to fit it with the calculated elemental libraries. GEANT4 simulation results were also used for the selection of the neutron shielding material.

  19. Performance of Lead-Free versus Lead-Based Hunting Ammunition in Ballistic Soap

    Science.gov (United States)

    Gremse, Felix; Krone, Oliver; Thamm, Mirko; Kiessling, Fabian; Tolba, René Hany; Rieger, Siegfried; Gremse, Carl

    2014-01-01

    Background Lead-free hunting bullets are an alternative to lead-containing bullets which cause health risks for humans and endangered scavenging raptors through lead ingestion. However, doubts concerning the effectiveness of lead-free hunting bullets hinder the wide-spread acceptance in the hunting and wildlife management community. Methods We performed terminal ballistic experiments under standardized conditions with ballistic soap as surrogate for game animal tissue to characterize dimensionally stable, partially fragmenting, and deforming lead-free bullets and one commonly used lead-containing bullet. The permanent cavities created in soap blocks are used as a measure for the potential wound damage. The soap blocks were imaged using computed tomography to assess the volume and shape of the cavity and the number of fragments. Shots were performed at different impact speeds, covering a realistic shooting range. Using 3D image segmentation, cavity volume, metal fragment count, deflection angle, and depth of maximum damage were determined. Shots were repeated to investigate the reproducibility of ballistic soap experiments. Results All bullets showed an increasing cavity volume with increasing deposited energy. The dimensionally stable and fragmenting lead-free bullets achieved a constant conversion ratio while the deforming copper and lead-containing bullets showed a ratio, which increases linearly with the total deposited energy. The lead-containing bullet created hundreds of fragments and significantly more fragments than the lead-free bullets. The deflection angle was significantly higher for the dimensionally stable bullet due to its tumbling behavior and was similarly low for the other bullets. The deforming bullets achieved higher reproducibility than the fragmenting and dimensionally stable bullets. Conclusion The deforming lead-free bullet closely resembled the deforming lead-containing bullet in terms of energy conversion, deflection angle, cavity shape

  20. Performance of lead-free versus lead-based hunting ammunition in ballistic soap.

    Directory of Open Access Journals (Sweden)

    Felix Gremse

    Full Text Available BACKGROUND: Lead-free hunting bullets are an alternative to lead-containing bullets which cause health risks for humans and endangered scavenging raptors through lead ingestion. However, doubts concerning the effectiveness of lead-free hunting bullets hinder the wide-spread acceptance in the hunting and wildlife management community. METHODS: We performed terminal ballistic experiments under standardized conditions with ballistic soap as surrogate for game animal tissue to characterize dimensionally stable, partially fragmenting, and deforming lead-free bullets and one commonly used lead-containing bullet. The permanent cavities created in soap blocks are used as a measure for the potential wound damage. The soap blocks were imaged using computed tomography to assess the volume and shape of the cavity and the number of fragments. Shots were performed at different impact speeds, covering a realistic shooting range. Using 3D image segmentation, cavity volume, metal fragment count, deflection angle, and depth of maximum damage were determined. Shots were repeated to investigate the reproducibility of ballistic soap experiments. RESULTS: All bullets showed an increasing cavity volume with increasing deposited energy. The dimensionally stable and fragmenting lead-free bullets achieved a constant conversion ratio while the deforming copper and lead-containing bullets showed a ratio, which increases linearly with the total deposited energy. The lead-containing bullet created hundreds of fragments and significantly more fragments than the lead-free bullets. The deflection angle was significantly higher for the dimensionally stable bullet due to its tumbling behavior and was similarly low for the other bullets. The deforming bullets achieved higher reproducibility than the fragmenting and dimensionally stable bullets. CONCLUSION: The deforming lead-free bullet closely resembled the deforming lead-containing bullet in terms of energy conversion

  1. Russian regulatory approaches to seismic design and seismic analysis of NPP piping

    International Nuclear Information System (INIS)

    Kaliberda, Y.V.

    2003-01-01

    The paper presents an overview of Russian regulatory approaches to seismic design and seismic analysis of NPP piping. The paper is focused on categorization and seismic analysis of nuclear power plant items (piping, equipment, supports, valves, but not building structures). The paper outlines the current seismic recommendations, corresponding methods with the examples of calculation models. The paper considers calculation results of the mechanisms of dynamic behavior and the problems of developing a rational and economical approaches to seismic design and seismic protection. (author)

  2. Impacts of aerosol lead to natural ecosystems

    International Nuclear Information System (INIS)

    Murozumi, Masayo; Nakamura, Seiji; Yoshida, Katsumi

    1982-01-01

    Impacts of aerosol lead have changed the concentration and isotopic ratios of the element circulating in remote ecosystems in the Hidaka and Tarumae mountains. Concentrations of lead in successive each 10 years ring veneer of Cercidiphyllum Japonica show that amount of the element residing on the bark and supwood layers has increased by a factor of 2 or more in comparison with that of the core part. The isotopic ratios of lead in the basement rocks and soils under the ecosystems converge to a certain narrow spot along the isochron Iine of the element, and distinguish their geochronogical characteristics from other leads of different sources. In these ecosystems, however, the lead isotopic ratios of materials exposed to the atmosphere are similar to those of foreign and anthropogenic aerosol lead but are evidently dissimilar to those of the rocks and soils. Furthermore, the lead isotopic ratios in yearly ring veneers of Ceridiphyllum Japonica and Ostrya Japonica show a certain differentiation towards the bark from the core, i.e., an approach to those of anthropogenic aerosol lead from those of the basement rocks and soils, as listed in Table 7. The lead burden per hectare in these remote ecosystems has increased to 4 g by the impact of 2 g of aerosol lead. (author)

  3. PETRA - an Activity-based Approach to Travel Demand Analysis

    DEFF Research Database (Denmark)

    Fosgerau, Mogens

    2001-01-01

    This paper concerns the PETRA model developed by COWI in a project funded by the Danish Ministry of Transport, the Danish Transport Council and the Danish Energy Research Program. The model provides an alternative approach to activity based travel demand analysis that excludes the time dimension...

  4. Classification of lead white pigments using synchrotron radiation micro X-ray diffraction

    International Nuclear Information System (INIS)

    Welcomme, E.; Walter, P.; Menu, M.; Bleuet, P.; Hodeau, J.L.; Dooryhee, E.; Martinetto, P.

    2007-01-01

    Lead white pigment was used and synthesised for cosmetic and artistic purposes since the antiquity. Ancient texts describe the various recipes, and preparation processes as well as locations of production. In this study, we describe the results achieved on several paint samples taken from Matthias Gruenewald's works. Gruenewald, who was active between 1503 and 1524, was a major painter at the beginning of the German Renaissance. Thanks to X-ray diffraction analysis using synchrotron radiation, it is possible to associate the composition of the paint samples with the masters ancient recipes. Different approaches were used, in reflection and transmission modes, directly on minute samples or on paint cross-sections embedded in resin. Characterisation of lead white pigments reveals variations in terms of composition, graininess and proportion of mineral phases. The present work enlightens the presence of lead white as differentiable main composition groups, which could be specific of a period, a know-how or a geographical origin. In this way, we aim at understanding the choices and the trading of pigments used to realise paintings during northern European Renaissance. (orig.)

  5. Classification of lead white pigments using synchrotron radiation micro X-ray diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Welcomme, E.; Walter, P.; Menu, M. [Centre de Recherche et de Restauration des Musees de France - CNRS UMR 171, Paris (France); Bleuet, P. [European Synchrotron Radiation Facility, BP 220, Grenoble Cedex (France); Hodeau, J.L.; Dooryhee, E.; Martinetto, P. [Institut Neel CNRS-UPR 503-1, 25, Av. des Martyrs, BP 166, Grenoble Cedex 9 (France)

    2007-12-15

    Lead white pigment was used and synthesised for cosmetic and artistic purposes since the antiquity. Ancient texts describe the various recipes, and preparation processes as well as locations of production. In this study, we describe the results achieved on several paint samples taken from Matthias Gruenewald's works. Gruenewald, who was active between 1503 and 1524, was a major painter at the beginning of the German Renaissance. Thanks to X-ray diffraction analysis using synchrotron radiation, it is possible to associate the composition of the paint samples with the masters ancient recipes. Different approaches were used, in reflection and transmission modes, directly on minute samples or on paint cross-sections embedded in resin. Characterisation of lead white pigments reveals variations in terms of composition, graininess and proportion of mineral phases. The present work enlightens the presence of lead white as differentiable main composition groups, which could be specific of a period, a know-how or a geographical origin. In this way, we aim at understanding the choices and the trading of pigments used to realise paintings during northern European Renaissance. (orig.)

  6. Occupational exposures to solvents and lead as risk factors for Alzheimer's disease: A collaborative re-analysis of case-control studies

    NARCIS (Netherlands)

    A.B. Graves; C.M. van Duijn (Cornelia); V. Chandra; L. Fratiglioni (Laura); A. Heyman; A.F. Jorm; E. Kokmen (Emre); K. Kondo; J.A. Mortimer; W.A. Rocca; S.L. Shalat; H. Soininen; A. Hofman (Albert)

    1991-01-01

    textabstractA meta-analysis, involving the secondary analysis of original data from 11 case-control studies of Alzheimer's disease, is presented for occupational exposures to solvents and lead. Three studies had data on occupational exposure to solvents. Among cases, 21.3% were reported to have been

  7. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    Directory of Open Access Journals (Sweden)

    Chahinez Benkoussas

    2015-01-01

    Full Text Available A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  8. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  9. An Analysis of the Factors Leading to Rising Credit Risk in the Zimbabwe Banking Sector

    Directory of Open Access Journals (Sweden)

    Maxwell Sandada

    2016-02-01

    Full Text Available The study sought to analyse the factors that lead to rising credit risk in the Zimbabwean banking sector. The objective was to ascertain the impact of macroeconomic, industry and bank specific factors on rising credit risk in in Zimbabwe. The study aimed at contributing to credit risk management literature by providing evidence Sub Saharan context. Being anchored on the positivist quantitative research approach, a survey was carried out gather the data that were analysed using descriptive, correlation and regression analyses. The results revealed that the most significant factors leading to credit risk in the Zimbabwean banking sector were macroeconomic and bank specific factors. The industry factors did not show a significant influence on the rising credit risk. The research findings of this study will a valuable addition to the existing knowledge and provide a platform for further research on how the credit risk problems can be dealt with. While credit risk is known as one of the risks inherent to any banking institutions, the alarming levels of credit risk in the Zimbabwe banking sector has motivated this current study to critically analyse the factors that have led to the high credit risk levels.

  10. Identification of the sources of metal (lead) contamination in drinking waters in north-eastern Tasmania using lead isotopic compositions.

    Science.gov (United States)

    Harvey, P J; Handley, H K; Taylor, M P

    2015-08-01

    This study utilises a range of scientific approaches, including lead isotopic compositions, to differentiate unknown sources of ongoing lead contamination of a drinking water supply in north-eastern Tasmania, Australia. Drinking water lead concentrations are elevated above the Australian Drinking Water Guideline (10 μg/L), reaching 540 μg/L in the supply network. Water lead isotopic compositions from the town of Pioneer ((208)Pb/(207)Pb 2.406, (206)Pb/(207)Pb 1.144 to (208)Pb/(207)Pb 2.360, (206)Pb/(207)Pb 1.094) and Ringarooma ((208)Pb/(207)Pb 2.398, (206)Pb/(207)Pb 1.117) are markedly different from the local bedrock ((208)Pb/(207)Pb 2.496, (206)Pb/(207)Pb 1.237). The data show that the lead in the local waters is sourced from a combination of dilapidated drinking water infrastructure, including lead jointed pipelines, end-of-life polyvinyl chloride pipes and household plumbing. Drinking water is being inadvertently contaminated by aging infrastructure, and it is an issue that warrants investigation to limit the burden of disease from lead exposure.

  11. Lead isotope profiling in dairy calves.

    Science.gov (United States)

    Buchweitz, John; McClure-Brinton, Kimberly; Zyskowski, Justin; Stensen, Lauren; Lehner, Andreas

    2015-03-01

    Lead (Pb) is a common cause of heavy metal poisonings in cattle. Sources of Pb on farms include crankcase oil, machinery grease, batteries, plumbing, and paint chips. Consequently, consumption of Pb from these sources may negatively impact animal health and Pb may be inadvertently introduced into the food supply. Therefore, the scope of poisoning incidents must be clearly assessed and sources of intoxication identified and strategies to mitigate exposure evaluated and implemented to prevent future exposures. Stable isotope analysis by inductively-coupled plasma mass spectrometry (ICP-MS) has proven itself of value in forensic investigations. We report on the extension of Pb stable isotope analysis to bovine tissues and profile comparisons with paint chips and soils collected from an affected dairy farm to elucidate the primary source. Pb occurs naturally as four stable isotopes: (204)Pb, (206)Pb, (207)Pb, and (208)Pb. Herein a case is reported to illustrate the use of (207)Pb/(206)Pb and (208)Pb/(206)Pb ratios to link environmental sources of exposure with tissues from a poisoned animal. Chemical Pb profiling provides a valuable tool for field investigative approaches to Pb poisoning in production agriculture and is applicable to subclinical exposures. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. A simplified approach for slope stability analysis of uncontrolled waste dumps.

    Science.gov (United States)

    Turer, Dilek; Turer, Ahmet

    2011-02-01

    Slope stability analysis of municipal solid waste has always been problematic because of the heterogeneous nature of the waste materials. The requirement for large testing equipment in order to obtain representative samples has identified the need for simplified approaches to obtain the unit weight and shear strength parameters of the waste. In the present study, two of the most recently published approaches for determining the unit weight and shear strength parameters of the waste have been incorporated into a slope stability analysis using the Bishop method to prepare slope stability charts. The slope stability charts were prepared for uncontrolled waste dumps having no liner and leachate collection systems with pore pressure ratios of 0, 0.1, 0.2, 0.3, 0.4 and 0.5, considering the most critical slip surface passing through the toe of the slope. As the proposed slope stability charts were prepared by considering the change in unit weight as a function of height, they reflect field conditions better than accepting a constant unit weight approach in the stability analysis. They also streamline the selection of slope or height as a function of the desired factor of safety.

  13. Analysis of optically variable devices using a photometric light-field approach

    Science.gov (United States)

    Soukup, Daniel; Å tolc, Svorad; Huber-Mörk, Reinhold

    2015-03-01

    Diffractive Optically Variable Image Devices (DOVIDs), sometimes loosely referred to as holograms, are popular security features for protecting banknotes, ID cards, or other security documents. Inspection, authentication, as well as forensic analysis of these security features are still demanding tasks requiring special hardware tools and expert knowledge. Existing equipment for such analyses is based either on a microscopic analysis of the grating structure or a point-wise projection and recording of the diffraction patterns. We investigated approaches for an examination of DOVID security features based on sampling the Bidirectional Reflectance Distribution Function (BRDF) of DOVIDs using photometric stereo- and light-field-based methods. Our approach is demonstrated on the practical task of automated discrimination between genuine and counterfeited DOVIDs on banknotes. For this purpose, we propose a tailored feature descriptor which is robust against several expected sources of inaccuracy but still specific enough for the given task. The suggested approach is analyzed from both theoretical as well as practical viewpoints and w.r.t. analysis based on photometric stereo and light fields. We show that especially the photometric method provides a reliable and robust tool for revealing DOVID behavior and authenticity.

  14. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  15. Topology optimization of compliant adaptive wing leading edge with composite materials

    Directory of Open Access Journals (Sweden)

    Tong Xinxing

    2014-12-01

    Full Text Available An approach for designing the compliant adaptive wing leading edge with composite material is proposed based on the topology optimization. Firstly, an equivalent constitutive relationship of laminated glass fiber reinforced epoxy composite plates has been built based on the symmetric laminated plate theory. Then, an optimization objective function of compliant adaptive wing leading edge was used to minimize the least square error (LSE between deformed curve and desired aerodynamics shape. After that, the topology structures of wing leading edge of different glass fiber ply-orientations were obtained by using the solid isotropic material with penalization (SIMP model and sensitivity filtering technique. The desired aerodynamics shape of compliant adaptive wing leading edge was obtained based on the proposed approach. The topology structures of wing leading edge depend on the glass fiber ply-orientation. Finally, the corresponding morphing experiment of compliant wing leading edge with composite materials was implemented, which verified the morphing capability of topology structure and illustrated the feasibility for designing compliant wing leading edge. The present paper lays the basis of ply-orientation optimization for compliant adaptive wing leading edge in unmanned aerial vehicle (UAV field.

  16. Management of functional Sprint Fidelis leads at cardiac resynchronization therapy-defibrillator generator replacement: a novel option for preventing inappropriate shocks from lead failure in fragile patients with high risk of sudden death.

    Science.gov (United States)

    Zhu, Dennis W X; Chu, Matthew M; House, Chad M

    2017-12-01

    In patients with a functional Sprint Fidelis lead at generator replacement, the manufacturer recommended to either continue to use the existing lead or replace it with a new lead. For those patients who continue to use a functional Fidelis lead, the risk of inappropriate shocks remains present if the lead fails in the future. We evaluated the feasibility of an alternative approach at the time of cardiac resynchronization therapy-defibrillator (CRT-D) generator replacement in patients with a functional bipolar left ventricular (LV) lead for prevention of inappropriate shocks from future Fidelis lead failure. During the procedure, the pace/sense IS-1 connection pin of the functional Fidelis lead was intentionally inserted into the LV port of the new CRT-D generator, while the existing bipolar LV lead IS-1 connection pin was inserted into the right ventricular (RV) pace/sense port. After such switching, the existing bipolar LV lead was used for functional LV pacing/sensing, while the Fidelis lead was used for functional RV pacing and high voltage shock only and could no longer be used for the purpose of sensing and detecting. This approach precluded oversensing and inappropriate shocks should the functional Fidelis lead fail in the future. Six fragile patients, who were not considered suitable candidates for lead replacement, underwent the alternative approach. During a follow-up of 35 ± 23 months, the CRT-D system functioned normally in five patients. The Fidelis lead fractured in one patient 7 months after generator replacement. The malfunction was detected promptly and the defected lead was replaced. No inappropriate detections or shock was triggered. In CRT-D patients with a functional Fidelis lead and a bipolar LV lead, switching of the Fidelis lead pace/sense IS-1 pin with the bipolar LV lead IS-1 pin at generator replacement did not affect normal system function. This novel approach may be valuable in fragile patients with high risk of sudden death for

  17. Advanced approaches to failure mode and effect analysis (FMEA applications

    Directory of Open Access Journals (Sweden)

    D. Vykydal

    2015-10-01

    Full Text Available The present paper explores advanced approaches to the FMEA method (Failure Mode and Effect Analysis which take into account the costs associated with occurrence of failures during the manufacture of a product. Different approaches are demonstrated using an example FMEA application to production of drawn wire. Their purpose is to determine risk levels, while taking account of the above-mentioned costs. Finally, the resulting priority levels are compared for developing actions mitigating the risks.

  18. Selection of mode for the measurement of lead isotope ratios by inductively coupled plasma mass spectrometry and its application to milk powder analysis

    International Nuclear Information System (INIS)

    Dean, J.R.; Ebdon, L.; Massey, R.

    1987-01-01

    An investigation into the selection of the optimum mode for the measurement of isotope ratios in inductively coupled plasma mass spectrometry (ICP-MS) is reported, with particular reference to lead isotope ratios. Variation in the accuracy and precision achievable using the measurement modes of scanning and peak jumping are discussed. It is concluded that if sufficient sample and time are available, scanning gives best accuracy and precision. Isotope dilution analysis (IDA) has been applied to the measurement of the lead content of two dried milk powders of Australian and European origin introduced as slurries into ICP-MS. Differences in the lead isotope ratios in the two milk powders were investigated and the total lead content determined by IDA. Isotope dilution analysis permitted accurate data to be obtained with an RSD of 4.2% or milk powder containing less than 30 ng g -1 of lead. The ICP-MS technique is confirmed as a useful tool for IDA. (author)

  19. Impact analysis of leading sub sector on basic sector to regional income in Siak Regency, Riau Province

    Science.gov (United States)

    Astuti, P.; Nugraha, I.; Abdillah, F.

    2018-02-01

    During this time Siak regency only known as oil producing regency in Riau province, but based on the vision of spatial planning Siak’s regency in 2031 there was a shift from petroleum towards to other sectors such as agribusiness, agroindustry and tourism. The purpose of this study was to identify the sector base, the leading subsectors and shift with their characteristics and to identify the leading subsectors development priority. The method used in this research consisted of the method of Location Quotient (LQ, Shift Share, and Overlay method). The research results were used Location Quotient (LQ) to identify sector’s base in Siak regency based on the document of PDRB. The sector’s refers to the constant prices year of 2000 were mining and quarrying sector (2.25). The sector’s base using document of PDRB at constant prices 2000 without oil and gas sector was the agricultural sector with a value of LQ was 2,45. The leading sub sector in the Siak regency with mining and quarrying sector was oil and gas (1.02) and leading sub sector without oil and gas sector was the plantation sector (1.48) and forestry sector (1.73). Overlay analysis results shown that agriculture sector as a sector base and plantation and forestry as a leading sub sector has positive value and categorize as progressive and competitiveness. Because of that, this leading sub sector gets high priority to developing.

  20. Lead-Time Models Should Not Be Used to Estimate Overdiagnosis in Cancer Screening

    DEFF Research Database (Denmark)

    Zahl, Per-Henrik; Jørgensen, Karsten Juhl; Gøtzsche, Peter C

    2014-01-01

    screening--the excess-incidence approach and the lead-time approach--that rely on two different lead-time definitions. Overdiagnosis when screening with mammography has varied from 0 to 75 %. We have explained that these differences are mainly caused by using different definitions and methods......Lead-time can mean two different things: Clinical lead-time is the lead-time for clinically relevant tumors; that is, those that are not overdiagnosed. Model-based lead-time is a theoretical construct where the time when the tumor would have caused symptoms is not limited by the person's death....... It is the average time at which the diagnosis is brought forward for both clinically relevant and overdiagnosed cancers. When screening for breast cancer, clinical lead-time is about 1 year, while model-based lead-time varies from 2 to 7 years. There are two different methods to calculate overdiagnosis in cancer...

  1. Public-Private Partnerships in Lead Discovery: Overview and Case Studies.

    Science.gov (United States)

    Gottwald, Matthias; Becker, Andreas; Bahr, Inke; Mueller-Fahrnow, Anke

    2016-09-01

    The pharmaceutical industry is faced with significant challenges in its efforts to discover new drugs that address unmet medical needs. Safety concerns and lack of efficacy are the two main technical reasons for attrition. Improved early research tools including predictive in silico, in vitro, and in vivo models, as well as a deeper understanding of the disease biology, therefore have the potential to improve success rates. The combination of internal activities with external collaborations in line with the interests and needs of all partners is a successful approach to foster innovation and to meet the challenges. Collaboration can take place in different ways, depending on the requirements of the participants. In this review, the value of public-private partnership approaches will be discussed, using examples from the Innovative Medicines Initiative (IMI). These examples describe consortia approaches to develop tools and processes for improving target identification and validation, as well as lead identification and optimization. The project "Kinetics for Drug Discovery" (K4DD), focusing on the adoption of drug-target binding kinetics analysis in the drug discovery decision-making process, is described in more detail. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Measurement Of Lead Equivalent Thickness For Irradiation Room: An Analysis

    International Nuclear Information System (INIS)

    Mohd Khalid Matori; Azuhar Ripin; Husaini Salleh; Mohd Khairusalih Mohd Zin; Muhammad Jamal Muhd Isa; Mohd Faizal Abdul Rahman

    2014-01-01

    The Malaysian Ministry of Health (MOH) has established that the irradiation room must have a sufficient thickness of shielding to ensure that requirements for the purpose of radiation protection of patients, employees and the public are met. This paper presents a technique using americium-241 source to test and verify the integrity of the shielding thickness in term of lead equivalent for irradiation room at health clinics own by MOH. Results of measurement of 8 irradiation rooms conducted in 2014 were analyzed for this presentation. Technical comparison of the attenuation of gamma rays from Am-241 source through the walls of the irradiation room and pieces of lead were used to assess the lead equivalent thickness of the walls. Results showed that almost all the irradiation rooms tested meet the requirements of the Ministry of Health and is suitable for the installation of the intended diagnostic X-ray apparatus. Some specific positions such as door knobs and locks, electrical plug sockets were identified with potential to not met the required lead equivalent thickness hence may contribute to higher radiation exposure to workers and the public. (author)

  3. Toxicological analysis of the risk of lead exposure in metal processing

    African Journals Online (AJOL)

    concentration and biological lead toxicity markers in blood and urine were performed for both exposed ... biological samples were determined by spectrophotometric methods. Results: The ..... Occupational lead exposure of storage battery.

  4. A QSAR approach for virtual screening of lead-like molecules en route to antitumor and antibiotic drugs from marine and microbial natural products

    Directory of Open Access Journals (Sweden)

    Florbela Pereira

    2014-05-01

    Figure 1. The unreported 15 lead antibiotic MNPs and MbNPs from AntiMarin database, using the best Rfs antibiotic model with a probability of being antibiotic greater than or equal to 0.8. Figure 2. The selected 4 lead antitumor MNPs and MbNPs from the AntiMarin database, using the best Rfs antitumor model with a probability of being antitumor greater than or equal to 0.8. The present work corroborates by one side the results of our previous work6 and enables the presentation of a new set of possible lead like bioactive compounds. Additionally, it is shown the usefulness of quantum-chemical descriptors in the discrimination of biological active and inactive compounds. The use of the εHOMO quantum-chemical descriptor in the discrimination of large scale data sets of lead-like or drug-like compounds has never been reported. This approach results in the reduction, in great extent, of the number of compounds used in real screens, and it reinforces the results of our previous work. Furthermore, besides the virtual screening, the computational methods can be very useful to build appropriate databases, allowing for effective shortcuts of NP extracts dereplication procedures, which will certainly result in increasing the efficiency of drug discovery.

  5. A social network analysis of alcohol-impaired drivers in Maryland : an egocentric approach.

    Science.gov (United States)

    2011-04-01

    This study examined the personal, household, and social structural attributes of alcoholimpaired : drivers in Maryland. The study used an egocentric approach of social network : analysis. This approach concentrated on specific actors (alcohol-impaire...

  6. Linear mixed-effects modeling approach to FMRI group analysis.

    Science.gov (United States)

    Chen, Gang; Saad, Ziad S; Britton, Jennifer C; Pine, Daniel S; Cox, Robert W

    2013-06-01

    Conventional group analysis is usually performed with Student-type t-test, regression, or standard AN(C)OVA in which the variance-covariance matrix is presumed to have a simple structure. Some correction approaches are adopted when assumptions about the covariance structure is violated. However, as experiments are designed with different degrees of sophistication, these traditional methods can become cumbersome, or even be unable to handle the situation at hand. For example, most current FMRI software packages have difficulty analyzing the following scenarios at group level: (1) taking within-subject variability into account when there are effect estimates from multiple runs or sessions; (2) continuous explanatory variables (covariates) modeling in the presence of a within-subject (repeated measures) factor, multiple subject-grouping (between-subjects) factors, or the mixture of both; (3) subject-specific adjustments in covariate modeling; (4) group analysis with estimation of hemodynamic response (HDR) function by multiple basis functions; (5) various cases of missing data in longitudinal studies; and (6) group studies involving family members or twins. Here we present a linear mixed-effects modeling (LME) methodology that extends the conventional group analysis approach to analyze many complicated cases, including the six prototypes delineated above, whose analyses would be otherwise either difficult or unfeasible under traditional frameworks such as AN(C)OVA and general linear model (GLM). In addition, the strength of the LME framework lies in its flexibility to model and estimate the variance-covariance structures for both random effects and residuals. The intraclass correlation (ICC) values can be easily obtained with an LME model with crossed random effects, even at the presence of confounding fixed effects. The simulations of one prototypical scenario indicate that the LME modeling keeps a balance between the control for false positives and the sensitivity

  7. Data classification and MTBF prediction with a multivariate analysis approach

    International Nuclear Information System (INIS)

    Braglia, Marcello; Carmignani, Gionata; Frosolini, Marco; Zammori, Francesco

    2012-01-01

    The paper presents a multivariate statistical approach that supports the classification of mechanical components, subjected to specific operating conditions, in terms of the Mean Time Between Failure (MTBF). Assessing the influence of working conditions and/or environmental factors on the MTBF is a prerequisite for the development of an effective preventive maintenance plan. However, this task may be demanding and it is generally performed with ad-hoc experimental methods, lacking of statistical rigor. To solve this common problem, a step by step multivariate data classification technique is proposed. Specifically, a set of structured failure data are classified in a meaningful way by means of: (i) cluster analysis, (ii) multivariate analysis of variance, (iii) feature extraction and (iv) predictive discriminant analysis. This makes it possible not only to define the MTBF of the analyzed components, but also to identify the working parameters that explain most of the variability of the observed data. The approach is finally demonstrated on 126 centrifugal pumps installed in an oil refinery plant; obtained results demonstrate the quality of the final discrimination, in terms of data classification and failure prediction.

  8. Risk Analysis Approach to Rainwater Harvesting Systems

    Directory of Open Access Journals (Sweden)

    Nadia Ursino

    2016-08-01

    Full Text Available Urban rainwater reuse preserves water resources and promotes sustainable development in rapidly growing urban areas. The efficiency of a large number of urban water reuse systems, operating under different climate and demand conditions, is evaluated here on the base of a new risk analysis approach. Results obtained by probability analysis (PA indicate that maximum efficiency in low demanding scenarios is above 0.5 and a threshold, distinguishing low from high demanding scenarios, indicates that in low demanding scenarios no significant improvement in performance may be attained by increasing the storage capacity of rainwater harvesting tanks. Threshold behaviour is displayed when tank storage capacity is designed to match both the average collected volume and the average reuse volume. The low demand limit cannot be achieved under climate and operating conditions characterized by a disproportion between harvesting and demand volume.

  9. EEG source space analysis of the supervised factor analytic approach for the classification of multi-directional arm movement

    Science.gov (United States)

    Shenoy Handiru, Vikram; Vinod, A. P.; Guan, Cuntai

    2017-08-01

    Objective. In electroencephalography (EEG)-based brain-computer interface (BCI) systems for motor control tasks the conventional practice is to decode motor intentions by using scalp EEG. However, scalp EEG only reveals certain limited information about the complex tasks of movement with a higher degree of freedom. Therefore, our objective is to investigate the effectiveness of source-space EEG in extracting relevant features that discriminate arm movement in multiple directions. Approach. We have proposed a novel feature extraction algorithm based on supervised factor analysis that models the data from source-space EEG. To this end, we computed the features from the source dipoles confined to Brodmann areas of interest (BA4a, BA4p and BA6). Further, we embedded class-wise labels of multi-direction (multi-class) source-space EEG to an unsupervised factor analysis to make it into a supervised learning method. Main Results. Our approach provided an average decoding accuracy of 71% for the classification of hand movement in four orthogonal directions, that is significantly higher (>10%) than the classification accuracy obtained using state-of-the-art spatial pattern features in sensor space. Also, the group analysis on the spectral characteristics of source-space EEG indicates that the slow cortical potentials from a set of cortical source dipoles reveal discriminative information regarding the movement parameter, direction. Significance. This study presents evidence that low-frequency components in the source space play an important role in movement kinematics, and thus it may lead to new strategies for BCI-based neurorehabilitation.

  10. Bioremoval of lead using Pennisetum purpureum augmented with Enterobacter cloacae-VITPASJ1: A pot culture approach.

    Science.gov (United States)

    Das, Anamika; Belgaonkar, Priyanka; Raman, Aditya S; Banu, Sofia; Osborne, Jabez W

    2017-06-01

    Lead is a toxic heavy metal discharged into the ecosystem from various industries. Biological remediation strategies have been effective in the bioremoval of lead. In our current study, a phytobacterial system using Pennisetum purpureum along with lead-resistant bacterium (LRB) was employed for the uptake of lead. The LRB was obtained from lead-contaminated sites. The isolate VITPASJ1 was found to be highly tolerant to lead and was identified as an effective plant growth-promoting bacterium. The 16S rRNA sequencing revealed VITPASJ1 to be the closest neighbour of Enterobacter cloacae. The lead-resistant gene pbrA in the plant and the bacterium were amplified using a specific primer. The uptake of lead was studied by phytoremediation and rhizoremediation set-ups where the soil was supplemented with various concentrations of lead (50, 100, 150 mg/kg). The plants were uprooted at regular intervals, and the translocation of lead into the plant was determined by atomic absorption spectroscopy. The root length, shoot height and chlorophyll content were found to be higher in the rhizoremediation set-up when compared to the phytoremediation set-up. The scanning electron microscopic micrographs gave a clear picture of increased tissue damage in the root and shoot of the phytoremediation set-up as compared to the rhizoremediation set-up with LRB.

  11. A systematic risk management approach employed on the CloudSat project

    Science.gov (United States)

    Basilio, R. R.; Plourde, K. S.; Lam, T.

    2000-01-01

    The CloudSat Project has developed a simplified approach for fault tree analysis and probabilistic risk assessment. A system-level fault tree has been constructed to identify credible fault scenarios and failure modes leading up to a potential failure to meet the nominal mission success criteria.

  12. Media deliberation on intra-EU migration. A qualitative approach to framing based on rhetorical analysis

    Directory of Open Access Journals (Sweden)

    Alexandru Cârlan

    2016-04-01

    Full Text Available In this paper we investigate how the model of deliberation proposed by Isabela and Norman Fairclough can be used for a better clarification and understanding of the framing processes in media – especially in opinion articles. We thus aim at integrating theoretical contributions from critical discourse analysis and argumentation theory with standard approaches to framing, originating in media studies. We emphasize how a rhetorical approach to framing can provide analytical insights into framing processes and complement the typical quantitative approaches with qualitative analysis based on textual reconstruction. Starting from an issue-specific approach to framing, we discuss a particular case of framing of intra-EU migration, analyzing four opinion articles selected from a larger corpus of Romanian, British and French media. We highlight, along our analysis, various methodological options and analytical difficulties inherent to such an approach.

  13. Event generation for next to leading order chargino production at the international linear collider

    Energy Technology Data Exchange (ETDEWEB)

    Robens, T.

    2006-10-15

    At the International Linear Collider (ILC), parameters of supersymmetry (SUSY) can be determined with an experimental accuracy matching the precision of next-to-leading order (NLO) and higher-order theoretical predictions. Therefore, these contributions need to be included in the analysis of the parameters. We present a Monte-Carlo event generator for simulating chargino pair production at the ILC at next-to-leading order in the electroweak couplings. We consider two approaches of including photon radiation. A strict fixed-order approach allows for comparison and consistency checks with published semianalytic results in the literature. A version with soft- and hard-collinear resummation of photon radiation, which combines photon resummation with the inclusion of the NLO matrix element for the production process, avoids negative event weights, so the program can simulate physical (unweighted) event samples. Photons are explicitly generated throughout the range where they can be experimentally resolved. In addition, it includes further higher-order corrections unaccounted for by the fixed-order method. Inspecting the dependence on the cutoffs separating the soft and collinear regions, we evaluate the systematic errors due to soft and collinear approximations for NLO and higher-order contributions. In the resummation approach, the residual uncertainty can be brought down to the per-mil level, coinciding with the expected statistical uncertainty at the ILC. We closely investigate the two-photon phase space for the resummation method. We present results for cross sections and event generation for both approaches. (orig.)

  14. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  15. An Approach for Impression Creep of Lead Free Microelectronic Solders

    Science.gov (United States)

    Anastasio, Onofrio A.

    2002-06-01

    Currently, the microelectronics industry is transitioning from lead-containing to lead-free solders in response to legislation in the EU and Japan. Before an alternative alloy can be designated as a replacement for current Pb-Sn extensive testing must be accomplished. One major characteristic of the alloy that must be considered is creep. Traditionally, creep testing requires numerous samples and a long tin, which thwarts the generation of comprehensive creep databases for difficult to prepare samples such as microelectronic solder joints. However, a relatively new technique, impression creep enables us to rapidly generate creep data. This test uses a cylindrical punch with a flat end to make an impression on the surface of a specimen under constant load. The steady state velocity of the indenter is found to have the same stress and temperature dependence as the conventional unidirectional creep test using bulk specimens. This thesis examines impression creep tests of eutectic Sn-Ag. A testing program and apparatus was developed constructed based on a servo hydraulic test frame. The apparatus is capable of a load resolution of 0.01N with a stability of plus/minus 0.1N, and a displacement resolution of 0.05 microns with a stability of plus/minus 0.1 microns. Samples of eutectic Sn-Ag solder were reflowed to develop the microstructure used in microelectronic packaging. Creep tests were conducted at various stresses and temperatures and showed that coarse microstructures creep more rapidly than the microstructures in the tested regime.

  16. A comparison of portable XRF and ICP-OES analysis for lead on air filter samples from a lead ore concentrator mill and a lead-acid battery recycler.

    Science.gov (United States)

    Harper, Martin; Pacolay, Bruce; Hintz, Patrick; Andrew, Michael E

    2006-03-01

    Personal and area samples for airborne lead were taken at a lead mine concentrator mill, and at a lead-acid battery recycler. Lead is mined as its sulfidic ore, galena, which is often associated with zinc and silver. The ore typically is concentrated, and partially separated, on site by crushing and differential froth flotation of the ore minerals before being sent to a primary smelter. Besides lead, zinc and iron are also present in the airborne dusts, together with insignificant levels of copper and silver, and, in one area, manganese. The disposal of used lead-acid batteries presents environmental issues, and is also a waste of recoverable materials. Recycling operations allow for the recovery of lead, which can then be sold back to battery manufacturers to form a closed loop. At the recycling facility lead is the chief airborne metal, together with minor antimony and tin, but several other metals are generally present in much smaller quantities, including copper, chromium, manganese and cadmium. Samplers used in these studies included the closed-face 37 mm filter cassette (the current US standard method for lead sampling), the 37 mm GSP or "cone" sampler, the 25 mm Institute of Occupational Medicine (IOM) inhalable sampler, the 25 mm Button sampler, and the open-face 25 mm cassette. Mixed cellulose-ester filters were used in all samplers. The filters were analyzed after sampling for their content of the various metals, particularly lead, that could be analyzed by the specific portable X-ray fluorescence (XRF) analyzer under study, and then were extracted with acid and analyzed by inductively coupled plasma optical emission spectroscopy (ICP-OES). The 25 mm filters were analyzed using a single XRF reading, while three readings on different parts of the filter were taken from the 37 mm filters. For lead at the mine concentrate mill, all five samplers gave good correlations (r2 > 0.96) between the two analytical methods over the entire range of found lead mass

  17. Investigation into the structure of lead-borate glass

    International Nuclear Information System (INIS)

    Kurtsinovskaya, R.I.

    1976-01-01

    X-ray phase and IR analysis of lead borate glasses show that glasses containing from 12 to 45 mole % PbO consist of several phases. A comparison of x-ray different data for lead borate and lead germanate glasses, which have two maxima on the diffraction patterns throughout the glass-formation region, shows that the microstructure of lead borate glasses is far more complex

  18. The Interpretative Phenomenological Analysis (IPA: A Guide to a Good Qualitative Research Approach

    Directory of Open Access Journals (Sweden)

    Abayomi Alase

    2017-04-01

    Full Text Available As a research methodology, qualitative research method infuses an added advantage to the exploratory capability that researchers need to explore and investigate their research studies. Qualitative methodology allows researchers to advance and apply their interpersonal and subjectivity skills to their research exploratory processes. However, in a study with an interpretative phenomenological analysis (IPA approach, the advantageous elements of the study quadruple because of the bonding relationship that the approach allows for the researchers to develop with their research participants. Furthermore, as a qualitative research approach, IPA gives researchers the best opportunity to understand the innermost deliberation of the ‘lived experiences’ of research participants. As an approach that is ‘participant-oriented’, interpretative phenomenological analysis approach allows the interviewees (research participants to express themselves and their ‘lived experience’ stories the way they see fit without any distortion and/or prosecution. Therefore, utilizing the IPA approach in a qualitative research study reiterates the fact that its main objective and essence are to explore the ‘lived experiences’ of the research participants and allow them to narrate the research findings through their ‘lived experiences’. As such, this paper discusses the historical background of phenomenology as both a theory and a qualitative research approach, an approach that has transitioned into an interpretative analytical tradition. Furthermore, as a resource tool to novice qualitative researchers, this paper provides a step-by-step comprehensive guide to help prepare and equip researchers with ways to utilize and apply the IPA approach in their qualitative research studies.  More importantly, this paper also provides an advanced in-depth analysis and usability application for the IPA approach in a qualitatively conducted research study. As such, this

  19. Surface analysis and depth profiling of corrosion products formed in lead pipes used to supply low alkalinity drinking water.

    Science.gov (United States)

    Davidson, C M; Peters, N J; Britton, A; Brady, L; Gardiner, P H E; Lewis, B D

    2004-01-01

    Modern analytical techniques have been applied to investigate the nature of lead pipe corrosion products formed in pH adjusted, orthophosphate-treated, low alkalinity water, under supply conditions. Depth profiling and surface analysis have been carried out on pipe samples obtained from the water distribution system in Glasgow, Scotland, UK. X-ray diffraction spectrometry identified basic lead carbonate, lead oxide and lead phosphate as the principal components. Scanning electron microscopy/energy-dispersive x-ray spectrometry revealed the crystalline structure within the corrosion product and also showed spatial correlations existed between calcium, iron, lead, oxygen and phosphorus. Elemental profiling, conducted by means of secondary ion mass spectrometry (SIMS) and secondary neutrals mass spectrometry (SNMS) indicated that the corrosion product was not uniform with depth. However, no clear stratification was apparent. Indeed, counts obtained for carbonate, phosphate and oxide were well correlated within the depth range probed by SIMS. SNMS showed relationships existed between carbon, calcium, iron, and phosphorus within the bulk of the scale, as well as at the surface. SIMS imaging confirmed the relationship between calcium and lead and suggested there might also be an association between chloride and phosphorus.

  20. Analysis of spent fuel assay with a lead slowing down spectrometer

    International Nuclear Information System (INIS)

    Gavron, A.; Smith, L. Eric; Ressler, Jennifer J.

    2009-01-01

    Assay of fissile materials in spent fuel that are produced or depleted during the operation of a reactor, is of paramount importance to nuclear materials accounting, verification of the reactor operation history, as well as for criticality considerations for storage. In order to prevent future proliferation following the spread of nuclear energy, we must develop accurate methods to assay large quantities of nuclear fuels. We analyze the potential of using a Lead Slowing Down Spectrometer for assaying spent fuel. We conclude that it possible to design a system that will provide around 1% statistical precision in the determination of the 239 Pu, 241 Pu and 235 U concentrations in a PWR spent-fuel assembly, for intermediate-to-high burnup levels, using commercial neutron sources, and a system of 238 U threshold fission detectors. Pending further analysis of systematic errors, it is possible that missing pins can be detected, as can asymmetry in the fuel bundle. (author)

  1. Lead (Pb) isotopic fingerprinting and its applications in lead pollution studies in China: A review

    International Nuclear Information System (INIS)

    Cheng Hefa; Hu Yuanan

    2010-01-01

    As the most widely scattered toxic metal in the world, the sources of lead (Pb) observed in contamination investigation are often difficult to identify. This review presents an overview of the principles, analysis, and applications of Pb isotopic fingerprinting in tracing the origins and transport pathways of Pb in the environment. It also summarizes the history and current status of lead pollution in China, and illustrates the power of Pb isotopic fingerprinting with examples of its recent applications in investigating the effectiveness of leaded gasoline phase-out on atmospheric lead pollution, and the sources of Pb found in various environmental media (plants, sediments, and aquatic organisms) in China. The limitations of Pb isotopic fingerprinting technique are discussed and a perspective on its development is also presented. Further methodological developments and more widespread instrument availability are expected to make isotopic fingerprinting one of the key tools in lead pollution investigation. - This review presents an overview of the principles, applications, and limitations of Pb isotopic fingerprinting in lead pollution investigation, and provides a perspective on its future development.

  2. Eigenvalue-eigenvector decomposition (EED) analysis of dissimilarity and covariance matrix obtained from total synchronous fluorescence spectral (TSFS) data sets of herbal preparations: Optimizing the classification approach

    Science.gov (United States)

    Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar

    2017-09-01

    The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.

  3. Analysis of neighborhood behavior in lead optimization and array design.

    Science.gov (United States)

    Papadatos, George; Cooper, Anthony W J; Kadirkamanathan, Visakan; Macdonald, Simon J F; McLay, Iain M; Pickett, Stephen D; Pritchard, John M; Willett, Peter; Gillet, Valerie J

    2009-02-01

    Neighborhood behavior describes the extent to which small structural changes defined by a molecular descriptor are likely to lead to small property changes. This study evaluates two methods for the quantification of neighborhood behavior: the optimal diagonal method of Patterson et al. and the optimality criterion method of Horvath and Jeandenans. The methods are evaluated using twelve different types of fingerprint (both 2D and 3D) with screening data derived from several lead optimization projects at GlaxoSmithKline. The principal focus of the work is the design of chemical arrays during lead optimization, and the study hence considers not only biological activity but also important drug properties such as metabolic stability, permeability, and lipophilicity. Evidence is provided to suggest that the optimality criterion method may provide a better quantitative description of neighborhood behavior than the optimal diagonal method.

  4. Lead (Pb) isotopic fingerprinting and its applications in lead pollution studies in China: a review.

    Science.gov (United States)

    Cheng, Hefa; Hu, Yuanan

    2010-05-01

    As the most widely scattered toxic metal in the world, the sources of lead (Pb) observed in contamination investigation are often difficult to identify. This review presents an overview of the principles, analysis, and applications of Pb isotopic fingerprinting in tracing the origins and transport pathways of Pb in the environment. It also summarizes the history and current status of lead pollution in China, and illustrates the power of Pb isotopic fingerprinting with examples of its recent applications in investigating the effectiveness of leaded gasoline phase-out on atmospheric lead pollution, and the sources of Pb found in various environmental media (plants, sediments, and aquatic organisms) in China. The limitations of Pb isotopic fingerprinting technique are discussed and a perspective on its development is also presented. Further methodological developments and more widespread instrument availability are expected to make isotopic fingerprinting one of the key tools in lead pollution investigation. Copyright 2009 Elsevier Ltd. All rights reserved.

  5. Lead isotopes reveal different sources of lead in balsamic and other vinegars

    International Nuclear Information System (INIS)

    Ndung'u, Kuria; Hibdon, Sharon; Veron, Alain; Flegal, A. Russell

    2011-01-01

    Fifty-eight brands of balsamic vinegars were analyzed for lead concentrations and isotopic compositions ( 204 Pb, 206 Pb, 207 Pb, and 208 Pb) to test the findings of a previous study indicating relatively high levels of lead contamination in some of those vinegars - more than two thirds (70%) of them exceeded California's State Maximum Level (34 μg/L) based on consumption rates ≥ 0.5 μg Pb per day. The Lead isotopic fingerprints of all those vinegars with high lead concentrations were then found to be primarily anthropogenic. This isotopic analysis unquestionably reveals multiple contamination sources including atmospheric pollutant Pb and an unidentified contamination source, likely occurring after grape harvest. Organically grown grape vinegars display the same Pb content and isotopic signatures as other vinegars. This implies that pesticides might not be a significant source of pollutant Pb in vinegars. A significant post-harvest contamination would be inherited from chemicals added during production and/or material used during transport, processing or storage of these vinegars. This is consistent with the highest Pb levels being found in aged vinegars (112 ± 112 μg/L) in contrast to other vinegars (41.6 ± 28.9 μg/L) suggesting contamination during storage. It is, therefore, projected that lead levels in most vinegars, especially aged balsamic and wine vinegars, will decrease with improvements in their manufacture and storage processes consequential to recent concerns of elevated levels of lead in some vinegars. - Highlights: → First extensive study on content and possible sources of lead in balsamic vinegars. → Half of the vinegars exceed California's State Maximum Level for human consumption. → Lead content in vinegars seems to be mainly post-harvest from industrial processes.

  6. Aspects of using a best-estimate approach for VVER safety analysis in reactivity initiated accidents

    Energy Technology Data Exchange (ETDEWEB)

    Ovdiienko, Iurii; Bilodid, Yevgen; Ieremenko, Maksym [State Scientific and Technical Centre on Nuclear and Radiation, Safety (SSTC N and RS), Kyiv (Ukraine); Loetsch, Thomas [TUEV SUED Industrie Service GmbH, Energie und Systeme, Muenchen (Germany)

    2016-09-15

    At present time, Ukraine faces the problem of small margins of acceptance criteria in connection with the implementation of a conservative approach for safety evaluations. The problem is particularly topical conducting feasibility analysis of power up-rating for Ukrainian nuclear power plants. Such situation requires the implementation of a best-estimate approach on the basis of an uncertainty analysis. For some kind of accidents, such as loss-of-coolant accident (LOCA), the best estimate approach is, more or less, developed and established. However, for reactivity initiated accident (RIA) analysis an application of best estimate method could be problematical. A regulatory document in Ukraine defines a nomenclature of neutronics calculations and so called ''generic safety parameters'' which should be used as boundary conditions for all VVER-1000 (V-320) reactors in RIA analysis. In this paper the ideas of uncertainty evaluations of generic safety parameters in RIA analysis in connection with the use of the 3D neutron kinetic code DYN3D and the GRS SUSA approach are presented.

  7. Feasibility of implementation of a "simplified, No-X-Ray, no-lead apron, two-catheter approach" for ablation of supraventricular arrhythmias in children and adults.

    Science.gov (United States)

    Stec, Sebastian; Śledź, Janusz; Mazij, Mariusz; Raś, Małgorzata; Ludwik, Bartosz; Chrabąszcz, Michał; Śledź, Arkadiusz; Banasik, Małgorzata; Bzymek, Magdalena; Młynarczyk, Krzysztof; Deutsch, Karol; Labus, Michał; Śpikowski, Jerzy; Szydłowski, Lesław

    2014-08-01

    Although the "near-zero-X-Ray" or "No-X-Ray" catheter ablation (CA) approach has been reported for treatment of various arrhythmias, few prospective studies have strictly used "No-X-Ray," simplified 2-catheter approaches for CA in patients with supraventricular tachycardia (SVT). We assessed the feasibility of a minimally invasive, nonfluoroscopic (MINI) CA approach in such patients. Data were obtained from a prospective multicenter CA registry of patients with regular SVTs. After femoral access, 2 catheters were used to create simple, 3D electroanatomic maps and to perform electrophysiologic studies. Medical staff did not use lead aprons after the first 10 MINI CA cases. A total of 188 patients (age, 45 ± 21 years; 17% 0.05), major complications (0% vs. 0%, P > 0.05) and acute (98% vs. 98%, P > 0.05) and long-term (93% vs. 94%, P > 0.05) success rates were similar in the "No-X-Ray" and control groups. Implementation of a strict "No-X-Ray, simplified 2-catheter" CA approach is safe and effective in majority of the patients with SVT. This modified approach for SVTs should be prospectively validated in a multicenter study. © 2014 Wiley Periodicals, Inc.

  8. Lead pollution in Islamabad

    International Nuclear Information System (INIS)

    Mohammad, D.; Khatoon, N.; Ishaque, M.; Ahmed, I.

    1997-01-01

    Lead pollution of urban area emanating from the vehicular exhaust alone is being labeled as one of the worst form of environmental degradation attracting our attention for remediation. For factual assessment samples were collected from different areas of Islamabad. These samples consisted of tree scrapings / peelings, which were dried and ground before undertaking analysis for the lead content. The samples were digested with an acid mixture to remove the organic matter and analyzed using GFAAS technique. A total of 81 samples have been analyzed. The results sowed a lead content varying in the range of 8-474 Mu g g/sup -1/) and 23 samples with Pb content <50 Mu g g-1 (8.0-50.0 Mu g g/sup -1/). Most of the samples also contained some growth which consisted of bacterial, algae and fugal cells and the results have been explained on the basis of Pb absorption by these cells. The procedure followed in this study is recommended for evaluation of lead pollution in urban areas. (author)

  9. [Approaches to medical training among physicians who teach; analysis of two different educational strategies].

    Science.gov (United States)

    Loría-Castellanos, Jorge; Rivera-lbarra, Doris Beatriz; Márquez-Avila, Guadalupe

    2009-01-01

    Compare the outreach of a promotional educational strategy that focuses on active participation and compare it with a more traditional approach to medical training. A quasi-experimental design was approved by the research committee. We compared the outreach of two different approaches to medical training. We administered a validated instrument that included 72 items that analyze statements used to measure educational tasks in the form of duplets through 3 indicators. A group that included seven physicians that were actively participating in teaching activities was stratified according to teaching approaches. One of the approaches was a traditional one and the other included a promotional strategy aimed at increasing participation. All participants signed informed consent before answering the research instruments. Statistical analysis was done using non-parametric tests. Mann-Whitney results did not show differences among the group in the preliminary analysis. A second analysis with the same test after the interventions found significant differences (p d" 0.018) in favor of those subjects that had participated in the promotional approach mainly in the indicator measuring "consequence". The Wilcoxon test showed that all participants in the promotional approach increased significantly (pd" 0.018) in 3 main indicators as compared with the control group. A promotional strategy aimed at increasing physician participation constitutes a more profitable approach when compared with traditional teaching methods.

  10. Direct numerical simulation and statistical analysis of turbulent convection in lead-bismuth

    Energy Technology Data Exchange (ETDEWEB)

    Otic, I.; Grotzbach, G. [Forschungszentrum Karlsruhe GmbH, Institut fuer Kern-und Energietechnik (Germany)

    2003-07-01

    Improved turbulent heat flux models are required to develop and analyze the reactor concept of an lead-bismuth cooled Accelerator-Driven-System. Because of specific properties of many liquid metals we have still no sensors for accurate measurements of the high frequency velocity fluctuations. So, the development of the turbulent heat transfer models which are required in our CFD (computational fluid dynamics) tools needs also data from direct numerical simulations of turbulent flows. We use new simulation results for the model problem of Rayleigh-Benard convection to show some peculiarities of the turbulent natural convection in lead-bismuth (Pr = 0.025). Simulations for this flow at sufficiently large turbulence levels became only recently feasible because this flow requires the resolution of very small velocity scales with the need for recording long-wave structures for the slow changes in the convective temperature field. The results are analyzed regarding the principle convection and heat transfer features. They are also used to perform statistical analysis to show that the currently available modeling is indeed not adequate for these fluids. Basing on the knowledge of the details of the statistical features of turbulence in this convection type and using the two-point correlation technique, a proposal for an improved statistical turbulence model is developed which is expected to account better for the peculiarities of the heat transfer in the turbulent convection in low Prandtl number fluids. (authors)

  11. Analysis of the laser-induced discoloration of lead white pigment

    International Nuclear Information System (INIS)

    Cooper, M.I.; Fowles, P.S.; Tang, C.C.

    2002-01-01

    The use of laser cleaning in artwork conservation is becoming increasingly important. An investigation into the effects of laser radiation on lead white pigment, considered to be historically the most important of all white pigments used in art, has been undertaken. Samples of pigment and pigment in a water-colour binding medium have been prepared and irradiated by laser radiation at 1064 nm (pulse duration 5-10 ns) at an average fluence of 0.3 J cm -2 . Irradiation under such conditions leads to the formation of an extremely thin discoloured layer. Synchrotron X-ray diffraction (XRD) and X-ray photoelectron spectroscopy (XPS) have been used to characterise the altered layer. Analytical evidence for the formation of elemental lead is presented for the first time and the effect of exposure of the altered layer to air and the effect of a binding medium on the process are discussed

  12. Direct expansion solar assisted heat pumps – A clean steady state approach for overall performance analysis

    International Nuclear Information System (INIS)

    Tagliafico, Luca A.; Scarpa, Federico; Valsuani, Federico

    2014-01-01

    using an optimal collector working temperature. - Highlights: •A new approach for the steady state analysis of solar assisted heat pumps is presented. •The model is based on the inverse Carnot cycle and does not use fluid properties. •The approach leads to an analytical steady state description of the system. •The model effectively describes the averaged behavior of the considered system. •The model appears suitable to be applied to embedded control systems

  13. Analysis of market competitive structure: The new methodological approach based in the using

    International Nuclear Information System (INIS)

    Romero de la Fuente, J.; Yague Guillen, M. J.

    2007-01-01

    This paper proposes a new methodological approach to identify market competitive structure, applying usage situation concept in positioning analysis. Dimensions used by consumer to classify products are identified using Correspondence Analysis and competitive groups are formed. Results are validated with Discriminant Analysis. (Author) 23 refs

  14. New approach to gallbladder ultrasonic images analysis and lesions recognition.

    Science.gov (United States)

    Bodzioch, Sławomir; Ogiela, Marek R

    2009-03-01

    This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards detection of disease symptoms on processed images. First, in this paper, there is presented a new method of filtering gallbladder contours from USG images. A major stage in this filtration is to segment and section off areas occupied by the said organ. In most cases this procedure is based on filtration that plays a key role in the process of diagnosing pathological changes. Unfortunately ultrasound images present among the most troublesome methods of analysis owing to the echogenic inconsistency of structures under observation. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours. The algorithm is based on rank filtration, as well as on the analysis of histogram sections on tested organs. The second part concerns detecting lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. Usually the final stage is to make a diagnosis based on the detected symptoms. This last stage can be carried out through either dedicated expert systems or more classic pattern analysis approach like using rules to determine illness basing on detected symptoms. This paper discusses the pattern analysis algorithms for gallbladder image interpretation towards classification of the most frequent illness symptoms of this organ.

  15. Characterizing the structure and content of nurse handoffs: A Sequential Conversational Analysis approach.

    Science.gov (United States)

    Abraham, Joanna; Kannampallil, Thomas; Brenner, Corinne; Lopez, Karen D; Almoosa, Khalid F; Patel, Bela; Patel, Vimla L

    2016-02-01

    Effective communication during nurse handoffs is instrumental in ensuring safe and quality patient care. Much of the prior research on nurse handoffs has utilized retrospective methods such as interviews, surveys and questionnaires. While extremely useful, an in-depth understanding of the structure and content of conversations, and the inherent relationships within the content is paramount to designing effective nurse handoff interventions. In this paper, we present a methodological framework-Sequential Conversational Analysis (SCA)-a mixed-method approach that integrates qualitative conversational analysis with quantitative sequential pattern analysis. We describe the SCA approach and provide a detailed example as a proof of concept of its use for the analysis of nurse handoff communication in a medical intensive care unit. This novel approach allows us to characterize the conversational structure, clinical content, disruptions in the conversation, and the inherently phasic nature of nurse handoff communication. The characterization of communication patterns highlights the relationships underlying the verbal content of nurse handoffs with specific emphasis on: the interactive nature of conversation, relevance of role-based (incoming, outgoing) communication requirements, clinical content focus on critical patient-related events, and discussion of pending patient management tasks. We also discuss the applicability of the SCA approach as a method for providing in-depth understanding of the dynamics of communication in other settings and domains. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Discordant diagnoses obtained by different approaches in antithrombin mutation analysis

    DEFF Research Database (Denmark)

    Feddersen, Søren; Nybo, Mads

    2014-01-01

    OBJECTIVES: In hereditary antithrombin (AT) deficiency it is important to determine the underlying mutation since the future risk of thromboembolism varies considerably between mutations. DNA investigations are in general thought of as flawless and irrevocable, but the diagnostic approach can...... be critical. We therefore investigated mutation results in the AT gene, SERPINC1, with two different approaches. DESIGN AND METHODS: Sixteen patients referred to the Centre for Thrombosis and Haemostasis, Odense University Hospital, with biochemical indications of AT deficiency, but with a negative denaturing...... high-performance liquid chromatography (DHPLC) mutation screening (routine approach until recently) were included. As an alternative mutation analysis, direct sequencing of all exons and exon-intron boundaries without pre-selection by DHPLC was performed. RESULTS: Out of sixteen patients...

  17. Intelligent assembly time analysis, using a digital knowledge based approach

    NARCIS (Netherlands)

    Jin, Y.; Curran, R.; Butterfield, J.; Burke, R.; Welch, B.

    2009-01-01

    The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for

  18. Evaluation of a tungsten coil atomization-laser-induced fluorescence detection approach for trace elemental analysis

    International Nuclear Information System (INIS)

    Ezer, Muhsin; Elwood, Seth A.; Jones, Bradley T.; Simeonsson, Josef B.

    2006-01-01

    The analytical utility of a tungsten (W)-coil atomization-laser-induced fluorescence (LIF) approach has been evaluated for trace level measurements of elemental chromium (Cr), arsenic (As), selenium (Se), antimony (Sb), lead (Pb), tin (Sn), copper (Cu), thallium (Tl), indium (In), cadmium (Cd), zinc (Zn) and mercury (Hg). Measurements of As, Cr, In, Se, Sb, Pb, Tl, and Sn were performed by laser-induced fluorescence using a single dye laser operating near 460 nm whose output was converted by frequency doubling and stimulated Raman scattering to wavelengths ranging from 196 to 286 nm for atomic excitation. Absolute limits of detection (LODs) of 1, 0.3, 0.3, 0.2, 1, 6, 1, 0.2 and 0.8 pg and concentration LODs of 100, 30, 30, 20, 100, 600, 100, 20, and 80 pg/mL were achieved for As, Se, Sb, Sn, In, Cu, Cr, Pb and Tl, respectively. Determinations of Hg, Pb, Zn and Cd were performed using two-color excitation approaches and resulted in absolute LODs of 2, 30, 5 and 0.6 pg, respectively, and concentration LODs of 200, 3000, 500 and 60 pg/mL, respectively. The sensitivities achieved by the W-coil LIF approaches compare well with those reported by W-coil atomic absorption spectrometry, graphite furnace atomic absorption spectrometry, and graphite furnace electrothermal atomization-LIF approaches. The accuracy of the approach was verified through the analysis of a multielement reference solution containing Sb, Pb and Tl which each had certified performance acceptance limits of 19.6-20.4 μg/mL. The determined concentrations were 20.05 ± 2.60, 20.70 ± 2.27 and 20.60 ± 2.46 μg/mL, for Sb, Pb and Tl, respectively. The results demonstrate that W-coil LIF provides good analytical performance for trace analyses due to its high sensitivity, linearity, and capability to measure multiple elements using a single tunable laser and suggest that the development of portable W-coil LIF instrumentation using compact, solid-state lasers is feasible

  19. Evaluation of a tungsten coil atomization-laser-induced fluorescence detection approach for trace elemental analysis.

    Science.gov (United States)

    Ezer, Muhsin; Elwood, Seth A; Jones, Bradley T; Simeonsson, Josef B

    2006-06-30

    The analytical utility of a tungsten (W)-coil atomization-laser-induced fluorescence (LIF) approach has been evaluated for trace level measurements of elemental chromium (Cr), arsenic (As), selenium (Se), antimony (Sb), lead (Pb), tin (Sn), copper (Cu), thallium (Tl), indium (In), cadmium (Cd), zinc (Zn) and mercury (Hg). Measurements of As, Cr, In, Se, Sb, Pb, Tl, and Sn were performed by laser-induced fluorescence using a single dye laser operating near 460 nm whose output was converted by frequency doubling and stimulated Raman scattering to wavelengths ranging from 196 to 286 nm for atomic excitation. Absolute limits of detection (LODs) of 1, 0.3, 0.3, 0.2, 1, 6, 1, 0.2 and 0.8 pg and concentration LODs of 100, 30, 30, 20, 100, 600, 100, 20, and 80 pg/mL were achieved for As, Se, Sb, Sn, In, Cu, Cr, Pb and Tl, respectively. Determinations of Hg, Pb, Zn and Cd were performed using two-color excitation approaches and resulted in absolute LODs of 2, 30, 5 and 0.6 pg, respectively, and concentration LODs of 200, 3000, 500 and 60 pg/mL, respectively. The sensitivities achieved by the W-coil LIF approaches compare well with those reported by W-coil atomic absorption spectrometry, graphite furnace atomic absorption spectrometry, and graphite furnace electrothermal atomization-LIF approaches. The accuracy of the approach was verified through the analysis of a multielement reference solution containing Sb, Pb and Tl which each had certified performance acceptance limits of 19.6-20.4 microg/mL. The determined concentrations were 20.05+/-2.60, 20.70+/-2.27 and 20.60+/-2.46 microg/mL, for Sb, Pb and Tl, respectively. The results demonstrate that W-coil LIF provides good analytical performance for trace analyses due to its high sensitivity, linearity, and capability to measure multiple elements using a single tunable laser and suggest that the development of portable W-coil LIF instrumentation using compact, solid-state lasers is feasible.

  20. A phasor approach analysis of multiphoton FLIM measurements of three-dimensional cell culture models

    Science.gov (United States)

    Lakner, P. H.; Möller, Y.; Olayioye, M. A.; Brucker, S. Y.; Schenke-Layland, K.; Monaghan, M. G.

    2016-03-01

    Fluorescence lifetime imaging microscopy (FLIM) is a useful approach to obtain information regarding the endogenous fluorophores present in biological samples. The concise evaluation of FLIM data requires the use of robust mathematical algorithms. In this study, we developed a user-friendly phasor approach for analyzing FLIM data and applied this method on three-dimensional (3D) Caco-2 models of polarized epithelial luminal cysts in a supporting extracellular matrix environment. These Caco-2 based models were treated with epidermal growth factor (EGF), to stimulate proliferation in order to determine if FLIM could detect such a change in cell behavior. Autofluorescence from nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) in luminal Caco-2 cysts was stimulated by 2-photon laser excitation. Using a phasor approach, the lifetimes of involved fluorophores and their contribution were calculated with fewer initial assumptions when compared to multiexponential decay fitting. The phasor approach simplified FLIM data analysis, making it an interesting tool for non-experts in numerical data analysis. We observed that an increased proliferation stimulated by EGF led to a significant shift in fluorescence lifetime and a significant alteration of the phasor data shape. Our data demonstrates that multiphoton FLIM analysis with the phasor approach is a suitable method for the non-invasive analysis of 3D in vitro cell culture models qualifying this method for monitoring basic cellular features and the effect of external factors.

  1. Enhancing the discussion of alternatives in EIA using principle component analysis leads to improved public involvement

    International Nuclear Information System (INIS)

    Kamijo, Tetsuya; Huang, Guangwei

    2017-01-01

    The purpose of this study is to show the effectiveness of principle component analysis (PCA) as a method of alternatives analysis useful for improving the discussion of alternatives and public involvement. This study examined public consultations by applying quantitative text analysis (QTA) to the minutes of meetings and showed a positive correlation between the discussion of alternatives and the sense of public involvement. The discussion of alternatives may improve public involvement. A table of multiple criteria analysis for alternatives with detailed scores may exclude the public from involvement due to the general public's limited capacity to understand the mathematical algorithm and to process too much information. PCA allowed for the reduction of multiple criteria down to a small number of uncorrelated variables (principle components), a display of the merits and demerits of the alternatives, and potentially made the identification of preferable alternatives by the stakeholders easier. PCA is likely to enhance the discussion of alternatives and as a result, lead to improved public involvement.

  2. VAUD: A Visual Analysis Approach for Exploring Spatio-Temporal Urban Data.

    Science.gov (United States)

    Chen, Wei; Huang, Zhaosong; Wu, Feiran; Zhu, Minfeng; Guan, Huihua; Maciejewski, Ross

    2017-10-02

    Urban data is massive, heterogeneous, and spatio-temporal, posing a substantial challenge for visualization and analysis. In this paper, we design and implement a novel visual analytics approach, Visual Analyzer for Urban Data (VAUD), that supports the visualization, querying, and exploration of urban data. Our approach allows for cross-domain correlation from multiple data sources by leveraging spatial-temporal and social inter-connectedness features. Through our approach, the analyst is able to select, filter, aggregate across multiple data sources and extract information that would be hidden to a single data subset. To illustrate the effectiveness of our approach, we provide case studies on a real urban dataset that contains the cyber-, physical-, and socialinformation of 14 million citizens over 22 days.

  3. Interim analysis: A rational approach of decision making in clinical trial.

    Science.gov (United States)

    Kumar, Amal; Chakraborty, Bhaswat S

    2016-01-01

    Interim analysis of especially sizeable trials keeps the decision process free of conflict of interest while considering cost, resources, and meaningfulness of the project. Whenever necessary, such interim analysis can also call for potential termination or appropriate modification in sample size, study design, and even an early declaration of success. Given the extraordinary size and complexity today, this rational approach helps to analyze and predict the outcomes of a clinical trial that incorporate what is learned during the course of a study or a clinical development program. Such approach can also fill the gap by directing the resources toward relevant and optimized clinical trials between unmet medical needs and interventions being tested currently rather than fulfilling only business and profit goals.

  4. Interim analysis: A rational approach of decision making in clinical trial

    Directory of Open Access Journals (Sweden)

    Amal Kumar

    2016-01-01

    Full Text Available Interim analysis of especially sizeable trials keeps the decision process free of conflict of interest while considering cost, resources, and meaningfulness of the project. Whenever necessary, such interim analysis can also call for potential termination or appropriate modification in sample size, study design, and even an early declaration of success. Given the extraordinary size and complexity today, this rational approach helps to analyze and predict the outcomes of a clinical trial that incorporate what is learned during the course of a study or a clinical development program. Such approach can also fill the gap by directing the resources toward relevant and optimized clinical trials between unmet medical needs and interventions being tested currently rather than fulfilling only business and profit goals.

  5. In vivo x-ray fluorescence of bone lead in the study of human lead metabolism: Serum lead, whole blood lead, bone lead, and cumulative exposure

    International Nuclear Information System (INIS)

    Cake, K.M.; Chettle, D.R.; Webber, C.E.; Gordon, C.L.

    1995-01-01

    Traditionally, clinical studies of lead's effect on health have relied on blood lead levels to indicate lead exposure. However, this is unsatisfactory because blood lead levels have a half-life of approximately 5 weeks, and thus reflect recent exposure. Over 90% of the lead body burden is in bone, and it is thought to have a long residence time, thus implying that measurements of bone lead reflect cumulative exposure. So, measurements of bone lead are useful in understanding the long-term health effects of lead. Ahlgren reported the first noninvasive measurements of bone lead in humans, where γ-rays from 57 Co were used to excite the K series x-rays of lead. The lead detection system at McMaster University uses a 109 Cd source which is positioned at the center of the detector face (HPGe) and a near backscatter (∼160 degrees) geometry. This arrangement allows great flexibility, since one can sample lead in a range of different bone sites due to a robust normalization technique which eliminates the need to correct for bone geometry, thickness of overlying tissue, and other related factors. The effective radiation dose to an adult during an x-ray fluorescence bone lead measurement is extremely low, being 35 nSv. This paper addresses the issue of how bone, whole blood, and serum lead concentrations can be related in order to understand a person's lead exposure history

  6. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis

    Directory of Open Access Journals (Sweden)

    Ágatha Nogueira Previdelli

    2016-09-01

    Full Text Available The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents’ dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR. In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits, while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  7. Environmental monitoring near urban lead refineries by photon and neutron activation analysis

    International Nuclear Information System (INIS)

    Paciga, J.J.; Chattopadhyay, A.; Jervis, R.E.

    1974-01-01

    Photon activation has been used in conjunction with neutron activation for multielement determinations in airborne particulates, soil, and hair samples collected near two secondary lead refineries in Metropolitan Toronto. Particle size distributions of suspended particulates collected with a high volume Andersen sampler are reported for Al, Sb, As, Br, Cl, Mn, Na, Pb, Ti and V. Increases in the concentrations of Pb, As and Sb associated with particles >3.3 μm diameter on certain days near the refineries has resulted in localized contamination as reflected in higher concentrations of these elements in soil. To assess Pb accumulation in local residents compared with control groups, approximately 250 hair samples were analyzed for Pb by photon activation analysis. Children living close to the refineries, especially boys, exhibit the most elevated levels: up to 20 times urban control values in some cases

  8. Bioinformatics approaches to single-cell analysis in developmental biology.

    Science.gov (United States)

    Yalcin, Dicle; Hakguder, Zeynep M; Otu, Hasan H

    2016-03-01

    Individual cells within the same population show various degrees of heterogeneity, which may be better handled with single-cell analysis to address biological and clinical questions. Single-cell analysis is especially important in developmental biology as subtle spatial and temporal differences in cells have significant associations with cell fate decisions during differentiation and with the description of a particular state of a cell exhibiting an aberrant phenotype. Biotechnological advances, especially in the area of microfluidics, have led to a robust, massively parallel and multi-dimensional capturing, sorting, and lysis of single-cells and amplification of related macromolecules, which have enabled the use of imaging and omics techniques on single cells. There have been improvements in computational single-cell image analysis in developmental biology regarding feature extraction, segmentation, image enhancement and machine learning, handling limitations of optical resolution to gain new perspectives from the raw microscopy images. Omics approaches, such as transcriptomics, genomics and epigenomics, targeting gene and small RNA expression, single nucleotide and structural variations and methylation and histone modifications, rely heavily on high-throughput sequencing technologies. Although there are well-established bioinformatics methods for analysis of sequence data, there are limited bioinformatics approaches which address experimental design, sample size considerations, amplification bias, normalization, differential expression, coverage, clustering and classification issues, specifically applied at the single-cell level. In this review, we summarize biological and technological advancements, discuss challenges faced in the aforementioned data acquisition and analysis issues and present future prospects for application of single-cell analyses to developmental biology. © The Author 2015. Published by Oxford University Press on behalf of the European

  9. ANALYSIS OF FUZZY QUEUES: PARAMETRIC PROGRAMMING APPROACH BASED ON RANDOMNESS - FUZZINESS CONSISTENCY PRINCIPLE

    OpenAIRE

    Dhruba Das; Hemanta K. Baruah

    2015-01-01

    In this article, based on Zadeh’s extension principle we have apply the parametric programming approach to construct the membership functions of the performance measures when the interarrival time and the service time are fuzzy numbers based on the Baruah’s Randomness- Fuzziness Consistency Principle. The Randomness-Fuzziness Consistency Principle leads to defining a normal law of fuzziness using two different laws of randomness. In this article, two fuzzy queues FM...

  10. Sampled-Data Control of Spacecraft Rendezvous with Discontinuous Lyapunov Approach

    Directory of Open Access Journals (Sweden)

    Zhuoshi Li

    2013-01-01

    Full Text Available This paper investigates the sampled-data stabilization problem of spacecraft relative positional holding with improved Lyapunov function approach. The classical Clohessy-Wiltshire equation is adopted to describe the relative dynamic model. The relative position holding problem is converted into an output tracking control problem using sampling signals. A time-dependent discontinuous Lyapunov functionals approach is developed, which will lead to essentially less conservative results for the stability analysis and controller design of the corresponding closed-loop system. Sufficient conditions for the exponential stability analysis and the existence of the proposed controller are provided, respectively. Finally, a simulation result is established to illustrate the effectiveness of the proposed control scheme.

  11. A new analytical approach to understanding nanoscale lead-iron interactions in drinking water distribution systems.

    Science.gov (United States)

    Trueman, Benjamin F; Gagnon, Graham A

    2016-07-05

    High levels of iron in distributed drinking water often accompany elevated lead release from lead service lines and other plumbing. Lead-iron interactions in drinking water distribution systems are hypothesized to be the result of adsorption and transport of lead by iron oxide particles. This mechanism was explored using point-of-use drinking water samples characterized by size exclusion chromatography with UV and multi-element (ICP-MS) detection. In separations on two different stationary phases, high apparent molecular weight (>669 kDa) elution profiles for (56)Fe and (208)Pb were strongly correlated (average R(2)=0.96, N=73 samples representing 23 single-unit residences). Moreover, (56)Fe and (208)Pb peak areas exhibited an apparent linear dependence (R(2)=0.82), consistent with mobilization of lead via adsorption to colloidal particles rich in iron. A UV254 absorbance peak, coincident with high molecular weight (56)Fe and (208)Pb, implied that natural organic matter was interacting with the hypothesized colloidal species. High molecular weight UV254 peak areas were correlated with both (56)Fe and (208)Pb peak areas (R(2)=0.87 and 0.58, respectively). On average, 45% (std. dev. 10%) of total lead occurred in the size range 0.05-0.45 μm. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Error analysis of satellite attitude determination using a vision-based approach

    Science.gov (United States)

    Carozza, Ludovico; Bevilacqua, Alessandro

    2013-09-01

    Improvements in communication and processing technologies have opened the doors to exploit on-board cameras to compute objects' spatial attitude using only the visual information from sequences of remote sensed images. The strategies and the algorithmic approach used to extract such information affect the estimation accuracy of the three-axis orientation of the object. This work presents a method for analyzing the most relevant error sources, including numerical ones, possible drift effects and their influence on the overall accuracy, referring to vision-based approaches. The method in particular focuses on the analysis of the image registration algorithm, carried out through on-purpose simulations. The overall accuracy has been assessed on a challenging case study, for which accuracy represents the fundamental requirement. In particular, attitude determination has been analyzed for small satellites, by comparing theoretical findings to metric results from simulations on realistic ground-truth data. Significant laboratory experiments, using a numerical control unit, have further confirmed the outcome. We believe that our analysis approach, as well as our findings in terms of error characterization, can be useful at proof-of-concept design and planning levels, since they emphasize the main sources of error for visual based approaches employed for satellite attitude estimation. Nevertheless, the approach we present is also of general interest for all the affine applicative domains which require an accurate estimation of three-dimensional orientation parameters (i.e., robotics, airborne stabilization).

  13. Raman spectroscopy an intensity approach

    CERN Document Server

    Guozhen, Wu

    2017-01-01

    This book summarizes the highlights of our work on the bond polarizability approach to the intensity analysis. The topics covered include surface enhanced Raman scattering, Raman excited virtual states and Raman optical activity (ROA). The first chapter briefly introduces the Raman effect in a succinct but clear way. Chapter 2 deals with the normal mode analysis. This is a basic tool for our work. Chapter 3 introduces our proposed algorithm for the Raman intensity analysis. Chapter 4 heavily introduces the physical picture of Raman virtual states. Chapter 5 offers details so that the readers can have a comprehensive idea of Raman virtual states. Chapter 6 demonstrates how this bond polarizability algorithm is extended to ROA intensity analysis. Chapters 7 and 8 offer details on ROA, showing many findings on ROA mechanism that were not known or neglected before. Chapter 9 introduces our proposed classical treatment on ROA which, as combined with the results from the bond polarizability analysis, leads to a com...

  14. A non-parametric meta-analysis approach for combining independent microarray datasets: application using two microarray datasets pertaining to chronic allograft nephropathy

    Directory of Open Access Journals (Sweden)

    Archer Kellie J

    2008-02-01

    Full Text Available Abstract Background With the popularity of DNA microarray technology, multiple groups of researchers have studied the gene expression of similar biological conditions. Different methods have been developed to integrate the results from various microarray studies, though most of them rely on distributional assumptions, such as the t-statistic based, mixed-effects model, or Bayesian model methods. However, often the sample size for each individual microarray experiment is small. Therefore, in this paper we present a non-parametric meta-analysis approach for combining data from independent microarray studies, and illustrate its application on two independent Affymetrix GeneChip studies that compared the gene expression of biopsies from kidney transplant recipients with chronic allograft nephropathy (CAN to those with normal functioning allograft. Results The simulation study comparing the non-parametric meta-analysis approach to a commonly used t-statistic based approach shows that the non-parametric approach has better sensitivity and specificity. For the application on the two CAN studies, we identified 309 distinct genes that expressed differently in CAN. By applying Fisher's exact test to identify enriched KEGG pathways among those genes called differentially expressed, we found 6 KEGG pathways to be over-represented among the identified genes. We used the expression measurements of the identified genes as predictors to predict the class labels for 6 additional biopsy samples, and the predicted results all conformed to their pathologist diagnosed class labels. Conclusion We present a new approach for combining data from multiple independent microarray studies. This approach is non-parametric and does not rely on any distributional assumptions. The rationale behind the approach is logically intuitive and can be easily understood by researchers not having advanced training in statistics. Some of the identified genes and pathways have been

  15. Neutronic design for a 100MWth Small modular natural circulation lead or lead-alloy cooled fast reactors core

    International Nuclear Information System (INIS)

    Chen, C.; Chen, H.; Zhang, H.; Chen, Z.; Zeng, Q.

    2015-01-01

    Lead or lead-alloy cooled fast reactor with good fuel proliferation and nuclear waste transmutation capability, as well as high security and economy, is a great potential for the development of fourth-generation nuclear energy systems. Small natural circulation reactor is an important technical route lead cooled fast reactors industrial applications, which has been chosen as one of the three reference technical for solution lead or lead-alloy cooled fast reactors by GIF lead-cooled fast reactor steering committee. The School of Nuclear Science and Technology of USTC proposed a small 100MW th natural circulation lead cooled fast reactor concept called SNCLFR-100 based realistic technology. This article describes the SNCLFR-100 reactor of the overall technical program, core physics calculation and analysis. The results show that: SNCLFR-100 with good neutronic and safety performance and relevant design parameters meet the security requirements with feasibility. (author)

  16. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  17. Single Molecule Analysis Research Tool (SMART: an integrated approach for analyzing single molecule data.

    Directory of Open Access Journals (Sweden)

    Max Greenfeld

    Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.

  18. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  19. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    Science.gov (United States)

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  20. Development of quantitative analysis for cadmium, lead and chromium in aluminum alloys by using x-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    Yamashita, Satoshi; Kurusu, Kazuhiko; Kudou, Aiko

    2009-01-01

    A highly reliable quantitative analysis for cadmium, lead and chromium in aluminum alloys was developed. Standard samples were made by doping cadmium, lead and chromium into several aluminum alloys, and the composition of standard samples were determined by inductively coupled plasma optical emission spectrometry and gravimetric method. The calibration curves for these standard samples by using WD-XRF and ED-XRF exhibited linear correlation. Slope of calibration curves for Al-Cu alloy and Al-Zn-Mg alloy were smaller than other alloy's one, because of the effect by coexistent elements. Then, all calibration curves agreed with each other by performing correction with α-coefficient method. (author)