WorldWideScience

Sample records for analysis approach leading

  1. Safe job analysis in a lead refinery. A practical approach from the process side

    Energy Technology Data Exchange (ETDEWEB)

    Esser, Knut; Meurer, Urban [BERZELIUS Stolberg GmbH, Stolberg (Germany)

    2011-09-15

    In order to increase safety and to maintain legal requirements, Berzelius Stolberg decided in 2009 to update and change the approach for the safe job analysis (SJA). The new approach takes detailed Standard Operation Procedures (SOPs), which were also updated during the new approach, as a basis for all following documents. Together with supervisors and operators all SOPs were structured in single working steps, because only if the real work is properly described, the afterwards performed safe job analysis makes sense and the risks are correctly identified. After updating the SOPs, a draft of each SJA was discussed by representatives from the refinery management, works council, safety officers and operators. For every identified risk one or more measures to avoid the risk were agreed. For the technical and organisational measures an action plan was created. The behavior related measures were concentrated in a safety handbook, representing the basis for future safety training of the operators. In addition to the Safe Job Analysis the SOPs are also the basis for training manuals and also for FMEAs. All in all the new approach of safe job analysis represents not only a way to increase safety systematically according to OHSAS guidelines, but also satisfies all aspects of quality management. (orig.)

  2. Noninvasive Biomonitoring Approaches to Determine Dosimetry and Risk Following Acute Chemical Exposure: Analysis of Lead or Organophosphate Insecticide in Saliva

    International Nuclear Information System (INIS)

    Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.; Campbell, James A.; Lin, Yuehe

    2004-01-01

    There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. The system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk

  3. Androgen receptor mutations associated with androgen insensitivity syndrome: a high content analysis approach leading to personalized medicine.

    Directory of Open Access Journals (Sweden)

    Adam T Szafran

    2009-12-01

    Full Text Available Androgen insensitivity syndrome (AIS is a rare disease associated with inactivating mutations of AR that disrupt male sexual differentiation, and cause a spectrum of phenotypic abnormalities having as a common denominator loss of reproductive viability. No established treatment exists for these conditions, however there are sporadic reports of patients (or recapitulated mutations in cell lines that respond to administration of supraphysiologic doses (or pulses of testosterone or synthetic ligands. Here, we utilize a novel high content analysis (HCA approach to study AR function at the single cell level in genital skin fibroblasts (GSF. We discuss in detail findings in GSF from three historical patients with AIS, which include identification of novel mechanisms of AR malfunction, and the potential ability to utilize HCA for personalized treatment of patients affected by this condition.

  4. GALA: group analysis leads to accuracy, a novel approach for solving the inverse problem in exploratory analysis of group MEG recordings.

    Science.gov (United States)

    Kozunov, Vladimir V; Ossadtchi, Alexei

    2015-01-01

    Although MEG/EEG signals are highly variable between subjects, they allow characterizing systematic changes of cortical activity in both space and time. Traditionally a two-step procedure is used. The first step is a transition from sensor to source space by the means of solving an ill-posed inverse problem for each subject individually. The second is mapping of cortical regions consistently active across subjects. In practice the first step often leads to a set of active cortical regions whose location and timecourses display a great amount of interindividual variability hindering the subsequent group analysis. We propose Group Analysis Leads to Accuracy (GALA)-a solution that combines the two steps into one. GALA takes advantage of individual variations of cortical geometry and sensor locations. It exploits the ensuing variability in electromagnetic forward model as a source of additional information. We assume that for different subjects functionally identical cortical regions are located in close proximity and partially overlap and their timecourses are correlated. This relaxed similarity constraint on the inverse solution can be expressed within a probabilistic framework, allowing for an iterative algorithm solving the inverse problem jointly for all subjects. A systematic simulation study showed that GALA, as compared with the standard min-norm approach, improves accuracy of true activity recovery, when accuracy is assessed both in terms of spatial proximity of the estimated and true activations and correct specification of spatial extent of the activated regions. This improvement obtained without using any noise normalization techniques for both solutions, preserved for a wide range of between-subject variations in both spatial and temporal features of regional activation. The corresponding activation timecourses exhibit significantly higher similarity across subjects. Similar results were obtained for a real MEG dataset of face-specific evoked responses.

  5. GALA: Group Analysis Leads to Accuracy, a novel approach for solving the inverse problem in exploratory analysis of group MEG recordings

    Directory of Open Access Journals (Sweden)

    Vladimir eKozunov

    2015-04-01

    Full Text Available Although MEG/EEG signals are highly variable between subjects, they allow characterizing systematic changes of cortical activity in both space and time. Traditionally a two-step procedure is used. The first step is a transition from sensor to source space by the means of solving an ill-posed inverse problem for each subject individually. The second is mapping of cortical regions consistently active across subjects. In practice the first step often leads to a set of active cortical regions whose location and timecourses display a great amount of interindividual variability hindering the subsequent group analysis.We propose Group Analysis Leads to Accuracy (GALA - a solution that combines the two steps into one. GALA takes advantage of individual variations of cortical geometry and sensor locations. It exploits the ensuing variability in electromagnetic forward model as a source of additional information. We assume that for different subjects functionally identical cortical regions are located in close proximity and partially overlap and their timecourses are correlated. This relaxed similarity constraint on the inverse solution can be expressed within a probabilistic framework, allowing for an iterative algorithm solving the inverse problem jointly for all subjects.A systematic simulation study showed that GALA, as compared with the standard min-norm approach, improves accuracy of true activity recovery, when accuracy is assessed both in terms of spatial proximity of the estimated and true activations and correct specification of spatial extent of the activated regions. This improvement obtained without using any noise normalization techniques for both solutions, preserved for a wide range of between-subject variations in both spatial and temporal features of regional activation. The corresponding activation timecourses exhibit significantly higher similarity across subjects. Similar results were obtained for a real MEG dataset of face

  6. Leading change: a concept analysis.

    Science.gov (United States)

    Nelson-Brantley, Heather V; Ford, Debra J

    2017-04-01

    To report an analysis of the concept of leading change. Nurses have been called to lead change to advance the health of individuals, populations, and systems. Conceptual clarity about leading change in the context of nursing and healthcare systems provides an empirical direction for future research and theory development that can advance the science of leadership studies in nursing. Concept analysis. CINAHL, PubMed, PsycINFO, Psychology and Behavioral Sciences Collection, Health Business Elite and Business Source Premier databases were searched using the terms: leading change, transformation, reform, leadership and change. Literature published in English from 2001 - 2015 in the fields of nursing, medicine, organizational studies, business, education, psychology or sociology were included. Walker and Avant's method was used to identify descriptions, antecedents, consequences and empirical referents of the concept. Model, related and contrary cases were developed. Five defining attributes of leading change were identified: (a) individual and collective leadership; (b) operational support; (c) fostering relationships; (d) organizational learning; and (e) balance. Antecedents were external or internal driving forces and organizational readiness. The consequences of leading change included improved organizational performance and outcomes and new organizational culture and values. A theoretical definition and conceptual model of leading change were developed. Future studies that use and test the model may contribute to the refinement of a middle-range theory to advance nursing leadership research and education. From this, empirically derived interventions that prepare and enable nurses to lead change to advance health may be realized. © 2016 John Wiley & Sons Ltd.

  7. Lead farmers approach in disseminating improved tef production ...

    African Journals Online (AJOL)

    Lead farmers approach in disseminating improved tef production technologies. ... The results of the analysis showed that eight of the lead farmers were very satisfied with the new variety. High grain yield was obtained at the 10 on-farm sites due to the introduction of new variety. While the grain yields ranged from 0.7 and 2.9 ...

  8. Leading Millennials: An Approach That Works

    Science.gov (United States)

    2015-02-01

    not hesitate to share their opinions. Unlike us, millennials grew up in a world where communication technology made the world 1’£lat.’’ During their...Leading Millennials An Approach That Works Col S. Clinton Hinote, USAF Col Timothy J. Sundvall, USAF \\1 COMMENTARY 0 ur Air Force is full of... millennials . The military 11pyramid" force structure means that there will always be considerably more young people than old, and the millennia

  9. How lead consultants approach educational change in postgraduate medical education.

    Science.gov (United States)

    Fokkema, Joanne P I; Westerman, Michiel; Teunissen, Pim W; van der Lee, Nadine; Scherpbier, Albert J J A; van der Vleuten, Cees P M; Dörr, P Joep; Scheele, Fedde

    2012-04-01

      Consultants in charge of postgraduate medical education (PGME) in hospital departments ('lead consultants') are responsible for the implementation of educational change. Although difficulties in innovating in medical education are described in the literature, little is known about how lead consultants approach educational change.   This study was conducted to explore lead consultants' approaches to educational change in specialty training and factors influencing these approaches.   From an interpretative constructivist perspective, we conducted a qualitative exploratory study using semi-structured interviews with a purposive sample of 16 lead consultants in the Netherlands between August 2010 and February 2011. The study design was based on the research questions and notions from corporate business and social psychology about the roles of change managers. Interview transcripts were analysed thematically using template analysis.   The lead consultants described change processes with different stages, including cause, development of content, and the execution and evaluation of change, and used individual change strategies consisting of elements such as ideas, intentions and behaviour. Communication is necessary to the forming of a strategy and the implementation of change, but the nature of communication is influenced by the strategy in use. Lead consultants differed in their degree of awareness of the strategies they used. Factors influencing approaches to change were: knowledge, ideas and beliefs about change; level of reflection; task interpretation; personal style, and department culture.   Most lead consultants showed limited awareness of their own approaches to change. This can lead them to adopt a rigid approach, whereas the ability to adapt strategies to circumstances is considered important to effective change management. Interventions and research should be aimed at enhancing the awareness of lead consultants of approaches to change in PGME.

  10. A systems approach to risk management through leading safety indicators

    International Nuclear Information System (INIS)

    Leveson, Nancy

    2015-01-01

    The goal of leading indicators for safety is to identify the potential for an accident before it occurs. Past efforts have focused on identifying general leading indicators, such as maintenance backlog, that apply widely in an industry or even across industries. Other recommendations produce more system-specific leading indicators, but start from system hazard analysis and thus are limited by the causes considered by the traditional hazard analysis techniques. Most rely on quantitative metrics, often based on probabilistic risk assessments. This paper describes a new and different approach to identifying system-specific leading indicators and provides guidance in designing a risk management structure to generate, monitor and use the results. The approach is based on the STAMP (System-Theoretic Accident Model and Processes) model of accident causation and tools that have been designed to build on that model. STAMP extends current accident causality to include more complex causes than simply component failures and chains of failure events or deviations from operational expectations. It incorporates basic principles of systems thinking and is based on systems theory rather than traditional reliability theory. - Highlights: • Much effort has gone into developing leading indicators with only limited success. • A systems-theoretic, assumption-based approach may be more successful. • Leading indicators are warning signals of an assumption’s changing vulnerability. • Heuristic biases can be controlled by using plausibility rather than likelihood

  11. A Public Health Approach to Addressing Lead

    Science.gov (United States)

    Describes EPA’s achievements in reducing childhood lead exposures and emphasizes the need to continue actions to further reduce lead exposures, especially in those communities where exposures remain high.

  12. Lead reactor strategy economical analysis

    International Nuclear Information System (INIS)

    Ciotti, Marco

    2013-01-01

    Conclusions: • A first attempt to evaluate LFR power plant electricity production cost has been performed; • Electricity price is similar to Gen III + plants; • The estimation accuracy is probably low; • Possible costs reduction could arise from coolant characteristics that may improve safety and simplicity by design; • Accident perception, not acceptable by public opinion, may be changed with low potential energy system (non exploding coolant); • Sustainability improvement could open to a better Public acceptance, depending on us. • Problems may arise in coupling a high capital cost low fuel cost plant in a grid with large amount of intermittent sources with priority dispatch. • Lead fast reactors can compete

  13. Maritime Load Dependent Lead Times - An Analysis

    DEFF Research Database (Denmark)

    Pahl, Julia; Voss, Stefan

    2017-01-01

    steps and increased container lead times. Proposed solutions to fight congestion range from extend-ing port capacities to process optimization of parts of the maritime supply chain. The potential that lies in information sharing and integrated planning using IT has received some attention, but mainly...... on the operational level concerning timely information sharing. Collaborative planning approaches for the maritime supply chain are scarce. The production industry already implemented planning and in-formation concepts. Problems related to the maritime supply chain have great similarities with those encountered...... in production. Inspired by supply chain planning systems, we analyze the current state of (collaborative) planning in the maritime transport chain with focus on containers. Regarding the problem of congestion, we particularly emphasize on load dependent lead times (LDLT) which are well studied in production....

  14. Lead Tap Sampling Approaches: What Do They Tell You

    Science.gov (United States)

    There is no single, universally applicable sampling approach for lead in drinking water. The appropriate type of sampling is dictated by the question being asked. There is no reason when a customer asks to have their home water tested to see if it's "safe" that they s...

  15. Lead

    Science.gov (United States)

    ... about the health effects of lead in drinking water The law mandates no-lead products for drinking water after ... Waste, and Cleanup Lead Mold Pesticides Radon Science Water A-Z Index Laws & Regulations By Business Sector By Topic Compliance Enforcement ...

  16. Lead distribution in soils impacted by a secondary lead smelter: Experimental and modelling approaches

    International Nuclear Information System (INIS)

    Schneider, Arnaud R.; Cancès, Benjamin; Ponthieu, Marie; Sobanska, Sophie; Benedetti, Marc F.; Pourret, Olivier; Conreux, Alexandra; Calandra, Ivan; Martinet, Blandine; Morvan, Xavier; Gommeaux, Maxime; Marin, Béatrice

    2016-01-01

    Smelting activities are one of the most common sources of trace elements in the environment. The aim of this study was to determine the lead distribution in upper horizons (0–5 and 5–10 cm) of acidic soils in the vicinity of a lead-acid battery recycling plant in northern France. The combination of chemical methods (sequential extractions), physical methods (Raman microspectroscopy and scanning electron microscopy with an energy dispersive spectrometer) and multi-surface complexation modelling enabled an assessment of the behaviour of Pb. Regardless of the studied soil, none of the Pb-bearing phases commonly identified in similarly polluted environments (e.g., anglesite) were observed. Lead was mainly associated with organic matter and manganese oxides. The association of Pb with these soil constituents can be interpreted as evidence of Pb redistribution in the studied soils following smelter particle deposition. - Highlights: • Lead behavior was studied in smelter impacted soils. • A combination of experimental methods and modelling was employed. • Pb was mainly associated with organic matter and to a lesser degree with Mn oxides. • Pb was redistributed in soils after smelter particle deposition.

  17. Lead and Conduct Problems: A Meta-Analysis

    Science.gov (United States)

    Marcus, David K.; Fulton, Jessica J.; Clarke, Erin J.

    2010-01-01

    This meta-analysis examined the association between conduct problems and lead exposure. Nineteen studies on 8,561 children and adolescents were included. The average "r" across all 19 studies was 0.19 (p less than 0.001), which is considered a medium effect size. Studies that assessed lead exposure using hair element analysis yielded…

  18. Leading neutron production at HERA in the color dipole approach

    Directory of Open Access Journals (Sweden)

    Carvalho F.

    2016-01-01

    Full Text Available In this work we study leading neutron production in e + p → e + n + X collisions at high energies and calculate the Feynman xL distribution of these neutrons. The differential cross section is written in terms of the pion flux and of the photon-pion total cross section. We describe this process using the color dipole formalism and, assuming the validity of the additive quark model, we relate the dipole-pion with the well determined dipoleproton cross section. In this formalism we can estimate the impact of the QCD dynamics at high energies as well as the contribution of gluon saturation effects to leading neutron production. With the parameters constrained by other phenomenological information, we are able to reproduce the basic features of the recently released H1 leading neutron spectra.

  19. A Novel Mechanistic Approach to Identify New Antifungal Lead ...

    African Journals Online (AJOL)

    version 5.0.4,. ChemAxon, Budapest, Hungary) application software. Due to the large amount of the selected compounds, besides the docking expriments, an alternative approach to eliminate unpromising compounds was established. Hence ...

  20. Predictive approaches to increase absorption of compounds during lead optimisation.

    Science.gov (United States)

    Valko, Klara; Butler, James; Eddershaw, Peter

    2013-10-01

    Complex physicochemical and biological processes influence the oral absorption of a drug molecule. Consideration of these processes is an important activity during the optimisation of potential candidate molecules. The authors review the applications of physicochemical and structural requirements for intestinal absorption. Furthermore, they provide examples of how to aid the lead optimisation process through improvement of solubility and permeability. The physicochemical requirements for absorption are solubility and permeability. Both are influenced by lipophilicity, but in the opposite way. The size of the molecule also affects both solubility and permeability. Several models can be used to estimate oral absorption from chemical structure or from measured physicochemical properties. Thus, logD-cMR model, the 'golden triangle' model, Abraham solvation equations and absorption potential can be used as tools in the lead optimisation process. Measured values of solubility and permeability greatly improve the estimation of in vivo oral absorption of compounds. However, it is important to appreciate that predictions of oral absorption may be confounded by the involvement of active transporters in the gut which may either increase (e.g., active uptake) or decrease (e.g., efflux) the absorption of drug molecules. To evaluate the first-pass metabolism, in vitro clearance measurements using liver microsomes can be used in physiologically based models for the estimation of bioavailability. The general tools discussed in this review are based on the physicochemical property assessment of compound libraries and they help design compounds that occupy desirable property space with increased likelihood of good oral absorption.

  1. A Comparative Analysis of PID, Lead, Lag, Lead-Lag, and Cascaded Lead Controllers for a Drug Infusion System

    Science.gov (United States)

    Jadoon, Zuwwar Khan; Shakeel, Sobia; Saleem, Abeera; Shuja, Sana; ul-Hasan, Qadeer; Ali Riaz, Raja

    2017-01-01

    Goal The aim of this paper is to conduct a comprehensive comparative analysis between five different controllers for a drug infusion system in total intravenous anesthesia (TIVA) administration. Methods The proposed method models a dilution chamber with first order exponential decay characteristics to represent the pharmacokinetic decay of a drug. The dilution chamber is integrated with five different control techniques with a simulation-based comparative analysis performed between them. The design process is conducted using MATLAB SISOTOOL. Results The findings show that each controller has its own merits and demerits. The results generated using MATLAB signify and confirm the effectiveness of PI and cascaded lead controllers, with cascaded lead controller as the best control technique to automate and control the propofol delivery. Conclusion In this paper, different control techniques for measurement-based feedback-controlled propofol delivery is confirmed with promising results. Significance The comparative analysis showed that this drug infusion platform, merged with the proper control technique, will perform eminently in the field of total intravenous anesthesia. PMID:29312654

  2. An Approach for Impression Creep of Lead Free Microelectronic Solders

    Science.gov (United States)

    Anastasio, Onofrio A.

    2002-06-01

    Currently, the microelectronics industry is transitioning from lead-containing to lead-free solders in response to legislation in the EU and Japan. Before an alternative alloy can be designated as a replacement for current Pb-Sn extensive testing must be accomplished. One major characteristic of the alloy that must be considered is creep. Traditionally, creep testing requires numerous samples and a long tin, which thwarts the generation of comprehensive creep databases for difficult to prepare samples such as microelectronic solder joints. However, a relatively new technique, impression creep enables us to rapidly generate creep data. This test uses a cylindrical punch with a flat end to make an impression on the surface of a specimen under constant load. The steady state velocity of the indenter is found to have the same stress and temperature dependence as the conventional unidirectional creep test using bulk specimens. This thesis examines impression creep tests of eutectic Sn-Ag. A testing program and apparatus was developed constructed based on a servo hydraulic test frame. The apparatus is capable of a load resolution of 0.01N with a stability of plus/minus 0.1N, and a displacement resolution of 0.05 microns with a stability of plus/minus 0.1 microns. Samples of eutectic Sn-Ag solder were reflowed to develop the microstructure used in microelectronic packaging. Creep tests were conducted at various stresses and temperatures and showed that coarse microstructures creep more rapidly than the microstructures in the tested regime.

  3. LEADING CHANGES IN ASSESSMENT USING AN EVIDENCE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    J. O. Macaulay

    2015-08-01

    Full Text Available Introduction and objectivesIt is has been widely accepted that assessment of learning is a critical component of education and that assessment drives/guides student learning through shaping study habits and student approaches to learning. However, although most academics would agree that assessment is a critical aspect of their roles as teachers it is often an aspect of teaching that is regarded more as an additional task rather than an integral component of the teaching/learning continuum. An additional impediment to high quality assessment is the non-evidence based-approach to the decision making process. The overall aim of this project was to improve the quality of assessment in Biochemistry and Molecular Biology undergraduate education by promoting high quality assessment.Materials and methodsTo do this we developed and trialled an audit tool for mapping assessment practices. The audit tool was designed to gather data on current assessment practices and identify areas of good practice in which assessment aligned with the learning objectives and areas in need of improvement. This evidence base will then be used to drive change in assessment.Results and conclusionsUsing the assessment mapping tool we have mapped the assessment regime in a Biochemistry and Molecular Biology major at Monash University. Criteria used included: assessment type, format, timing, assessors, provision of feedback, level of learning (Bloom’s, approaches taken to planning assessment. We have mapped assessment of content and the systematic development of higher order learning and skills progression throughout the program of study. The data has enabled us to examine the assessment at unit (course level as well as the vertical development across the major. This information is now being used to inform a review of the units and the major.

  4. An exploratory study of lead recovery in lead-acid battery lifecycle in US market: An evidence-based approach

    International Nuclear Information System (INIS)

    Genaidy, A.M.; Sequeira, R.; Tolaymat, T.; Kohler, J.; Rinder, M.

    2008-01-01

    Background: This research examines lead recovery and recycling in lead-acid batteries (LAB) which account for 88% of US lead consumption. We explore strategies to maximize lead recovery and recycling in the LAB lifecycle. Currently, there is limited information on recycling rates for LAB in the published literature and is derived from a single source. Therefore, its recycling efforts in the US has been unclear so as to determine the maximum opportunities for metal recovery and recycling in the face of significant demands for LAB particularly in the auto industry. Objectives: The research utilizes an evidence-based approach to: (1) determine recycling rates for lead recovery in the LAB product lifecycle for the US market; and (2) quantify and identify opportunities where lead recovery and recycling can be improved. Methods: A comprehensive electronic search of the published literature was conducted to gather information on different LAB recycling models and actual data used to calculate recycling rates based on product lifecycle for the US market to identify strategies for increasing lead recovery and recycling. Results: The electronic search yielded five models for calculating LAB recycling rates. The description of evidence was documented for each model. Furthermore, an integrated model was developed to identify and quantify the maximum opportunities for lead recovery and recycling. Results showed that recycling rates declined during the period spanning from 1999 to 2006. Opportunities were identified for recovery and recycling of lead in the LAB product lifecycle. Concluding remarks: One can deduce the following from the analyses undertaken in this report: (1) lead recovery and recycling has been stable between 1999 and 2006; (2) lead consumption has increased at an annual rate of 2.25%, thus, the values derived in this study for opportunities dealing with lead recovery and recycling underestimate the amount of lead in scrap and waste generated; and (3) the

  5. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  6. [A proactive approach to risks: from responding tot leading].

    Science.gov (United States)

    Zegers, Marieke; Hesselink, Gijs; Roes, Kit; Geense, Wytske; Wollersheim, Hub

    2015-01-01

    To give an overview of instruments for early detection of quality and safety risks for integrated risk management in hospitals. Systematic literature review and qualitative research. A review of literature in three databases (PubMed, CINAHL and Embase) was conducted to establish which instruments are known from academic literature. Articles were selected if the effectiveness and feasibility of the instrument for risk management had been evaluated. We also examined the references of the articles found, and searched for grey literature. Moreover, 19 experts from healthcare and other sectors were interviewed in order to verify which instruments are used in practice and to study the factors influencing implementation in hospitals. We found more than 60 instruments which we divided into 12 categories. Interviewees reported that a combination of instruments is required in order to assess all of the quality and safety risks, the main elements being: (a) the patient as source; (b) brainstorming sessions and consultation in networks, i.e. verbal exchange of risks between departments and organisations; (c) insight into the performance of individual healthcare professionals and teams; and (d) site visits. For instruments to work as effectively as possible, a culture is essential in which care providers recognise and discuss risks. There is also a need for a management system including all hospital risks, allowing an integrated, efficient approach to risk. There are several instruments for early detection of quality and safety risks for integrated risk management in hospitals. The predictive value of the instruments requires further investigation.

  7. An Approach for Long-lead Probabilistic Forecast of Droughts

    Science.gov (United States)

    Madadgar, S.; Moradkhani, H.

    2013-12-01

    Spatio-temporal analysis of historical droughts across the Gunnison river Basin in CO, USA is studied and the probability distribution of future droughts is obtained. The Standardized Runoff Index (SRI) is employed to analyze the drought status across the spatial extent of the basin. To apply SRI in drought forecasting, the Precipitation Runoff Modeling System (PRMS) is used to estimate the runoff generated in the spatial units of the basin. A recently developed multivariate forecast technique is then used to model the joint behavior between the correlated variables of accumulated runoff over the forecast and predicting periods. The probability of future droughts in the forecast season given the observed drought in the last season is evaluated by the conditional probabilities derived from the forecast model. Using the conditional probabilities of future droughts, the runoff variation over the basin with the particular chance of occurrence is obtained as well. The forecast model also provides the uncertainty bound of future runoff produced at each spatial unit across the basin. Our results indicate that the statistical method developed in this study is a useful procedure in presenting the probabilistic forecasting of droughts given the spatio-temporal characteristics of droughts in the past.

  8. Statistical analysis of lead isotope data in provenance studies

    International Nuclear Information System (INIS)

    Reedy, C.L.

    1991-01-01

    This paper reports on tracing artifacts to ore sources which is different from assigning ore samples to time epochs. Until now, archaeometrists working with lead isotopes have used the ratio methods developed by geochronologists. For provenance studies, however, the use of composition data (the fraction of each of the four isotopes) leads to fewer arbitrary choices, two standard types of plots (labelled ternary and canonical variable, and a consistent method of discriminant analysis for separating groups of samples from different sources

  9. Workplace English: Approach and Analysis.

    Science.gov (United States)

    Prince, David

    1984-01-01

    Describes two approaches to teaching vocational English as a second language: (1) describing work activities in terms of processes and procedures and (2) describing work activities in terms of specific human behaviors. Suggests a goal analysis as an initial step before deciding which approach to take in any training project. (SED)

  10. Potentiometric stripping analysis of Cadmium and Lead in superficial waters

    International Nuclear Information System (INIS)

    Arias, Juan Miguel; Marciales Castiblanco, Clara

    2003-01-01

    This paper contains the implementation and validation of an analytical method for determining cadmium and lead in surface waters. This is a valuable tool for the description of actual conditions and qualitative and quantitative control of dangerous heavy metals discharge in water bodies. Test were run for selecting stripping potentiometry conditions that as indicated by results were: sample oxidant concentration 36.4 μg/L Hg 2+ stirring frequency 2400 rpm, electrolysis time 80 s., electrolysis potential -950 mV and pH of 2.0. Interference of Cu 2+ and Fe 2+ showed that copper concentrations larger than 150 μg/L and 500 μg/L negatively influence the analytical response for Cadmium and lead respectively; [Fe 3+ ] larger than 60 μg/L and 400 μg/L cause variations in cadmium and lead read content respectively. Linear concentration range for cadmium lies between 5 and 250 μg/L; for lead range goes from 10 to 250 μg/L. Precision expressed as repeatability for both system and method, exhibit good reproducibility with variation coefficients below 6%. Accuracy, assessed from recuperation, is strongly influenced by concentration level therefore standard addition is recommended for lead and cadmium quantification. Analysis performed on surface waters from Colombian Magdalena and Cauca rivers pointed lead and cadmium contents below detection limits

  11. Bulk analysis of silver, lead and zinc in drill cores

    International Nuclear Information System (INIS)

    Ellis, W.K.; Sowerby, B.D.; Rainey, P.T.; Dekker, D.L.

    1987-01-01

    A bulk analyser is being developed to determine the silver, lead and zinc contents of unsplit drill cores. Laboratory work carried out at Lucas Heights Research Laboratories has shown that the most favourable bulk analysis techniques for the elements silver, lead and zinc are neutron activation, gamma-ray scattering and neutron inelastic scattering respectively. Three measurement stations would be needed. However the same annular borecore holder would be used for all three measurements, so that the operator needs to load the sample only once. Results on a large number of crushed Hilton borecores show that lead can be determined to within 0.7 wt%. Preliminary results indicate that Ag and Zn can be determined to within about 20 ppm and 0.7 wt% respectively. A bulk analyzer based on the above techniques would analyse about 6 samples per hour

  12. Transvenous Lead Extraction via the Inferior Approach Using a Gooseneck Snare versus Simple Manual Traction.

    Science.gov (United States)

    Jo, Uk; Kim, Jun; Hwang, You-Mi; Lee, Ji-Hyun; Kim, Min-Su; Choi, Hyung-Oh; Lee, Woo-Seok; Kwon, Chang-Hee; Ko, Gi-Young; Yoon, Hyun-Ki; Nam, Gi-Byoung; Choi, Kee-Joon; Kim, You-Ho

    2016-03-01

    The number of patients with cardiac implantable electronic devices needing lead extraction is increasing for various reasons, including infections, vascular obstruction, and lead failure. We report our experience with transvenous extraction of pacemaker and defibrillator leads via the inferior approach of using a gooseneck snare as a first-line therapy and compare extraction using a gooseneck snare with extraction using simple manual traction. The study included 23 consecutive patients (43 leads) who underwent transvenous lead extraction using a gooseneck snare (group A) and 10 consecutive patients (17 leads) who underwent lead extraction using simple manual traction (group B). Patient characteristics, indications, and outcomes were analyzed and compared between the groups. The dwelling time of the leads was longer in group A (median, 121) than in group B (median, 56; p=0.000). No differences were noted in the overall procedural success rate (69.6% vs. 70%), clinical procedural success rate (82.6% vs. 90%), and lead clinical success rate (86% vs. 94.1%) between the groups. The procedural success rates according to lead type were 89.2% and 100% for pacing leads and 66.7% and 83.3% for defibrillator leads in groups A and B, respectively. Major complications were noted in 3 (mortality in 1) patients in group A and 2 patients in group B. Transvenous extraction of pacemaker leads via an inferior approach using a gooseneck snare was both safe and effective. However, stand-alone transvenous extraction of defibrillator leads using the inferior approach was suboptimal.

  13. Lead shielded cells for the spectrographic analysis of radioisotope solutions

    International Nuclear Information System (INIS)

    Roca, M.; Capdevila, C.; Cruz, F. de la

    1967-01-01

    Two lead shielded cells for the spectrochemical analysis of radioisotope samples are described. One of them is devoted to the evaporation of samples before excitation and the other one contains a suitable spectrographic excitation stand for the copper spark technique. A special device makes it possible the easy displacement of the excitation cell on wheels and rails for its accurate and reproducible position as well as its replacement by a glove box for plutonium analysis. In order to guarantee safety the room in which the spectrograph and the source are set up in separated from the active laboratory by a wall with a suitable window. (Author) 1 refs

  14. Lead optimization attrition analysis (LOAA): a novel and general methodology for medicinal chemistry.

    Science.gov (United States)

    Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos

    2015-08-01

    Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Real Analysis A Historical Approach

    CERN Document Server

    Stahl, Saul

    2011-01-01

    A provocative look at the tools and history of real analysis This new edition of Real Analysis: A Historical Approach continues to serve as an interesting read for students of analysis. Combining historical coverage with a superb introductory treatment, this book helps readers easily make the transition from concrete to abstract ideas. The book begins with an exciting sampling of classic and famous problems first posed by some of the greatest mathematicians of all time. Archimedes, Fermat, Newton, and Euler are each summoned in turn, illuminating the utility of infinite, power, and trigonome

  16. Corrosion by liquid lead and lead-bismuth: experimental results review and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jinsuo [Los Alamos National Laboratory

    2008-01-01

    Liquid metal technologies for liquid lead and lead-bismuth alloy are under wide investigation and development for advanced nuclear energy systems and waste transmutation systems. Material corrosion is one of the main issues studied a lot recently in the development of the liquid metal technology. This study reviews corrosion by liquid lead and lead bismuth, including the corrosion mechanisms, corrosion inhibitor and the formation of the protective oxide layer. The available experimental data are analyzed by using a corrosion model in which the oxidation and scale removal are coupled. Based on the model, long-term behaviors of steels in liquid lead and lead-bismuth are predictable. This report provides information for the selection of structural materials for typical nuclear reactor coolant systems when selecting liquid lead or lead bismuth as heat transfer media.

  17. Analysis of natural radionuclides and lead in foods and diets

    International Nuclear Information System (INIS)

    Bueno, Luciana

    1999-01-01

    The main purpose of the present study was to determine the lead-210, polonium-210 and lead concentrations in foods and diets. Consumption of food is generally the main route by which radionuclides can enter the human organism. Precision and accuracy of the methods developed were verifies by the analysis of reference materials from the International Atomic Energy Agency (IAEA). The method for polonium-210 analysis consisted of sample dissolution by using a microwave digester (open system) employing concentrated nitric acid and hydrogen peroxide, evaporation almost dryness, addition of hydrochloric acid, polonium deposition onto silver disc for six hours and counting by alpha spectrometry. Lead was analysed by atomic absorption technique. After sample dissolution in a microwave digester (using concentrated nitric acid and hydrogen peroxide) and dilution to 50 ml, 20μl of the sample was injected in a pyrolytic graphite furnace - atomic absorption spectrophotometer equipped with Zeeman background correction. The assessment of the contaminants in foods and diets allowed to estimate the intake of these elements and for the radionuclides were also evaluated the radiation doses that the individuals selected were exposed by the food consumption. The effective dose for lead-210 by diets intake ranged from 1.3 to 4.3 μSv/year, corresponding to 25% of the resulting from polonium-210 intake. The dose due to the both natural radionuclides varied from 6.8 to 23.0 μSv/year. These values are in good agreement with the literature data. The value estimated by the United Nations Scientific Committee on Effects of Atomic Radiation (UNSCEAR, 1993) that is 60 μSv and lower than the dose of 0.02 Sv, limit established by ICRP, 1980. The lead levels found in the majority of the Brazilian foods are in good agreement with the values published by CONAT and FAO/WHO. However, some foods such as bean, potato, papaya, apple and rice present levels above of the recommended values by the Public

  18. Leading edge analysis of transcriptomic changes during pseudorabies virus infection

    Directory of Open Access Journals (Sweden)

    Damarius S. Fleming

    2016-12-01

    Full Text Available Eight RNA samples taken from the tracheobronchial lymph nodes (TBLN of pigs that were either infected or non-infected with a feral isolate of porcine pseudorabies virus (PRV were used to investigate changes in gene expression related to the pathogen. The RNA was processed into fastq files for each library prior to being analyzed using Illumina Digital Gene Expression Tag Profiling sequences (DGETP which were used as the downstream measure of differential expression. Analyzed tags consisted of 21 base pair sequences taken from time points 1, 3, 6, and 14 days' post infection (dpi that generated 1,927,547 unique tag sequences. Tag sequences were analyzed for differential transcript expression and gene set enrichment analysis (GSEA to uncover transcriptomic changes related to PRV pathology progression. In conjunction with the DGETP and GSEA, the study also incorporated use of leading edge analysis to help link the TBLN transcriptome data to clinical progression of PRV at each of the sampled time points. The purpose of this manuscript is to provide useful background on applying the leading edge analysis to GSEA and expression data to help identify genes considered to be of high biological interest. The data in the form of fastq files has been uploaded to the NCBI Gene Expression Omnibus (GEO (GSE74473 database.

  19. A noninvasive isotopic approach to estimate the bone lead contribution to blood in children: implications for assessing the efficacy of lead abatement.

    Science.gov (United States)

    Gwiazda, Roberto; Campbell, Carla; Smith, Donald

    2005-01-01

    Lead hazard control measures to reduce children's exposure to household lead sources often result in only limited reductions in blood lead levels. This may be due to incomplete remediation of lead sources and/or to the remobilization of lead stores from bone, which may act as an endogenous lead source that buffers reductions in blood lead levels. Here we present a noninvasive isotopic approach to estimate the magnitude of the bone lead contribution to blood in children following household lead remediation. In this approach, lead isotopic ratios of a child's blood and 5-day fecal samples are determined before and after a household intervention aimed at reducing the child's lead intake. The bone lead contribution to blood is estimated from a system of mass balance equations of lead concentrations and isotopic compositions in blood at the different times of sample collection. The utility of this method is illustrated with three cases of children with blood lead levels in the range of 18-29 microg/dL. In all three cases, the release of lead from bone supported a substantial fraction of the measured blood lead level postintervention, up to 96% in one case. In general, the lead isotopic compositions of feces matched or were within the range of the lead isotopic compositions of the household dusts with lead loadings exceeding U.S. Environmental Protection Agency action levels. This isotopic agreement underscores the utility of lead isotopic measurements of feces to identify household sources of lead exposure. Results from this limited number of cases support the hypothesis that the release of bone lead into blood may substantially buffer the decrease in blood lead levels expected from the reduction in lead intake.

  20. Analysis of actin FLAP dynamics in the leading lamella.

    Directory of Open Access Journals (Sweden)

    Igor R Kuznetsov

    2010-04-01

    Full Text Available The transport of labeled G-actin from the mid-lamella region to the leading edge in a highly motile malignant rat fibroblast line has been studied using fluorescence localization after photobleaching or FLAP, and the transit times recorded in these experiments were so fast that simple diffusion was deemed an insufficient explanation (see Zicha et al., Science, v. 300, pp. 142-145 [1].We re-examine the Zicha FLAP experiments using a two-phase reactive interpenetrating flow formalism to model the cytoplasm and the transport dynamics of bleached and unbleached actin. By allowing an improved treatment of effects related to the retrograde flow of the cytoskeleton and of the geometry and finite thickness of the lamella, this new analysis reveals a mechanism that can realistically explain the timing and the amplitude of all the FLAP signals observed in [1] without invoking special transport modalities.We conclude that simple diffusion is sufficient to explain the observed transport rates, and that variations in the transport of labeled actin through the lamella are minor and not likely to be the cause of the observed physiological variations among different segments of the leading edge. We find that such variations in labeling can easily arise from differences and changes in the microscopic actin dynamics inside the edge compartment, and that the key dynamical parameter in this regard is the so-called "dilatation rate" (the velocity of cytoskeletal retrograde flow divided by a characteristic dimension of the edge compartment where rapid polymerization occurs. If our dilatation hypothesis is correct, the transient kinetics of bleached actin relocalization constitute a novel and very sensitive method for probing the cytoskeletal dynamics in leading edge micro-environments which are otherwise very difficult to directly interrogate.

  1. Analysis of actin FLAP dynamics in the leading lamella.

    Science.gov (United States)

    Kuznetsov, Igor R; Herant, Marc; Dembo, Micah

    2010-04-15

    The transport of labeled G-actin from the mid-lamella region to the leading edge in a highly motile malignant rat fibroblast line has been studied using fluorescence localization after photobleaching or FLAP, and the transit times recorded in these experiments were so fast that simple diffusion was deemed an insufficient explanation (see Zicha et al., Science, v. 300, pp. 142-145 [1]). We re-examine the Zicha FLAP experiments using a two-phase reactive interpenetrating flow formalism to model the cytoplasm and the transport dynamics of bleached and unbleached actin. By allowing an improved treatment of effects related to the retrograde flow of the cytoskeleton and of the geometry and finite thickness of the lamella, this new analysis reveals a mechanism that can realistically explain the timing and the amplitude of all the FLAP signals observed in [1] without invoking special transport modalities. We conclude that simple diffusion is sufficient to explain the observed transport rates, and that variations in the transport of labeled actin through the lamella are minor and not likely to be the cause of the observed physiological variations among different segments of the leading edge. We find that such variations in labeling can easily arise from differences and changes in the microscopic actin dynamics inside the edge compartment, and that the key dynamical parameter in this regard is the so-called "dilatation rate" (the velocity of cytoskeletal retrograde flow divided by a characteristic dimension of the edge compartment where rapid polymerization occurs). If our dilatation hypothesis is correct, the transient kinetics of bleached actin relocalization constitute a novel and very sensitive method for probing the cytoskeletal dynamics in leading edge micro-environments which are otherwise very difficult to directly interrogate.

  2. A new approach to evaluate the leading hadronic corrections to the muon g-2

    Directory of Open Access Journals (Sweden)

    C.M. Carloni Calame

    2015-06-01

    Full Text Available We propose a novel approach to determine the leading hadronic corrections to the muon g-2. It consists in a measurement of the effective electromagnetic coupling in the space-like region extracted from Bhabha scattering data. We argue that this new method may become feasible at flavor factories, resulting in an alternative determination potentially competitive with the accuracy of the present results obtained with the dispersive approach via time-like data.

  3. Lead in rice: analysis of baseline lead levels in market and field collected rice grains.

    Science.gov (United States)

    Norton, Gareth J; Williams, Paul N; Adomako, Eureka E; Price, Adam H; Zhu, Yongguan; Zhao, Fang-Jie; McGrath, Steve; Deacon, Claire M; Villada, Antia; Sommella, Alessia; Lu, Ying; Ming, Lei; De Silva, P Mangala C S; Brammer, Hugh; Dasgupta, Tapash; Islam, M Rafiqul; Meharg, Andrew A

    2014-07-01

    In a large scale survey of rice grains from markets (13 countries) and fields (6 countries), a total of 1578 rice grain samples were analysed for lead. From the market collected samples, only 0.6% of the samples exceeded the Chinese and EU limit of 0.2 μg g(-1) lead in rice (when excluding samples collected from known contaminated/mine impacted regions). When evaluating the rice grain samples against the Food and Drug Administration's (FDA) provisional total tolerable intake (PTTI) values for children and pregnant women, it was found that only people consuming large quantities of rice were at risk of exceeding the PTTI from rice alone. Furthermore, 6 field experiments were conducted to evaluate the proportion of the variation in lead concentration in rice grains due to genetics. A total of 4 of the 6 field experiments had significant differences between genotypes, but when the genotypes common across all six field sites were assessed, only 4% of the variation was explained by genotype, with 9.5% and 11% of the variation explained by the environment and genotype by environment interaction respectively. Further work is needed to identify the sources of lead contamination in rice, with detailed information obtained on the locations and environments where the rice is sampled, so that specific risk assessments can be performed. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Measuring leading placental edge to internal cervical os: Transabdominal versus transvaginal approach

    DEFF Research Database (Denmark)

    Westerway, Susan Campbell; Hyett, Jon; Henning Pedersen, Lars

    2017-01-01

    the area under the receiver operator characteristics (ROC) curve. Intra-/interobserver variations were also calculated. Results Of the pregnancies, 278 had a leading placental edge that was visible with the TV approach. Differences (TA-TV) ranged from −50 mm to +57 mm. Bland-Altman plot shows that TA......We aimed to compare the value of transabdominal (TA) and transvaginal (TV) approaches for assessing the risk of a low-lying placenta. This involved a comparison of TA and TV measurements between the leading placental edge and the internal cervical os. We also assessed the intra......-/interobserver variation for these measurements and the efficacy of TA measures in screening for a low placenta. Methodology Transabdominal and TV measurements of the leading placental edge to the internal cervical os were performed on 369 consecutive pregnancies of 16–41 weeks' gestation. The difference (TA-TV) from...

  5. Lead isotope approach to the understanding of early Japanese bronze culture

    International Nuclear Information System (INIS)

    Mabuchi, H.; Hirao, Y.

    1985-01-01

    For several years, the authors have used lead isotope analysis to investigate extensively the provenance of ancient bronze or copper artifacts which had been excavated mainly from Japanese archaeological sites. The results have been published item by item in several relevant Japanese journals. This review is intended to give an account which will review the whole work relating early Japanese bronze culture to Chinese and Korean cultures through lead isotope study. (author)

  6. Children's Lead Exposure: A Multimedia Modeling Analysis to Guide Public Health Decision-Making.

    Science.gov (United States)

    Zartarian, Valerie; Xue, Jianping; Tornero-Velez, Rogelio; Brown, James

    2017-09-12

    Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)'s National Drinking Water Advisory Council (NDWAC) recommended establishment of a "health-based, household action level" for lead in drinking water based on children's exposure. The primary objective was to develop a coupled exposure-dose modeling approach that can be used to determine what drinking water lead concentrations keep children's blood lead levels (BLLs) below specified values, considering exposures from water, soil, dust, food, and air. Related objectives were to evaluate the coupled model estimates using real-world blood lead data, to quantify relative contributions by the various media, and to identify key model inputs. A modeling approach using the EPA's Stochastic Human Exposure and Dose Simulation (SHEDS)-Multimedia and Integrated Exposure Uptake and Biokinetic (IEUBK) models was developed using available data. This analysis for the U.S. population of young children probabilistically simulated multimedia exposures and estimated relative contributions of media to BLLs across all population percentiles for several age groups. Modeled BLLs compared well with nationally representative BLLs (0-23% relative error). Analyses revealed relative importance of soil and dust ingestion exposure pathways and associated Pb intake rates; water ingestion was also a main pathway, especially for infants. This methodology advances scientific understanding of the relationship between lead concentrations in drinking water and BLLs in children. It can guide national health-based benchmarks for lead and related community public health decisions. https://doi.org/10.1289/EHP1605.

  7. Qualitative analysis of factors leading to clinical incidents.

    Science.gov (United States)

    Smith, Matthew D; Birch, Julian D; Renshaw, Mark; Ottewill, Melanie

    2013-01-01

    The purpose of this paper is to evaluate the common themes leading or contributing to clinical incidents in a UK teaching hospital. A root-cause analysis was conducted on patient safety incidents. Commonly occurring root causes and contributing factors were collected and correlated with incident timing and severity. In total, 65 root-cause analyses were reviewed, highlighting 202 factors implicated in the clinical incidents and 69 categories were identified. The 14 most commonly occurring causes (encountered in four incidents or more) were examined as a key-root or contributory cause. Incident timing was also analysed; common factors were encountered more frequently during out-hours--occurring as contributory rather than a key-root cause. In total, 14 commonly occurring factors were identified to direct interventions that could prevent many clinical incidents. From these, an "Organisational Safety Checklist" was developed to involve departmental level clinicians to monitor practice. This study demonstrates that comprehensively investigating incidents highlights common factors that can be addressed at a local level. Resilience against clinical incidents is low during out-of-hours periods, where factors such as lower staffing levels and poor service provision allows problems to escalate and become clinical incidents, which adds to the literature regarding out-of-hours care provision and should prove useful to those organising hospital services at departmental and management levels.

  8. Inter-lead correlation analysis for automated detection of cable reversals in 12/16-lead ECG.

    Science.gov (United States)

    Jekova, Irena; Krasteva, Vessela; Leber, Remo; Schmid, Ramun; Twerenbold, Raphael; Müller, Christian; Reichlin, Tobias; Abächerli, Roger

    2016-10-01

    A crucial factor for proper electrocardiogram (ECG) interpretation is the correct electrode placement in standard 12-lead ECG and extended 16-lead ECG for accurate diagnosis of acute myocardial infarctions. In the context of optimal patient care, we present and evaluate a new method for automated detection of reversals in peripheral and precordial (standard, right and posterior) leads, based on simple rules with inter-lead correlation dependencies. The algorithm for analysis of cable reversals relies on scoring of inter-lead correlations estimated over 4s snapshots with time-coherent data from multiple ECG leads. Peripheral cable reversals are detected by assessment of nine correlation coefficients, comparing V6 to limb leads: (I, II, III, -I, -II, -III, -aVR, -aVL, -aVF). Precordial lead reversals are detected by analysis of the ECG pattern cross-correlation progression within lead sets (V1-V6), (V4R, V3R, V3, V4), and (V4, V5, V6, V8, V9). Disturbed progression identifies the swapped leads. A test-set, including 2239 ECGs from three independent sources-public 12-lead (PTB, CSE) and proprietary 16-lead (Basel University Hospital) databases-is used for algorithm validation, reporting specificity (Sp) and sensitivity (Se) as true negative and true positive detection of simulated lead swaps. Reversals of limb leads are detected with Se = 95.5-96.9% and 100% when right leg is involved in the reversal. Among all 15 possible pairwise reversals in standard precordial leads, adjacent lead reversals are detected with Se = 93.8% (V5-V6), 95.6% (V2-V3), 95.9% (V3-V4), 97.1% (V1-V2), and 97.8% (V4-V5), increasing to 97.8-99.8% for reversals of anatomically more distant electrodes. The pairwise reversals in the four extra precordial leads are detected with Se = 74.7% (right-sided V4R-V3R), 91.4% (posterior V8-V9), 93.7% (V4R-V9), and 97.7% (V4R-V8, V3R-V9, V3R-V8). Higher true negative rate is achieved with Sp > 99% (standard 12-lead ECG), 81.9% (V4R-V3R), 91

  9. Determination of the sources of copper and lead used for British bronze age metalwork by lead isotope analysis

    International Nuclear Information System (INIS)

    Rohl, B.M.

    1997-01-01

    This presentation highlights the results of the work carried out by the author during her doctoral research regarding the use of lead isotope analysis to investigate the source of copper and lead for the metalwork during the British Bronze Age. Over 450 new lead isotope analyses of ore samples from England and Wales were compared with published data from Britain, Ireland, France and Germany. In addition, more than 400 pieces of metalwork, representing all phases of the British Bronze Age, were analysed. Many of these pieces of metalwork had previously been analysed for their chemical and impurity content, and supplementary chemical analyses were made to investigate a possible chemical/lead isotope relationship. The ores show overlapping isotopic distributions, while the artefacts show intriguing shifts in the lead isotope signature, with coherent pattern recognizable throughout the Bronze Age phases and regionally

  10. Gamma radiation shielding analysis of lead-flyash concretes

    International Nuclear Information System (INIS)

    Singh, Kanwaldeep; Singh, Sukhpal; Dhaliwal, A.S.; Singh, Gurmel

    2015-01-01

    Six samples of lead-flyash concrete were prepared with lead as an admixture and by varying flyash content – 0%, 20%, 30%, 40%, 50% and 60% (by weight) by replacing cement and keeping constant w/c ratio. Different gamma radiation interaction parameters used for radiation shielding design were computed theoretically and measured experimentally at 662 keV, 1173 keV and 1332 keV gamma radiation energy using narrow transmission geometry. The obtained results were compared with ordinary-flyash concretes. The radiation exposure rate of gamma radiation sources used was determined with and without lead-flyash concretes. - Highlights: • Concrete samples with lead as admixture were casted with flyash replacing 0%, 20%, 30%, 40%, 50% and 60% of cement content (by weight). • Gamma radiation shielding parameters of concretes for different gamma ray sources were measured. • The attenuation results of lead-flyash concretes were compared with the results of ordinary flyash concretes

  11. Extraction of lead from waste CRT funnel glass by generating lead sulfide - An approach for electronic waste management.

    Science.gov (United States)

    Hu, Biao; Hui, Wenlong

    2017-09-01

    Waste cathode ray tube (CRT) funnel glass is the key and difficult points in waste electrical and electronic equipment (WEEE) disposal. In this paper, a novel and effective process for the detoxification and reutilization of waste CRT funnel glass was developed by generating lead sulfide precipitate via a high-temperature melting process. The central function in this process was the generation of lead sulfide, which gathered at the bottom of the crucible and was then separated from the slag. Sodium carbonate was used as a flux and reaction agent, and sodium sulfide was used as a precipitating agent. The experimental results revealed that the lead sulfide recovery rate initially increased with an increase in the amount of added sodium carbonate, the amount of sodium sulfide, the temperature, and the holding time and then reached an equilibrium value. The maximum lead sulfide recovery rate was approximately 93%, at the optimum sodium carbonate level, sodium sulfide level, temperature, and holding time of 25%, 8%, 1200°C, and 2h, respectively. The glass slag can be made into sodium and potassium silicate by hydrolysis in an environmental and economical process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Analysis of lead content in herbal preparations in Malaysia.

    Science.gov (United States)

    Ang, H H; Lee, E L; Matsumoto, K

    2003-08-01

    In Malaysia, the phase 3 registration for traditional medicines was implemented on 1 January 1992 under the Control of Drugs and Cosmetics Regulation 1984, emphasizing quality, efficacy and safety (including the detection of the presence of heavy metals) in all pharmaceutical dosage forms of traditional medicine preparations. Therefore, a total of 100 products in various pharmaceutical dosage forms of a herbal preparation, were analysed for lead content using atomic absorption spectrophotometer. Results showed that 8% (eight products) possessed 10.64-20.72 ppm of lead, and therefore, do not comply with the quality requirement for traditional medicines in Malaysia. One of these products, M-Tongkat Ali (exhibited 10.64 +/-0.37 ppm of lead), was in fact already registered with the DCA Malaysia. The rest, Sukarno Tongkat Ali, Eurycoma Madu, Super Pill Tongkat Ali, Force Pill Tongkat Ali, Tender Pill Tongkat Ali, Super Pill Tongkat Ali Plus and Great Pill Tongkat Ali Plus have not registered with the DCA Malaysia and exhibited 12.24-20.72 ppm of lead. Although this study showed that only 92% of the products complied with the quality requirement for traditional medicines in Malaysia, however, they cannot be assumed safe from lead contamination because of batch-to-batch inconsistency.

  13. Electrode alignment of transverse tripoles using a percutaneous triple-lead approach in spinal cord stimulation

    Science.gov (United States)

    Sankarasubramanian, V.; Buitenweg, J. R.; Holsheimer, J.; Veltink, P.

    2011-02-01

    The aim of this modeling study is to determine the influence of electrode alignment of transverse tripoles on the paresthesia coverage of the pain area in spinal cord stimulation, using a percutaneous triple-lead approach. Transverse tripoles, comprising a central cathode and two lateral anodes, were modeled on the low-thoracic vertebral region (T10-T12) using percutaneous triple-lead configurations, with the center lead on the spinal cord midline. The triple leads were oriented both aligned and staggered. In the staggered configuration, the anodes were offset either caudally (caudally staggered) or rostrally (rostrally staggered) with respect to the midline cathode. The transverse tripolar field steering with the aligned and staggered configurations enabled the estimation of dorsal column fiber thresholds (IDC) and dorsal root fiber thresholds (IDR) at various anodal current ratios. IDC and IDR were considerably higher for the aligned transverse tripoles as compared to the staggered transverse tripoles. The aligned transverse tripoles facilitated deeper penetration into the medial dorsal columns (DCs). The staggered transverse tripoles always enabled broad and bilateral DC activation, at the expense of mediolateral steerability. The largest DC recruited area was obtained with the rostrally staggered transverse tripole. Transverse tripolar geometries, using percutaneous leads, allow for selective targeting of either medial or lateral DC fibers, if and only if the transverse tripole is aligned. Steering of anodal currents between the lateral leads of the staggered transverse tripoles cannot target medially confined populations of DC fibers in the spinal cord. An aligned transverse tripolar configuration is strongly recommended, because of its ability to provide more post-operative flexibility than other configurations.

  14. An obstructive sleep apnea detection approach using kernel density classification based on single-lead electrocardiogram.

    Science.gov (United States)

    Chen, Lili; Zhang, Xi; Wang, Hui

    2015-05-01

    Obstructive sleep apnea (OSA) is a common sleep disorder that often remains undiagnosed, leading to an increased risk of developing cardiovascular diseases. Polysomnogram (PSG) is currently used as a golden standard for screening OSA. However, because it is time consuming, expensive and causes discomfort, alternative techniques based on a reduced set of physiological signals are proposed to solve this problem. This study proposes a convenient non-parametric kernel density-based approach for detection of OSA using single-lead electrocardiogram (ECG) recordings. Selected physiologically interpretable features are extracted from segmented RR intervals, which are obtained from ECG signals. These features are fed into the kernel density classifier to detect apnea event and bandwidths for density of each class (normal or apnea) are automatically chosen through an iterative bandwidth selection algorithm. To validate the proposed approach, RR intervals are extracted from ECG signals of 35 subjects obtained from a sleep apnea database ( http://physionet.org/cgi-bin/atm/ATM ). The results indicate that the kernel density classifier, with two features for apnea event detection, achieves a mean accuracy of 82.07 %, with mean sensitivity of 83.23 % and mean specificity of 80.24 %. Compared with other existing methods, the proposed kernel density approach achieves a comparably good performance but by using fewer features without significantly losing discriminant power, which indicates that it could be widely used for home-based screening or diagnosis of OSA.

  15. Analysis and determination of mercury, cadmium and lead in ...

    African Journals Online (AJOL)

    The objective of this study is to determine mercury, cadmium and lead concentrations in 60 canned tuna fish samples produced and distributed in Iran after digestion by the standard methods of the Association of Official Analytical Chemists. Mercury contents in canned tuna fish were determined by cold vapor atomic ...

  16. Nuclear microprobe analysis of lead profile in crocodile bones

    Energy Technology Data Exchange (ETDEWEB)

    Orlic, I. E-mail: ivo@ansto.gov.au; Siegele, R.; Hammerton, K.; Jeffree, R.A.; Cohen, D.D

    2003-09-01

    Elevated concentrations of lead were found in Australian free ranging saltwater crocodile (Crocodylus porosus) bone and flesh. Lead shots were found as potential source of lead in these animals. ANSTO's heavy ion nuclear microprobe was used to measure the distribution of Pb in a number of bones and osteoderms. The aim was to find out if elevated Pb concentration remains in growth rings and if the concentration is correlated with the blood levels recorded at the time. Results of our study show a very distinct distribution of accumulated Pb in bones and osteoderms as well as good correlation with the level of lead concentration in blood. To investigate influence of ion species on detection limits measurements of the same sample were performed by using 3 MeV protons, 9 MeV He ions and 20 MeV carbon ions. Peak to background ratios, detection limits and the overall 'quality' of obtained spectra are compared and discussed.

  17. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  18. Approaches to Enhance Sensemaking for Intelligence Analysis

    National Research Council Canada - National Science Library

    McBeth, Michael

    2002-01-01

    ..., and to apply persuasion skills to interact more productively with others. Each approach is explained from a sensemaking perspective and linked to Richard Heuer's Psychology of Intelligence Analysis...

  19. Lead isotopic compositions of environmental certified reference materials for an inter-laboratory comparison of lead isotope analysis

    International Nuclear Information System (INIS)

    Aung, Nyein Nyein; Uryu, Tsutomu; Yoshinaga, Jun

    2004-01-01

    Lead isotope ratios, viz. 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, of the commercially available certified reference materials (CRMs) issued in Japan are presented with an objective to provide a data set, which will be useful for the quality assurance of analytical procedures, instrumental performance and method validation of the laboratories involved in environmental lead isotope ratio analysis. The analytical method used in the present study was inductively coupled plasma quadrupole mass spectrometry (ICPQMS) presented by acid digestion and with/without chemical separation of lead from the matrix. The precision of the measurements in terms of the relative standard deviation (RSD) of triplicated analyses was 0.19% and 0.14%, for 207 Pb/ 206 Pb and 208 Pb/ 206 Pb, respectively. The trueness of lead isotope ratio measurements of the present study was tested with a few CRMs, which have been analyzed by other analytical methods and reported in various literature. The lead isotopic ratios of 18 environmental matrix CRMs (including 6 CRMs analyzed for our method validation) are presented and the distribution of their ratios is briefly discussed. (author)

  20. Analysis of composite wing structures with a morphing leading edge

    OpenAIRE

    Morishima, Ryoko

    2011-01-01

    One of the main challenges for the civil aviation industry is the reduction of its environmental impact. Over the past years, improvements in performance efficiency have been achieved by simplifying the design of the structural components and using composite materials to reduce the overall weight. These approaches however, are not sufficient to meet the current demanding requirements set for a „greener‟ aircraft. Significant changes in drag reduction and fuel consumption can be obtained by...

  1. Small-molecule inhibitor leads of ribosome-inactivating proteins developed using the doorstop approach.

    Directory of Open Access Journals (Sweden)

    Yuan-Ping Pang

    2011-03-01

    Full Text Available Ribosome-inactivating proteins (RIPs are toxic because they bind to 28S rRNA and depurinate a specific adenine residue from the α-sarcin/ricin loop (SRL, thereby inhibiting protein synthesis. Shiga-like toxins (Stx1 and Stx2, produced by Escherichia coli, are RIPs that cause outbreaks of foodborne diseases with significant morbidity and mortality. Ricin, produced by the castor bean plant, is another RIP lethal to mammals. Currently, no US Food and Drug Administration-approved vaccines nor therapeutics exist to protect against ricin, Shiga-like toxins, or other RIPs. Development of effective small-molecule RIP inhibitors as therapeutics is challenging because strong electrostatic interactions at the RIP•SRL interface make drug-like molecules ineffective in competing with the rRNA for binding to RIPs. Herein, we report small molecules that show up to 20% cell protection against ricin or Stx2 at a drug concentration of 300 nM. These molecules were discovered using the doorstop approach, a new approach to protein•polynucleotide inhibitors that identifies small molecules as doorstops to prevent an active-site residue of an RIP (e.g., Tyr80 of ricin or Tyr77 of Stx2 from adopting an active conformation thereby blocking the function of the protein rather than contenders in the competition for binding to the RIP. This work offers promising leads for developing RIP therapeutics. The results suggest that the doorstop approach might also be applicable in the development of other protein•polynucleotide inhibitors as antiviral agents such as inhibitors of the Z-DNA binding proteins in poxviruses. This work also calls for careful chemical and biological characterization of drug leads obtained from chemical screens to avoid the identification of irrelevant chemical structures and to avoid the interference caused by direct interactions between the chemicals being screened and the luciferase reporter used in screening assays.

  2. Analysis of Lead and Zinc by Mercury-Free Potentiometric Stripping Analysis

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    A method is presented for trace-element analysis of lead and zinc by potentiometric stripping analysis (PSA) where both the glassy-carbon working electrode and the electrolyte are free of mercury. Analysis of zinc requires an activation procedure of the glassy-carbon electrode. The activation...... is performed by pre-concentrating zinc on glassy carbon at -1400 mV(SCE) in a mercury-free electrolyte containing 0.1 M HCl and 2 ppm Zn2+, followed by stripping at approx. -1050 mV. A linear relationship between stripping peak areas, recorded in the derivative mode, and concentration was found...

  3. An Inverse Modeling Approach to Investigate Past Lead Atmospheric Deposition in Southern Greenland

    Science.gov (United States)

    Massa, C.; Monna, F.; Bichet, V.; Gauthier, E.; Richard, H.

    2013-12-01

    The aim of this study is to model atmospheric pollution lead fluxes using two different paleoenvironmental records, covering the last 2000 years, located in southern Greenland. Fifty five sediment samples from the Lake Igaliku sequence (61°00.403'N, 45°26.494'W) were analyzed for their Pb and Al contents, and for lead isotopic compositions. The second archive consists in a previously published dataset (Shotyk et al., 2003), including Zr and Pb concentrations, and lead isotopic compositions, obtained from a minerogenic peat deposit located 16 km northwest of Lake Igaliku (61°08.314'N, 45°33.703'W). As natural background concentrations are high and obliterate most of the airborne anthropogenic lead, it is not possible to isolate this anthropogenic contribution through time with classical methods (i.e. Pb is normalized to a lithogenic and conservative element). Moreover, the background 206Pb/207Pb ratio is rather noisy because of the wide geological heterogeneity of sediment sources, which further complicated unambiguous detection of the lead pollution. To overcome these difficulties, an inverse modeling approach based on assumptions about past lead inputs was applied. This method consists of simulating a range of anthropogenic fluxes to determine the best match between measured and simulated data, both for Pb concentrations and isotopic compositions. The model is validated by the coherence of the results obtained from the two independent datasets that must reflect a similar pollution history. Although notable 206Pb/207Pb ratio shifts suggest that the first signs of anthropogenic inputs may have occurred in the 15th century, the signal-to-noise ratio was too low to significantly influence the sediment composition. Nevertheless we were able to estimate that anthropogenic lead fluxes did not exceed 2700 μg m-2 yr-1, a maximum value recorded during the 1960s. The comparison with other records from the North Atlantic Islands reveals a spatial gradient most likely due

  4. Approaches to Sensitivity Analysis in MOLP

    OpenAIRE

    Sebastian Sitarz

    2014-01-01

    The paper presents two approaches to the sensitivity analysis in multi-objective linear programming (MOLP). The first one is the tolerance approach and the other one is the standard sensitivity analysis. We consider the perturbation of the objective function coefficients. In the tolerance method we simultaneously change all of the objective function coefficients. In the standard sensitivity analysis we change one objective function coefficient without changing the others. In the numerical exa...

  5. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    Science.gov (United States)

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  6. Medial parapatellar approach leads to internal rotation of tibial component in total knee arthroplasty.

    Science.gov (United States)

    Schiapparelli, Filippo-Franco; Amsler, Felix; Hirschmann, Michael T

    2017-05-30

    The purpose of this study was to investigate if the type of approach [medial parapatellar approach (MPA) versus lateral parapatellar approach with tibial tubercle osteotomy (LPA)] influences rotation of femoral and/or tibial component and leg axis in total knee arthroplasty (TKA). It was the hypothesis that MPA leads to an internally rotated tibial TKA component. This study included 200 consecutive patients in whom TKA was performed using either a parapatellar medial (n = 162, MPA) or parapatellar lateral approach with tibial tubercle osteotomy (n = 38, LPA). All patients underwent clinical follow-up, standardized radiographs and computed radiography (CT). TKA components' position and the whole leg axis were assessed on 3D reconstructed CT scans (sagittal, coronal and rotational). Mean values of TKA component position and the whole leg alignment of both groups were compared using a t test. The tibial component was graded as internally rotated (6° ER). The femoral component was graded as internally rotated [>3° of internal rotation (IR)], neutral rotation (equal or between -3° IR and 3° of ER) and externally rotated (>3° ER). There was no significant difference in terms of whole leg axis after TKA between both groups (MPA: 0.2° valgus ± 3.4; LPA: 0.0° valgus ± 3.5). Means of tibial component rotation were 2.7° ER ± 6.1 (MPA) and 7.6° ER ± 5.4 (LPA). Patients of group LPA presented a significantly less internally rotated (LPA: 18.4%; MPA: 48.8%) and more externally rotated (LPA: 52.6%; MPA: 22.8%) tibial component (p approach (medial versus lateral) significantly influenced tibial TKA component rotation. It appears that a MPA tends to internally rotate the tibial TKA component and a LPA tends to externally rotate the tibial TKA. The anterior cortex should not be used as landmark for tibial TKA component placement when using the lateral approach with tibial tubercle osteotomy. Retrospective comparative study, Level III.

  7. Classifying Enterprise Architecture Analysis Approaches

    Science.gov (United States)

    Buckl, Sabine; Matthes, Florian; Schweda, Christian M.

    Enterprise architecture (EA) management forms a commonly accepted means to enhance the alignment of business and IT, and to support the managed evolution of the enterprise. One major challenge of EA management is to provide decision support by analyzing as-is states of the architecture as well as assessing planned future states. Thus, different kinds of analysis regarding the EA exist, each relying on certain conditions and demands for models, methods, and techniques.

  8. A hybrid approach for global sensitivity analysis

    International Nuclear Information System (INIS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2017-01-01

    Distribution based sensitivity analysis (DSA) computes sensitivity of the input random variables with respect to the change in distribution of output response. Although DSA is widely appreciated as the best tool for sensitivity analysis, the computational issue associated with this method prohibits its use for complex structures involving costly finite element analysis. For addressing this issue, this paper presents a method that couples polynomial correlated function expansion (PCFE) with DSA. PCFE is a fully equivalent operational model which integrates the concepts of analysis of variance decomposition, extended bases and homotopy algorithm. By integrating PCFE into DSA, it is possible to considerably alleviate the computational burden. Three examples are presented to demonstrate the performance of the proposed approach for sensitivity analysis. For all the problems, proposed approach yields excellent results with significantly reduced computational effort. The results obtained, to some extent, indicate that proposed approach can be utilized for sensitivity analysis of large scale structures. - Highlights: • A hybrid approach for global sensitivity analysis is proposed. • Proposed approach integrates PCFE within distribution based sensitivity analysis. • Proposed approach is highly efficient.

  9. Lead identification for the K-Ras protein: virtual screening and combinatorial fragment-based approaches

    Directory of Open Access Journals (Sweden)

    Pathan AAK

    2016-05-01

    Full Text Available Akbar Ali Khan Pathan,1,2,* Bhavana Panthi,3,* Zahid Khan,1 Purushotham Reddy Koppula,4–6 Mohammed Saud Alanazi,1 Sachchidanand,3 Narasimha Reddy Parine,1 Mukesh Chourasia3,* 1Genome Research Chair (GRC, Department of Biochemistry, College of Science, King Saud University, 2Integrated Gulf Biosystems, Riyadh, Kingdom of Saudi Arabia; 3Department of Pharmacoinformatics, National Institute of Pharmaceutical Education and Research, Hajipur, India; 4Department of Internal Medicine, School of Medicine, 5Harry S. Truman Memorial Veterans Affairs Hospital, 6Department of Radiology, School of Medicine, Columbia, MO, USA *These authors contributed equally to this work Objective: Kirsten rat sarcoma (K-Ras protein is a member of Ras family belonging to the small guanosine triphosphatases superfamily. The members of this family share a conserved structure and biochemical properties, acting as binary molecular switches. The guanosine triphosphate-bound active K-Ras interacts with a range of effectors, resulting in the stimulation of downstream signaling pathways regulating cell proliferation, differentiation, and apoptosis. Efforts to target K-Ras have been unsuccessful until now, placing it among high-value molecules against which developing a therapy would have an enormous impact. K-Ras transduces signals when it binds to guanosine triphosphate by directly binding to downstream effector proteins, but in case of guanosine diphosphate-bound conformation, these interactions get disrupted. Methods: In the present study, we targeted the nucleotide-binding site in the “on” and “off” state conformations of the K-Ras protein to find out suitable lead compounds. A structure-based virtual screening approach has been used to screen compounds from different databases, followed by a combinatorial fragment-based approach to design the apposite lead for the K-Ras protein. Results: Interestingly, the designed compounds exhibit a binding preference for the

  10. Publication Trends in Thanatology: An Analysis of Leading Journals.

    Science.gov (United States)

    Wittkowski, Joachim; Doka, Kenneth J; Neimeyer, Robert A; Vallerga, Michael

    2015-01-01

    To identify important trends in thanatology as a discipline, the authors analyzed over 1,500 articles that appeared in Death Studies and Omega over a 20-year period, coding the category of articles (e.g., theory, application, empirical research), their content focus (e.g., bereavement, death attitudes, end-of-life), and for empirical studies, their methodology (e.g., quantitative, qualitative). In general, empirical research predominates in both journals, with quantitative methods outnumbering qualitative procedures 2 to 1 across the period studied, despite an uptick in the latter methods in recent years. Purely theoretical articles, in contrast, decline in frequency. Research on grief and bereavement is the most commonly occurring (and increasing) content focus of this work, with a declining but still substantial body of basic research addressing death attitudes. Suicidology is also well represented in the corpus of articles analyzed. In contrast, publications on topics such as death education, medical ethics, and end-of-life issues occur with lower frequency, in the latter instances likely due to the submission of such work to more specialized medical journals. Differences in emphasis of Death Studies and Omega are noted, and the analysis of publication patterns is interpreted with respect to overall trends in the discipline and the culture, yielding a broad depiction of the field and some predictions regarding its possible future.

  11. Tracing fetal and childhood exposure to lead using isotope analysis of deciduous teeth

    International Nuclear Information System (INIS)

    Shepherd, Thomas J.; Dirks, Wendy; Roberts, Nick M.W.; Patel, Jaiminkumar G.; Hodgson, Susan; Pless-Mulloli, Tanja; Walton, Pamela; Parrish, Randall R.

    2016-01-01

    study confirms that laser ablation Pb isotope analysis of deciduous teeth, when carried out in conjunction with histological analysis, permits a reconstruction of the timing, duration and source of exposure to Pb during early childhood. With further development, this approach has the potential to study larger cohorts and appraise environments where the levels of exposure to Pb are much higher. - Highlights: • Reconstructing a high resolution chronology of early childhood exposure to lead. • Combined laser ablation lead isotope – histological analysis of children's teeth. • Using dentine to recover information on the intensity, duration and source of lead. • Importance of industrial airborne lead pollution in a post-leaded petrol era.

  12. Toxicological analysis of the risk of lead exposure in metal processing

    African Journals Online (AJOL)

    Purpose: To evaluate toxicological risks for workers who are exposed to lead in their work environment. Methods: Since it is an important indicator of toxicological risk, a statistical analysis of lead concentration and biological lead toxicity markers in blood and urine were performed for both exposed and control groups.

  13. Comprehensive analysis of 5-aminolevulinic acid dehydrogenase (ALAD) variants and renal cell carcinoma risk among individuals exposed to lead.

    Science.gov (United States)

    van Bemmel, Dana M; Boffetta, Paolo; Liao, Linda M; Berndt, Sonja I; Menashe, Idan; Yeager, Meredith; Chanock, Stephen; Karami, Sara; Zaridze, David; Matteev, Vsevolod; Janout, Vladimir; Kollarova, Hellena; Bencko, Vladimir; Navratilova, Marie; Szeszenia-Dabrowska, Neonilia; Mates, Dana; Slamova, Alena; Rothman, Nathaniel; Han, Summer S; Rosenberg, Philip S; Brennan, Paul; Chow, Wong-Ho; Moore, Lee E

    2011-01-01

    Epidemiologic studies are reporting associations between lead exposure and human cancers. A polymorphism in the 5-aminolevulinic acid dehydratase (ALAD) gene affects lead toxicokinetics and may modify the adverse effects of lead. The objective of this study was to evaluate single-nucleotide polymorphisms (SNPs) tagging the ALAD region among renal cancer cases and controls to determine whether genetic variation alters the relationship between lead and renal cancer. Occupational exposure to lead and risk of cancer was examined in a case-control study of renal cell carcinoma (RCC). Comprehensive analysis of variation across the ALAD gene was assessed using a tagging SNP approach among 987 cases and 1298 controls. Occupational lead exposure was estimated using questionnaire-based exposure assessment and expert review. Odds ratios (OR) and 95% confidence intervals (CI) were calculated using logistic regression. The adjusted risk associated with the ALAD variant rs8177796(CT/TT) was increased (OR = 1.35, 95%CI = 1.05-1.73, p-value = 0.02) when compared to the major allele, regardless of lead exposure. Joint effects of lead and ALAD rs2761016 suggest an increased RCC risk for the homozygous wild-type and heterozygous alleles ((GG)OR = 2.68, 95%CI = 1.17-6.12, p = 0.01; (GA)OR = 1.79, 95%CI = 1.06-3.04 with an interaction approaching significance (p(int) = 0.06). No significant modification in RCC risk was observed for the functional variant rs1800435(K68N). Haplotype analysis identified a region associated with risk supporting tagging SNP results. A common genetic variation in ALAD may alter the risk of RCC overall, and among individuals occupationally exposed to lead. Further work in larger exposed populations is warranted to determine if ALAD modifies RCC risk associated with lead exposure.

  14. Comprehensive analysis of 5-aminolevulinic acid dehydrogenase (ALAD variants and renal cell carcinoma risk among individuals exposed to lead.

    Directory of Open Access Journals (Sweden)

    Dana M van Bemmel

    Full Text Available BACKGROUND: Epidemiologic studies are reporting associations between lead exposure and human cancers. A polymorphism in the 5-aminolevulinic acid dehydratase (ALAD gene affects lead toxicokinetics and may modify the adverse effects of lead. METHODS: The objective of this study was to evaluate single-nucleotide polymorphisms (SNPs tagging the ALAD region among renal cancer cases and controls to determine whether genetic variation alters the relationship between lead and renal cancer. Occupational exposure to lead and risk of cancer was examined in a case-control study of renal cell carcinoma (RCC. Comprehensive analysis of variation across the ALAD gene was assessed using a tagging SNP approach among 987 cases and 1298 controls. Occupational lead exposure was estimated using questionnaire-based exposure assessment and expert review. Odds ratios (OR and 95% confidence intervals (CI were calculated using logistic regression. RESULTS: The adjusted risk associated with the ALAD variant rs8177796(CT/TT was increased (OR = 1.35, 95%CI = 1.05-1.73, p-value = 0.02 when compared to the major allele, regardless of lead exposure. Joint effects of lead and ALAD rs2761016 suggest an increased RCC risk for the homozygous wild-type and heterozygous alleles ((GGOR = 2.68, 95%CI = 1.17-6.12, p = 0.01; (GAOR = 1.79, 95%CI = 1.06-3.04 with an interaction approaching significance (p(int = 0.06. No significant modification in RCC risk was observed for the functional variant rs1800435(K68N. Haplotype analysis identified a region associated with risk supporting tagging SNP results. CONCLUSION: A common genetic variation in ALAD may alter the risk of RCC overall, and among individuals occupationally exposed to lead. Further work in larger exposed populations is warranted to determine if ALAD modifies RCC risk associated with lead exposure.

  15. Adaptation and implementation of the TRACE code for transient analysis in designs lead cooled fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2015-01-01

    Lead-Cooled Fast Reactor (LFR) has been identified as one of promising future reactor concepts in the technology road map of the Generation IVC International Forum (GIF)as well as in the Deployment Strategy of the European Sustainable Nuclear Industrial Initiative (ESNII), both aiming at improved sustainability, enhanced safety, economic competitiveness, and proliferation resistance. This new nuclear reactor concept requires the development of computational tools to be applied in design and safety assessments to confirm improved inherent and passive safety features of this design. One approach to this issue is to modify the current computational codes developed for the simulation of Light Water Reactors towards their applicability for the new designs. This paper reports on the performed modifications of the TRACE system code to make it applicable to LFR safety assessments. The capabilities of the modified code are demonstrated on series of benchmark exercises performed versus other safety analysis codes. (Author)

  16. Drag Coefficient of Water Droplets Approaching the Leading Edge of an Airfoil

    Science.gov (United States)

    Vargas, Mario; Sor, Suthyvann; Magarino, Adelaida Garcia

    2013-01-01

    This work presents results of an experimental study on droplet deformation and breakup near the leading edge of an airfoil. The experiment was conducted in the rotating rig test cell at the Instituto Nacional de Tecnica Aeroespacial (INTA) in Madrid, Spain. An airfoil model was placed at the end of the rotating arm and a monosize droplet generator produced droplets that fell from above, perpendicular to the path of the airfoil. The interaction between the droplets and the airfoil was captured with high speed imaging and allowed observation of droplet deformation and breakup as the droplet approached the airfoil near the stagnation line. Image processing software was used to measure the position of the droplet centroid, equivalent diameter, perimeter, area, and the major and minor axes of an ellipse superimposed over the deforming droplet. The horizontal and vertical displacement of each droplet against time was also measured, and the velocity, acceleration, Weber number, Bond number, Reynolds number, and the drag coefficients were calculated along the path of the droplet to the beginning of breakup. Results are presented and discussed for drag coefficients of droplets with diameters in the range of 300 to 1800 micrometers, and airfoil velocities of 50, 70 and 90 meters/second. The effect of droplet oscillation on the drag coefficient is discussed.

  17. Linking cases of illegal shootings of the endangered California condor using stable lead isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, Myra E., E-mail: myraf@ucsc.edu [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Kuspa, Zeka E. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Welch, Alacia [National Park Service, Pinnacles National Park, 5000 Highway 146, Paicines, CA 95043 (United States); Eng, Curtis; Clark, Michael [Los Angeles Zoo and Botanical Gardens, 5333 Zoo Drive, Los Angeles, CA 90027 (United States); Burnett, Joseph [Ventana Wildlife Society, 19045 Portola Dr. Ste. F-1, Salinas, CA 93908 (United States); Smith, Donald R. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States)

    2014-10-15

    Lead poisoning is preventing the recovery of the critically endangered California condor (Gymnogyps californianus) and lead isotope analyses have demonstrated that ingestion of spent lead ammunition is the principal source of lead poisoning in condors. Over an 8 month period in 2009, three lead-poisoned condors were independently presented with birdshot embedded in their tissues, evidencing they had been shot. No information connecting these illegal shooting events existed and the timing of the shooting(s) was unknown. Using lead concentration and stable lead isotope analyses of feathers, blood, and recovered birdshot, we observed that: i) lead isotope ratios of embedded shot from all three birds were measurably indistinguishable from each other, suggesting a common source; ii) lead exposure histories re-constructed from feather analysis suggested that the shooting(s) occurred within the same timeframe; and iii) two of the three condors were lead poisoned from a lead source isotopically indistinguishable from the embedded birdshot, implicating ingestion of this type of birdshot as the source of poisoning. One of the condors was subsequently lead poisoned the following year from ingestion of a lead buckshot (blood lead 556 µg/dL), illustrating that ingested shot possess a substantially greater lead poisoning risk compared to embedded shot retained in tissue (blood lead ∼20 µg/dL). To our knowledge, this is the first study to use lead isotopes as a tool to retrospectively link wildlife shooting events. - Highlights: • We conducted a case-based analysis of illegal shootings of California condors. • Blood and feather Pb isotopes were used to reconstruct the illegal shooting events. • Embedded birdshot from the three condors had the same Pb isotope ratios. • Feather and blood Pb isotopes indicated that the condors were shot in a common event. • Ingested shot causes substantially greater lead exposure compared to embedded shot.

  18. Introduction to Real Analysis An Educational Approach

    CERN Document Server

    Bauldry, William C

    2011-01-01

    An accessible introduction to real analysis and its connection to elementary calculus Bridging the gap between the development and history of real analysis, Introduction to Real Analysis: An Educational Approach presents a comprehensive introduction to real analysis while also offering a survey of the field. With its balance of historical background, key calculus methods, and hands-on applications, this book provides readers with a solid foundation and fundamental understanding of real analysis. The book begins with an outline of basic calculus, including a close examination of problems illust

  19. Can pluralistic approaches based upon unknown languages enhance learner engagement and lead to active social inclusion?

    Science.gov (United States)

    Dahm, Rebecca

    2017-08-01

    One way to foster active social inclusion is to enable students to develop a positive attitude to "foreignness". Creating a situation where mainstream students are less wary of foreign languages and cultures, and where newcomers feel their linguistic background is being valued, provides favourable conditions for the inclusion of these newcomers in the classroom and in society. However, language classrooms in French schools rarely take any previously acquired linguistic knowledge into account, thus unconsciously contributing to the rift between multilingual learners (e.g. 1st- and 2nd-generation immigrant children, refugees, children of parents with different mother tongues) and French learners. Native French learners' first experience of learning another language is usually when English is added as a subject to their curriculum in primary school. In some schools in France, English lessons now include the simulation of multilingual situations, designed in particular for the French "quasi-monolingual" students to lose their fear of unknown languages and "foreignness" in general. But the overall aim is to help both groups of learners become aware of the positive impact of multilingualism on cognitive abilities. However, to achieve long-term effects, this awareness-raising needs to be accompanied by maximum engagement on the part of the students. This article explores an instructional strategy termed Pluralistic Approaches based upon Unknown Languages (PAUL), which was designed to develop learning strategies of quasi-monolingual students in particular and to increase learner engagement more generally. The results of a small-scale PAUL study discussed by the author seem to confirm an increase in learner engagement leading to an enhancement of learning outcomes. Moreover, PAUL seems indeed suitable for helping to prepare the ground for social inclusion.

  20. A statistical approach to plasma profile analysis

    International Nuclear Information System (INIS)

    Kardaun, O.J.W.F.; McCarthy, P.J.; Lackner, K.; Riedel, K.S.

    1990-05-01

    A general statistical approach to the parameterisation and analysis of tokamak profiles is presented. The modelling of the profile dependence on both the radius and the plasma parameters is discussed, and pertinent, classical as well as robust, methods of estimation are reviewed. Special attention is given to statistical tests for discriminating between the various models, and to the construction of confidence intervals for the parameterised profiles and the associated global quantities. The statistical approach is shown to provide a rigorous approach to the empirical testing of plasma profile invariance. (orig.)

  1. An international pooled analysis for obtaining a benchmark dose for environmental lead exposure in children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack...

  2. Lead substances selection using GHS approach for the classification of mixtures: Case study of painting in the work environment.

    Science.gov (United States)

    Kaneko, Kazuhiro; Ishii, Satoko; Hosohara, Sachio; Hirata, Tsuyoshi; Masuda, Motoshi; Murasawa, Kaori; Yamada, Airi; Tadokoro, Takaaki; Hanzawa, Masahiko

    2017-08-01

    We developed a lead substances selection approach based on the concept of mixture classification of UN GHS for the purpose of efficient risk assessment of mixtures consisting of multiple components. Lead substances selection methods are being actively developed in Europe, but these methods are predicated on the regulations and information sources available within Europe and are therefore not readily applicable to countries outside Europe. In this study, the features of the GHS-based approach and the risk assessment results for outdoor painting work as a specific utilization example of the GHS-based approach were described. Comparison with the DPD + method and the CCA method proposed in Europe revealed that the GHS-based approach resulted in the selection of the safest lead substances. The GHS method, like the DPD + method, is a classification-based approach. We believe that a classification-based approach based on the GHS method can be an appropriate tool to efficiently implement risk assessment of mixtures for countries outside Europe. Some tools for business operators to conduct the management of chemicals using the GHS classification have been established in Japan. We plan to propose the GHS-based approach as a standardized assessment tool. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Introduction to audio analysis a MATLAB approach

    CERN Document Server

    Giannakopoulos, Theodoros

    2014-01-01

    Introduction to Audio Analysis serves as a standalone introduction to audio analysis, providing theoretical background to many state-of-the-art techniques. It covers the essential theory necessary to develop audio engineering applications, but also uses programming techniques, notably MATLAB®, to take a more applied approach to the topic. Basic theory and reproducible experiments are combined to demonstrate theoretical concepts from a practical point of view and provide a solid foundation in the field of audio analysis. Audio feature extraction, audio classification, audio segmentation, au

  4. Comparing Machine Learning and Decision Making Approaches to Forecast Long Lead Monthly Rainfall: The City of Vancouver, Canada

    Directory of Open Access Journals (Sweden)

    Zahra Zahmatkesh

    2018-01-01

    Full Text Available Estimating maximum possible rainfall is of great value for flood prediction and protection, particularly for regions, such as Canada, where urban and fluvial floods from extreme rainfalls have been known to be a major concern. In this study, a methodology is proposed to forecast real-time rainfall (with one month lead time using different number of spatial inputs with different orders of lags. For this purpose, two types of models are used. The first one is a machine learning data driven-based model, which uses a set of hydrologic variables as inputs, and the second one is an empirical-statistical model that employs the multi-criteria decision analysis method for rainfall forecasting. The data driven model is built based on Artificial Neural Networks (ANNs, while the developed multi-criteria decision analysis model uses Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS approach. A comprehensive set of spatially varying climate variables, including geopotential height, sea surface temperature, sea level pressure, humidity, temperature and pressure with different orders of lags is collected to form input vectors for the forecast models. Then, a feature selection method is employed to identify the most appropriate predictors. Two sets of results from the developed models, i.e., maximum daily rainfall in each month (RMAX and cumulative value of rainfall for each month (RCU, are considered as the target variables for forecast purpose. The results from both modeling approaches are compared using a number of evaluation criteria such as Nash-Sutcliffe Efficiency (NSE. The proposed models are applied for rainfall forecasting for a coastal area in Western Canada: Vancouver, British Columbia. Results indicate although data driven models such as ANNs work well for the simulation purpose, developed TOPSIS model considerably outperforms ANNs for the rainfall forecasting. ANNs show acceptable simulation performance during the

  5. Blood, urine, and hair kinetic analysis following an acute lead intoxication

    OpenAIRE

    KEUTGENS, Aurore; HO, Giang; SCHOOFS, Roland; KOTOLENKO, Svelana; DENOOZ, Raphael; CHARLIER, Corinne

    2011-01-01

    A case of lead exposure resulting from the accidental ingestion of a lead-containing solution is reported. Because of clinical management rapidly performed through chelation therapy by 2,3-dimercaptopropane sulfonate sodium and meso-2,3-dimercaptosuccinic acid, blood lead levels of this 51-year-old patient were moderate (412.9 μg/L) and no clinical symptoms were observed. Numerous blood and urine samples were collected for kinetic analysis of lead elimination. However, we report the first cas...

  6. A new analytical approach to understanding nanoscale lead-iron interactions in drinking water distribution systems.

    Science.gov (United States)

    Trueman, Benjamin F; Gagnon, Graham A

    2016-07-05

    High levels of iron in distributed drinking water often accompany elevated lead release from lead service lines and other plumbing. Lead-iron interactions in drinking water distribution systems are hypothesized to be the result of adsorption and transport of lead by iron oxide particles. This mechanism was explored using point-of-use drinking water samples characterized by size exclusion chromatography with UV and multi-element (ICP-MS) detection. In separations on two different stationary phases, high apparent molecular weight (>669 kDa) elution profiles for (56)Fe and (208)Pb were strongly correlated (average R(2)=0.96, N=73 samples representing 23 single-unit residences). Moreover, (56)Fe and (208)Pb peak areas exhibited an apparent linear dependence (R(2)=0.82), consistent with mobilization of lead via adsorption to colloidal particles rich in iron. A UV254 absorbance peak, coincident with high molecular weight (56)Fe and (208)Pb, implied that natural organic matter was interacting with the hypothesized colloidal species. High molecular weight UV254 peak areas were correlated with both (56)Fe and (208)Pb peak areas (R(2)=0.87 and 0.58, respectively). On average, 45% (std. dev. 10%) of total lead occurred in the size range 0.05-0.45 μm. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Approach to uncertainty in risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  8. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  9. The right side? under time pressure, approach motivation leads to right-oriented bias

    NARCIS (Netherlands)

    Roskes, Marieke; Sligte, Daniel; Shalvi, Shaul; De Dreu, Carsten K W

    2011-01-01

    Approach motivation, a focus on achieving positive outcomes, is related to relative left-hemispheric brain activation, which translates to a variety of right-oriented behavioral biases. In two studies, we found that approach-motivated individuals display a right-oriented bias, but only when they are

  10. A comparative analysis of the Brazilian and Norwegian Transfer Pricing System within the areas of thin capitalisation, interest and service regulations: Is there a difference in the approach to the arm s length principle under their domestic legislations that could lead to double taxation issues? If so, could this be solved by the Double Taxation Agreement between both countries?

    OpenAIRE

    Delgado, Fernanda Jose Cuadra

    2013-01-01

    As a whole, the current thesis presents us with a comparative analysis of the Norwegian and Brazilian TP domestic regulations; in the context of the Convention for the Avoidance of Double Taxation and the Prevention of Fiscal Evasion with Respect to Taxes on Income and Capital, ratified by both countries in 1981, and the OECD Transfer Pricing Guidelines. This will be achieved in the basis of the following structure: Part 1 will present an overview of the governmental approach for TP for Brazi...

  11. Environmental health risk assessment of ambient lead levels in Lisbon, Portugal: A full chain study approach

    DEFF Research Database (Denmark)

    Casimiro, E.; Philippe Ciffroy, P.; Serpa, P.

    2011-01-01

    The multi-causality interactions between environment and health are complex and call for an integrated multidisciplinary study approach. Emerging computational toxicology tools that link toxicology, chemistry, environmental sciences, biostatistics, and computer sciences are proving to be very use...

  12. Factors Leading to Success in Diversified Occupation: A Livelihood Analysis in India

    Science.gov (United States)

    Saha, Biswarup; Bahal, Ram

    2015-01-01

    Purpose: Livelihood diversification is a sound alternative for higher economic growth and its success or failure is conditioned by the interplay of a multitude of factors. The study of the profile of the farmers in which they operate is important to highlight the factors leading to success in diversified livelihoods. Design/Methodology/Approach: A…

  13. The Politics of Educational Policy Studies: A Preliminary Analysis of Leading Educational Policy Journal Publications

    Science.gov (United States)

    Hardy, Ian

    2009-01-01

    This paper argues that the content, analytical approaches and institutional affiliations of authors of articles published in the latest issues of two leading educational policy studies journals provide useful insights into the contested nature of educational policy studies. The paper draws upon a selection of articles published in 2007/08 issues…

  14. Discovering novel plant-derived drug leads for the treatment of HIV through an integrated approach

    CSIR Research Space (South Africa)

    Nthambeleni, R

    2010-09-01

    Full Text Available HIV/AIDS is now the leading cause of death in Sub-Saharan Africa and has moved up to fourth place among all causes of death worldwide. According to estimates from the UNAIDS 2009 report (UNAIDS 2009) on the global AIDS epidemic, around 33.4 million...

  15. An equivalent dipole analysis of PZT ceramics and lead-free piezoelectric single crystals

    Directory of Open Access Journals (Sweden)

    Andrew J. Bell

    2016-06-01

    Full Text Available The recently proposed Equivalent Dipole Model for describing the electromechanical properties of ionic solids in terms of 3 ions and 2 bonds has been applied to PZT ceramics and lead-free single crystal piezoelectric materials, providing analysis in terms of an effective ionic charge and the asymmetry of the interatomic force constants. For PZT it is shown that, as a function of composition across the morphotropic phase boundary, the dominant bond compliance peaks at 52% ZrO2. The stiffer of the two bonds shows little composition dependence with no anomaly at the phase boundary. The effective charge has a maximum value at 50% ZrO2, decreasing across the phase boundary region, but becoming constant in the rhombohedral phase. The single crystals confirm that both the asymmetry in the force constants and the magnitude of effective charge are equally important in determining the values of the piezoelectric charge coefficient and the electromechanical coupling coefficient. Both are apparently temperature dependent, increasing markedly on approaching the Curie temperature.

  16. NASA Armstrong's Approach to Store Separation Analysis

    Science.gov (United States)

    Acuff, Chris; Bui, Trong

    2015-01-01

    Presentation will an overview of NASA Armstrong's store separation capabilities and how they have been applied recently. Objective of the presentation is to brief Generation Orbit and other potential partners on NASA Armstrong's store separation capabilities. It will include discussions on the use of NAVSEP and Cart3D, as well as some Python scripting work to perform the analysis, and a short overview of this methodology applied to the Towed Glider Air Launch System. Collaboration with potential customers in this area could lead to funding for the further development of a store separation capability at NASA Armstrong, which would boost the portfolio of engineering expertise at the center.

  17. Potentiometric stripping analysis of lead and cadmium leaching from dental prosthetic materials and teeth

    Directory of Open Access Journals (Sweden)

    GORAN M. NIKOLIC

    2004-07-01

    Full Text Available Potentiometric stipping analysis (PSA was applied for the determination of lead and cadmium leaching from dental prosthetic materials and teeth. The soluble lead content in finished dental implants was found to be much lower than that of the individual components used for their preparation. Cadmium was not detected in dental implants and materials under the defined conditions. The soluble lead and cadmium content of teeth was slightly lower than the lead and cadmium content in whole teeth (w/w reported by other researchers, except in the case of a tooth with removed amalgam filling. The results of this work suggest that PSA may be a good method for lead and cadmium leaching studies for investigation of the biocompatibility of dental prosthetic materials.

  18. A factorization approach to next-to-leading-power threshold logarithms

    Energy Technology Data Exchange (ETDEWEB)

    Bonocore, D. [Nikhef,Science Park 105, NL-1098 XG Amsterdam (Netherlands); Laenen, E. [Nikhef,Science Park 105, NL-1098 XG Amsterdam (Netherlands); ITFA, University of Amsterdam,Science Park 904, Amsterdam (Netherlands); ITF, Utrecht University,Leuvenlaan 4, Utrecht (Netherlands); Magnea, L. [Dipartimento di Fisica, Università di Torino and INFN, Sezione di Torino,Via P. Giuria 1, I-10125, Torino (Italy); Melville, S. [School of Physics and Astronomy, University of Glasgow,Glasgow, G12 8QQ (United Kingdom); Vernazza, L. [Higgs Centre for Theoretical Physics, School of Physics and Astronomy, University of Edinburgh,Edinburgh, EH9 3JZ, Scotland (United Kingdom); White, C.D. [School of Physics and Astronomy, University of Glasgow,Glasgow, G12 8QQ (United Kingdom)

    2015-06-03

    Threshold logarithms become dominant in partonic cross sections when the selected final state forces gluon radiation to be soft or collinear. Such radiation factorizes at the level of scattering amplitudes, and this leads to the resummation of threshold logarithms which appear at leading power in the threshold variable. In this paper, we consider the extension of this factorization to include effects suppressed by a single power of the threshold variable. Building upon the Low-Burnett-Kroll-Del Duca (LBKD) theorem, we propose a decomposition of radiative amplitudes into universal building blocks, which contain all effects ultimately responsible for next-to-leading-power (NLP) threshold logarithms in hadronic cross sections for electroweak annihilation processes. In particular, we provide a NLO evaluation of the radiative jet function, responsible for the interference of next-to-soft and collinear effects in these cross sections. As a test, using our expression for the amplitude, we reproduce all abelian-like NLP threshold logarithms in the NNLO Drell-Yan cross section, including the interplay of real and virtual emissions. Our results are a significant step towards developing a generally applicable resummation formalism for NLP threshold effects, and illustrate the breakdown of next-to-soft theorems for gauge theory amplitudes at loop level.

  19. A noncombinatorial approach for efficient conjunction analysis

    Science.gov (United States)

    Mercurio, Michael

    Conjunction analysis is the study of possible collisions between objects in space, and is aimed at reducing the number of collisions between manmade objects and debris or- biting the Earth. Standard conjunction analysis requires computing the probability of collision between each and every resident space object and thus, it is a combinatorial problem. Due to this computational burden, real-time conjunction analysis algorithms are infeasible. The main objective of this thesis is to automatically determine which objects should be chosen to perform a detailed analysis, significantly reducing the number of object pairs to be investigated. The heart of the approach lies in the efficient tree code algorithm. It has been found that these methods significantly reduce computational cost to something more tractable such as O(NlogN) , while obtaining comparable results to expensive brute force methods. To account for probabilistic nearest neighbors, the Hellinger distance has been employed. Additionally, this research accounts for non-Gaussian uncertainties via Gaussian Mixture Models. The resulting probabilistic distance computation is effectively reduced to a linear programming problem. It has been found that the favorable computational efficiency of the tree-based approach is maintained, while the applicability of the proposed method is extended.

  20. Extractive waste management: A risk analysis approach.

    Science.gov (United States)

    Mehta, Neha; Dino, Giovanna Antonella; Ajmone-Marsan, Franco; Lasagna, Manuela; Romè, Chiara; De Luca, Domenico Antonio

    2018-05-01

    Abandoned mine sites continue to present serious environmental hazards because the heavy metals associated with extractive waste are continuously released into the environment, where they threaten human life and the environment. Remediating and securing extractive waste are complex, lengthy and costly processes. Thus, in most European countries, a site is considered for intervention when it poses a risk to human health and the surrounding environment. As a consequence, risk analysis presents a viable decisional approach towards the management of extractive waste. To evaluate the effects posed by extractive waste to human health and groundwater, a risk analysis approach was used for an abandoned nickel extraction site in Campello Monti in North Italy. This site is located in the Southern Italian Alps. The area consists of large and voluminous mafic rocks intruded by mantle peridotite. The mining activities in this area have generated extractive waste. A risk analysis of the site was performed using Risk Based Corrective Action (RBCA) guidelines, considering the properties of extractive waste and water for the properties of environmental matrices. The results showed the presence of carcinogenic risk due to arsenic and risks to groundwater due to nickel. The results of the risk analysis form a basic understanding of the current situation at the site, which is affected by extractive waste. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    . Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual......Lawyers generally explain legal development by looking at explicit amendments to statutory law and modifications in judicial practice. As far as the latter are concerned, leading cases occupy a special place. This article empirically studies the process in which certain cases become leading cases...

  2. [Biosorption of lead ions on dried waste beer yeast and the analysis by FTIR].

    Science.gov (United States)

    Dai, Qun-Wei; Dong, Fa-Qin; Zhang, Wei

    2009-07-01

    The biosorption of lead ions on dried waste beer yeast was investigated with respect to the adsorption conditions and the biosorption mechanism was analyzed with the instruments of AAS, SEM/EDS and FTIR. The results show that the metal uptake value obtained was 47.6 mg x g(-1) and the adsorptive efficiency was above 90%. Under our experiment conditions, the biosorption of Pb2+ on dried waste beer yeast is a fast process. The biosroption quantity of Pb2+ on beer yeast cells was 47.6 mg x g(-1) and the adsorption efficiency obtained was 91.6% in fisrt 30 min, then the metal uptake value obtained was 48.8 mg x g(-1) and the adsorptive efficiency was above 94% at 90 min. The cells cracking and breaking off were seen after the biosorption of lead ions on beer yeast through SEM analysis, and the cytoplasts from yeast cell should be responsible for the last period biosorption of lead ions. EDS analysis also proved that lead ions were absorbed on the yeast cells. FTIR analysis showed that the infrared spectrograms are different at different pH and biosorption time, especially hydroxyl groups, carboxylate groups and amide groups have obviously changed. Amylase and amide of protein were considered as main components to participate the chemical absorption of lead ions on yeast cells. Consequently, dried waste beer yeast is an inexpensive, readily available adsorbent for metals and especially has a high adsorption capacity for lead ions.

  3. Practical Field Survey Approach with Handle Device for Lead Contamination Assessment in Kabwe, Zambia

    Science.gov (United States)

    Nakamura, S.; Hirose, K.; Takeda, T.; Uchida, Y.; Nakata, H.; Nakayama, S.; Ishizuka, M.; Yabe, J.; Ito, M.; Igarashi, T.

    2017-12-01

    International joint research project for assessing lead soil contamination in Kabwe, Zambia was started by Zambian and Japanese scientists in 2008. Various scientific data and results have been obtained up to now. Data sharing among researchers and government officials is necessary for understanding current situation of lead contamination in Kabwe comprehensively. As lead contamination affects on local communities seriously, local community participation is important to solve the environmental issue in near future. This study, therefore, aims to develop GIS Data Integration System (GDIS as followed) consisted of GIS Data Sharing System (GDSS as followed) as web-GIS, FIELDNAUT as Android App for field survey and opensource SNS instance named Mastodon. GDIS will provide local communities to participate easily and support researchers to collect and understand about everyday situation and visualize lead contamination status in Kabwe, Zambia. GDIS is developed with opensource programs. GDSS was simply designed and developed on one desktop PC (Hirose et al., 2015) although common web-GIS requires many servers (Suresh et al., 2015). FIELDNAUT was developed in 2016. FIELDNAUT provides researchers plotting their locations on satellite images and thematic maps, filming and texting with their locations, and compiling and sharing data through GDSS. Mastodon will be used as a new FIELDNAUT communication function between local communities and researchers. It is an independent SNS instance, and the closed and secure communication system will be able to be developed. With this function, local communities will share photos and texts about their daily lives and situation around Kabwe by FIELDNAUT, and those data will be collected into GDSS. Researchers provide their results as hazard maps to local communities through FIELDNAUT. GDIS consisted of GDSS, FIELDNAUT and Mastodon encourages local community participation and let local communities be interested in their environmental issues

  4. Ancillary Resistor Leads to Sparse Glitches: an Extra Approach to Avert Hacker Using Syndicate Browser Design

    OpenAIRE

    Pendlimarri, Devaki; Petlu, Paul Bharath Bhushan

    2012-01-01

    After the invention of internet most of the people all over the world have become a fan of it because of its vast exploitation for information exchange, e-mail, e-commerce etc. for their easy leading of life. On the other side, may be equally or less/more, many people are also using it for the purpose of hacking the information which is being communicated. Because, the data/information that is being communicated through the internet is via an unsecured networks. This gives breaches to the hac...

  5. Efficacy of ultrasound-guided axillary/subclavian venous approaches for pacemaker and defibrillator lead implantation: a randomized study.

    Science.gov (United States)

    Liccardo, Mattia; Nocerino, Pasquale; Gaia, Salzano; Ciardiello, Carmine

    2018-03-01

    Subclavian access is a reliable technique for lead insertion in pacemaker and defibrillator (ICD) implantation, but it is often accompanied by complications. The aim of this study was to compare the efficacy of the ultrasound-guided axillary approach to the subclavian method. This randomized comparative study was performed on 174 patients: as a first attempt, 116 patients underwent the ultrasound-guided axillary access and 58 patients underwent the subclavian approach. A total of 364 leads were placed. Operators were trained in ultrasound-guided vein access technique. Axillary access was successful in 69% of patients (32/46), in the training phase and, as a first attempt, in 91.4% of patients (106/116), in the randomized phase. When axillary approach failed, we performed the following: subclavian access in 5.2% of patients (6/116), cephalic approach in 2.6% of patients (3/116), surgical method in 0.9% of patients (1/116). The subclavian technique was effective, as a first attempt, in 55 patients (94.8%). When the subclavian access failed, the ultrasound axillary approach successfully performed in all three cases. During a mean follow-up of 18 ± 6 months, the number of lead complications was similar in the subclavian group compared to the axillary group (p = 0.664). As first attempt, ultrasound-guided axillary method showed similarly high-success rate than subclavian approach and well performed when the first attempt in subclavian group failed. Axillary access can be considered a safe and effective alternative technique to the conventional subclavian method for device implantation.

  6. The use of lead isotopic abundances in trace uranium samples for nuclear forensics analysis

    International Nuclear Information System (INIS)

    Fahey, A.J.; Ritchie, N.W.M.; Newbury, D.E.; Small, J.A.

    2010-01-01

    Secondary ion mass spectrometry (SIMS), secondary electron microscopy (SEM) and X-ray analysis have been applied to the measurement of U-bearing particles with the intent of gleaning information concerning their history and/or origin. The lead isotopic abundances are definitive indicators that U-bearing particles have come from an ore-body, even if they have undergone chemical processing. SEM images and X-ray analysis can add further information to the study that may allude to the extent of chemical processing. The presence of 'common' lead that does not exhibit a radiogenic signature is clear evidence of anthropogenic origin. (author)

  7. Accelerated approach of discovering plant derived drug leads for treatment of TB

    CSIR Research Space (South Africa)

    Naidoo, D

    2010-06-01

    Full Text Available -discovery of known compounds and loss of activity in the course of the purification process is not uncommon as the process may neglect interesting compounds with minor biological activity. These secondary active compounds discarded during the traditional approach... with chemical and biological information optimising the biological annotation of natural compounds contained within botanical extracts preventing the repeated discovery of known compounds thereby saving time and resources accelerating structure elucidation...

  8. Interrupted time series analysis of children’s blood lead levels: A case study of lead hazard control program in Syracuse, New York

    Science.gov (United States)

    Shao, Liyang; Zhang, Lianjun; Zhen, Zhen

    2017-01-01

    Children’s blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children’s blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children’s BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children’s BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children’s blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children’s BLLs in the city of Syracuse, NY. The average of children’s BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children’s blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children’s BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis. PMID:28182688

  9. The Environmental Burdens of Lead-Acid Batteries in China: Insights from an Integrated Material Flow Analysis and Life Cycle Assessment of Lead

    Directory of Open Access Journals (Sweden)

    Sha Chen

    2017-11-01

    Full Text Available Lead-acid batteries (LABs, a widely used energy storage equipment in cars and electric vehicles, are becoming serious problems due to their high environmental impact. In this study, an integrated method, combining material flow analysis with life cycle assessment, was developed to analyze the environmental emissions and burdens of lead in LABs. The environmental burdens from other materials in LABs were not included. The results indicated that the amount of primary lead used in LABs accounted for 77% of the total lead production in 2014 in China. The amount of discharged lead into the environment was 8.54 × 105 tonnes, which was mainly from raw material extraction (57.2%. The largest environmental burden was from the raw materials extraction and processing, which accounted for 81.7% of the total environmental burdens. The environmental burdens of the environmental toxicity potential, human toxicity potential-cancer, human toxicity potential-non-cancer, water footprint and land use accounted for more than 90% at this stage. Moreover, the environmental burdens from primary lead was much more serious than regenerated lead. On the basis of the results, main practical measures and policies were proposed to reduce the lead emissions and environmental burdens of LABs in China, namely establishing an effective LABs recycling system, enlarging the market share of the legal regenerated lead, regulating the production of regenerated lead, and avoiding the long-distance transportation of the waste LABs.

  10. A Fourier analysis approach for capillary polarimetry.

    Science.gov (United States)

    Markov, Dmitry A; Swinney, Kelly; Norville, Kristin; Lu, David; Bornhop, Darryl J

    2002-03-01

    A new method of fringe interrogation based on Fourier analysis was implemented and tested for a capillary polarimetry detector. It has significant advantages over the previously employed depth of modulation (DOM) approach, including speed and alignment insensitivity. The new and old methods were compared using a set of interference fringes typically used to facilitate nanoliter volume polarimetric determinations. Polarimetric response was calculated with both methods over the range from 0 degrees to 180 degrees. The results were found to be in good agreement with Malus Law and indicate that an fast Fourier transform (fft) could be used for real-time capillary scale polarimetry in a probe volume of 40 nL.

  11. An Ethnografic Approach to Video Analysis

    DEFF Research Database (Denmark)

    Holck, Ulla

    2007-01-01

    a short introduction to the ethnographic approach, the workshop participants will have a chance to try out the method. First through a common exercise and then applied to video recordings of music therapy with children with severe communicative limitations. Focus will be on patterns of interaction......, followed by a discussion of their significance for the therapeutic interaction. Literature: Holck, U, Oldfield, A. and Plahl, C. (2005) Video Micro Analysis in Music Therapy Research, a Research Workshop. In: Aldridge, D., Fachner, J. & Erkkilä, J. (Eds) Many Faces of Music Therapy - Proceedings of the 6th...

  12. The Role of Consumer Confidence as a Leading Indicator on Stock Returns: A Markov Switching Approach

    Directory of Open Access Journals (Sweden)

    Koy AYBEN

    2017-04-01

    Full Text Available Investor’s psychological and emotional factors lead to irrationality in financial decision making and anomalies in prices. Investor sentiment and psychology help to elucidate phenomena in financial markets that cannot be explained by traditional theory. The aim of this study is two-fold: it investigates whether mutual regime switching behavior exists between the consumer indices and equity index, and examines their dynamics in response to each other in different regimes. This study applies the Markov Regime Switching model to monthly data from the BIST100 Return Index, Bloomberg Confidence Index, TUIK Confidence Index, Real Sector Confidence Index for the period between 2007:01 and 2016:06. The results indicate if consumer indices point out negative signals, capital market still gains in normal periods of economy. If they only in a recession or an expansion regime do, each of the indices moves in the same direction.

  13. A gene stacking approach leads to engineered plants with highly increased galactan levels in Arabidopsis.

    Science.gov (United States)

    Gondolf, Vibe M; Stoppel, Rhea; Ebert, Berit; Rautengarten, Carsten; Liwanag, April Jm; Loqué, Dominique; Scheller, Henrik V

    2014-12-10

    Engineering of plants with a composition of lignocellulosic biomass that is more suitable for downstream processing is of high interest for next-generation biofuel production. Lignocellulosic biomass contains a high proportion of pentose residues, which are more difficult to convert into fuels than hexoses. Therefore, increasing the hexose/pentose ratio in biomass is one approach for biomass improvement. A genetic engineering approach was used to investigate whether the amount of pectic galactan can be specifically increased in cell walls of Arabidopsis fiber cells, which in turn could provide a potential source of readily fermentable galactose. First it was tested if overexpression of various plant UDP-glucose 4-epimerases (UGEs) could increase the availability of UDP-galactose and thereby increase the biosynthesis of galactan. Constitutive and tissue-specific expression of a poplar UGE and three Arabidopsis UGEs in Arabidopsis plants could not significantly increase the amount of cell wall bound galactose. We then investigated co-overexpression of AtUGE2 together with the β-1,4-galactan synthase GalS1. Co-overexpression of AtUGE2 and GalS1 led to over 80% increase in cell wall galactose levels in Arabidopsis stems, providing evidence that these proteins work synergistically. Furthermore, AtUGE2 and GalS1 overexpression in combination with overexpression of the NST1 master regulator for secondary cell wall biosynthesis resulted in increased thickness of fiber cell walls in addition to the high cell wall galactose levels. Immunofluorescence microscopy confirmed that the increased galactose was present as β-1,4-galactan in secondary cell walls. This approach clearly indicates that simultaneous overexpression of AtUGE2 and GalS1 increases the cell wall galactose to much higher levels than can be achieved by overexpressing either one of these proteins alone. Moreover, the increased galactan content in fiber cells while improving the biomass composition had no impact

  14. Lead coolant test facility systems design, thermal hydraulic analysis and cost estimate

    Energy Technology Data Exchange (ETDEWEB)

    Khericha, Soli, E-mail: slk2@inel.gov [Battelle Energy Alliance, LLC, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Harvego, Edwin; Svoboda, John; Evans, Robert [Battelle Energy Alliance, LLC, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Dalling, Ryan [ExxonMobil Gas and Power Marketing, Houston, TX 77069 (United States)

    2012-01-15

    The Idaho National Laboratory prepared a preliminary technical and functional requirements (T and FR), thermal hydraulic design and cost estimate for a lead coolant test facility. The purpose of this small scale facility is to simulate lead coolant fast reactor (LFR) coolant flow in an open lattice geometry core using seven electrical rods and liquid lead or lead-bismuth eutectic coolant. Based on review of current world lead or lead-bismuth test facilities and research needs listed in the Generation IV Roadmap, five broad areas of requirements were identified as listed below: Bullet Develop and demonstrate feasibility of submerged heat exchanger. Bullet Develop and demonstrate open-lattice flow in electrically heated core. Bullet Develop and demonstrate chemistry control. Bullet Demonstrate safe operation. Bullet Provision for future testing. This paper discusses the preliminary design of systems, thermal hydraulic analysis, and simplified cost estimated. The facility thermal hydraulic design is based on the maximum simulated core power using seven electrical heater rods of 420 kW; average linear heat generation rate of 300 W/cm. The core inlet temperature for liquid lead or Pb/Bi eutectic is 4200 Degree-Sign C. The design includes approximately seventy-five data measurements such as pressure, temperature, and flow rates. The preliminary estimated cost of construction of the facility is $3.7M (in 2006 $). It is also estimated that the facility will require two years to be constructed and ready for operation.

  15. Physiotherapeutic approach to diagnosis: Gait analysis

    Directory of Open Access Journals (Sweden)

    Sandra Hincapié

    2010-12-01

    Full Text Available Physical therapy involves the study of the movement of man as a fundamental factor in his development, in addition to allowing for the movement of different body segments and changes in different body positions, movement is a fundamental element for the interaction between people and the execution of activities related to people’s lives. For this reason, physical therapists should be interested not only in intervention plans, but also in understanding the importance of initiating such interventions with adequate diagnosis of the health condition in question, of the movement of their subjects, and in their adequate professional performance. Consistent with this approach, this article attempts to highlight the importance of the analysis of movement as a fundamental element of the physical therapy diagnosis, utilizing a review of the topic which includes an analysis of gait from its kinematic components, as a pattern of movement essential to humans.

  16. Risk Analysis Approach to Rainwater Harvesting Systems

    Directory of Open Access Journals (Sweden)

    Nadia Ursino

    2016-08-01

    Full Text Available Urban rainwater reuse preserves water resources and promotes sustainable development in rapidly growing urban areas. The efficiency of a large number of urban water reuse systems, operating under different climate and demand conditions, is evaluated here on the base of a new risk analysis approach. Results obtained by probability analysis (PA indicate that maximum efficiency in low demanding scenarios is above 0.5 and a threshold, distinguishing low from high demanding scenarios, indicates that in low demanding scenarios no significant improvement in performance may be attained by increasing the storage capacity of rainwater harvesting tanks. Threshold behaviour is displayed when tank storage capacity is designed to match both the average collected volume and the average reuse volume. The low demand limit cannot be achieved under climate and operating conditions characterized by a disproportion between harvesting and demand volume.

  17. Comparison of two methods for blood lead analysis in cattle: graphite-furnace atomic absorption spectrometry and LeadCare(R) II system.

    Science.gov (United States)

    Bischoff, Karyn; Gaskill, Cynthia; Erb, Hollis N; Ebel, Joseph G; Hillebrandt, Joseph

    2010-09-01

    The current study compared the LeadCare(R) II test kit system with graphite-furnace atomic absorption spectrometry for blood lead (Pb) analysis in 56 cattle accidentally exposed to Pb in the field. Blood Pb concentrations were determined by LeadCare II within 4 hr of collection and after 72 hr of refrigeration. Blood Pb concentrations were determined by atomic absorption spectrometry, and samples that were coagulated (n = 12) were homogenized before analysis. There was strong rank correlation (R(2) = 0.96) between atomic absorption and LeadCare II (within 4 hr of collection), and a conversion formula was determined for values within the observed range (3-91 mcg/dl, although few had values >40 mcg/dl). Median and mean blood pb concentrations for atomic absorption were 7.7 and 15.9 mcg/dl, respectively; for LeadCare II, medians were 5.2 mcg/dl at 4 hr and 4.9 mcg/dl at 72 hr, and means were 12.4 and 11.7, respectively. LeadCare II results at 4 hr strongly correlated with 72 hr results (R(2) = 0.96), but results at 72 hr were lower (P atomic absorption. Although there have been several articles that compared LeadCare with other analytical techniques, all were for the original system, not LeadCare II. The present study indicated that LeadCare II results correlated well with atomic absorption over a wide range of blood Pb concentrations and that refrigerating samples for up to 72 hr before LeadCare II analysis was acceptable for clinical purposes.

  18. A Systematic Approach for Engagement Analysis Under Multitasking Environments

    Science.gov (United States)

    Zhang, Guangfan; Leddo, John; Xu, Roger; Richey, Carl; Schnell, Tom; McKenzie, Frederick; Li, Jiang

    2011-01-01

    An overload condition can lead to high stress for an operator and further cause substantial drops in performance. On the other extreme, in automated systems, an operator may become underloaded; in which case, it is difficult for the operator to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, a disengaged operation may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss a systematic approach monitor for extremes of cognitive workload and engagement in multitasking environments. Inferences of cognitive workload ar engagement are based on subjective evaluations, objective performance measures, physiological signals, and task analysis results. The systematic approach developed In this paper aggregates these types of information collected under the multitasking environment and can provide a real-time assessment or engagement.

  19. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    Sanchez Paz, L.A.

    1991-01-01

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  20. Circuit Board Analysis for Lead by Atomic Absorption Spectroscopy in a Course for Nonscience Majors

    Science.gov (United States)

    Weidenhammer, Jeffrey D.

    2007-01-01

    A circuit board analysis of the atomic absorption spectroscopy, which is used to measure lead content in a course for nonscience majors, is being presented. The experiment can also be used to explain the potential environmental hazards of unsafe disposal of various used electronic equipments.

  1. Lead in Hair and in Red Wine by Potentiometric Stripping Analysis: The University Students' Design.

    Science.gov (United States)

    Josephsen, Jens

    1985-01-01

    A new program for training upper secondary school chemistry teachers (SE 537 693) depends heavily on student project work. A project in which lead in hair and in red wine was examined by potentiometric stripping analysis is described and evaluated. (JN)

  2. Mercury-Free Analysis of Lead in Drinking Water by Anodic Stripping Square Wave Voltammetry

    Science.gov (United States)

    Wilburn, Jeremy P.; Brown, Kyle L.; Cliffel, David E.

    2007-01-01

    The analysis of drinking water for lead, which has well-known health effects, is presented as an instructive example for undergraduate chemistry students. It allows the students to perform an experiment and evaluate to monitor risk factors and common hazard of everyday life.

  3. Our On-Its-Head-and-In-Your-Dreams Approach Leads to Clean Energy

    Energy Technology Data Exchange (ETDEWEB)

    Kazmerski, Lawrence; Gwinner, Don; Hicks, Al

    2013-07-18

    Representing the Center for Inverse Design (CID), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE: energy. The mission of the CID is to revolutionize the discovery of new materials by design with tailored properties through the development and application of a novel inverse design approach powered by theory guiding experiment with an initial focus on solar energy conversion.

  4. Liquidity indicator for the Croatian economy – Factor analysis approach

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2014-12-01

    Full Text Available Croatian business surveys (BS are conducted in the manufacturing industry, retail trade and construction sector. In all of these sectors, manager´s assessments of liquidity are measured. The aim of the paper was to form a new composite liquidity indicator by including business survey liquidity measures from all three covered economic sectors in the Croatian economy mentioned above. In calculating the leading indicator, a factor analysis approach was used. However, this kind of indicator does not exist in a Croatia or in any other European economy. Furthermore, the issue of Croatian companies´ illiquidity is highly neglected in the literature. The empirical analysis consists of two parts. In the first part the new liquidity indicator was formed using factor analysis. One factor (representing the new liquidity indicator; LI was extracted out of the three liquidity variables in three economic sectors. This factor represents the new liquidity indicator. In the second part, econometric models were applied in order to investigate the forecasting properties of the new business survey liquidity indicator, when predicting the direction of changes in Croatian industrial production. The quarterly data used in the research covered the period from January 2000 to April 2013. Based on econometric analysis, it can be concluded that the LI is a leading indicator of Croatia’s industrial production with better forecasting properties then the standard liquidity indicators (formed in a manufacturing industry.

  5. Finite element analysis of vibration energy harvesting using lead-free piezoelectric materials: A comparative study

    Directory of Open Access Journals (Sweden)

    Anuruddh Kumar

    2014-06-01

    Full Text Available In this article, the performance of various piezoelectric materials is simulated for the unimorph cantilever-type piezoelectric energy harvester. The finite element method (FEM is used to model the piezolaminated unimorph cantilever structure. The first-order shear deformation theory (FSDT and linear piezoelectric theory are implemented in finite element simulations. The genetic algorithm (GA optimization approach is carried out to optimize the structural parameters of mechanical energy-based energy harvester for maximum power density and power output. The numerical simulation demonstrates the performance of lead-free piezoelectric materials in unimorph cantilever-based energy harvester. The lead-free piezoelectric material K0.5Na0.5NbO3-LiSbO3-CaTiO3 (2 wt.% has demonstrated maximum mean power and maximum mean power density for piezoelectric energy harvester in the ambient frequency range of 90–110 Hz. Overall, the lead-free piezoelectric materials of K0.5Na0.5NbO3-LiSbO3 (KNN-LS family have shown better performance than the conventional lead-based piezoelectric material lead zirconate titanate (PZT in the context of piezoelectric energy harvesting devices.

  6. Blood lead and preeclampsia: A meta-analysis and review of implications.

    Science.gov (United States)

    Poropat, Arthur E; Laidlaw, Mark A S; Lanphear, Bruce; Ball, Andrew; Mielke, Howard W

    2018-01-01

    Multiple cross-sectional studies suggest that there is an association between blood lead and preeclampsia. We performed a systematic review and meta-analysis to summarize information on the association between preeclampsia and lead poisoning. Searches of Medline, Web of Science, Scopus, Pubmed, Science Direct and ProQuest (dissertations and theses) identified 2089 reports, 46 of which were downloaded after reviewing the abstracts, and 11 studies were evaluated as meeting the selection criteria. Evaluation using the ROBINS-I template (Sterne, et al., 2016), indicated moderate risk of bias in all studies. We found that blood lead concentrations were significantly and substantially associated with preeclampsia (k = 12; N = 6069; Cohen's d = 1.26; odds ratio = 9.81; odds ratio LCL = 8.01; odds ratio UCL = 12.02; p = 0.005). Eliminating one study produced a homogeneous meta-analysis and stronger estimates, despite the remaining studies coming from eight separate countries and having countervailing risks of bias. Blood lead concentrations in pregnant women are a major risk factor for preeclampsia, with an increase of 1μg/dL associated with a 1.6% increase in likelihood of preeclampsia, which appears to be the strongest risk factor for preeclampsia yet reported. Pregnant women with historical lead exposure should routinely have blood lead concentrations tested, especially after mid-term. Women with concentrations higher than 5μg/dL should be actively monitored for preeclampsia and be advised to take prophylactic calcium supplementation. All pregnant women should be advised to actively avoid lead exposure. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Lamina specific loss of inhibition may lead to distinct neuropathic manifestations: a computational modeling approach

    Directory of Open Access Journals (Sweden)

    Erick Javier Argüello Prada

    Full Text Available Introduction It has been reported that inhibitory control at the superficial dorsal horn (SDH can act in a regionally distinct manner, which suggests that regionally specific subpopulations of SDH inhibitory neurons may prevent one specific neuropathic condition. Methods In an attempt to address this issue, we provide an alternative approach by integrating neuroanatomical information provided by different studies to construct a network-model of the SDH. We use Neuroids to simulate each neuron included in that model by adapting available experimental evidence. Results Simulations suggest that the maintenance of the proper level of pain sensitivity may be attributed to lamina II inhibitory neurons and, therefore, hyperalgesia may be elicited by suppression of the inhibitory tone at that lamina. In contrast, lamina III inhibitory neurons are more likely to be responsible for keeping the nociceptive pathway from the mechanoreceptive pathway, so loss of inhibitory control in that region may result in allodynia. The SDH network-model is also able to replicate non-linearities associated to pain processing, such as Aβ-fiber mediated analgesia and frequency-dependent increase of the neural response. Discussion By incorporating biophysical accuracy and newer experimental evidence, the SDH network-model may become a valuable tool for assessing the contribution of specific SDH connectivity patterns to noxious transmission in both physiological and pathological conditions.

  8. Solution Synthesis Approach to Colloidal Cesium Lead Halide Perovskite Nanoplatelets with Monolayer-Level Thickness Control

    Science.gov (United States)

    2016-01-01

    We report a colloidal synthesis approach to CsPbBr3 nanoplatelets (NPLs). The nucleation and growth of the platelets, which takes place at room temperature, is triggered by the injection of acetone in a mixture of precursors that would remain unreactive otherwise. The low growth temperature enables the control of the plate thickness, which can be precisely tuned from 3 to 5 monolayers. The strong two-dimensional confinement of the carriers at such small vertical sizes is responsible for a narrow PL, strong excitonic absorption, and a blue shift of the optical band gap by more than 0.47 eV compared to that of bulk CsPbBr3. We also show that the composition of the NPLs can be varied all the way to CsPbBr3 or CsPbI3 by anion exchange, with preservation of the size and shape of the starting particles. The blue fluorescent CsPbCl3 NPLs represent a new member of the scarcely populated group of blue-emitting colloidal nanocrystals. The exciton dynamics were found to be independent of the extent of 2D confinement in these platelets, and this was supported by band structure calculations. PMID:26726764

  9. Development and characterisation of disposable gold electrodes, and their use for lead(II) analysis

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Mohd F. M. [Cranfield University, Cranfield Health, Silsoe (United Kingdom); Institute for Medical Research, Toxicology and Pharmacology Unit, Herbal Medicine Research Centre, Kuala Lumpur (Malaysia); Tothill, Ibtisam E. [Cranfield University, Cranfield Health, Silsoe (United Kingdom)

    2006-12-15

    There is an increasing need to assess the harmful effects of heavy-metal-ion pollution on the environment. The ability to detect and measure toxic contaminants on site using simple, cost effective, and field-portable sensors is an important aspect of environmental protection and facilitating rapid decision making. A screen-printed gold sensor in a three-electrode configuration has been developed for analysis of lead(II) by square-wave stripping voltammetry (SWSV). The working electrode was fabricated with gold ink deposited by use of thick-film technology. Conditions affecting the lead stripping response were characterised and optimized. Experimental data indicated that chloride ions are important in lead deposition and subsequent analysis with this type of sensor. A linear concentration range of 10-50 {mu}g L{sup -1} and 25-300 {mu}g L{sup -1} with detection limits of 2 {mu}g L{sup -1} and 5.8 {mu}g L{sup -1} were obtained for lead(II) for measurement times of four and two minutes, respectively. The electrodes can be reused up to 20 times after cleaning with 0.5 mol L{sup -1} sulfuric acid. Interference of other metals with the response to lead were also examined to optimize the sensor response for analysis of environmental samples. The analytical utility of the sensor was demonstrated by applying the system to a variety of wastewater and soil sample extracts from polluted sites. The results are sufficient evidence of the feasibility of using these screen-printed gold electrodes for the determination of lead(II) in wastewater and soil extracts. For comparison purposes a mercury-film electrode and ICP-MS were used for validation. (orig.)

  10. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    . Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual...... legal impact of Les Verts on the acquis is most visible in the area that was sidelined in the academic commentary. This implies that a leading case is a symbolic category, which might not always correspond to the actual role that the case plays in the Court’s jurisprudence....

  11. A Community-Based Approach to Leading the Nation in Smart Energy Use

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2013-12-31

    Project Objectives The AEP Ohio gridSMART® Demonstration Project (Project) achieved the following objectives: • Built a secure, interoperable, and integrated smart grid infrastructure in northeast central Ohio that demonstrated the ability to maximize distribution system efficiency and reliability and consumer use of demand response programs that reduced energy consumption, peak demand, and fossil fuel emissions. • Actively attracted, educated, enlisted, and retained consumers in innovative business models that provided tools and information reducing consumption and peak demand. • Provided the U.S. Department of Energy (DOE) information to evaluate technologies and preferred smart grid business models to be extended nationally. Project Description Ohio Power Company (the surviving company of a merger with Columbus Southern Power Company), doing business as AEP Ohio (AEP Ohio), took a community-based approach and incorporated a full suite of advanced smart grid technologies for 110,000 consumers in an area selected for its concentration and diversity of distribution infrastructure and consumers. It was organized and aligned around: • Technology, implementation, and operations • Consumer and stakeholder acceptance • Data management and benefit assessment Combined, these functional areas served as the foundation of the Project to integrate commercially available products, innovative technologies, and new consumer products and services within a secure two-way communication network between the utility and consumers. The Project included Advanced Metering Infrastructure (AMI), Distribution Management System (DMS), Distribution Automation Circuit Reconfiguration (DACR), Volt VAR Optimization (VVO), and Consumer Programs (CP). These technologies were combined with two-way consumer communication and information sharing, demand response, dynamic pricing, and consumer products, such as plug-in electric vehicles and smart appliances. In addition, the Project

  12. A Novel Approach for the Removal of Lead(II Ion from Wastewater Using Mucilaginous Leaves of Diceriocaryum eriocarpum Plant

    Directory of Open Access Journals (Sweden)

    Joshua N. Edokpayi

    2015-10-01

    Full Text Available Lead(II ion is a very toxic element known to cause detrimental effects to human health even at very low concentrations. An adsorbent prepared using mucilaginous leaves from Diceriocaryum eriocarpum plant (DEP was used for the adsorption of lead(II ion from aqueous solution. Batch experiments were performed on simulated aqueous solutions under optimized conditions of adsorbent dosage, contact time, pH and initial lead(II ion concentration at 298 K. The Langmuir isotherm model more suitably described the adsorption process than the Freundlich model with linearized coefficients of 0.9661 and 0.9547, respectively. Pseudo-second order kinetic equation best described the kinetics of the reaction. Fourier transform infra-red analysis confirmed the presence of amino (–NH, carbonyl (–C=O and hydroxyl (–OH functional groups. Application of the prepared adsorbent to wastewater samples of 10 mg/L and 12 mg/L of lead(II ion concentration taken from a waste stabilization pond showed removal efficiencies of 95.8% and 96.4%, respectively. Futhermore, 0.1 M HCl was a better desorbing agent than 0.1 M NaOH and de-ionized water. The experimental data obtained demonstrated that mucilaginous leaves from DEP can be used as a suitable adsorbent for lead(II ion removal from wastewater.

  13. Transcriptional analysis of the leading region in F plasmid DNA transfer.

    Science.gov (United States)

    Cram, D; Ray, A; O'Gorman, L; Skurray, R

    1984-05-01

    Transcriptional activity associated with the leading region (53.8-66.7F) in F DNA transfer has been shown by RNA-DNA hybridization studies to occur on the anterior segment extending from 59.4 to 66.7F. Promoter-probe analysis of cloned leading region segments detected two promoters within the transcribed portion of the leading region. The promoter active across the 64.7F EcoRI site on the transferred F strand was associated with the expression of two polypeptides, 6d and 13.5p, located between 64.7-66.6F. However, no definite role could be ascribed to the second promoter operative through the 66.6F Bg/II site located in close proximity to oriT, the origin of transfer.

  14. Safer approaches and landings: A multivariate analysis of critical factors

    Science.gov (United States)

    Heinrich, Durwood J.

    The approach-and-landing phases of flight represent 27% of mission time while resulting in 61 of the accidents and 39% of the fatalities. The landing phase itself represents only 1% of flight time but claims 45% of the accidents. Inadequate crew situation awareness (SA), crew resource management (CRM), and crew decision-making (DM) have been implicated in 51%, 63%, and 73% respectively of these accidents. The human factors constructs of SA, CRM, and DM were explored; a comprehensive definition of SA was proposed; and a "proactive defense" safety strategy was recommended. Data from a 1997 analysis of worldwide fatal accidents by the Flight Safety Foundation (FSF) Approach-and-Landing Accident Reduction (ALAR) Task Force was used to isolate crew- and weather-related causal factors that lead to approach-and-landing accidents (ALAs). Logistic regression and decision tree analysis were used on samplings of NASA's Aviation Safety Reporting System (ASRS) incident records ("near misses") and the National Transportation Safety Board's (NTSB) accident reports to examine hypotheses regarding factors and factor combinations that can dramatically increase the opportunity for accidents. An effective scale of risk factors was introduced for use by crews to proactively counter safety-related error-chain situations.

  15. Approaches to data analysis of multiple-choice questions

    OpenAIRE

    Lin Ding; Robert Beichner

    2009-01-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  16. An Overview of Focal Approaches of Critical Discourse Analysis

    Science.gov (United States)

    Jahedi, Maryam; Abdullah, Faiz Sathi; Mukundan, Jayakaran

    2014-01-01

    This article aims to present detailed accounts of central approaches to Critical Discourse Analysis. It focuses on the work of three prominent scholars such as Fairclough's critical approach, Wodak's discourse-historical approach and Van Dijk's socio-cognitive approach. This study concludes that a combination of these three approaches can be…

  17. Trace analysis of lead and cadmium in seafoods by differential pulse anodic stripping voltametry

    International Nuclear Information System (INIS)

    Sumera, F.C.; Verceluz, F.P.; Kapauan, P.A.

    1979-01-01

    A method for the simultaneous determination of cadmium and lead in seafoods is described. The sample is dry ashed in a muffle furnace elevating the temperature gradually up to 500 0 C. The ashed sample is treated with concentrated nitric acid, dried on a heating plate and returned to the muffle furnace for further heating. The treated ash is then dissolved in 1 N HCL acetate buffer and citric acid are added and the pH adjusted to 3.6-4. The resulting solution is analyzed for lead and cadmium by differential pulse anodic stripping voltametry (DPASV) using a wax-impregnated graphite thin film electrode. The average recoveries of 0.4 of cadmium and lead added to 5 fish samples were 97% and 99% respectively. The standard deviations, on a homogenized shark sample for lead and cadmium analysis were 6.7 ppb and 12.3 ppb, respectively, and the relative standard deviations were 21.0% and 15.5% respectively. Studies on instrumental parameters involved in the DPASV step of analysis and methods of measuring peak current signals were also made. (author)

  18. Nu-Way Snaps and Snap Leads: an Important Connection in the History of Behavior Analysis.

    Science.gov (United States)

    Escobar, Rogelio; Lattal, Kennon A

    2014-10-01

    Beginning in the early 1950s, the snap lead became an integral and ubiquitous component of the programming of electromechanical modules used in behavioral experiments. It was composed of a Nu-Way snap connector on either end of a colored electrical wire. Snap leads were used to connect the modules to one another, thereby creating the programs that controlled contingencies, arranged reinforcers, and recorded behavior in laboratory experiments. These snap leads populated operant conditioning laboratories from their inception until the turn of the twenty-first century. They allowed quick and flexible programming because of the ease with which they could be connected, stacked, and removed. Thus, the snap lead was integral to the research activity that constituted the experimental analysis of behavior for more than five decades. This review traces the history of the snap lead from the origins of the snap connector in Birmingham, England, in the late eighteenth century, through the use of snaps connected to wires during the Second World War, to its adoption in operant laboratories, and finally to its demise in the digital age.

  19. Eco-Balance analysis of the disused lead-acid-batteries recycling technology

    Directory of Open Access Journals (Sweden)

    Kamińska Ewa

    2017-01-01

    Full Text Available The article presents the results of the eco-balance analysis of the disused lead-acid batteries recycling process. Test-dedicated technology offers the possibility to recover other elements, for example, polypropylene of the battery case or to obtain crystalline sodium sulphate. The life cycle assessment was made using ReCiPe and IMPACT2002 + methods. The results are shown as environmental points [Pt]. The results are shown in the environmental categories, specific for each of the methods grouped in the impact categories. 1 Mg of the processed srap was a dopted as the functional unit. The results of the analyses indicate that recycling processes may provide the environmental impact of recycling technology less harmful. Repeated use of lead causes that its original sources are not explored. Similarly, the use of granule production-dedicated polypropylene extracted from battery casings that are used in the plastics industry, has environmental benefits. Due to the widespread use of lead-acid batteries, the attention should be paid to their proper utilization, especially in terms of heavy metals, especially lead. According to the calculations, the highest level of environmental benefits from the use of lead from secondary sources in the production of new products, was observed in the refining process.

  20. Eco-Balance analysis of the disused lead-acid-batteries recycling technology

    Science.gov (United States)

    Kamińska, Ewa; Kamiński, Tomasz

    2017-10-01

    The article presents the results of the eco-balance analysis of the disused lead-acid batteries recycling process. Test-dedicated technology offers the possibility to recover other elements, for example, polypropylene of the battery case or to obtain crystalline sodium sulphate. The life cycle assessment was made using ReCiPe and IMPACT2002 + methods. The results are shown as environmental points [Pt]. The results are shown in the environmental categories, specific for each of the methods grouped in the impact categories. 1 Mg of the processed srap was a dopted as the functional unit. The results of the analyses indicate that recycling processes may provide the environmental impact of recycling technology less harmful. Repeated use of lead causes that its original sources are not explored. Similarly, the use of granule production-dedicated polypropylene extracted from battery casings that are used in the plastics industry, has environmental benefits. Due to the widespread use of lead-acid batteries, the attention should be paid to their proper utilization, especially in terms of heavy metals, especially lead. According to the calculations, the highest level of environmental benefits from the use of lead from secondary sources in the production of new products, was observed in the refining process.

  1. A functional genomics approach using metabolomics and in silico pathway analysis

    DEFF Research Database (Denmark)

    Förster, Jochen; Gombert, Andreas Karoly; Nielsen, Jens

    2002-01-01

    analysis techniques and changes in the genotype will in many cases lead to different metabolite profiles. Here, a theoretical framework that may be applied to identify the function of orphan genes is presented. The approach is based on a combination of metabolome analysis combined with in silico pathway...

  2. Leading order analysis of neutrino induced dimuon events in the CHORUS experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kayis-Topaksu, A.; Onenguet, G. [Cukurova University, Adana (Turkey); Dantzig, R. van; Jong, M. de; Oldeman, R.G.C. [NIKHEF, Amsterdam (Netherlands); Gueler, M.; Kama, S.; Koese, U.; Serin-Zeyrek, M.; Tolun, P. [METU, Ankara (Turkey); Catanesi, M.G.; Muciaccia, M.T. [Universita di Bari and INFN, Bari (Italy); Buelte, A.; Winter, K. [Humboldt Universitaet, Berlin (Germany); Van de Vyver, B.; Vilain, P.; Wilquet, G. [Inter-University Institute for High Energies (ULB-VUB), Brussels (Belgium); Saitta, B. [Universita di Cagliari and INFN, Cagliari (Italy); Di Capua, E. [Universita di Ferrara and INFN, Ferrara (Italy); Ogawa, S. [Toho University, Funabashi (Japan)] (and others)

    2008-07-21

    We present a leading order QCD analysis of a sample of neutrino induced charged-current events with two muons in the final state originating in the lead-scintillating fibre calorimeter of the CHORUS detector. The results are based on a sample of 8910 neutrino and 430 antineutrino induced opposite-sign dimuon events collected during the exposure of the detector to the CERN Wide Band Neutrino Beam between 1995 and 1998. The analysis yields a value of the charm quark mass of m{sub c}=(1.26{+-}0.16{+-}0.09)GeV/c{sup 2} and a value of the ratio of the strange to non-strange sea in the nucleon of {kappa}=0.33{+-}0.05{+-}0.05, improving the results obtained in similar analyses by previous experiments.

  3. Leading with "Emotional" Intelligence--Existential and Motivational Analysis in Leadership and Leadership Development

    Science.gov (United States)

    Mengel, Thomas

    2012-01-01

    This conceptual and practical paper is integrating the work of Viktor Frankl (1985) and Steven Reiss (2000, 2008) into a model of Existential and Motivational Analysis (EMotiAn). This integrated model and approach may provide scholars, educators, consultants and practitioners alike with an innovative and meaningful framework for leadership and…

  4. A next-to-leading order QCD analysis of the spin structure function $g_1$

    CERN Document Server

    Adeva, B; Arik, E; Badelek, B; Bardin, G; Baum, G; Berglund, P; Betev, L; Birsa, R; De Botton, N R; Bradamante, Franco; Bravar, A; Bressan, A; Bültmann, S; Burtin, E; Crabb, D; Cranshaw, J; Çuhadar-Dönszelmann, T; Dalla Torre, S; Van Dantzig, R; Derro, B R; Deshpande, A A; Dhawan, S K; Dulya, C M; Eichblatt, S; Fasching, D; Feinstein, F; Fernández, C; Forthmann, S; Frois, Bernard; Gallas, A; Garzón, J A; Gilly, H; Giorgi, M A; von Goeler, E; Görtz, S; Gracia, G; De Groot, N; Grosse-Perdekamp, M; Haft, K; Von Harrach, D; Hasegawa, T; Hautle, P; Hayashi, N; Heusch, C A; Horikawa, N; Hughes, V W; Igo, G; Ishimoto, S; Iwata, T; Kabuss, E M; Kageya, T; Karev, A G; Kessler, H J; Ketel, T; Kiryluk, J; Kiselev, Yu F; Krämer, Dietrich; Krivokhizhin, V G; Kröger, W; Kukhtin, V V; Kurek, K; Kyynäräinen, J; Lamanna, M; Landgraf, U; Le Goff, J M; Lehár, F; de Lesquen, A; Lichtenstadt, J; Litmaath, M; Magnon, A; Mallot, G K; Marie, F; Martin, A; Martino, J; Matsuda, T; Mayes, B W; McCarthy, J S; Medved, K S; Meyer, W T; Van Middelkoop, G; Miller, D; Miyachi, Y; Mori, K; Moromisato, J H; Nassalski, J P; Naumann, Lutz; Niinikoski, T O; Oberski, J; Ogawa, A; Ozben, C; Pereira, H; Perrot-Kunne, F; Peshekhonov, V D; Piegia, R; Pinsky, L; Platchkov, S K; Pló, M; Pose, D; Postma, H; Pretz, J; Puntaferro, R; Rädel, G; Rijllart, A; Reicherz, G; Roberts, J; Rodríguez, M; Rondio, Ewa; Sabo, I; Saborido, J; Sandacz, A; Savin, I A; Schiavon, R P; Schiller, A; Sichtermann, E P; Simeoni, F; Smirnov, G I; Staude, A; Steinmetz, A; Stiegler, U; Stuhrmann, H B; Szleper, M; Tessarotto, F; Thers, D; Tlaczala, W; Tripet, A; Ünel, G; Velasco, M; Vogt, J; Voss, Rüdiger; Whitten, C; Windmolders, R; Willumeit, R; Wislicki, W; Witzmann, A; Ylöstalo, J; Zanetti, A M; Zaremba, K; Zhao, J

    1998-01-01

    We present a next-to-leading order QCD analysis of the presently available data on the spin structure function $g_1$ including the final data from the Spin Muon Collaboration (SMC). We present resu lts for the first moments of the proton, deuteron and neutron structure functions, and determine singlet and non-singlet parton distributions in two factorization schemes. We also test the Bjor ken sum rule and find agreement with the theoretical prediction at the level of 10\\%.

  5. Solar wind conditions leading to efficient radiation belt electron acceleration: A superposed epoch analysis

    OpenAIRE

    Li, W; Thorne, RM; Bortnik, J; Baker, DN; Reeves, GD; Kanekal, SG; Spence, HE; Green, JC

    2015-01-01

    ©2015. American Geophysical Union. All Rights Reserved. Determining preferential solar wind conditions leading to efficient radiation belt electron acceleration is crucial for predicting radiation belt electron dynamics. Using Van Allen Probes electron observations ( > 1 MeV) from 2012 to 2015, we identify a number of efficient and inefficient acceleration events separately to perform a superposed epoch analysis of the corresponding solar wind parameters and geomagnetic indices. By directly c...

  6. Multicriteria approach to data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Hélcio Vieira Junior

    2008-08-01

    Full Text Available With the aim of making Data Envelopment Analysis (DEA more acceptable to the managers' community, the Weights Restrictions approaches were born. They allow DEA to not dispose of any data and permit the Decision Maker (DM to have some management over the method. The purpose of this paper is to suggest a Weights Restrictions DEA model that incorporates the DM preference. In order to perform that, we employed the MACBETH methodology as a tool to find out the bounds of the weights to be used in a Weights Restrictions approach named Virtual Weights Restrictions. Our proposal achieved an outcome that has an expressive correlation with three widely used decision-aids methodologies: the ELECTRE III, the SMART and the PROMETHEE I and II. In addition, our approach was able to join the most significant outcomes of all the above three Multicriteria decision-aids methodologies in one unique outcome.Com o objetivo de fazer a Análise Envoltória de Dados (DEA mais aceitável pela comunidade gerencial, as abordagens de Restrição aos Pesos foram criadas. Estas abordagens fazem com que a DEA não descarte nenhum dado e permitem que o Decisor (DM tenha alguma gerência sobre o método. O objetivo deste artigo é sugerir um modelo de restrição aos pesos que incorpore as preferências do DM. Para realizar isto, nós empregamos a metodologia MACBETH como ferramenta para descobrir os limites dos pesos a serem utilizados na abordagem de restrição aos pesos chamada "Restrição aos Pesos Virtuais". Nossa proposta alcançou um resultado que apresenta uma correlação expressiva com três metodologias de apoio à decisão amplamente utilizadas: o ELECTRE III, o SMART e o PROMETHEE I e II. Adicionalmente, nossa abordagem foi capaz de reunir os resultados mais significativos de todas estas três metodologias de apoio à decisão em um único resultado.

  7. Metals and metalloids in atmospheric dust: Use of lead isotopic analysis for source apportionment

    Science.gov (United States)

    Felix Villar, Omar I.

    Mining activities generate aerosol in a wide range of sizes. Smelting activities produce mainly fine particles ( 1 microm). The adverse effects of aerosols on human health depend mainly on two key characteristics: size and chemical composition. One of the main objectives of this research is to analyze the size distribution of contaminants in aerosol produced by mining operations. For this purpose, a Micro-Orifice Uniform Deposit Impactor (MOUDI) was utilized. Results from the MOUDI samples show higher concentrations of the toxic elements like lead and arsenic in the fine fraction (Fine particles are more likely to be deposited in the deeper zones of the respiratory system; therefore, they are more dangerous than coarse particles that can be filtered out in the upper respiratory system. Unfortunately, knowing the total concentration of contaminants does not give us enough information to identify the source of contamination. For this reason, lead isotopes have been introduced as fingerprints for source apportionment. Each source of lead has specific isotopic ratios; by knowing these ratios sources can be identified. During this research, lead isotopic ratios were analyzed at different sites and for different aerosol sizes. From these analyses it can be concluded that lead isotopes are a powerful tool to identify sources of lead. Mitigation strategies could be developed if the source of contamination is well defined. Environmental conditions as wind speed, wind direction, relative humidity and precipitation have an important role in the concentration of atmospheric dust. Dry environments with low relative humidity are ideal for the transport of aerosols. Results obtained from this research show the relationship between dust concentrations and meteorological parameters. Dust concentrations are highly correlated with relative humidity and wind speed. With all the data collected on site and the analysis of the meteorological parameters, models can be develop to predict

  8. Oxygen concentration diffusion analysis of lead-bismuth-cooled, natural-circulation reactor

    International Nuclear Information System (INIS)

    Ito, Kei; Sakai, Takaaki

    2001-11-01

    The feasibility study on fast breeder reactors in Japan has been conducted at JNC and related organizations. The Phase-I study has finished in March, 2001. During the Phase-I activity, lead-bismuth eutectic coolant has been selected as one of the possible coolant options and a medium-scale plant, cooled by a lead-bismuth natural circulation flow was studied. On the other side, it is known that lead-bismuth eutectic has a problem of structural material corrosiveness. It was found that oxygen concentration control in the eutectic plays an important role on the corrosion protection. In this report, we have developed a concentration diffusion analysis code (COCOA: COncentration COntrol Analysis code) in order to carry out the oxygen concentration control analysis. This code solves a two-dimensional concentration diffusion equation by the finite differential method. It is possible to simulate reaction of oxygen and hydrogen by the code. We verified the basic performance of the code and carried out oxygen concentration diffusion analysis for the case of an oxygen increase by a refueling process in the natural circulation reactor. In addition, characteristics of the oxygen control system was discussed for a different type of the control system as well. It is concluded that the COCOA code can simulate diffusion of oxygen concentration in the reactor. By the analysis of a natural circulation medium-scale reactor, we make clear that the ON-OFF control and PID control can well control oxygen concentration by choosing an appropriate concentration measurement point. In addition, even when a trouble occurs in the oxygen emission or hydrogen emission system, it observes that control characteristic drops away. It is still possible, however, to control oxygen concentration in such case. (author)

  9. Ancient bronze coins from Mediterranean basin: LAMQS potentiality for lead isotopes comparative analysis with former mineral

    Energy Technology Data Exchange (ETDEWEB)

    Torrisi, L., E-mail: Lorenzo.Torrisi@unime.it [Department of Physics Science - MIFT, Messina University, V.le F.S. d’Alcontres 31, 98166 S. Agata, Messina (Italy); Italiano, A. [INFN, Sezione di Catania, Gruppo collegato di Messina (Italy); Torrisi, A. [Institute of Optoelectronics, Military University of Technology, 2 Kaliskiego Str., 00-908 Warsaw (Poland)

    2016-11-30

    Highlights: • Surface and bulk compositional elements in ancient bronze coins were investigated using XRF analysis. • Lead stable isotope {sup 204}Pb, {sup 206}Pb, {sup 207}Pb and {sup 208}Pb were measured in ancient coins with LAMQS analysis. • Lead ratios {sup 208}Pb/{sup 206}Pb and {sup 207}Pb/{sup 206}Pb, measured by LAMQS, were compared with Brettscaife.net geological database relative to the minerals in different mines of Mediterranean basin. • Bronze coins were correlated to possible ancient mining sites of minerals from which lead was extracted. - Abstract: Bronze coins coming from the area of the Mediterranean basin, dated back the II–X Cent. A.D., were analyzed using different physical analytical techniques. Characteristic X-ray fluorescence was used with electrons and photons, in order to investigate the elemental composition of both the surface layers and bulk. Moreover, the quadrupole mass spectrometry coupled to laser ablation (LAMQS technique) in high vacuum was used to analyse typical material compounds from surface contamination. Mass spectrometry, at high resolution and sensitivity, extended up to 300 amu, allowed measuring the {sup 208}Pb/{sup 206}Pb and {sup 207}Pb/{sup 206}Pb isotopic ratios into the coins. Quantitative relative analyses of these isotopic ratios identify the coin composition such as a “fingerprint” depending on the mineral used to extract the lead. Isotopic ratios in coins can be compared to those of the possible minerals used to produce the bronze alloy. A comparison between the measured isotope ratios in the analyzed coins and the literature database, related to the mineral containing Pb as a function of its geological and geophysical extraction mine, is presented. The analysis, restricted to old coins and the mines of the Mediterranean basin, indicates a possible correlation between the coin compositions and the possible geological sites of the extracted mineral.

  10. Global approach of emergency response, reflection analysis

    International Nuclear Information System (INIS)

    Velasco Garcia, E.; Garcia Ahumada, F.; Albaladejo Vidal, S.

    1998-01-01

    The emergency response management approach must be dealt with adequately within company strategy, since a badly managed emergency situation can adversely affect a company, not only in terms of asset, but also in terms of the negative impact on its credibility, profitability and image. Thereby, it can be said that there are three main supports to manage the response in an emergency situation. a) Diagnosis b) Prognosis. c) Communications. To reach these capabilities it is necessary a co-ordination of different actions at the following levels. i. Facility Operation implies Local level. ii. Facility Property implies National level iii. Local Authority implies Local level iv. National Authority implies National level Taking into account all the last, these following functions must be covered: a) Management: incorporating communication, diagnosis and prognosis areas. b) Decision: incorporating communication and information means. c) Services: in order to facilitate the decision, as well as the execution of this decision. d) Analysis: in order to facilitate the situations that make easier to decide. e) Documentation: to seek the information for the analysts and decision makers. (Author)

  11. Prediction of new onset atrial fibrillation through P wave analysis in 12 lead ECG.

    Science.gov (United States)

    Yoshizawa, Tomoharu; Niwano, Shinichi; Niwano, Hiroe; Igarashi, Tazuru; Fujiishi, Tamami; Ishizue, Naruya; Oikawa, Jun; Satoh, Akira; Kurokawa, Sayaka; Hatakeyama, Yuko; Fukaya, Hidehira; Ako, Junya

    2014-01-01

    It is unknown whether 12-lead ECG can predict new-onset AF. In the present study, we identified patients with new onset AF from our digitally stored ECG database, and the P wave morphologies were analyzed in their preceding sinus rhythm recordings as the precursor state for AF. The P wave was analyzed in the most recent ECG recording of sinus rhythm preceding new onset AF within 12 months. The duration and amplitude of P waves were analyzed in 12 leads and compared between the 2 groups with the other clinical parameters. The study population consisted of 68 patients with new-onset AF and 68 age and sex-matched controls. Multivariate analysis revealed that the P wave amplitude in leads II and V1 (0.157 ± 0.056 versus 0.115 ± 0.057 mV, P = 0.032, and 0.146 ± 0.089 versus 0.095 ± 0.036 mV, P = 0.002) and P wave dispersion (56.9 ± 14.8 versus 33.5 ± 12.9 ms, P = 0.001) were significant independent factors for the prediction of new-onset AF. By using these factors, new-onset AF could be predicted with a sensitivity of 69.1% and specificity of 88.2%. P wave analysis is useful for predicting new onset AF.

  12. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  13. Re-analysis of fatigue data for welded joints using the notch stress approach

    DEFF Research Database (Denmark)

    Pedersen, Mikkel Melters; Mouritsen, Ole Ø.; Hansen, Michael Rygaard

    2010-01-01

    Experimental fatigue data for welded joints have been collected and subjected to re-analysis using the notch stress approach according to IIW recommendations. This leads to an overview regarding the reliability of the approach, based on a large number of results (767 specimens). Evidently, there ......-welded joints agree quite well with the FAT 225 curve; however a reduction to FAT 200 is suggested in order to achieve approximately the same safety as observed in the nominal stress approach.......Experimental fatigue data for welded joints have been collected and subjected to re-analysis using the notch stress approach according to IIW recommendations. This leads to an overview regarding the reliability of the approach, based on a large number of results (767 specimens). Evidently...

  14. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  15. Matrix effects in the X-ray flourescence analysis of zinc-lead ores

    International Nuclear Information System (INIS)

    Kirchmayer, M.

    1977-01-01

    In the present paper several mathematical procedures for overcoming matrix effects in the X-ray flourescence analysis of zinc-lead ores varying widely in composition are examined. Some new intensity correction equations are derived using the theory of X-ray flourescence. Experiments were carried out on a high resolution Si/Li/ spectrometer and also on portable single-channel analyzer with proportional counter, using 238 Pu and 55 Fe radioisotope sources. For separation of overlapping spectral lines recorded with a proportional counter, the single edge filter method and pre-calibration is proposed. Multiple regression is applied for the calculation of influence coefficients and estimation of errors of determinations. The present results show that the equations derived in this paper allow zinc, lead and iron in ores to be determined with relatively high accuracy. (author)

  16. A comparison of methods and materials for the analysis of leaded wipes.

    Science.gov (United States)

    Harper, Martin; Hallmark, Timothy S; Bartolucci, Alfred A

    2002-12-01

    The purposes of this study are: (1) to determine whether proficiency analytical test (PAT) materials from the American Industrial Hygiene Association can be used to provide quality data for portable X-ray fluorescence analysis (XRF) of lead in dust wipe surface samples; (2) to provide data to determine whether the on-site analysis of field dust wipe samples by XRF and the laboratory method of inductively coupled plasma emission analysis (ICP) are comparable; and (3) to determine if differences exist between different wipe materials. Several wipes meet the ASTM E1792 performance requirements of lead background level less than 5 microg per wipe, be only one layer thick, yield recovery rates of 80- 120% from spiked samples, remain damp throughout the sampling procedure, and do not contain aloe. The wipes used in this study were Pace Wipes, which are used for the PAT materials, and, for the field samples, Palintest Wipes, which were supplied by the instrument manufacturer, and Ghost Wipes, which are popular because they digest in hot, concentrated acid, so that chemical analysis is simplified. Twenty PAT wipe samples were obtained from four different proficiency test rounds. Surface wipe samples were taken at three different locations representing different industry types. All samples were analyzed using a portable XRF spectrometer and by ICP. Strong linear relationships were found for the analysis of wipe samples by ICP and by portable XRF. For the PAT samples, the results from the ICP and XRF analysis were not statistically equivalent, which indicates a bias in the ICP analysis. The bias was not excessive, since all ICP analyses fell within the acceptable range for the proficiency samples. The good correlation between the proficiency sample reference values and the XRF determinations is not surprising considering similar proficiency samples were used to calibrate the instrument response. Users of this portable XRF analyzer could enroll in the proficiency test program

  17. An Analysis Of Leading Character’s Conflict In Nicholas Sparks’ Novel The Notebook

    OpenAIRE

    Simamora, Agreny Melisa

    2014-01-01

    The title of this thesis is An Analysis of Leading Character’s Conflict In Nicholas Sparks’ The Notebook which is a kind of Allie’s conflict in her life. The conflict is devided into three such as Allie’s internal conflict, Allie’s external conflict with Noah (her boyfriend) and Allie’s external conflict with her parents. The conflict can happen because the choices cannot be fullfilled, where Allie’s parents do not agree with her decision because different status, Allie comes from a rich fami...

  18. Lead test assembly irradiation and analysis Watts Bar Nuclear Plant, Tennessee and Hanford Site, Richland, Washington

    International Nuclear Information System (INIS)

    1997-07-01

    The U.S. Department of Energy (DOE) needs to confirm the viability of using a commercial light water reactor (CLWR) as a potential source for maintaining the nation's supply of tritium. The Proposed Action discussed in this environmental assessment is a limited scale confirmatory test that would provide DOE with information needed to assess that option. This document contains the environmental assessment results for the Lead test assembly irradiation and analysis for the Watts Bar Nuclear Plant, Tennessee, and the Hanford Site in Richland, Washington

  19. Next-to leading order analysis of target mass corrections to structure functions and asymmetries

    International Nuclear Information System (INIS)

    Brady, L.T.; Accardi, A.; Hobbs, T.J.; Melnitchouk, W.

    2011-01-01

    We perform a comprehensive analysis of target mass corrections (TMCs) to spin-averaged structure functions and asymmetries at next-to-leading order. Several different prescriptions for TMCs are considered, including the operator product expansion, and various approximations to it, collinear factorization, and xi-scaling. We assess the impact of each of these on a number of observables, such as the neutron to proton F 2 structure function ratio, and parity-violating electron scattering asymmetries for protons and deuterons which are sensitive to gamma-Z interference effects. The corrections from higher order radiative and nuclear effects on the parity-violating deuteron asymmetry are also quantified.

  20. Quantitative analysis of trace lead in tin-base lead-free solder by laser-induced plasma spectroscopy in air at atmospheric pressure.

    Science.gov (United States)

    Chen, Baozhong; Kano, Hidenori; Kuzuya, Mikio

    2008-02-01

    A quantitative analysis of trace lead in tin-base lead-free solder was carried out with laser-induced plasma spectroscopy (LIPS). In order to evaluate the applicability of the technique for rapid in situ analytical purposes, measurements were performed in air at atmospheric pressure, and the emission characteristics of the plasma produced by a Q-switched Nd:YAG laser over a laser energy range of 10 - 90 mJ were investigated using time-resolved spectroscopy. The experimental results showed that the emission intensity of the analysis line (Pb I 405.78 nm) was maximized at a laser energy of around 30 mJ, and a time-resolved measurement of a spectrum with a delay time of 0.4 micros after the laser pulse was effective for reducing the background continuum. Based on the results, lead-free solder certified reference materials were analyzed for trace lead (concentration 174 - 1940 ppm), and a linear calibration curve was obtained with a detection limit of several tens ppm.

  1. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    Science.gov (United States)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    Operational quantitative precipitation forecasts (QPF) are provided routinely by weather services or hydrological authorities, particularly those responsible for densely populated regions of small catchments, such as those typically found in Mediterranean areas prone to flash-floods. Specific rainfall values are used as thresholds for issuing warning levels considering different time frameworks (mid-range, short-range, 24h, 1h, etc.), for example 100 mm in 24h or 60 mm in 1h. There is a clear need to determine how feasible is a specific rainfall value for a given lead-time, in particular for very short range forecasts or nowcasts typically obtained from weather radar observations (Pierce et al 2012). In this study we assess which specific nowcast lead-times can be provided for a number of heavy precipitation events (HPE) that affected Catalonia (NE Spain). The nowcasting system we employed generates QPFs through the extrapolation of rainfall fields observed with weather radar following a Lagrangian approach developed and tested successfully in previous studies (Berenguer et al. 2005, 2011).Then QPFs up to 3h are compared with two quality controlled observational data sets: weather radar quantitative precipitation estimates (QPE) and raingauge data. Several high-impact weather HPE were selected including the 7 September 2005 Llobregat Delta river tornado outbreak (Bech et al. 2007) or the 2 November 2008 supercell tornadic thunderstorms (Bech et al. 2011) both producing, among other effects, local flash floods. In these two events there were torrential rainfall rates (30' amounts exceeding 38.2 and 12.3 mm respectively) and 24h accumulation values above 100 mm. A number of verification scores are used to characterize the evolution of precipitation forecast quality with time, which typically presents a decreasing trend but showing an strong dependence on the selected rainfall threshold and integration period. For example considering correlation factors, 30

  2. Analysis of landing gear noise during approach

    NARCIS (Netherlands)

    Merino Martinez, R.; Snellen, M.; Simons, D.G.

    2016-01-01

    Airframe noise is becoming increasingly important during approach, even reaching higher noise levels than the engines in some cases. More people are a_ected due to low ight altitudes and _xed tra_c routing associated with typical approaches. For most air- craft types, the landing gear system is a

  3. Approaches to Data Analysis of Multiple-Choice Questions

    Science.gov (United States)

    Ding, Lin; Beichner, Robert

    2009-01-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics…

  4. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  5. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  6. Analysis and testing of the DIII-D ohmic heating coil lead repair clamp

    International Nuclear Information System (INIS)

    Reis, E.E.; Anderson, P.M.; Chin, E.; Robinson, J.I.

    1997-11-01

    DIII-D has been operating for the last year with limited volt-second capabilities due to structural failure of a conductor lead to one of the ohmic heating (OH) solenoids. The conductor failure was due to poor epoxy impregnation of the overwrap of the lead pack, resulting in copper fatigue and a water leak. A number of structural analyses were performed to assist in determining the failure scenario and to evaluate various repair options. A fatigue stress analysis of the leads with a failed epoxy overwrap indicated crack initiation after 1,000 cycles at the maximum operating conditions. The failure occurred in a very inaccessible area which restricted design repair options to concepts which could be implemented remotely. Several design options were considered for repairing the lead so that it can sustain the loads for 7.5 Vs conditions at full toroidal field. A clamp, along with preloaded banding straps and shim bags, provides a system that guarantees that the stress at the crack location is always compressive and prevents further crack growth in the conductor. Due to the limited space available for the repair, it was necessary to design the clamp system to operate at the material yield stress. The primary components of the clamp system were verified by load tests prior to installation. The main body of the clamp contains a load cell and potentiometer for monitoring the load-deflection characteristics of the clamp and conductors during plasma operation. Strain gages provides redundant instrumentation. If required, the preload on the conductors can be increased remotely by a special wrench attached to the clamp assembly

  7. Approaches to data analysis of multiple-choice questions

    Directory of Open Access Journals (Sweden)

    Lin Ding

    2009-09-01

    Full Text Available This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  8. Structural health monitoring of multi-spot welded joints using a lead zirconate titanate based active sensing approach

    Science.gov (United States)

    Yao, Ping; Kong, Qingzhao; Xu, Kai; Jiang, Tianyong; Huo, Lin-sheng; Song, Gangbing

    2016-01-01

    Failures of spot welded joints directly reduce the load capacity of adjacent structures. Due to their complexity and invisibility, real-time health monitoring of spot welded joints is still a challenge. In this paper, a lead zirconate titanate (PZT) based active sensing approach was proposed to monitor the structural health of multi-spot welded joints in real time. In the active sensing approach, one PZT transducer was used as an actuator to generate a guided stress wave, while another one, as a sensor, detected the wave response. Failure of a spot welded joint reduces the stress wave paths and attenuates the wave propagation energy from the actuator to the sensor. A total of four specimens made of dual phase steel with spot welds, including two specimens with 20 mm intervals of spot welded joints and two with 25 mm intervals, were designed and fabricated for this research. Under tensile tests, the spot welded joints successively failed, resulting in the PZT sensor reporting decreased received energy. The energy attenuations due to the failures of joints were clearly observed by the PZT sensor signal in both the time domain and frequency domain. In addition, a wavelet packet-based spot-weld failure indicator was developed to quantitatively evaluate the failure condition corresponding to the number of failed joints.

  9. Reporting and analysis of trials using stratified randomisation in leading medical journals: review and reanalysis.

    Science.gov (United States)

    Kahan, Brennan C; Morris, Tim P

    2012-09-14

    To assess how often stratified randomisation is used, whether analysis adjusted for all balancing variables, and whether the method of randomisation was adequately reported, and to reanalyse a previously reported trial to assess the impact of ignoring balancing factors in the analysis. Review of published trials and reanalysis of a previously reported trial. Four leading general medical journals (BMJ, Journal of the American Medical Association, Lancet, and New England Journal of Medicine) and the second Multicenter Intrapleural Sepsis Trial (MIST2). 258 trials published in 2010 in the four journals. Cluster randomised, crossover, non-randomised, single arm, and phase I or II trials were excluded, as were trials reporting secondary analyses, interim analyses, or results that had been previously published in 2010. Whether the method of randomisation was adequately reported, how often balanced randomisation was used, and whether balancing factors were adjusted for in the analysis. Reanalysis of MIST2 showed that an unadjusted analysis led to larger P values and a loss of power. The review of published trials showed that balanced randomisation was common, with 163 trials (63%) using at least one balancing variable. The most common methods of balancing were stratified permuted blocks (n=85) and minimisation (n=27). The method of randomisation was unclear in 37% of trials. Most trials that balanced on centre or prognostic factors were not adequately analysed; only 26% of trials adjusted for all balancing factors in their primary analysis. Trials that did not adjust for balancing factors in their analysis were less likely to show a statistically significant result (unadjusted 57% v adjusted 78%, P=0.02). Balancing on centre or prognostic factors is common in trials but often poorly described, and the implications of balancing are poorly understood. Trialists should adjust their primary analysis for balancing factors to obtain correct P values and confidence intervals and

  10. Leading Antibacterial Laboratory Research by Integrating Conventional and Innovative Approaches: The Laboratory Center of the Antibacterial Resistance Leadership Group.

    Science.gov (United States)

    Manca, Claudia; Hill, Carol; Hujer, Andrea M; Patel, Robin; Evans, Scott R; Bonomo, Robert A; Kreiswirth, Barry N

    2017-03-15

    The Antibacterial Resistance Leadership Group (ARLG) Laboratory Center (LC) leads the evaluation, development, and implementation of laboratory-based research by providing scientific leadership and supporting standard/specialized laboratory services. The LC has developed a physical biorepository and a virtual biorepository. The physical biorepository contains bacterial isolates from ARLG-funded studies located in a centralized laboratory and they are available to ARLG investigators. The Web-based virtual biorepository strain catalogue includes well-characterized gram-positive and gram-negative bacterial strains published by ARLG investigators. The LC, in collaboration with the ARLG Leadership and Operations Center, developed procedures for review and approval of strain requests, guidance during the selection process, and for shipping strains from the distributing laboratories to the requesting investigators. ARLG strains and scientific and/or technical guidance have been provided to basic research laboratories and diagnostic companies for research and development, facilitating collaboration between diagnostic companies and the ARLG Master Protocol for Evaluating Multiple Infection Diagnostics (MASTERMIND) initiative for evaluation of multiple diagnostic devices from a single patient sampling event. In addition, the LC has completed several laboratory-based studies designed to help evaluate new rapid molecular diagnostics by developing, testing, and applying a MASTERMIND approach using purified bacterial strains. In collaboration with the ARLG's Statistical and Data Management Center (SDMC), the LC has developed novel analytical strategies that integrate microbiologic and genetic data for improved and accurate identification of antimicrobial resistance. These novel approaches will aid in the design of future ARLG studies and help correlate pathogenic markers with clinical outcomes. The LC's accomplishments are the result of a successful collaboration with the ARLG

  11. Long-term dietary exposure to lead in young European children: Comparing a pan-European approach with a national exposure assessment

    DEFF Research Database (Denmark)

    Boon, P.E.; Te Biesebeek, J.D.; van Klaveren, J.D.

    2012-01-01

    the national exposure calculations. For both assessments cereals contributed most to the exposure. The lower dietary exposure in the national study was due to the use of lower lead concentrations and a more optimal linkage of food consumption and concentration data. When a pan-European approach, using......Long-term dietary exposures to lead in young children were calculated by combining food consumption data of 11 European countries categorised using harmonised broad food categories with occurrence data on lead from different Member States (pan-European approach). The results of the assessment...... in children living in the Netherlands were compared with a long-term lead intake assessment in the same group using Dutch lead concentration data and linking the consumption and concentration data at the highest possible level of detail. Exposures obtained with the pan-European approach were higher than...

  12. A new approach to regression analysis of censored competing-risks data.

    Science.gov (United States)

    Jin, Yuxue; Lai, Tze Leung

    2017-10-01

    An approximate likelihood approach is developed for regression analysis of censored competing-risks data. This approach models directly the cumulative incidence function, instead of the cause-specific hazard function, in terms of explanatory covariates under a proportional subdistribution hazards assumption. It uses a self-consistent iterative procedure to maximize an approximate semiparametric likelihood function, leading to an asymptotically normal and efficient estimator of the vector of regression parameters. Simulation studies demonstrate its advantages over previous methods.

  13. A Bayesian Nonparametric Approach to Factor Analysis

    DEFF Research Database (Denmark)

    Piatek, Rémi; Papaspiliopoulos, Omiros

    2018-01-01

    This paper introduces a new approach for the inference of non-Gaussian factor models based on Bayesian nonparametric methods. It relaxes the usual normality assumption on the latent factors, widely used in practice, which is too restrictive in many settings. Our approach, on the contrary, does...... not impose any particular assumptions on the shape of the distribution of the factors, but still secures the basic requirements for the identification of the model. We design a new sampling scheme based on marginal data augmentation for the inference of mixtures of normals with location and scale...... restrictions. This approach is augmented by the use of a retrospective sampler, to allow for the inference of a constrained Dirichlet process mixture model for the distribution of the latent factors. We carry out a simulation study to illustrate the methodology and demonstrate its benefits. Our sampler is very...

  14. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  15. Machine learning approaches in medical image analysis

    DEFF Research Database (Denmark)

    de Bruijne, Marleen

    2016-01-01

    Machine learning approaches are increasingly successful in image-based diagnosis, disease prognosis, and risk assessment. This paper highlights new research directions and discusses three main challenges related to machine learning in medical imaging: coping with variation in imaging protocols......, learning from weak labels, and interpretation and evaluation of results....

  16. Shotgun approaches to gait analysis : insights & limitations

    NARCIS (Netherlands)

    Kaptein, Ronald G.; Wezenberg, Daphne; IJmker, Trienke; Houdijk, Han; Beek, Peter J.; Lamoth, Claudine J. C.; Daffertshofer, Andreas

    2014-01-01

    Background: Identifying features for gait classification is a formidable problem. The number of candidate measures is legion. This calls for proper, objective criteria when ranking their relevance. Methods: Following a shotgun approach we determined a plenitude of kinematic and physiological gait

  17. Understanding common risk analysis problems leads to better E and P decisions

    International Nuclear Information System (INIS)

    Smith, M.B.

    1994-01-01

    Many petroleum geologists, engineers and managers who have been introduced to petroleum risk analysis doubt that probability theory actually works in practice. Discovery probability estimates for exploration prospects always seem to be more optimistic than after-the-fact results. In general, probability estimates seem to be plucked from the air without any objective basis. Because of subtleties in probability theories, errors may result in applying risk analysis to real problems. Four examples have been selected to illustrate how misunderstanding in applying risk analysis may lead to incorrect decisions. Examples 1 and 2 show how falsely assuming statistical independence distorts probability calculations. Example 1 and 2 show how falsely assuming statistical independence distorts probability calculations. Example 3 discusses problems with related variable using the Monte Carlo method. Example 4 shows how subsurface data yields a probability value that is superior to a simple statistical estimate. The potential mistakes in the following examples would go unnoticed in analyses in most companies. Lack of objectivity and flawed theory would be blamed when fault actually would lies with incorrect application of basic probability principles

  18. Temperature control characteristics analysis of lead-cooled fast reactor with natural circulation

    International Nuclear Information System (INIS)

    Yang, Minghan; Song, Yong; Wang, Jianye; Xu, Peng; Zhang, Guangyu

    2016-01-01

    Highlights: • The LFR temperature control system are analyzed with frequency domain method. • The temperature control compensator is designed according to the frequency analysis. • Dynamic simulation is performed by SIMULINK and RELAP5-HD. - Abstract: Lead-cooled Fast Reactor (LFR) with natural circulation in primary system is among the highlights in advance nuclear reactor research, due to its great superiority in reactor safety and reliability. In this work, a transfer function matrix describing coolant temperature dynamic process, obtained by Laplace transform of the one-dimensional system dynamic model is developed in order to investigate the temperature control characteristics of LFR. Based on the transfer function matrix, a close-loop coolant temperature control system without compensator is built. The frequency domain analysis indicates that the stability and steady-state of the temperature control system needs to be improved. Accordingly, a temperature compensator based on Proportion–Integration and feed-forward is designed. The dynamic simulation of the whole system with the temperature compensator for core power step change is performed with SIMULINK and RELAP5-HD. The result shows that the temperature compensator can provide superior coolant temperature control capabilities in LFR with natural circulation due to the efficiency of the frequency domain analysis method.

  19. Slurry analysis after lead collection on a sorbent and its determination by electrothermal atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Baysal, Asli; Tokman, Nilgun [Istanbul Technical University, Faculty of Science and Letters, Department of Chemistry, 34469 Maslak-Istanbul (Turkey); Akman, Suleyman [Istanbul Technical University, Faculty of Science and Letters, Department of Chemistry, 34469 Maslak-Istanbul (Turkey)], E-mail: akmans@itu.edu.tr; Ozeroglu, Cemal [Istanbul University, Department of Chemistry, Faculty of Engineering, 34320 Avcilar-Istanbul (Turkey)

    2008-02-11

    In this study, in order to eliminate the drawbacks of elution step and to reach higher enrichment factors, a novel preconcentration/separation technique for the slurry analysis of sorbent loaded with lead prior to its determination by electrothermal atomic absorption spectrometry was described. For this purpose, at first, lead was collected on ethylene glycol dimethacrylate methacrylic acid copolymer (EGDMA-MA) treated with ammonium pyrolidine dithiocarbamate (APDC) by conventional batch technique. After separation of liquid phase, slurry of the sorbent was prepared and directly pipetted into graphite furnace of atomic absorption spectrophotometer. Optimum conditions for quantitative sorption and preparation of the slurry were investigated. A 100-fold enrichment factor could be easily reached. The analyte element in certified sea-water and Bovine-liver samples was determined in the range of 95% confidence level. The proposed technique was fast and simple and the risks of contamination and analyte loss were low. Detection limit (3{delta}) for Pb was 1.67 {mu}g l{sup -1}.

  20. Atmospheric deposition study in the area of Kardzhali lead-zinc plant based on moss analysis

    International Nuclear Information System (INIS)

    Hristozova, G.; Marinova, S.; Strelkova, L.P.; Goryajnova, Z.; Frontas'eva, M.V.; Stafilov, T.

    2015-01-01

    For the first time the moss biomonitoring technique was used to assess the environmental situation in the area affected by the lead-zinc plant as one of the most hazardous enterprises in Bulgaria. 77 Hypnum cupressiforme moss samples were collected in the Kardzhali municipality in the summer and autumn of 2011. The concentrations of a total of 47 elements were determined by means of instrumental epithermal neutron activation analysis (ENAA), atomic absorption spectrometry (AAS) and inductively coupled plasma-atomic emission spectrometry (ICP-AES). Multivariate statistics was applied to characterize the sources of elements detected in the samples. Four groups of elements were found. In comparison to the data averaged for the area outside of the town, the atmospheric deposition loads for the elements of industrial origin in Kardzhali, where the smelter chimney is located, were found to be much higher. Median levels of the measured concentrations of the most toxic metals (Pb, Zn, Cd, As, Cu, In, Sb) were extremely high in this hot spot when compared to the median Bulgarian cross-country data from the 2010-2011 European moss survey. GIS technology was used to produce element distribution maps illustrating deposition patterns of element pollutants in the study area. The results obtained contribute to the Bulgarian environmental research used to study and control the manufacturing processes of the lead-zinc plant in the town of Kardzhali.

  1. Modeling the effect of levothyroxine therapy on bone mass density in postmenopausal women: a different approach leads to new inference

    Science.gov (United States)

    Mohammadi, Babak; Haghpanah, Vahid; Tavangar, Seyed Mohammad; Larijani, Bagher

    2007-01-01

    Background The diagnosis, treatment and prevention of osteoporosis is a national health emergency. Osteoporosis quietly progresses without symptoms until late stage complications occur. Older patients are more commonly at risk of fractures due to osteoporosis. The fracture risk increases when suppressive doses of levothyroxine are administered especially in postmenopausal women. The question is; "When should bone mass density be tested in postmenopausal women after the initiation of suppressive levothyroxine therapy?". Standard guidelines for the prevention of osteoporosis suggest that follow-up be done in 1 to 2 years. We were interested in predicting the level of bone mass density in postmenopausal women after the initiation of suppressive levothyroxine therapy with a novel approach. Methods The study used data from the literature on the influence of exogenous thyroid hormones on bone mass density. Four cubic polynomial equations were obtained by curve fitting for Ward's triangle, trochanter, spine and femoral neck. The behaviors of the models were investigated by statistical and mathematical analyses. Results There are four points of inflexion on the graphs of the first derivatives of the equations with respect to time at about 6, 5, 7 and 5 months. In other words, there is a maximum speed of bone loss around the 6th month after the start of suppressive L-thyroxine therapy in post-menopausal women. Conclusion It seems reasonable to check bone mass density at the 6th month of therapy. More research is needed to explain the cause and to confirm the clinical application of this phenomenon for osteoporosis, but such an approach can be used as a guide to future experimentation. The investigation of change over time may lead to more sophisticated decision making in a wide variety of clinical problems. PMID:17559682

  2. Modeling the effect of levothyroxine therapy on bone mass density in postmenopausal women: a different approach leads to new inference

    Directory of Open Access Journals (Sweden)

    Tavangar Seyed

    2007-06-01

    Full Text Available Abstract Background The diagnosis, treatment and prevention of osteoporosis is a national health emergency. Osteoporosis quietly progresses without symptoms until late stage complications occur. Older patients are more commonly at risk of fractures due to osteoporosis. The fracture risk increases when suppressive doses of levothyroxine are administered especially in postmenopausal women. The question is; "When should bone mass density be tested in postmenopausal women after the initiation of suppressive levothyroxine therapy?". Standard guidelines for the prevention of osteoporosis suggest that follow-up be done in 1 to 2 years. We were interested in predicting the level of bone mass density in postmenopausal women after the initiation of suppressive levothyroxine therapy with a novel approach. Methods The study used data from the literature on the influence of exogenous thyroid hormones on bone mass density. Four cubic polynomial equations were obtained by curve fitting for Ward's triangle, trochanter, spine and femoral neck. The behaviors of the models were investigated by statistical and mathematical analyses. Results There are four points of inflexion on the graphs of the first derivatives of the equations with respect to time at about 6, 5, 7 and 5 months. In other words, there is a maximum speed of bone loss around the 6th month after the start of suppressive L-thyroxine therapy in post-menopausal women. Conclusion It seems reasonable to check bone mass density at the 6th month of therapy. More research is needed to explain the cause and to confirm the clinical application of this phenomenon for osteoporosis, but such an approach can be used as a guide to future experimentation. The investigation of change over time may lead to more sophisticated decision making in a wide variety of clinical problems.

  3. Environmental monitoring near urban lead refineries by photon and neutron activation analysis

    International Nuclear Information System (INIS)

    Paciga, J.J.; Chattopadhyay, A.; Jervis, R.E.

    1974-01-01

    Photon activation has been used in conjunction with neutron activation for multielement determinations in airborne particulates, soil, and hair samples collected near two secondary lead refineries in Metropolitan Toronto. Particle size distributions of suspended particulates collected with a high volume Andersen sampler are reported for Al, Sb, As, Br, Cl, Mn, Na, Pb, Ti and V. Increases in the concentrations of Pb, As and Sb associated with particles >3.3 μm diameter on certain days near the refineries has resulted in localized contamination as reflected in higher concentrations of these elements in soil. To assess Pb accumulation in local residents compared with control groups, approximately 250 hair samples were analyzed for Pb by photon activation analysis. Children living close to the refineries, especially boys, exhibit the most elevated levels: up to 20 times urban control values in some cases

  4. Representation of autism in leading newspapers in china: a content analysis.

    Science.gov (United States)

    Bie, Bijie; Tang, Lu

    2015-01-01

    The public's lack of understanding and the public's misconceptions about autism in China contribute to the underdiagnosis and undertreatment of the disorder and the stigma associated with it. Mass media are the primary channel through which people learn about autism. This article examines how leading newspapers in China covered autism in the 10-year period of 2003 through 2012 through a framing analysis. It finds that while autism has received increased media attention, it is increasingly framed as a family problem-family members are cited or quoted more than any other sources and the responsibility of dealing with autism is ultimately assigned to families. Autistic people are largely silenced unless they are autistic savants with special talents. The use of the scientific discourse and the human-interest discourse both decrease over time in percentage, while the use of other discourses such as the public relations discourse becomes more dominant.

  5. Analysis of spent fuel assay with a lead slowing down spectrometer

    International Nuclear Information System (INIS)

    Gavron, A.; Smith, L. Eric; Ressler, Jennifer J.

    2009-01-01

    Assay of fissile materials in spent fuel that are produced or depleted during the operation of a reactor, is of paramount importance to nuclear materials accounting, verification of the reactor operation history, as well as for criticality considerations for storage. In order to prevent future proliferation following the spread of nuclear energy, we must develop accurate methods to assay large quantities of nuclear fuels. We analyze the potential of using a Lead Slowing Down Spectrometer for assaying spent fuel. We conclude that it possible to design a system that will provide around 1% statistical precision in the determination of the 239 Pu, 241 Pu and 235 U concentrations in a PWR spent-fuel assembly, for intermediate-to-high burnup levels, using commercial neutron sources, and a system of 238 U threshold fission detectors. Pending further analysis of systematic errors, it is possible that missing pins can be detected, as can asymmetry in the fuel bundle. (author)

  6. A Conceptual Approach to Recreation Habitat Analysis

    National Research Council Canada - National Science Library

    Hamilton, H. R

    1996-01-01

    .... The Habitat Evaluation Procedures (HEP) is a commonly used technique for assessing human impacts on the vigor of wildlife species, and serves as the model for the Recreation Habitat Analysis Method (RHAM...

  7. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    Brodsky, J.B.; Groer, P.G.

    1984-09-01

    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  8. ADVANCEMENTS IN TIME-SPECTRA ANALYSIS METHODS FOR LEAD SLOWING-DOWN SPECTROSCOPY

    International Nuclear Information System (INIS)

    Smith, Leon E.; Anderson, Kevin K.; Gesh, Christopher J.; Shaver, Mark W.

    2010-01-01

    Direct measurement of Pu in spent nuclear fuel remains a key challenge for safeguarding nuclear fuel cycles of today and tomorrow. Lead slowing-down spectroscopy (LSDS) is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic mass with an uncertainty lower than the approximately 10 percent typical of today's confirmatory assay methods. Pacific Northwest National Laboratory's (PNNL) previous work to assess the viability of LSDS for the assay of pressurized water reactor (PWR) assemblies indicated that the method could provide direct assay of Pu-239 and U-235 (and possibly Pu-240 and Pu-241) with uncertainties less than a few percent, assuming suitably efficient instrumentation, an intense pulsed neutron source, and improvements in the time-spectra analysis methods used to extract isotopic information from a complex LSDS signal. This previous simulation-based evaluation used relatively simple PWR fuel assembly definitions (e.g. constant burnup across the assembly) and a constant initial enrichment and cooling time. The time-spectra analysis method was founded on a preliminary analytical model of self-shielding intended to correct for assay-signal nonlinearities introduced by attenuation of the interrogating neutron flux within the assembly.

  9. Plant gravitropic signal transduction: A network analysis leads to gene discovery

    Science.gov (United States)

    Wyatt, Sarah

    Gravity plays a fundamental role in plant growth and development. Although a significant body of research has helped define the events of gravity perception, the role of the plant growth regulator auxin, and the mechanisms resulting in the gravity response, the events of signal transduction, those that link the biophysical action of perception to a biochemical signal that results in auxin redistribution, those that regulate the gravitropic effects on plant growth, remain, for the most part, a “black box.” Using a cold affect, dubbed the gravity persistent signal (GPS) response, we developed a mutant screen to specifically identify components of the signal transduction pathway. Cloning of the GPS genes have identified new proteins involved in gravitropic signaling. We have further exploited the GPS response using a multi-faceted approach including gene expression microarrays, proteomics analysis, and bioinformatics analysis and continued mutant analysis to identified additional genes, physiological and biochemical processes. Gene expression data provided the foundation of a regulatory network for gravitropic signaling. Based on these gene expression data and related data sets/information from the literature/repositories, we constructed a gravitropic signaling network for Arabidopsis inflorescence stems. To generate the network, both a dynamic Bayesian network approach and a time-lagged correlation coefficient approach were used. The dynamic Bayesian network added existing information of protein-protein interaction while the time-lagged correlation coefficient allowed incorporation of temporal regulation and thus could incorporate the time-course metric from the data set. Thus the methods complemented each other and provided us with a more comprehensive evaluation of connections. Each method generated a list of possible interactions associated with a statistical significance value. The two networks were then overlaid to generate a more rigorous, intersected

  10. Preparing a Safety Analysis Report using the building block approach

    International Nuclear Information System (INIS)

    Herrington, C.C.

    1990-01-01

    The credibility of the applicant in a licensing proceeding is severely impacted by the quality of the license application, particularly the Safety Analysis Report. To ensure the highest possible credibility, the building block approach was devised to support the development of a quality Safety Analysis Report. The approach incorporates a comprehensive planning scheme that logically ties together all levels of the investigation and provides the direction necessary to prepare a superior Safety Analysis Report

  11. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP–AES and Portable XRF Instruments: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Hyeongyu Lee

    2016-03-01

    Full Text Available Understanding spatial variation of potentially toxic trace elements (PTEs in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP–AES analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP–AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP–AES and portable X-ray fluorescence (PXRF analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP–AES analysis data, PXRF analysis data, both ICP–AES and transformed PXRF analysis data by considering the correlation between the ICP–AES and PXRF analysis data, and co-kriging to both the ICP–AES (primary variable and PXRF analysis data (secondary variable. Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP–AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP–AES and PXRF analysis data.

  12. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP-AES and Portable XRF Instruments: A Comparative Study.

    Science.gov (United States)

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-03-30

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP-AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP-AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP-AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP-AES analysis data, PXRF analysis data, both ICP-AES and transformed PXRF analysis data by considering the correlation between the ICP-AES and PXRF analysis data, and co-kriging to both the ICP-AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP-AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP-AES and PXRF analysis data.

  13. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP–AES and Portable XRF Instruments: A Comparative Study

    Science.gov (United States)

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-01-01

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP–AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP–AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP–AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP–AES analysis data, PXRF analysis data, both ICP–AES and transformed PXRF analysis data by considering the correlation between the ICP–AES and PXRF analysis data, and co-kriging to both the ICP–AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP–AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP–AES and PXRF analysis data. PMID:27043594

  14. Tanzania: A Hierarchical Cluster Analysis Approach | Ngaruko ...

    African Journals Online (AJOL)

    Using survey data from Kibondo district, west Tanzania, we use hierarchical cluster analysis to classify borrower farmers according to their borrowing behaviour into four distinctive clusters. The appreciation of the existence of heterogeneous farmer clusters is vital in forging credit delivery policies that are not only ...

  15. The fuzzy approach to statistical analysis

    NARCIS (Netherlands)

    Coppi, Renato; Gil, Maria A.; Kiers, Henk A. L.

    2006-01-01

    For the last decades, research studies have been developed in which a coalition of Fuzzy Sets Theory and Statistics has been established with different purposes. These namely are: (i) to introduce new data analysis problems in which the objective involves either fuzzy relationships or fuzzy terms;

  16. Concept Analysis of Spirituality: An Evolutionary Approach.

    Science.gov (United States)

    Weathers, Elizabeth; McCarthy, Geraldine; Coffey, Alice

    2016-04-01

    The aim of this article is to clarify the concept of spirituality for future nursing research. Previous concept analyses of spirituality have mostly reviewed the conceptual literature with little consideration of the empirical literature. The literature reviewed in prior concept analyses extends from 1972 to 2005, with no analysis conducted in the past 9 years. Rodgers' evolutionary framework was used to review both the theoretical and empirical literature pertaining to spirituality. Evolutionary concept analysis is a formal method of philosophical inquiry, in which papers are analyzed to identify attributes, antecedents, and consequences of the concept. Empirical and conceptual literature. Three defining attributes of spirituality were identified: connectedness, transcendence, and meaning in life. A conceptual definition of spirituality was proposed based on the findings. Also, four antecedents and five primary consequences of spirituality were identified. Spirituality is a complex concept. This concept analysis adds some clarification by proposing a definition of spirituality that is underpinned by both conceptual and empirical research. Furthermore, exemplars of spirituality, based on prior qualitative research, are presented to support the findings. Hence, the findings of this analysis could guide future nursing research on spirituality. © 2015 Wiley Periodicals, Inc.

  17. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  18. Rapid lead isotope analysis of archaeological metals by multiple-collector inductively coupled plasma mass spectrometry

    DEFF Research Database (Denmark)

    Baker, J.A.; Stos, S.; Waight, Tod Earle

    2006-01-01

    Lead isotope ratios in archaeological silver and copper were determined by MC-ICPMS using laser ablation and bulk dissolution without lead purification. Laser ablation results on high-lead metals and bulk solution analyses on all samples agree within error of TIMS data, suggesting that problems...

  19. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael

    2017-01-01

    This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power in agricultu......This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power...

  20. An artificial intelligence approach towards disturbance analysis

    International Nuclear Information System (INIS)

    Fiedler, U.; Lindner, A.; Baldeweg, F.; Klebau, J.

    1986-01-01

    Scale and degree of sophistication of technological plants, e.g. nuclear power plants, have been essentially increased during the last decades. Conventional disturbance analysis systems have proved to work successfully in well-known situations. But in cases of emergencies, the operator needs more advanced assistance in realizing diagnosis and therapy control. The significance of introducing artificial intelligence (AI) methods in nuclear power technology is emphasized. Main features of the on-line disturbance analysis system SAAP-2 are reported about. It is being developed for application to nuclear power plants. Problems related to man-machine communication will be gone into more detail, because their solution will influence end-user acceptance considerably. (author)

  1. A cognitive-axiological approach to the chairman’s letter of the leading civil aircraft manufacturers

    Directory of Open Access Journals (Sweden)

    Mª Enriqueta Cortés de los Ríos

    2017-10-01

    Full Text Available In this paper our aim is to look into the dominant axiological values in the aeronautical discourse through the Chairman´s letters included in the annual reports published in 2014 and 2015 of some leading civil transport aircraft manufacturers: Airbus, Boeing, Bombardier, Embraer and Textron Aviation. We will analyse the positive qualities and the way in which they are introduced in this type of discourse through cognitive tools such as metaphor, metonymy and image schemas. This work has been structured according to the Cognitive Theory of Metaphor and Metonymy (Lakoff & Johnson, 1980, 1999 and extended to axiological semantics (Krzeszowski, 1997, 2004 and to specialized discourse (Adams & Cruz García, 2007; Cortés de los Ríos, 2010; Nicolae 2010, among others. However, the specific approach to a technological manufacturing sector does not seem to have been carried out so far and, consequently, this study tries both to bridge this gap and to show new evidence for the correct interpretation of this genre. The results show that the source domains that are regularly used to highlight the positive qualities of the new planes and their manufacturing companies are extremely diverse. Consequently, source domains such as LENS, JOURNEY, CONSTRUCTION, GAME and LIVING ORGANISM, among others, can be found in these corporate letters. However, only two metonymies (INSTITUTION FOR PEOPLE RESPONSIBLE and PART FOR THE WHOLE are used to communicate positive attributes in this sector and the number of relevant image schemas in the corpus (attribute, space, container and balance is also limited and occasional.

  2. Integrated micro-biochemical approach for phytoremediation of cadmium and lead contaminated soils using Gladiolus grandiflorus L cut flower.

    Science.gov (United States)

    Mani, Dinesh; Kumar, Chitranjan; Patel, Niraj Kumar

    2016-02-01

    The potential of vermicompost, elemental sulphur, Thiobacillus thiooxidans and Pseudomonas putida for phytoremediation is well known individually but their integrated approach has not been discovered so far. The present work highlights the consideration of so far overlooked aspects of their integrated treatment by growing the ornamental plant, Gladiolus grandiflorus L in uncontaminated and sewage-contaminated soils (sulphur-deficient alluvial Entisols, pH 7.6-7.8) for phytoremediation of cadmium and lead under pot experiment. Between vermicompost and elemental sulphur, the response of vermicompost was higher towards improvement in the biometric parameters of plants, whereas the response of elemental sulphur was higher towards enhanced bioaccumulation of heavy metals under soils. The integrated treatment (T7: vermicompost 6g and elemental sulphur 0.5gkg(-1) soil and co-inoculation of the plant with T. thiooxidans and P. putida) was found superior in promoting root length, plant height and dry biomass of the plant. The treatment T7 caused enhanced accumulation of Cd up to 6.96 and 6.45mgkg(-1) and Pb up to 22.6 and 19.9mgkg(-1) in corm and shoot, respectively at the contaminated soil. T7 showed maximum remediation efficiency of 0.46% and 0.19% and bioaccumulation factor of 2.92 and 1.21 and uptake of 6.75 and 21.4mgkg(-1) dry biomass for Cd and Pb respectively in the contaminated soil. The integrated treatment T7 was found significant over the individual treatments to promote plant growth and enhance phytoremediation. Hence, authors conclude to integrate vermicompost, elemental sulphur and microbial co-inoculation for the enhanced clean-up of Cd and Pb-contaminated soils. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Bioremoval of lead using Pennisetum purpureum augmented with Enterobacter cloacae-VITPASJ1: A pot culture approach.

    Science.gov (United States)

    Das, Anamika; Belgaonkar, Priyanka; Raman, Aditya S; Banu, Sofia; Osborne, Jabez W

    2017-06-01

    Lead is a toxic heavy metal discharged into the ecosystem from various industries. Biological remediation strategies have been effective in the bioremoval of lead. In our current study, a phytobacterial system using Pennisetum purpureum along with lead-resistant bacterium (LRB) was employed for the uptake of lead. The LRB was obtained from lead-contaminated sites. The isolate VITPASJ1 was found to be highly tolerant to lead and was identified as an effective plant growth-promoting bacterium. The 16S rRNA sequencing revealed VITPASJ1 to be the closest neighbour of Enterobacter cloacae. The lead-resistant gene pbrA in the plant and the bacterium were amplified using a specific primer. The uptake of lead was studied by phytoremediation and rhizoremediation set-ups where the soil was supplemented with various concentrations of lead (50, 100, 150 mg/kg). The plants were uprooted at regular intervals, and the translocation of lead into the plant was determined by atomic absorption spectroscopy. The root length, shoot height and chlorophyll content were found to be higher in the rhizoremediation set-up when compared to the phytoremediation set-up. The scanning electron microscopic micrographs gave a clear picture of increased tissue damage in the root and shoot of the phytoremediation set-up as compared to the rhizoremediation set-up with LRB.

  4. Energy policy and externalities: the life cycle analysis approach

    International Nuclear Information System (INIS)

    Virdis, M.R.

    2002-01-01

    In the energy sector, getting the prices right is a prerequisite for market mechanisms to work effectively towards sustainable development. However, energy production and use creates 'costs' external to traditional accounting practices, such as damages to human health and the environment resulting from residual emissions or risks associated with dependence on foreign suppliers. Energy market prices do not fully reflect those external costs. For example, the costs of climate change are not internalized and, therefore, consumers do not get the right price signals leading them to make choices that are optimised from a societal viewpoint. Economic theory has developed approaches to assessing and internalizing external costs that can be applied to the energy sector and, in principle, provide means to quantify and integrate relevant information in a comprehensive framework. The tools developed for addressing these issues are generally aimed at monetary valuation of impacts and damages and integration of the valued 'external costs' in total cost of the product, e.g. electricity. The approach of Life Cycle Analysis (LCA) provides a conceptual framework for a detailed and comprehensive comparative evaluation of energy supply options. This paper offers a summary of the LCA methodology and an overview of some of its limitations. It then illustrates, through a few examples, how the methodology can be used to inform or correct policy making and to orient investment decisions. Difficulties and issues emerging at various stages in the application and use of LCA results are discussed, although in such a short note, it is impossible to address all issues related to LCA. Therefore, as part of the concluding section, some issues are left open - and areas in which further analytical work may be needed are described. (author)

  5. Collaborative Approach to Network Behavior Analysis

    Science.gov (United States)

    Rehak, Martin; Pechoucek, Michal; Grill, Martin; Bartos, Karel; Celeda, Pavel; Krmicek, Vojtech

    Network Behavior Analysis techniques are designed to detect intrusions and other undesirable behavior in computer networks by analyzing the traffic statistics. We present an efficient framework for integration of anomaly detection algorithms working on the identical input data. This framework is based on high-speed network traffic acquisition subsystem and on trust modeling, a well-established set of techniques from the multi-agent system field. Trust-based integration of algorithms results in classification with lower error rate, especially in terms of false positives. The presented framework is suitable for both online and offline processing, and introduces a relatively low computational overhead compared to deployment of isolated anomaly detection algorithms.

  6. Proteomic analysis of the metabolic adaptation of the biocontrol agent Pseudozyma flocculosa leading to glycolipid production

    Directory of Open Access Journals (Sweden)

    Bélanger Richard R

    2010-02-01

    Full Text Available Abstract The yeast-like epiphytic fungus Pseudozyma flocculosa is known to antagonize powdery mildew fungi through proliferation on colonies presumably preceded by the release of an antifungal glycolipid (flocculosin. In culture conditions, P. flocculosa can be induced to produce or not flocculosin through manipulation of the culture medium nutrients. In order to characterize and understand the metabolic changes in P. flocculosa linked to glycolipid production, we conducted a 2-DE proteomic analysis and compared the proteomic profile of P. flocculosa growing under conditions favoring the development of the fungus (control or conducive to flocculosin synthesis (stress. A large number of protein spots (771 were detected in protein extracts of the control treatment compared to only 435 matched protein spots in extracts of the stress cultures, which clearly suggests an important metabolic reorganization in slow-growing cells producing flocculosin. From the latter treatment, we were able to identify 21 protein spots that were either specific to the treatment or up-regulated significantly (2-fold increase. All of them were identified based on similarity between predicted ORF of the newly sequenced genome of P. flocculosa with Ustilago maydis' available annotated sequences. These proteins were associated with the carbon and fatty acid metabolism, and also with the filamentous change of the fungus leading to flocculosin production. This first look into the proteome of P. flocculosa suggests that flocculosin synthesis is elicited in response to specific stress or limiting conditions.

  7. Lead-Bismuth Eutectic cooled experimental Accelerator Driven System. Windowless target unit thermal-hydraulic analysis

    International Nuclear Information System (INIS)

    Bianchi, F.; Ferri, R.; Moreau, V.

    2004-01-01

    A main concern related to the peaceful use of nuclear energy is the safe management of nuclear wastes, with particular attention to long-lived fission products. An increasing attention has recently been addressed to transmutation systems (Accelerator Driven System: ADS) able to 'burn' the actinides and some of the long-lived fission products (High-Level Waste: HLW), transforming them in short or medium-lived wastes that may be easier managed and stored in the geological disposal, with the consequent easier acceptability by population. An ADS consists of a subcritical-core coupled with an accelerator by means of a target. This paper deals with the thermal-hydraulic analysis, performed with STAR-CD and RELAP5 codes for the windowless target unit of Lead-Bismuth Eutectic (LBE) cooled experimental ADS (XADS), both to assess its behaviour during operational and accident sequences and to provide input data for the thermal-mechanical analyses. It also reports a description of modifications properly implemented in the codes used for the assessment of this kind of plants. (author)

  8. Quantitative Analysis of Lead in Tea Samples by Laser-Induced Breakdown Spectroscopy

    Science.gov (United States)

    Wang, J.; Shi, M.; Zheng, P.; Xue, S.

    2017-03-01

    Laser-induced breakdown spectroscopy (LIBS) is applied at natural atmosphere to compare the quantitative analysis performances of the toxic heavy metal element lead (Pb) in Pu'er tea leaves, determined by three calibration methods: the external standard method, the internal standard method, and the multiple linear regression method. The Pb I line at 405.78 nm is chosen as the analytical spectral line to perform the calibration. The linear correlation coefficients ( R 2 ) of the predicted concentrations versus the standard reference concentrations determined by the three methods are 0.97916, 0.98462, and 0.99647, respectively. The multiple linear regression method gives the best performance with respect to average relative errors (ARE = 2.69%), maximum relative errors (MRE = 4.94%), average relative standard deviations (ARSD = 9.69%) and maximum relative standard deviations (MRSD = 24.44%) of the predicted concentrations of Pb in eight samples, compared to the other two methods. It is shown that the multiple linear regression method is more accurate and stable in predicting concentrations of Pb in Pu'er tea leaf samples.

  9. The mobility of Atlantic baric depressions leading to intense precipitation over Italy: a preliminary statistical analysis

    Directory of Open Access Journals (Sweden)

    N. Tartaglione

    2006-01-01

    Full Text Available The speed of Atlantic surface depressions, occurred during the autumn and winter seasons and that lead to intense precipitation over Italy from 1951 to 2000, was investigated. Italy was divided into 5 regions as documented in previous climatological studies (based on Principal Component Analysis. Intense precipitation events were selected on the basis of in situ rain gauge data and clustered according to the region that they hit. For each intense precipitation event we tried to identify an associated surface depression and we tracked it, within a large domain covering the Mediterranean and Atlantic regions, from its formation to cyclolysis in order to estimate its speed. 'Depression speeds' were estimated with 6-h resolution and clustered into slow and non-slow classes by means of a threshold, coinciding with the first quartile of speed distribution and depression centre speeds were associated with their positions. Slow speeds occurring over an area including Italy and the western Mediterranean basin showed frequencies higher than 25%, for all the Italian regions but one. The probability of obtaining by chance the observed more than 25% success rate was estimated by means of a binomial distribution. The statistical reliability of the result is confirmed for only one region. For Italy as a whole, results were confirmed at 95% confidence level. Stability of the statistical inference, with respect to errors in estimating depression speed and changes in the threshold of slow depressions, was analysed and essentially confirmed the previous results.

  10. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  11. Sediment Analysis Using a Structured Programming Approach

    Directory of Open Access Journals (Sweden)

    Daniela Arias-Madrid

    2012-12-01

    Full Text Available This paper presents an algorithm designed for the analysis of a sedimentary sample of unconsolidated material and seeks to identify very quickly the main features that occur in a sediment and thus classify them fast and efficiently. For this purpose, it requires that the weight of each particle size to be entered in the program and using the method of Moments, which is based on four equations representing the mean, standard deviation, skewness and kurtosis, is found the attributes of the sample in few seconds. With the program these calculations are performed in an effective and more accurately way, obtaining also the explanations of the results of the features such as grain size, sorting, symmetry and origin, which helps to improve the study of sediments and in general the study of sedimentary rocks.

  12. Towards a More Holistic Stakeholder Analysis Approach

    DEFF Research Database (Denmark)

    Sedereviciute, Kristina; Valentini, Chiara

    2011-01-01

    are identified based on the dimensions of connectivity and the content shared. Accordingly, the study introduces four groups of important actors from social media: unconcerned lurkers, unconcerned influencers, concerned lurkers and concerned influencers and integrates them into the existing Stakeholder Salience...... in finding stakeholders on new environments (social media), where connectivity and relationships play a key role. The argument stems from the need to assess stakeholder presence beyond the dyadic ties. Consequently, the combination of the Stakeholder Salience Model (SSM) and social network analysis (SNA......) is proposed as a more holistic solution for stakeholder identification including those from social media. A process of finding “unknown” but important stakeholders from social media was identified incorporating the content search and the principles of SNA. Consequently, stakeholders from social media...

  13. Nu-Way Snaps and Snap Leads: an Important Connection in the History of Behavior Analysis

    OpenAIRE

    Escobar, Rogelio; Lattal, Kennon A.

    2014-01-01

    Beginning in the early 1950s, the snap lead became an integral and ubiquitous component of the programming of electromechanical modules used in behavioral experiments. It was composed of a Nu-Way snap connector on either end of a colored electrical wire. Snap leads were used to connect the modules to one another, thereby creating the programs that controlled contingencies, arranged reinforcers, and recorded behavior in laboratory experiments. These snap leads populated operant conditioning la...

  14. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  15. A global optimization approach to multi-polarity sentiment analysis.

    Science.gov (United States)

    Li, Xinmiao; Li, Jing; Wu, Yukeng

    2015-01-01

    Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From

  16. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  17. A novel approach for endocardial resynchronization therapy: Initial experience with transapical implantation of the left ventricular lead

    NARCIS (Netherlands)

    I. Kassai (Imre); A. Mihalcz (Attila); C. Foldesi (Csaba); A. Kardos (Attila); T. Szili-Torok (Tamas)

    2009-01-01

    textabstractBackground: Coronary sinus lead placement for transvenous left ventricular (LV) pacing in cardiac resynchronization therapy (CRT) has a significant failure rate at implant and a considerable dislocation rate during follow-up. For these patients epicardial pacing lead implantation is the

  18. Fatigue in engineering structures. A three fold analysis approach

    International Nuclear Information System (INIS)

    Malik, Afzaal M.; Qureshi, Ejaz M.; Dar, Naeem Ullah; Khan, Iqbal

    2007-01-01

    The integrity in most of the engineering structures in influenced by the presence of cracks or crack like defects. These structures fail, even catastrophically if a crack greater than a critically safe size exist. Although most of the optimal designed structures are initially free from critical cracks, sub-critical cracks can lead to failures under cyclic loadings, called fatigue crack growth. It is nearly impractical to prevent sub-critical crack growth in engineering structures particularly in crack sensitive structures like most of the structures in nuclear, aerospace and aeronautical domains. However, it is essential to predict the fatigue crack growth for these structures to preclude the in service failures causing loss of assets. The present research presents an automatic procedure for the prediction of fatigue crack growth in three dimensional engineering structures and the key data for the fracture mechanics based design: the stress intensity factors. Three fold analysis procedures are adopted to investigate the effects of repetitive (cyclic) loadings on the fatigue life of different geometries of aluminum alloy 2219-O. A general purpose Finite Element (FE) Code ANSYS-8.0 is used to predict/estimate the fatigue life of the geometries. Computer codes utilizing the Green's Function are developed to calculate the stress intensity factors. Another code based on superposition technique presented by Shivakumara and Foreman is developed to calculate the fatigue crack growth rate, fatigue life (No. of loading cycles are developed to validate the results and finally full scale laboratory tests are conducted for the comparison of the results. The results showing a close co-relation between the different techniques employed gives the promising feature of the analysis approach for the future work. (author)

  19. Sensitivity based reduced approaches for structural reliability analysis

    Indian Academy of Sciences (India)

    the system parameters and the natural frequencies. For these reasons a scientific and systematic approach is required to predict the probability of failure of a structure at the design stage. Probabilistic structural reliability analysis is one such approach. This can be implemented in conjunction with the stochastic finite element ...

  20. Nonlinear chaos-dynamical approach to analysis of atmospheric ...

    Indian Academy of Sciences (India)

    *Corresponding author. E-mail: glushkovav@gmail.com. Abstract. We present the theoretical foundations of an effective universal complex chaos-dynamical approach to the analysis and prediction of atmospheric radon 222Rn concentration ..... are taken in descend- ing order. There are a few approaches to computing.

  1. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  2. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  3. Frequency Approach to Analysis of ESD Pulse

    Science.gov (United States)

    Baran, Janusz; Sroka, Jan

    The paper concerns calibration of generators for simulation of the Human-Metal Electrostatic Discharge (ESD) according to the IEC 61000-4-2 standard. It is shown that analysis of the ESD pulse in the frequency domain can be an indication if omitting the frequency considerations in calibration of ESD guns is acceptable. The calibration setup consists of a target (current converter), attenuator, coaxial cable and a wideband, single shot oscilloscope. It is much easier to use only a low frequency model of such a measurement path setup than consider a high frequency model. If, however, a high frequency treatment is indispensable, then a frequency dependent transfer impedance of the measurement path and approximation of the oscilloscope frequency response with an infinite impulse response discrete time filter are adequate tools. Comparison of power spectral densities (PSD) of theoretical pulses, measured pulses as well as the measurement path noise gives a good criterion for specifying minimal bandwidth of a setup required for reliable calibration of a given ESD gun. This paper is a resume of previous papers of the authors, in which these issues were presented in details.

  4. System Issues Leading to "Found-on-Floor" Incidents: A Multi-Incident Analysis.

    Science.gov (United States)

    Shaw, James; Bastawrous, Marina; Burns, Susan; McKay, Sandra

    2016-11-02

    Although attention to patient safety issues in the home care setting is growing, few studies have highlighted health system-level concerns that contribute to patient safety incidents in the home. Found-on-floor (FOF) incidents are a key patient safety issue that is unique to the home care setting and highlights a number of opportunities for system-level improvements to drive enhanced patient safety. We completed a multi-incident analysis of FOF incidents documented in the electronic record system of a home health care agency in Toronto, Canada, for the course of 1 year between January 2012 and February 2013. Length of stay (LOS) was identified as the cross-cutting theme, illustrating the following 3 key issues: (1) in the short LOS group, a lack of information continuity led to missed fall risk information by home care professionals; (2) in the medium LOS group, a lack of personal support worker/carer training in fall prevention led to inadequate fall prevention activity; and (3) in the long LOS group, a lack of accountability policy at a system level led to a lack of fall risk assessment follow-up. Our study suggests that considering LOS in the home care sector helps expose key system-level issues enabling safety incidents such as FOF to occur. Our multi-incident analysis identified a number of opportunities for system-level changes that might improve fall prevention practice and reduce the likelihood of FOF incidents in the home. Specifically, investment in electronic health records that are functional across the continuum of care, further research and understanding of the training and skills of personal support workers, and enhanced incentives or more punitive approaches (depending on the circumstances) to ensure accountability in home safety will strengthen the home care sector and help prevent FOF incidents among older people.

  5. Radiometric trace analysis quantitative paper chromatography of lead with phosphate-32P

    NARCIS (Netherlands)

    Erkelens, P.C. van

    1961-01-01

    A method is described for the selective determination of lead in paper chromatograms, down to 1 μg (standard deviation 11%). After development and drying, the lead spot is sprayed with a Na2H32PO4 solution and dried. Excess reagent and alkaline earth phosphates are eluted with a borax—oxalate buffer

  6. Phasing out lead from gasoline in Pakistan: a benefit cost analysis

    International Nuclear Information System (INIS)

    Martin, R.P.; Zaman, Q.U.

    1999-01-01

    Medical research has established a clear link between elevated blood lead levels nd adverse health effects in humans including the retardation of neurological development, hypertension, and cardiovascular ailments. Due to this, a large number of countries now restrict the sale of leaded gasoline. In contrast, only highly leaded gasoline is readily available in Pakistan, resulting in serious health concerns in certain areas. This paper presents the findings of a study to evaluate consumers' perceived benefits and actual costs of switching to unleaded gasoline in Pakistan. Policy implications are noted. The study indicates a concentration of adverse health effects in the major urban centers. Of special interest is the loss of approximately 2,5000 IQ points annually in Karachi and Lahore as a result of gasoline linked lead exposure. Consumers' willingness to pay for the removal of lead from gasoline, as estimated using a contingent valuation technique, is shown to be positively related to both educational attainment and income. Once consumers are informed of the adverse health effects associated with lead exposure, their willingness to pay for a switch to unleaded gasoline for exceeds the costs incurred. This suggests that significant gains in social welfare may be obtained by phasing out lead from gasoline in Pakistan. The benefits are most pronounced in urban areas, while in rural villages and small cities the costs are likely to out weight the benefits. A flexible program to restrict the sale of leaded gasoline in urban areas is thus recommended. (author)

  7. Children’s Lead Exposure: A Multimedia Modeling Analysis to Guide Public Health Decision-Making

    Science.gov (United States)

    BACKGROUND: Drinking water and other sources for lead are the subject of public health concerns around the Flint, Michigan, drinking water and East Chicago, Indiana, lead in soil crises. In 2015, the U.S. Environmental Protection Agency (EPA)’s National Drinking Water Advis...

  8. Functional Equivalence of Autistic Leading and Communicative Pointing: Analysis and Treatment.

    Science.gov (United States)

    Carr, Edward G.; Kemp, Duane C.

    1989-01-01

    Autistic leading in four autistic children, aged three-five, was treated by strengthening pointing as an alternative form of request. Following intervention, pointing gradually replaced leading, and stimulus generalization was observed. Results indicate that functional equivalence and response efficiency can be procedurally combined to…

  9. An Exploration of Students' Motivation to Lead: An Analysis by Race, Gender, and Student Leadership Behaviors

    Science.gov (United States)

    Rosch, David M.; Collier, Daniel; Thompson, Sara E.

    2015-01-01

    This exploratory study examined the motivation to lead of a random sample of 1,338 undergraduate students to determine the degree to which motivation to lead can predict leadership behaviors. Results suggested that students' internal self-identity as a leader positively predicted behavior, while their "social normative" motivation to…

  10. Thermo-fluid dynamics and corrosion analysis of a self cooled lead lithium blanket for the HiPER reactor

    Science.gov (United States)

    Juárez, R.; Zanzi, C.; Hernández, J.; Sanz, J.

    2015-09-01

    The HiPER reactor is the HiPER project phase devoted to power production. To reach a preliminary reactor design, tritium breeding schemes need to be adapted to the HiPER project technologies selection: direct drive ignition, 150 \\text{MJ}/\\text{shot}× 10 Hz of power released through fusion reactions, and the dry first wall scheme. In this paper we address the main challenge of the HiPER EUROFER-based self cooled lead lithium blanket, which is related to the corrosive behavior of Pb-15.7Li in contact with EUROFER. We evaluate the cooling and corrosion behavior of the so-called separated first wall blanket (SFWB) configuration by performing thermo-fluid dynamics simulations using a large eddy simulation approach. Despite the expected improvement over the integrated first wall blanket, we still find an unsatisfactory cooling performance, expressed as a low outlet Pb-15.7Li temperature plus too high corrosion rates derived from local Pb-15.7Li high temperature and velocity, which can mainly be attributed to the geometry of the channels. Nevertheless, the analysis allowed us to devise future modifications of the SFWB to overcome the limitations found with the present design.

  11. The Existence Of Leading Islands Securing And The Border Areas Unitary State Of Indonesia An Analysis In Law Perspective

    Directory of Open Access Journals (Sweden)

    Nazali

    2015-08-01

    Full Text Available Abstract The research was carried with the aim to discover the existence of securing the foremost islands and state border region of the Republic of Indonesia reviewed from a legal perspective which is directly related to the existence of security and dispute resolution methods as well as the governance of the foremost islands and border region in Kalimantan which bordering Malaysia. This study was conducted in Nunukan district and the surrounding provinces of Kalimantan in this research method that used is normative legal analysis data with juridical and qualitative descriptive approach. The results showed that the security of foremost islands and border region of law perspective in accordance with the Law No. 34 of 2004 regarding the Indonesian National Army has not been implemented to the fullest to realize the security of foremost islands and border region as the frontline of the Republic of Indonesia. The existence of leading islands securing and the border region of the Republic of Indonesia still contain many weaknesses in terms of both governance and security.

  12. Prevalence of spontaneous Brugada ECG pattern recorded at standard intercostal leads: A meta-analysis.

    Science.gov (United States)

    Shi, Shaobo; Barajas-Martinez, Hector; Liu, Tao; Sun, Yaxun; Yang, Bo; Huang, Congxin; Hu, Dan

    2018-03-01

    Typical Brugada ECG pattern is the keystone in the diagnosis of Brugada syndrome. However, the exact prevalence remains unclear, especially in Asia. The present study was designed to systematically evaluate the prevalence of spontaneous Brugada ECG pattern recorded at standard leads. We searched the Medline, Embase and Chinese National Knowledge Infrastructure (CNKI) for studies of the prevalence of Brugada ECG pattern, published between Jan 1, 2003, and September 1, 2016. Pooled prevalence of type 1 and type 2-3 Brugada ECG pattern were estimated in a random-effects model, and group prevalence data by the characteristic of studies. Meta-regression analyses were performed to explore the potential sources of heterogeneity, and sensitivity analyses were conducted to assess the effect of each study on the overall prevalence. Thirty-nine eligible studies involving 558,689 subjects were identified. Pooled prevalence of type 1 and 2-3 Brugada ECG pattern was 0.03% (95%CI, 0.01%-0.06%), and 0.42% (95%CI, 0.28%-0.59%), respectively. Regions, sample size, year of publication were the main source of heterogeneity. The prevalence of type 1 Brugada ECG pattern was higher in male, Asia, adult, patient, and fever subjects; but the relation between fever and type 2-3 Brugada ECG pattern was not significant. Sensitivity analysis showed that each study did not lonely affect the prevalence of type 1 and type 2-3 Brugada ECG pattern. Brugada ECG pattern is not rare, especially preponderant in adult Asian males, and fever subjects. Clinical screening and further examination of Brugada syndrome in potential population need to be highlighted. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  13. Structural and electrochemical analysis of chemically synthesized microcubic architectured lead selenide thin films

    Science.gov (United States)

    Bhat, T. S.; Shinde, A. V.; Devan, R. S.; Teli, A. M.; Ma, Y. R.; Kim, J. H.; Patil, P. S.

    2018-01-01

    The present work deals with the synthesis of lead selenide (PbSe) thin films by simple and cost-effective chemical bath deposition method with variation in deposition time. The structural, morphological, and electrochemical properties of as-deposited thin films were examined using characterization techniques such as X-ray diffraction spectroscopy (XRD), field-emission scanning electron microscopy (FE-SEM), X-ray photoelectron spectroscopy (XPS), cyclic voltammetry (CV), galvanostatic charge-discharge and electrochemical impedance spectroscopy. XRD reveals formation of rock salt phase cubic structured PbSe. FE-SEM images show the formation of microcubic structured morphology. The existence of the PbSe is confirmed from the XPS analysis. On the other hand, CV curves show four reaction peaks corresponding to oxidation [PbSe and Pb(OH)2] and reduction (PbO2 and Pb(OH)2) at the surface of PbSe thin films. The PbSe:2 sample deposited for 80 min. shows maximum specific capacitance of 454 ± 5 F g- 1 obtained at 0.25 mA cm- 2 current density. The maximum energy density of 69 Wh kg- 1 was showed by PbSe:2 electrode with a power density of 1077 W kg- 1. Furthermore, electrochemical impedance studies of PbSe:2 thin film show 80 ± 3% cycling stability even after 500 CV cycles. Such results show the importance of microcubic structured PbSe thin film as an anode in supercapacitor devices.

  14. Analysis of sexual behaviour in male rabbits across successive tests leading to sexual exhaustion

    Directory of Open Access Journals (Sweden)

    Pedro Jimenez

    2012-04-01

    Full Text Available Various parameters of sexual behaviour were studied in ten male rabbits daily tested with sexually receptive females (ovariectomized, given estradiol benzoate s.c. 5 µg/day.  The aim of this study was to analyse rabbit sexual behaviour during successive tests leading to sexual exhaustion.  We allowed copulation ad libitum and determined if sexual satiety was reached within a day and sexual exhaustion across several days.  The pair was allowed to copulate freely until the male failed to show sexual interest in that female for 30 minutes. The female was then removed and replaced by another; this procedure was repeated using as many does as needed, until the male showed no interest in any female for 2 hours. Scent-marking (chinning was also recorded, before and after the copulation test.  This whole procedure was repeated daily until the male showed no sexual behaviour at all on a given day.  Within a test, copulation ad libitum led to a gradual increase in the time interval between successive mounts and ejaculations, regardless of the day of testing.  Such increments predicted that the buck was reaching sexual satiety.  The “miss” rate (i.e., the proportion of mounts that did not culminate in ejaculation significantly increased from a median of 25 on the first day to 55 on the last day of testing.  The mean time to reach copulatory inactivity decreased from 4 hrs on the first day to 1 hr on the last day.  The total number of ejaculations within a test decreased from an average of 22 to 6 (first vs last day, respectively and the number of chin marks was reduced by 69% compared with pre-mating values, regardless of the day of testing.  All bucks eventually stopped copulating after a variable number of days (range=2-15 days.  We concluded that, following copulation ad libitum with  several females, male rabbits reach sexual satiety (i.e., they are unable to continue copulating on the same day and, after several days, they also attain

  15. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    Science.gov (United States)

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2018-03-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  16. Application of multilevel analysis approach in management theory

    Directory of Open Access Journals (Sweden)

    S. Morteza Ghayour

    2013-12-01

    Full Text Available Any phenomenon can be considered and analyzed in terms of different perspectives. In multilevel theorists view, structure or structures of the studied phenomenon are used to consider or to analyze it, completely. Level of analysis indicates the purpose of a researcher or theorist that is intended to be explained or justified, like individual, group or organizational levels and then they are generalized. Contrary to multilevel approach, the conventional approach of theorizing considers micro level or macro level. It cannot perform a simultaneous micro–macro level analysis. A multilevel approach characterized by inter-level and multilevel organizational view to organizational phenomena is an attempt to expand the boundaries of knowledge and provide a new plan. This study uses documentary studies to analyze the multi-level approach of theorizing, multilevel models and multilevel analysis.

  17. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  18. An analysis of lead (Pb) from human hair samples (20-40 years of age) by atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    Gelsano, Flordeliza K.; Timing, Laurie D.

    2003-01-01

    This analysis of lead from human hair samples in five different groups namely scavengers from Payatas Quezon City, tricycle drivers, car shop workers, paint factory workers, and students from Polytechnic University of the Philippines. The people from Nagcarlan, Laguna represented as a ''base-line value'' or as a control group. The method applied was acid digestion using HNO 3 and HClO 4 then the samples were subjected to atomic absorption spectrophotometer. In terms of lead found from hair, the scavengers from Payatas Q.C. obtained high exposure of lead among the samples that were tested. The result of the analysis of concentration of lead was expressed in mg/L. (Authors)

  19. Overview of the use of ATHENA for thermal-hydraulic analysis of systems with lead-bismuth coolant

    International Nuclear Information System (INIS)

    Davis, C.B.; Shieh, A. S.

    2000-01-01

    The INEEL and MIT are investigating the suitability of lead-bismuth cooled fast reactor for producing low-cost electricity as well as for actinide burning. This paper is concerned with the general area of thermal-hydraulics of lead-bismuth cooled reactors. The ATHENA code is being used in the thermal-hydraulic design and analysis of lead-bismuth cooled reactors. The ATHENA code was reviewed to determine its applicability for simulating lead-bismuth cooled reactors. Two modifications were made to the code as a result of this review. Specifically, a correlation to represent heat transfer from rod bundles to a liquid metal and a void correlation based on data taken in a mixture of lead-bismuth and steam were added the code. The paper also summarizes the analytical work that is being performed with the code and plans for future analytical work

  20. Hierarchical Approaches to the Analysis of Genetic Diversity in ...

    African Journals Online (AJOL)

    Hierarchical Approaches to the Analysis of Genetic Diversity in Plants: A Systematic Overview. ME Osawaru, MC Ogwu, RO Aiwansoba. Abstract. Hierarchical analysis highlights the nature of relationship between and among type samples as outlined by standard descriptors. It produces an output called dendrogram, which ...

  1. Steady and dynamic states analysis of induction motor: FEA approach

    African Journals Online (AJOL)

    This paper deals with the steady and dynamic states analysis of induction motor using finite element analysis (FEA) approach. The motor has aluminum rotor bars and is designed for direct-on-line operation at 50 Hz. A study of the losses occurring in the motor performed at operating frequency of 50Hz showed that stator ...

  2. Discussion on safety analysis approach for sodium fast reactors

    International Nuclear Information System (INIS)

    Hong, Soon Joon; Choo, Yeon Joon; Suh, Nam Duk; Shin, Ahn Dong; Bae, Moo Hoon

    2012-01-01

    Utilization of nuclear energy is increasingly necessary not only because of the increasing energy consumption but also because of the controls on greenhouse emissions against global warming. To keep step with such demands, advanced reactors are now world widely under development with the aims of highly economical advances, and enhanced safety. Recently, further elaborating is encouraged on the research and development program for Generation IV (GEN IV) reactors, and in collaboration with other interested countries through the Generation IV International Forum (GIF). Sodium cooled Fast Reactor (SFR) is a strong contender amongst the GEN IV reactor concepts. Korea also takes part in that program and plans to construct demonstration reactor of SFR. SFR is under the development for a candidate of small modular reactors, for example, PRISM (Power Reactor Innovative Small Module). Understanding of safety analysis approach has also advanced by the demand of increasing comprehensive safety requirement. Reviewing the past development of the licensing and safety basis in the advanced reactors, such approaches seemed primarily not so satisfactory because the reference framework of licensing and safety analysis approach in the advanced reactors was always the one in water reactors. And, the framework is very plant specific one and thereby the advanced reactors and their frameworks don't look like a well assorted couple. Recently as a result of considerable advances in probabilistic safety assessment (PSA), risk informed approaches are increasingly applied together with some of the deterministic approaches like as the ones in water reactors. Technology neutral framework (TNF) can be said to be the utmost works of such risk informed approaches, even though an intensive assessment of the applicability has not been sufficiently accomplished. This study discusses the viable safety analysis approaches for the urgent application to the construction of pool type SFR. As discussed in

  3. Direct determination approach for the multifractal detrending moving average analysis

    Science.gov (United States)

    Xu, Hai-Chuan; Gu, Gao-Feng; Zhou, Wei-Xing

    2017-11-01

    In the canonical framework, we propose an alternative approach for the multifractal analysis based on the detrending moving average method (MF-DMA). We define a canonical measure such that the multifractal mass exponent τ (q ) is related to the partition function and the multifractal spectrum f (α ) can be directly determined. The performances of the direct determination approach and the traditional approach of the MF-DMA are compared based on three synthetic multifractal and monofractal measures generated from the one-dimensional p -model, the two-dimensional p -model, and the fractional Brownian motions. We find that both approaches have comparable performances to unveil the fractal and multifractal nature. In other words, without loss of accuracy, the multifractal spectrum f (α ) can be directly determined using the new approach with less computation cost. We also apply the new MF-DMA approach to the volatility time series of stock prices and confirm the presence of multifractality.

  4. Computer and Internet Addiction: Analysis and Classification of Approaches

    Directory of Open Access Journals (Sweden)

    Zaretskaya O.V.

    2017-08-01

    Full Text Available The theoretical analysis of modern research works on the problem of computer and Internet addiction is carried out. The main features of different approaches are outlined. The attempt is made to systematize researches conducted and to classify scientific approaches to the problem of Internet addiction. The author distinguishes nosological, cognitive-behavioral, socio-psychological and dialectical approaches. She justifies the need to use an approach that corresponds to the essence, goals and tasks of social psychology in the field of research as the problem of Internet addiction, and the dependent behavior in general. In the opinion of the author, this dialectical approach integrates the experience of research within the framework of the socio-psychological approach and focuses on the observed inconsistencies in the phenomenon of Internet addiction – the compensatory nature of Internet activity, when people who are interested in the Internet are in a dysfunctional life situation.

  5. Flavor effects on the electric dipole moments in supersymmetric theories: A beyond leading order analysis

    International Nuclear Information System (INIS)

    Hisano, Junji; Nagai, Minoru; Paradisi, Paride

    2009-01-01

    The standard model predictions for the hadronic and leptonic electric dipole moments (EDMs) are considerably far from the present experimental resolutions; thus, the EDMs represent very clean probes of new physics effects. Especially, within supersymmetric frameworks with flavor-violating soft terms, large and potentially visible effects to the EDMs are typically expected. In this work, we systematically evaluate the predictions for the EDMs at the beyond leading order. In fact, we show that beyond-leading-order contributions to the EDMs dominate over the leading-order effects in large regions of the supersymmetric parameter space. Hence, their inclusion in the evaluation of the EDMs is unavoidable. As an example, we show the relevance of beyond-leading-order effects to the EDMs for a supersymmetric SU(5) model with right-handed neutrinos.

  6. The Risk Factors of Child Lead Poisoning in China: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    You Li

    2016-03-01

    Full Text Available Background: To investigate the risk factors of child lead poisoning in China. Methods: A document retrieval was performed using MeSH (Medical subject heading terms and key words. The Newcastle-Ottawa Scale (NOS was used to assess the quality of the studies, and the pooled odd ratios with a 95% confidence interval were used to identify the risk factors. We employed Review Manager 5.2 and Stata 10.0 to analyze the data. Heterogeneity was assessed by both the Chi-square and I2 tests, and publication bias was evaluated using a funnel plot and Egger’s test. Results: Thirty-four articles reporting 13,587 lead-poisoned children met the inclusion criteria. Unhealthy lifestyle and behaviors, environmental pollution around the home and potential for parents’ occupational exposure to lead were risk factors of child lead poisoning in the pooled analyses. Our assessments yielded no severe publication biases. Conclusions: Seventeen risk factors are associated with child lead poisoning, which can be used to identify high-risk children. Health education and promotion campaigns should be designed in order to minimize or prevent child lead poisoning in China.

  7. Analysis of a Study of Lead Wheel Weight Deposition and Abrasion in New Jersey.

    Science.gov (United States)

    Root, Robert A

    This paper analyzes the implications for children's health of shortcomings in the methods and results of a study of lead in the environment, "Quantity of Lead Released to the Environment in New Jersey in the Form of Motor Vehicle Wheel Weights," by the New Jersey Department of Environmental Protection (Aucott and Caldarelli, Water, Air, & Soil Pollution, 223 , 1743-1752, 2012). The study significantly understates the amount of lead deposited in New Jersey streets as 12 metric tons per year and incorrectly concludes that only 40 kg per year of the lead from wheel weights is abraded into small particles. The 2012 New Jersey Department of Environmental Protection (NJDEP) study misleads regulators and the public into believing that little toxic particulate lead from abraded wheel weights occurs on the streets of New Jersey and by implication that little occurs elsewhere in the United States, thus minimizing the potential health risk that lead wheel weights may have to our nation's children and indeed all of us.

  8. A genetic algorithm approach to routine gamma spectra analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlevaro, C M [Instituto de FIsica de LIquidos y Sistemas Biologicos, Calle 59 No 789, B1900BTE La Plata (Argentina); Wilkinson, M V [Autoridad Regulatoria Nuclear, Avda. del Libertador 8250, C1429BNP Buenos Aires (Argentina); Barrios, L A [Autoridad Regulatoria Nuclear, Avda. del Libertador 8250, C1429BNP Buenos Aires (Argentina)

    2008-01-15

    In this work we present an alternative method for performing routine gamma spectra analysis based on genetic algorithm techniques. The main idea is to search for patterns of single nuclide spectra obtained by simulation in a sample spectrum targeted for analysis. We show how this approach is applied to the analysis of simulated and real target spectra, and also to the study of interference resolution.

  9. Fuzzy set theoretic approach to fault tree analysis

    African Journals Online (AJOL)

    user

    events is replaced by possibilities, thereby leading to fuzzy fault tree analysis. Triangular and trapezoidal fuzzy numbers are used to represent the failure possibility of basic events. Since a system may have to go through different operating conditions during the design or testing phase. Thus the failure possibility of a basic ...

  10. Fuzzy set theoretic approach to fault tree analysis | Tyagi ...

    African Journals Online (AJOL)

    Research in conventional fault tree analysis (FTA) is based mainly on failure probability of basic events, which uses classical probability distributions for the failure probability of basic events. In the present paper the probabilistic consideration of basic events is replaced by possibilities, thereby leading to fuzzy fault tree ...

  11. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  12. LEADING WITH LEADING INDICATORS

    International Nuclear Information System (INIS)

    PREVETTE, S.S.

    2005-01-01

    This paper documents Fluor Hanford's use of Leading Indicators, management leadership, and statistical methodology in order to improve safe performance of work. By applying these methods, Fluor Hanford achieved a significant reduction in injury rates in 2003 and 2004, and the improvement continues today. The integration of data, leadership, and teamwork pays off with improved safety performance and credibility with the customer. The use of Statistical Process Control, Pareto Charts, and Systems Thinking and their effect on management decisions and employee involvement are discussed. Included are practical examples of choosing leading indicators. A statistically based color coded dashboard presentation system methodology is provided. These tools, management theories and methods, coupled with involved leadership and employee efforts, directly led to significant improvements in worker safety and health, and environmental protection and restoration at one of the nation's largest nuclear cleanup sites

  13. Eat, Grow, Lead 4-H: An Innovative Approach to Deliver Campus- Based Field Experiences to Pre-Entry Extension Educators

    Science.gov (United States)

    Weeks, Penny Pennington; Weeks, William G.

    2012-01-01

    Eat, Grow, Lead 4-H Club was created as a pilot program for college students seeking to gain experience as non-formal youth educators, specifically serving pre-entry level Extension educators through a university-based 4-H club. Seventeen student volunteers contributed an estimated 630 hours of service to the club during spring 2011. The club…

  14. Analysis of the laser-induced discoloration of lead white pigment

    International Nuclear Information System (INIS)

    Cooper, M.I.; Fowles, P.S.; Tang, C.C.

    2002-01-01

    The use of laser cleaning in artwork conservation is becoming increasingly important. An investigation into the effects of laser radiation on lead white pigment, considered to be historically the most important of all white pigments used in art, has been undertaken. Samples of pigment and pigment in a water-colour binding medium have been prepared and irradiated by laser radiation at 1064 nm (pulse duration 5-10 ns) at an average fluence of 0.3 J cm -2 . Irradiation under such conditions leads to the formation of an extremely thin discoloured layer. Synchrotron X-ray diffraction (XRD) and X-ray photoelectron spectroscopy (XPS) have been used to characterise the altered layer. Analytical evidence for the formation of elemental lead is presented for the first time and the effect of exposure of the altered layer to air and the effect of a binding medium on the process are discussed

  15. A NOVEL APPROACH FOR 3D NEIGHBOURHOOD ANALYSIS

    Directory of Open Access Journals (Sweden)

    S. Emamgholian

    2017-09-01

    Full Text Available Population growth and lack of land in urban areas have caused massive developments such as high rises and underground infrastructures. Land authorities in the international context recognizes 3D cadastres as a solution to efficiently manage these developments in complex cities. Although a 2D cadastre does not efficiently register these developments, it is currently being used in many jurisdictions for registering land and property information. Limitations in analysis and presentation are considered as examples of such limitations. 3D neighbourhood analysis by automatically finding 3D spaces has become an issue of major interest in recent years. Whereas the neighbourhood analysis has been in the focus of research, the idea of 3D neighbourhood analysis has rarely been addressed in 3 dimensional information systems (3D GIS analysis. In this paper, a novel approach for 3D neighbourhood analysis has been proposed by recording spatial and descriptive information of the apartment units and easements. This approach uses the coordinates of the subject apartment unit to find the neighbour spaces. By considering a buffer around the edges of the unit, neighbour spaces are accurately detected. This method was implemented in ESRI ArcScene and three case studies were defined to test the efficiency of this approach. The results show that spaces are accurately detected in various complex scenarios. This approach can also be applied for other applications such as property management and disaster management in order to find the affected apartments around a defined space.

  16. a Novel Approach for 3d Neighbourhood Analysis

    Science.gov (United States)

    Emamgholian, S.; Taleai, M.; Shojaei, D.

    2017-09-01

    Population growth and lack of land in urban areas have caused massive developments such as high rises and underground infrastructures. Land authorities in the international context recognizes 3D cadastres as a solution to efficiently manage these developments in complex cities. Although a 2D cadastre does not efficiently register these developments, it is currently being used in many jurisdictions for registering land and property information. Limitations in analysis and presentation are considered as examples of such limitations. 3D neighbourhood analysis by automatically finding 3D spaces has become an issue of major interest in recent years. Whereas the neighbourhood analysis has been in the focus of research, the idea of 3D neighbourhood analysis has rarely been addressed in 3 dimensional information systems (3D GIS) analysis. In this paper, a novel approach for 3D neighbourhood analysis has been proposed by recording spatial and descriptive information of the apartment units and easements. This approach uses the coordinates of the subject apartment unit to find the neighbour spaces. By considering a buffer around the edges of the unit, neighbour spaces are accurately detected. This method was implemented in ESRI ArcScene and three case studies were defined to test the efficiency of this approach. The results show that spaces are accurately detected in various complex scenarios. This approach can also be applied for other applications such as property management and disaster management in order to find the affected apartments around a defined space.

  17. Selection of optimal recording sites for limited lead body surface potential mapping: A sequential selection based approach

    Directory of Open Access Journals (Sweden)

    McCullagh Paul J

    2006-02-01

    Full Text Available Abstract Background In this study we propose the development of a new algorithm for selecting optimal recording sites for limited lead body surface potential mapping. The proposed algorithm differs from previously reported methods in that it is based upon a simple and intuitive data driven technique that does not make any presumptions about deterministic characteristics of the data. It uses a forward selection based search technique to find the best combination of electrocardiographic leads. Methods The study was conducted using a dataset consisting of body surface potential maps (BSPM recorded from 116 subjects which included 59 normals and 57 subjects exhibiting evidence of old Myocardial Infarction (MI. The performance of the algorithm was evaluated using spatial RMS voltage error and correlation coefficient to compare original and reconstructed map frames. Results In all, three configurations of the algorithm were evaluated and it was concluded that there was little difference in the performance of the various configurations. In addition to observing the performance of the selection algorithm, several lead subsets of 32 electrodes as chosen by the various configurations of the algorithm were evaluated. The rationale for choosing this number of recording sites was to allow comparison with a previous study that used a different algorithm, where 32 leads were deemed to provide an acceptable level of reconstruction performance. Conclusion It was observed that although the lead configurations suggested in this study were not identical to that suggested in the previous work, the systems did bear similar characteristics in that recording sites were chosen with greatest density in the precordial region.

  18. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  19. Leading coordinate analysis of reaction pathways in proton chain transfer: Application to a two-proton transfer model for the green fluorescent protein

    International Nuclear Information System (INIS)

    Wang Sufan; Smith, Sean C.

    2006-01-01

    The 'leading coordinate' approach to computing an approximate reaction pathway, with subsequent determination of the true minimum energy profile, is applied to a two-proton chain transfer model based on the chromophore and its surrounding moieties within the green fluorescent protein (GFP). Using an ab initio quantum chemical method, a number of different relaxed energy profiles are found for several plausible guesses at leading coordinates. The results obtained for different trial leading coordinates are rationalized through the calculation of a two-dimensional relaxed potential energy surface (PES) for the system. Analysis of the 2-D relaxed PES reveals that two of the trial pathways are entirely spurious, while two others contain useful information and can be used to furnish starting points for successful saddle-point searches. Implications for selection of trial leading coordinates in this class of proton chain transfer reactions are discussed, and a simple diagnostic function is proposed for revealing whether or not a relaxed pathway based on a trial leading coordinate is likely to furnish useful information

  20. Fetal ECG extraction using independent component analysis by Jade approach

    Science.gov (United States)

    Giraldo-Guzmán, Jader; Contreras-Ortiz, Sonia H.; Lasprilla, Gloria Isabel Bautista; Kotas, Marian

    2017-11-01

    Fetal ECG monitoring is a useful method to assess the fetus health and detect abnormal conditions. In this paper we propose an approach to extract fetal ECG from abdomen and chest signals using independent component analysis based on the joint approximate diagonalization of eigenmatrices approach. The JADE approach avoids redundancy, what reduces matrix dimension and computational costs. Signals were filtered with a high pass filter to eliminate low frequency noise. Several levels of decomposition were tested until the fetal ECG was recognized in one of the separated sources output. The proposed method shows fast and good performance.

  1. A global optimization approach to multi-polarity sentiment analysis.

    Directory of Open Access Journals (Sweden)

    Xinmiao Li

    Full Text Available Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG and support vector machines (SVM are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA and grid

  2. Comparative analysis of diagnostic 12-lead electrocardiography and 3-dimensional noninvasive mapping.

    Science.gov (United States)

    Leong, Kevin Ming Wei; Lim, Phang Boon; Kanagaratnam, Prapa

    2015-03-01

    The clinical utility of noninvasive electrocardiographic imaging has been demonstrated in a variety of conditions. It has recently been shown to have superior predictive accuracy and higher clinical value than validated 12-lead electrogram algorithms in the localization of arrhythmias arising from the ventricular outflow tract, and displays similar potential in other conditions. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. DELAMINATION AND XRF ANALYSIS OF NIST LEAD IN PAINT FILM STANDARDS

    Science.gov (United States)

    The objectives of this protocol were to remove the laminate coating from lead paint film standards acquired from NIST by means of surface heating. The average XRF value did not change after removal of the polymer coating suggesting that this protocol is satisfactory for renderin...

  4. Extending dynamic segmentation with lead generation : A latent class Markov analysis of financial product portfolios

    NARCIS (Netherlands)

    Paas, L.J.; Bijmolt, T.H.A.; Vermunt, J.K.

    2004-01-01

    A recent development in marketing research concerns the incorporation of dynamics in consumer segmentation.This paper extends the latent class Markov model, a suitable technique for conducting dynamic segmentation, in order to facilitate lead generation.We demonstrate the application of the latent

  5. Toxicological analysis of the risk of lead exposure in metal processing

    African Journals Online (AJOL)

    Australia. 2.41. The aim of the present study is to evaluate toxicological risks for the workers who are exposed to lead in their work environment. (NISSAL factory, Niš, Serbia; hereinafter. "Nissal"). The values of this toxic metal and the biological markers (δ-aminolevulinic acid and coproporphyrin) in the blood and urine were.

  6. Method of analysis for the determination of lead and cadmium in fresh meat

    NARCIS (Netherlands)

    Ruig, de W.G.

    1980-01-01

    This report comprises the result of the RIKILT of an intercomparison on the determination of lead and cadmium in bovine liver and bovine kidney. The aim of this round robbin was to check a wet ashing procedure followed by a flame AAS determination as described too in EEC doc. 2266/VI/77. Special

  7. Basic analysis of sugar cane lead and cane fields of an AIC

    International Nuclear Information System (INIS)

    Diaz Rizo, O.; Saunders, M.; Herrerra, E.; Rodriguez, R.; Mendoza, A.; Meneses, N.; Griffith, J.; Mesa, S.; Zhuk, L.I.; Danilova, E.A.

    1991-01-01

    The concentration of minor and trace elements in sugar cane leaves and soils samples from a cuban sugar factory were determine by means of thermal reactor neutron activation analysis (NAA) and X-ray Fluorescence Analysis (XRFA). The samples were taken according to the methodology of Sugar Minister for leaves and soils analysis. The concentration of 28 elements was determinate. the concentration values obtained by NAA, XRFA and previous analysis are compared

  8. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  9. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  10. Multivariate analysis of 2-DE protein patterns - Practical approaches

    DEFF Research Database (Denmark)

    Jacobsen, Charlotte; Jacobsen, Susanne; Grove, H.

    2007-01-01

    Practical approaches to the use of multivariate data analysis of 2-DE protein patterns are demonstrated by three independent strategies for the image analysis and the multivariate analysis on the same set of 2-DE data. Four wheat varieties were selected on the basis of their baking quality. Two...... of the varieties were of strong baking quality and hard wheat kernel and two were of weak baking quality and soft kernel. Gliadins at different stages of grain development were analyzed by the application of multivariate data analysis on images of 2-DEs. Patterns related to the wheat varieties, harvest times...

  11. Determination of lead in hair and its segmental analysis by solid sampling electrothermal atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Baysal, Asli, E-mail: baysalas@itu.edu.t [Istanbul Technical University, Faculty of Science and Letters, Department of Chemistry, 34469, Maslak, Istanbul (Turkey); Akman, Suleyman, E-mail: akmans@itu.edu.t [Istanbul Technical University, Faculty of Science and Letters, Department of Chemistry, 34469, Maslak, Istanbul (Turkey)

    2010-04-15

    A rapid and practical solid sampling electrothermal atomic absorption spectrometric method was described for the determination of lead in scalp hair. Hair samples were washed once with acetone; thrice with distilled-deionized water and again once with acetone and dried at 75 deg. C. Typically 0.05 to 1.0 mg of dried samples were inserted on the platforms of solid sampling autosampler. The effects of pyrolysis temperature, atomization temperature, the amount of sample as well as addition of a modifier (Pd/Mg) and/or auxiliary digesting agents (hydrogen peroxide and nitric acid) and/or a surfactant (Triton X-100) on the recovery of lead were investigated. Hair samples were washed once with acetone; thrice with distilled-deionized water and again once with acetone and dried at 75 deg. C. Typically 0.05 to 1.0 mg of dried samples were inserted on the platforms of solid sampling autosampler. The limit of detection for lead (3sigma, N = 10) was 0.3 ng/g The addition of modifier, acids, oxidant and surfactant hardly improved the results. Due to the risk of contamination and relatively high blank values, the lead in hair were determined directly without adding any reagent(s). Finally, the method was applied for the segmental determination of lead concentrations in hair of different persons which is important to know when and how much a person was exposed to the analyte. For this purpose, 0.5 cm of pieces were cut along the one or a few close strands and analyzed by solid sampling.

  12. Determination of lead in hair and its segmental analysis by solid sampling electrothermal atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Baysal, Asli; Akman, Suleyman

    2010-01-01

    A rapid and practical solid sampling electrothermal atomic absorption spectrometric method was described for the determination of lead in scalp hair. Hair samples were washed once with acetone; thrice with distilled-deionized water and again once with acetone and dried at 75 deg. C. Typically 0.05 to 1.0 mg of dried samples were inserted on the platforms of solid sampling autosampler. The effects of pyrolysis temperature, atomization temperature, the amount of sample as well as addition of a modifier (Pd/Mg) and/or auxiliary digesting agents (hydrogen peroxide and nitric acid) and/or a surfactant (Triton X-100) on the recovery of lead were investigated. Hair samples were washed once with acetone; thrice with distilled-deionized water and again once with acetone and dried at 75 deg. C. Typically 0.05 to 1.0 mg of dried samples were inserted on the platforms of solid sampling autosampler. The limit of detection for lead (3σ, N = 10) was 0.3 ng/g The addition of modifier, acids, oxidant and surfactant hardly improved the results. Due to the risk of contamination and relatively high blank values, the lead in hair were determined directly without adding any reagent(s). Finally, the method was applied for the segmental determination of lead concentrations in hair of different persons which is important to know when and how much a person was exposed to the analyte. For this purpose, 0.5 cm of pieces were cut along the one or a few close strands and analyzed by solid sampling.

  13. GameOn: A Game-Theoretic Approach to Digital Marketing and Online Lead Generation for Oligopoly Markets

    OpenAIRE

    Mota, Diogo Carvalho dos Santos

    2015-01-01

    The importance and role of digital marketing in today’s competitive world is rapidly increasing. The surge and rapid expansion of digital technologies and especially, the Internet has propelled a shift in the consumers’ habits and consequently in the strategies that firms must employ to attract the maximum number of consumers possible towards their products and/or services. Within these efforts, the online lead generation process is gaining steam and is currently an extremely important activi...

  14. Illustration and analysis of a coordinated approach to an effective forensic trace evidence capability.

    Science.gov (United States)

    Stoney, David A; Stoney, Paul L

    2015-08-01

    An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. Mixcore safety analysis approach used for introduction of Westinghouse fuel assemblies in Ukraine

    International Nuclear Information System (INIS)

    Abdullayev, A.; Baidullin, V.; Maryochin, A.; Sleptsov, S.; Kulish, G.

    2008-01-01

    Six Westinghouse Lead Test Assemblies (LTA) were installed in 2005 and are currently operated in Unit 3 of the South Ukraine NPP (SUNPP) under the Ukraine Nuclear Fuel Qualification Project. At the early stages of the LTAs implementation in Ukraine, there was no experience of licensing of new fuel types, which explains the need to develop approaches for safety substantiation of LTAs. This presentation considers some approaches for performing of safety analysis of the design basis Initiating Events (IE) for the LTA fuel cycles. These approaches are non-standard in terms of the established practices for obtaining the regulatory authorities' permission for the core operation. The analysis was based on the results of the FA and reactor core thermal hydraulic and nuclear design

  16. Defibrillation lead placement using a transthoracic transatrial approach in a case without transvenous access due to lack of the right superior vena cava

    Directory of Open Access Journals (Sweden)

    Yosuke Otsuka

    2015-06-01

    Full Text Available A 65-year-old woman with a history of syncope was diagnosed with hypertrophic cardiomyopathy. She had previously undergone mastectomy of the left breast owing to breast cancer. Holter electrocardiogram (ECG and monitor ECG revealed sick sinus syndrome (Type II and non-sustained ventricular tachycardia. Sustained ventricular tachycardia and ventricular fibrillation were induced in an electrophysiological study. Although the patient was eligible for treatment with a dual chamber implantable cardioverter defibrillator (ICD, venography revealed lack of the right superior vena cava (R-SVC. Lead placement from the left subclavian vein would have increased the risk of lymphedema owing to the patient׳s mastectomy history. Consequently, the defibrillation lead was placed in the right ventricle by direct puncture of the right auricle through the tricuspid valve. The atrial lead was sutured to the atrial wall, and the postoperative course was unremarkable. Defibrillation lead placement using a transthoracic transatrial approach can be an alternative method in cases where a transvenous approach for lead placement is not feasible.

  17. Analysis of the state of poling of lead zirconate titanate (PZT) particles in a Zn-ionomer composite

    NARCIS (Netherlands)

    James, N.K.; Comyn, T.; Hall, D.; Daniel, L.; Kleppe, A.; Zwaag, S. van der; Groen, W.A.

    2016-01-01

    The poling behaviour of tetragonal lead zirconate titanate (PZT) piezoelectric ceramic particles in a weakly conductive ionomer polymer matrix is investigated using high energy synchrotron X-ray diffraction analysis. The poling efficiency, crystallographic texture and lattice strain of the PZT

  18. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Directory of Open Access Journals (Sweden)

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory.

  19. IN SEARCH OF NEW LEAD COMPOUNDS FOR TRYPANOSOMIASIS DRUG DESIGN - A PROTEIN STRUCTURE-BASED LINKED-FRAGMENT APPROACH

    NARCIS (Netherlands)

    VERLINDE, CLMJ; RUDENKO, G; HOL, WGJ

    A modular method for pursuing structure-based inhibitor design in the framework of a design cycle is presented. The approach entails four stages: (1) a design pathway is defined in the three-dimensional structure of a target protein; (2) this pathway is divided into subregions; (3) complementary

  20. Analysis of the equalizing holes resistance in fuel assembly spike for lead-based reactor

    International Nuclear Information System (INIS)

    Zhang, Guangyu; Jin, Ming; Wang, Jianye; Song, Yong

    2017-01-01

    Highlights: • A RELAP5 model for a 10 MWth lead-based reactor was built to study the hydrodynamic characteristics between the equalizing holes in the fuel assembly spike. • Different fuel assembly total blockage scenarios and different resistances for different fuel assemblies were examined. • The inherent safety characteristics of the lead-based reactor was improved by optimizing the configuration of equalizing holes in the fuel assembly spike. - Abstract: To avoid the damage of the fuel rod cladding when a fuel assembly (FA) is totally blocked, a special configuration of the fuel assembly spike was designed with some equalizing holes in the center region which can let the coolant to flow during the totally blockage scenarios of FA. To study the hydrodynamic characteristics between the equalizing holes and an appropriate resistance, a RELAP5 model was developed for a 10 MWth lead-based reactor which used lead-bismuth as coolant. Several FA total blockage and partial core blockage scenarios were selected. The simulation results indicated that when all the FA spike equalizing holes had the same hydraulic resistance, only a narrow range of suitable equalizing holes resistances could be chosen when a FA was blocked. However, in the two or more FA blockage scenarios, there were no appropriate resistances to meet the requirement. In addition, with different FA spike equalizing holes with different resistances, a large range of suitable equalizing hole resistances could be chosen. Especially a series of suitable resistances were selected when the small power FA resistance was 1/2, 1/4, 1/8 of the large one. Under these circumstances, one, two or three FA blockages would not damage the core. These demonstrated that selecting a series of suitable hydraulic resistances for the equalizing holes could improve the safety characteristics of the reactor effectively.

  1. Optical Analysis of Iron-Doped Lead Sulfide Thin Films for Opto-Electronic Applications

    Science.gov (United States)

    Chidambara Kumar, K. N.; Khadeer Pasha, S. K.; Deshmukh, Kalim; Chidambaram, K.; Shakil Muhammad, G.

    Iron-doped lead sulfide thin films were deposited on glass substrates using successive ionic layer adsorption and reaction method (SILAR) at room temperature. The X-ray diffraction pattern of the film shows a well formed crystalline thin film with face-centered cubic structure along the preferential orientation (1 1 1). The lattice constant is determined using Nelson Riley plots. Using X-ray broadening, the crystallite size is determined by Scherrer formula. Morphology of the thin film was studied using a scanning electron microscope. The optical properties of the film were investigated using a UV-vis spectrophotometer. We observed an increase in the optical band gap from 2.45 to 3.03eV after doping iron in the lead sulfide thin film. The cutoff wavelength lies in the visible region, and hence the grown thin films can be used for optoelectronic and sensor applications. The results from the photoluminescence study show the emission at 500-720nm. The vibrating sample magnetometer measurements confirmed that the lead sulfide thin film becomes weakly ferromagnetic material after doping with iron.

  2. Bibliometric analysis of the volume and visibility of Saudi publications in leading anesthesia journals.

    Science.gov (United States)

    Mowafi, Hany A

    2012-01-01

    The quantity and quality of publications by a country indicates its contribution towards scientific development. To examine the volume and impact of the Saudi anesthesia publications in leading anesthesia journals. Fifteen leading anesthesia journals were identified. Saudi publications in these journals from 1991 to 2011 were searched in the databases of Pubmed and Web of Knowledge. For each article, the journal and time of publication, the type of the article and the affiliation of the first author were analysed. The visibility of the publications was related to the number of citations and was analysed for the years 2000 to 2008. Data were compared with selected Arab countries. Two visibility indices were used. The first relates the average citations per Saudi articles in the years following publication to the average global citations. The second relates the average citations per Saudi article in the two years following publication to the impact factor of the journal of publication. The h-index was used as a measure of both volume and visibility. Anesthesiologists from Saudi affiliations published 173 documents in leading 15 anesthesia journals betweent the years 1991-2011, with a marked increase in the last 6 years. Anesthesia and Analgesia journal published 24% of Saudi articles. Saudi universities contributed to 55% of Saudi publications. The visibility of the Saudi articles was 0.7 of the international figures. Saudi anesthesia publications are increasing in recent years. Although the visibility of Saudi publications is below international figures, it compares favourably to Arab countries.

  3. Application of six sigma and AHP in analysis of variable lead time calibration process instrumentation

    Science.gov (United States)

    Rimantho, Dino; Rahman, Tomy Abdul; Cahyadi, Bambang; Tina Hernawati, S.

    2017-02-01

    Calibration of instrumentation equipment in the pharmaceutical industry is an important activity to determine the true value of a measurement. Preliminary studies indicated that occur lead-time calibration resulted in disruption of production and laboratory activities. This study aimed to analyze the causes of lead-time calibration. Several methods used in this study such as, Six Sigma in order to determine the capability process of the calibration instrumentation of equipment. Furthermore, the method of brainstorming, Pareto diagrams, and Fishbone diagrams were used to identify and analyze the problems. Then, the method of Hierarchy Analytical Process (AHP) was used to create a hierarchical structure and prioritize problems. The results showed that the value of DPMO around 40769.23 which was equivalent to the level of sigma in calibration equipment approximately 3,24σ. This indicated the need for improvements in the calibration process. Furthermore, the determination of problem-solving strategies Lead Time Calibration such as, shortens the schedule preventive maintenance, increase the number of instrument Calibrators, and train personnel. Test results on the consistency of the whole matrix of pairwise comparisons and consistency test showed the value of hierarchy the CR below 0.1.

  4. Multi-level approach for parametric roll analysis

    Science.gov (United States)

    Kim, Taeyoung; Kim, Yonghwan

    2011-03-01

    The present study considers multi-level approach for the analysis of parametric roll phenomena. Three kinds of computation method, GM variation, impulse response function (IRF), and Rankine panel method, are applied for the multi-level approach. IRF and Rankine panel method are based on the weakly nonlinear formulation which includes nonlinear Froude- Krylov and restoring forces. In the computation result of parametric roll occurrence test in regular waves, IRF and Rankine panel method show similar tendency. Although the GM variation approach predicts the occurrence of parametric roll at twice roll natural frequency, its frequency criteria shows a little difference. Nonlinear roll motion in bichromatic wave is also considered in this study. To prove the unstable roll motion in bichromatic waves, theoretical and numerical approaches are applied. The occurrence of parametric roll is theoretically examined by introducing the quasi-periodic Mathieu equation. Instability criteria are well predicted from stability analysis in theoretical approach. From the Fourier analysis, it has been verified that difference-frequency effects create the unstable roll motion. The occurrence of unstable roll motion in bichromatic wave is also observed in the experiment.

  5. Multi-level approach for parametric roll analysis

    Directory of Open Access Journals (Sweden)

    Taeyoung Kim

    2011-03-01

    Full Text Available The present study considers multi-level approach for the analysis of parametric roll phenomena. Three kinds of computation method, GM variation, impulse response function (IRF, and Rankine panel method, are applied for the multi-level approach. IRF and Rankine panel method are based on the weakly nonlinear formulation which includes nonlinear Froude-Krylov and restoring forces. In the computation result of parametric roll occurrence test in regular waves, IRF and Rankine panel method show similar tendency. Although the GM variation approach predicts the occurrence of parametric roll at twice roll natural frequency, its frequency criteria shows a little difference. Nonlinear roll motion in bichromatic wave is also considered in this study. To prove the unstable roll motion in bichromatic waves, theoretical and numerical approaches are applied. The occurrence of parametric roll is theoretically examined by introducing the quasi-periodic Mathieu equation. Instability criteria are well predicted from stability analysis in theoretical approach. From the Fourier analysis, it has been verified that difference-frequency effects create the unstable roll motion. The occurrence of unstable roll motion in bichromatic wave is also observed in the experiment.

  6. Theoretical Approach Regarding the Bases of Success and Reasons that May Lead to Failure in Project Management

    OpenAIRE

    Larisa Loredana Dragolea; Denisa Cotîrlea Denisa

    2012-01-01

    The present paper work deals with theoretical concepts regarding the reasons that may lead to a project’s failure or success. The primary aim of the work paper is to help readers to get a view of conceptualizations of project management and its phases, while providing an opportunity to see the interferences of factors that may determine project’s success. This work paper also contains some aspects regarding key-factors and elements that may assure a good management of projects. The present su...

  7. Thermal-hydraulic analysis of an innovative decay heat removal system for lead-cooled fast reactors

    Energy Technology Data Exchange (ETDEWEB)

    Giannetti, Fabio; Vitale Di Maio, Damiano; Naviglio, Antonio; Caruso, Gianfranco, E-mail: gianfranco.caruso@uniroma1.it

    2016-08-15

    Highlights: • LOOP thermal-hydraulic transient analysis for lead-cooled fast reactors. • Passive decay heat removal system concept to avoid lead freezing. • Solution developed for the diversification of the decay heat removal functions. • RELAP5 vs. RELAP5-3D comparison for lead applications. - Abstract: Improvement of safety requirements in GEN IV reactors needs more reliable safety systems, among which the decay heat removal system (DHR) is one of the most important. Complying with the diversification criteria and based on pure passive and very reliable components, an additional DHR for the ALFRED reactor (Advanced Lead Fast Reactor European Demonstrator) has been proposed and its thermal-hydraulic performances are analyzed. It consists in a coupling of two innovative subsystems: the radiative-based direct heat exchanger (DHX), and the pool heat exchanger (PHX). Preliminary thermal-hydraulic analyses, by using RELAP5 and RELAP5-3D© computer programs, have been carried out showing that the whole system can safely operate, in natural circulation, for a long term. Sensitivity analyses for: the emissivity of the DHX surfaces, the PHX water heat transfer coefficient (HTC) and the lead HTC have been carried out. In addition, the effects of the density variation uncertainty on the results has been analyzed and compared. It allowed to assess the feasibility of the system and to evaluate the acceptable range of the studied parameters. A comparison of the results obtained with RELAP5 and RELAP5-3D© has been carried out and the analysis of the differences of the two codes for lead is presented. The features of the innovative DHR allow to match the decay heat removal performance with the trend of the reactor decay heat power after shutdown, minimizing at the same time the risk of lead freezing. This system, proposed for the diversification of the DHR in the LFRs, could be applicable in the other pool-type liquid metal fast reactors.

  8. Impact analysis of leading sub sector on basic sector to regional income in Siak Regency, Riau Province

    Science.gov (United States)

    Astuti, P.; Nugraha, I.; Abdillah, F.

    2018-02-01

    During this time Siak regency only known as oil producing regency in Riau province, but based on the vision of spatial planning Siak’s regency in 2031 there was a shift from petroleum towards to other sectors such as agribusiness, agroindustry and tourism. The purpose of this study was to identify the sector base, the leading subsectors and shift with their characteristics and to identify the leading subsectors development priority. The method used in this research consisted of the method of Location Quotient (LQ, Shift Share, and Overlay method). The research results were used Location Quotient (LQ) to identify sector’s base in Siak regency based on the document of PDRB. The sector’s refers to the constant prices year of 2000 were mining and quarrying sector (2.25). The sector’s base using document of PDRB at constant prices 2000 without oil and gas sector was the agricultural sector with a value of LQ was 2,45. The leading sub sector in the Siak regency with mining and quarrying sector was oil and gas (1.02) and leading sub sector without oil and gas sector was the plantation sector (1.48) and forestry sector (1.73). Overlay analysis results shown that agriculture sector as a sector base and plantation and forestry as a leading sub sector has positive value and categorize as progressive and competitiveness. Because of that, this leading sub sector gets high priority to developing.

  9. Inference on the Univariate Frailty Model: A Bayesian Reference Analysis Approach

    Science.gov (United States)

    Tomazella, Vera Lucia D.; Martins, Camila Bertini; Bernardo, Jose Miguel

    2008-11-01

    In this work we present an approach involving objective Bayesian reference analysis to the Frailty model with univariate survival time and sources of heterogeneity that are not captured by covariates. The derivation unconditional hazard and survival leads to the Lomax distribution, also known as the Pareto distribution of the second kind. This distribution has an important position in life testing to adjust data from business failures. Reference analysis, introduced by Bernardo (1979) produce a new solution for this problem. The results are illustrated with survival data analyzed in the literature and simulated data.

  10. Blurring the Inputs: A Natural Language Approach to Sensitivity Analysis

    Science.gov (United States)

    Kleb, William L.; Thompson, Richard A.; Johnston, Christopher O.

    2007-01-01

    To document model parameter uncertainties and to automate sensitivity analyses for numerical simulation codes, a natural-language-based method to specify tolerances has been developed. With this new method, uncertainties are expressed in a natural manner, i.e., as one would on an engineering drawing, namely, 5.25 +/- 0.01. This approach is robust and readily adapted to various application domains because it does not rely on parsing the particular structure of input file formats. Instead, tolerances of a standard format are added to existing fields within an input file. As a demonstration of the power of this simple, natural language approach, a Monte Carlo sensitivity analysis is performed for three disparate simulation codes: fluid dynamics (LAURA), radiation (HARA), and ablation (FIAT). Effort required to harness each code for sensitivity analysis was recorded to demonstrate the generality and flexibility of this new approach.

  11. Helicopter Gas Turbine Engine Performance Analysis : A Multivariable Approach

    NARCIS (Netherlands)

    Arush, Ilan; Pavel, M.D.

    2017-01-01

    Helicopter performance relies heavily on the available output power of the engine(s) installed. A simplistic single-variable analysis approach is often used within the flight-testing community to reduce raw flight-test data in order to predict the available output power under different atmospheric

  12. An Approach to Scenario Analysis, Generation and Evaluation

    NARCIS (Netherlands)

    Chen, Y.; Van Zuylen, H.J.

    2014-01-01

    This article presents an operation-oriented approach for traffic management scenario generation, analysis and evaluation. We start taking a few most applied scenarios from a traffic control centre, analysing each component and structure of the whole, and evaluating the impact of each component and

  13. Semiotic Approach to the Analysis of Children's Drawings

    Science.gov (United States)

    Turkcan, Burcin

    2013-01-01

    Semiotics, which is used for the analysis of a number of communication languages, helps describe the specific operational rules by determining the sub-systems included in the field it examines. Considering that art is a communication language, this approach could be used in analyzing children's products in art education. The present study aiming…

  14. Intelligent assembly time analysis, using a digital knowledge based approach

    NARCIS (Netherlands)

    Jin, Y.; Curran, R.; Butterfield, J.; Burke, R.; Welch, B.

    2009-01-01

    The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for

  15. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  16. A pathway-centric approach to rare variant association analysis

    Science.gov (United States)

    Richardson, Tom G; Timpson, Nicholas J; Campbell, Colin; Gaunt, Tom R

    2017-01-01

    Current endeavours in rare variant analysis are typically underpowered when investigating association signals from individual genes. We undertook an approach to rare variant analysis which utilises biological pathway information to analyse functionally relevant genes together. Conventional filtering approaches for rare variant analysis are based on variant consequence and are therefore confined to coding regions of the genome. Therefore, we undertook a novel approach to this process by obtaining functional annotations from the Combined Annotation Dependent Depletion (CADD) tool, which allowed potentially deleterious variants from intronic regions of genes to be incorporated into analyses. This work was undertaken using whole-genome sequencing data from the UK10K project. Rare variants from the KEGG pathway for arginine and proline metabolism were collectively associated with systolic blood pressure (P=3.32x10−5) based on analyses using the optimal sequence kernel association test. Variants along this pathway also showed evidence of replication using imputed data from the Avon Longitudinal Study of Parents and Children cohort (P=0.02). Subsequent analyses found that the strength of evidence diminished when analysing genes in this pathway individually, suggesting that they would have been overlooked in a conventional gene-based analysis. Future studies that adopt similar approaches to investigate polygenic effects should yield value in better understanding the genetic architecture of complex disease. PMID:27577545

  17. Grounded theory approach in sermon analysis of sermons on ...

    African Journals Online (AJOL)

    In this article, I am going to discuss the place of grounded theory in qualitative research and the application of Charmaz's approach to it in homiletics. The process of sermon analysis in its different phases will be discussed as well as the interaction of this bottom-up theory with existing homiletic theories in relation to the ...

  18. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    Science.gov (United States)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  19. Metabolic pathway analysis using a nash equilibrium approach

    NARCIS (Netherlands)

    Lucia, Angelo; DiMaggio, Peter A.; Alonso-Martinez, Diego

    2018-01-01

    A novel approach to metabolic network analysis using a Nash Equilibrium (NE) formulation is proposed in which enzymes are considered players in a multi-player game. Each player has its own payoff function with the objective of minimizing the Gibbs free energy associated with the biochemical

  20. Thermal analysis of selected tin-based lead-free solder alloys

    DEFF Research Database (Denmark)

    Palcut, Marián; Sopoušek, J.; Trnková, L.

    2009-01-01

    ) and thermodynamic calculations using the CALPHAD approach. The amount of the alloying elements in the materials was chosen to be close to the respective eutectic composition and the nominal compositions were the following: Sn-3.7Ag-0.7Cu, Sn-1.0Ag-0.5Cu-1Bi (in wt.%). Thermal effects during melting and solidifying...

  1. [Medical doctors driving technological innovation: questions about and innovation management approaches to incentive structures for lead users].

    Science.gov (United States)

    Bohnet-Joschko, Sabine; Kientzler, Fionn

    2010-01-01

    Management science defines user-generated innovations as open innovation and lead user innovation. The medical technology industry finds user-generated innovations profitable and even indispensable. Innovative medical doctors as lead users need medical technology innovations in order to improve patient care. Their motivation to innovate is mostly intrinsic. But innovations may also involve extrinsic motivators such as gain in reputation or monetary incentives. Medical doctors' innovative activities often take place in hospitals and are thus embedded into the hospital's organisational setting. Hospitals find it difficult to gain short-term profits from in-house generated innovations and sometimes hesitate to support them. Strategic investment in medical doctors' innovative activities may be profitable for hospitals in the long run if innovations provide first-mover competitive advantages. Industry co-operations with innovative medical doctors offer chances but also bear potential risks. Innovative ideas generated by expert users may result in even higher complexity of medical devices; this could cause mistakes when applied by less specialised users and thus affect patient safety. Innovations that yield benefits for patients, medical doctors, hospitals and the medical technology industry can be advanced by offering adequate support for knowledge transfer and co-operation models.

  2. The modelling of lead removal from water by deep eutectic solvents functionalized CNTs: artificial neural network (ANN) approach.

    Science.gov (United States)

    Fiyadh, Seef Saadi; AlSaadi, Mohammed Abdulhakim; AlOmar, Mohamed Khalid; Fayaed, Sabah Saadi; Hama, Ako R; Bee, Sharifah; El-Shafie, Ahmed

    2017-11-01

    The main challenge in the lead removal simulation is the behaviour of non-linearity relationships between the process parameters. The conventional modelling technique usually deals with this problem by a linear method. The substitute modelling technique is an artificial neural network (ANN) system, and it is selected to reflect the non-linearity in the interaction among the variables in the function. Herein, synthesized deep eutectic solvents were used as a functionalized agent with carbon nanotubes as adsorbents of Pb 2+ . Different parameters were used in the adsorption study including pH (2.7 to 7), adsorbent dosage (5 to 20 mg), contact time (3 to 900 min) and Pb 2+ initial concentration (3 to 60 mg/l). The number of experimental trials to feed and train the system was 158 runs conveyed in laboratory scale. Two ANN types were designed in this work, the feed-forward back-propagation and layer recurrent; both methods are compared based on their predictive proficiency in terms of the mean square error (MSE), root mean square error, relative root mean square error, mean absolute percentage error and determination coefficient (R 2 ) based on the testing dataset. The ANN model of lead removal was subjected to accuracy determination and the results showed R 2 of 0.9956 with MSE of 1.66 × 10 -4 . The maximum relative error is 14.93% for the feed-forward back-propagation neural network model.

  3. A Diversified Recruitment Approach Incorporating Social Media Leads to Research Participation Among Young Adult-Aged Female Cancer Survivors.

    Science.gov (United States)

    Gorman, Jessica R; Roberts, Samantha C; Dominick, Sally A; Malcarne, Vanessa L; Dietz, Andrew C; Su, H Irene

    2014-06-01

    Purpose: Cancer survivors in their adolescent and young adult (AYA) years are an understudied population, possibly in part because of the high effort required to recruit them into research studies. The aim of this paper is to describe the specific recruitment strategies used in four studies recruiting AYA-aged female cancer survivors and to identify the highest yielding approaches. We also discuss challenges and recommendations. Methods: We recruited AYA-aged female cancer survivors for two studies conducted locally and two conducted nationally. Recruitment strategies included outreach and referral via: healthcare providers and clinics; social media and the internet; community and word of mouth; and a national fertility information hotline. We calculated the yield of each recruitment approach for the local and national studies by comparing the number that participated to the number of potential participants. Results: We recruited a total of 534 participants into four research studies. Seventy-one percent were diagnosed as young adults and 61% were within 3 years of their cancer diagnosis. The highest-yielding local recruitment strategy was healthcare provider and clinic referral. Nationally, social media and internet outreach yielded the highest rate of participation. Overall, internet-based recruitment resulted in the highest number and yield of participants. Conclusion: Our results suggest that outreach through social media and the internet are effective approaches to recruiting AYA-aged female cancer survivors. Forging collaborative relationships with survivor advocacy groups' members and healthcare providers also proved beneficial.

  4. Adaptation and implementation of the TRACE code for transient analysis on designs of cooled lead fast reactors

    International Nuclear Information System (INIS)

    Lazaro, A.; Ammirabile, L.; Martorell, S.

    2014-01-01

    The article describes the changes implemented in the TRACE code to include thermodynamic tables of liquid lead drawn from experimental results. He then explains the process for developing a thermohydraulic model for the prototype ALFRED and analysis of a selection of representative transient conducted within the framework of international research projects. The study demonstrates the applicability of TRACE code to simulate designs of cooled lead fast reactors and exposes the high safety margins are there in this technology to accommodate the most severe transients identified in their security study. (Author)

  5. An approach to transdisciplinary analysis of Health Law complex

    Directory of Open Access Journals (Sweden)

    Mártin Marks Szinvelski

    2016-12-01

    Full Text Available The process of social evolution creates impact on the structure of social systems, or because new rights make complex the process, or because the function of reducing the complexity, inherent in the function of each system, leading to increased the complexity. The subject of right to health is fertile for analysis doubleheader increase / reduction of complexity. In this article, the analysis is based on a vision that goes beyond the limits of a single science. Thus, will be analyzed the Social Health System, in view of the transdisciplinary impact in the ensuring the right to health.

  6. A Chemoinformatics Approach to the Discovery of Lead-Like Molecules from Marine and Microbial Sources En Route to Antitumor and Antibiotic Drugs

    Science.gov (United States)

    Pereira, Florbela; Latino, Diogo A. R. S.; Gaudêncio, Susana P.

    2014-01-01

    The comprehensive information of small molecules and their biological activities in the PubChem database allows chemoinformatic researchers to access and make use of large-scale biological activity data to improve the precision of drug profiling. A Quantitative Structure–Activity Relationship approach, for classification, was used for the prediction of active/inactive compounds relatively to overall biological activity, antitumor and antibiotic activities using a data set of 1804 compounds from PubChem. Using the best classification models for antibiotic and antitumor activities a data set of marine and microbial natural products from the AntiMarin database were screened—57 and 16 new lead compounds for antibiotic and antitumor drug design were proposed, respectively. All compounds proposed by our approach are classified as non-antibiotic and non-antitumor compounds in the AntiMarin database. Recently several of the lead-like compounds proposed by us were reported as being active in the literature. PMID:24473174

  7. Increase in Organization Effectiveness Using Voice Analysis: The System Approach

    Directory of Open Access Journals (Sweden)

    Lina Bartkienė

    2011-04-01

    Full Text Available The main purpose of this article is to analyze literature related to the system theory and to present the system of increase in organization effectiveness using voice analysis. The concepts of the system approach were analyzed, the definition of the system, its components and classification were discussed. Following the principles of the system theory, the system of increase in organization effectiveness using voice analysis was designed. Each element was briefly discussed, i.e. processes influencing the employee, the environment, voice analysis system, expert system, prime and final organizational effectiveness. In addition, the relations between these elements were indentified. Article in Lithuanian

  8. Nurse faculty members' ego states: transactional analysis approach.

    Science.gov (United States)

    Keçeci, Ayla; Taşocak, Gülsün

    2009-10-01

    This study uses a Transactional Analysis Approach (TA) to investigate communication between faculty and students in nursing education. The research population was comprised of nurse faculty members (N=33) employed at a school of nursing and students (N=482) registered at the same school. The research sample was comprised of 26 faculty members and 325 students. Data collection was performed via questionnaires, focus group interviews and observation. Qualitative data were analyzed using descriptive analysis methods, and quantitative data were evaluated using the Mann-Whitney U test and the Pearson moment correlation coefficients technique. Using the Transactional Analysis Approach (TA), faculty members viewed themselves as an Adult and felt they used the Critical Parent ego state the least. Students also perceived that faculty members used the Adult ego state the most and used the Free Child ego state the least.

  9. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  10. Citation analysis: A social and dynamic approach to knowledge organization

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2013-01-01

    other and thereby indicating kinds of relatedness and semantic distance. It is therefore important to view bibliometric techniques as a family of approaches to KO in order to illustrate their relative strengths and weaknesses. The subfield of bibliometrics concerned with citation analysis forms...... a distinct approach to KO which is characterized by its social, historical and dynamic nature, its close dependence on scholarly literature and its explicit kind of literary warrant. The two main methods, co-citation analysis and bibliographic coupling represent different things and thus neither can...... be considered superior for all purposes. The main difference between traditional knowledge organization systems (KOSs) and maps based on citation analysis is that the first group represents intellectual KOSs, whereas the second represents social KOSs. For this reason bibliometric maps cannot be expected ever...

  11. Influence of Superconducting Leads Energy Gap on Electron Transport Through Double Quantum Dot by Markovian Quantum Master Equation Approach

    International Nuclear Information System (INIS)

    Afsaneh, E.; Yavari, H.

    2014-01-01

    The superconducting reservoir effect on the current carrying transport of a double quantum dot in Markovian regime is investigated. For this purpose, a quantum master equation at finite temperature is derived for the many-body density matrix of an open quantum system. The dynamics and the steady-state properties of the double quantum dot system for arbitrary bias are studied. We will show that how the populations and coherencies of the system states are affected by superconducting leads. The energy parameter of system contains essentially four contributions due to dots system-electrodes coupling, intra dot coupling, two quantum dots inter coupling and superconducting gap. The coupling effect of each energy contribution is applied to currents and coherencies results. In addition, the effect of energy gap is studied by considering the amplitude and lifetime of coherencies to get more current through the system. (author)

  12. Analysis of Traffic Accidents Leading to Death Using Tripod Beta Method in Yazd, Iran

    Directory of Open Access Journals (Sweden)

    Mehrzad Ebrahemzadih

    2016-06-01

    Full Text Available This study tried to find the original causes of road accidents to prevent their occurrence. This was a descriptive-analytic retrospective study which assessed 1,000 cases of road accidents leading to death during 2003-2013 using the Tripod Beta method. The latent problems, the contributing preconditions, and corrective strategies for the prevention of occurrence of these accidents were determined. The findings of this study revealed that violation of traffic safety rules, especially deliberate violations and risk-takings decreased with increasing age. In comparative status of the superficial problems, illegal and impermissible speed of drivers accounted for 19.10%, in comparative status of preconditions, violation of safety rules accounted for 32.6% and finally, in comparative status of the latent problems, the presence of financial constraints and time pressure in designing and manufacturing of the cars, and quality of city streets, roads, accounted for 20.1%, of the leading causes of occurrence of accidents in this study.

  13. In vitro and in vivo approaches for the measurement of oral bioavailability of lead (Pb) in contaminated soils: A review

    Energy Technology Data Exchange (ETDEWEB)

    Zia, Munir Hussain, E-mail: MunirZia@gmail.com [Technical Services Department, Fauji Fertilizer Company Limited, Lahore (Pakistan); USDA-ARS, Environmental Management and By-products Utilization Laboratory, Bldg. 007, BARC-West, Beltsville, MD 20705-2350 (United States); Codling, Eton E. [USDA-ARS, Environmental Management and By-products Utilization Laboratory, Bldg. 007, BARC-West, Beltsville, MD 20705-2350 (United States); Scheckel, Kirk G. [US-Environmental Protection Agency, National Risk Management Research Laboratory Land Remediation and Pollution Control Division, 5995 Center Hill Avenue, Cincinnati, OH 45224-1702 (United States); Chaney, Rufus L. [USDA-ARS, Environmental Management and By-products Utilization Laboratory, Bldg. 007, BARC-West, Beltsville, MD 20705-2350 (United States)

    2011-10-15

    We reviewed the published evidence of lead (Pb) contamination of urban soils, soil Pb risk to children through hand-to-mouth activity, reduction of soil Pb bioavailability due to soil amendments, and methods to assess bioaccessibility which correlate with bioavailability of soil Pb. Feeding tests have shown that urban soils may have much lower Pb bioavailability than previously assumed. Hence bioavailability of soil Pb is the important measure for protection of public health, not total soil Pb. Chemical extraction tests (Pb bioaccessibility) have been developed which are well correlated with the results of bioavailability tests; application of these tests can save money and time compared with feeding tests. Recent findings have revealed that fractional bioaccessibility (bioaccessible compared to total) of Pb in urban soils is only 5-10% of total soil Pb, far lower than the 60% as bioavailable as food-Pb presumed by U.S.-EPA (30% absolute bioavailability used in IEUBK model). - Highlights: > Among direct exposure pathways for Pb in urban environments, inadvertent ingestion of soil is considered the major concern. > The concentration of lead in house dusts is significantly related to that in garden soil, and is highest at older homes. > In modeling risks from diet/water/soil Pb, US-EPA presumes that soil-Pb is 60% as bioavailable as other dietary Pb. > Joplin study proved that RBALP method seriously underestimated the ability of phosphate treatments to reduce soil Pb bioavailability. > Zia et al. method has revealed that urban soils have only 5-10% bioaccessible Pb of total Pb level. - Improved risk evaluation and recommendations for Pb contaminated soils should be based on bioavailability-correlated bioaccessible soil Pb rather than total soil Pb.

  14. Occupational exposure and biological evaluation of lead in Iranian workers-a systematic review and meta-analysis

    Directory of Open Access Journals (Sweden)

    Kourosh Sayehmiri

    2016-09-01

    Full Text Available Introduction: Lead exposure is considered as a global health problem. The irreparable harmful effects of this heavy metal on human have been proven in various studies. Comparing to general population, workers in related industries are more exposed to lead. Several studies have investigated lead occupational exposure and its biological evaluation in Iran; however there is no overall estimate. Thus, the present study was conducted to determine the occupational exposure to lead and its biological evaluation in Iranian workers, using systematic review and meta-analysis. Material and Method: This study was carried out based on information obtained from databases including Magiran, Iranmedex, SID, Medlib, Trials Register, Scopus, Pubmed, Science Direct, Cochran, Embase, Medline, Web of Science, Springer, Online Library Wiley, and Google Scholar from 1991 to 2016, using standard key words. All of the reviewed papers which met the inclusion criteria have been evaluated. Data combination was performed according to Random Effects Model using Stata software version 11.1. Result: In the 34 qualified studies, the mean blood lead level (BLL concentration in Iranian workers was estimated 42.8µg/dl (95% CI: 35.15-50.49. The minimum and maximum BLL were belonged to west (28.348µg/dl and center (45.928µg/dl regions of Iran, respectively. Considering different occupations, the lowest mean value was reported in textile industry workers (12.3 µg/dl, while the highest value was for zinc-lead mine workers (72.6 µg/dl. Mean breathing air lead level of Iranian workers reported in 4 studies was estimated 0.23 mg/m3 (95% CI: 0.14-0.33. Conclusion: According to the high concentration of BLL and breathing air, it is recommended to increase protective measures and frequent screening. Scheduled clinical and paraclinical examination should also be performed for workers.

  15. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  16. Following the clues to neuropathic pain. Distribution and other leads reveal the cause and the treatment approach.

    Science.gov (United States)

    Belgrade, M J

    1999-11-01

    Neuropathic pain can seem enigmatic at first because it can last indefinitely and often a cause is not evident. However, heightened awareness of typical characteristics, such as the following, makes identification fairly easy: The presence of certain accompanying conditions (e.g., diabetes, HIV or herpes zoster infection, multiple sclerosis) Pain described as shooting, stabbing, lancinating, burning, or searing Pain worse at night Pain following anatomic nerve distribution Pain in a numb or insensate site The presence of allodynia Neuropathic pain responds poorly to standard pain therapies and usually requires specialized medications (e.g., anticonvulsants, tricyclic antidepressants, opioid analgesics) for optimal control. Successful pain control is enhanced with use of a systematic approach consisting of disease modification, local or regional measures, and systemic therapy.

  17. Lead Us Not into Tanktation: A Simulation Modelling Approach to Gain Insights into Incentives for Sporting Teams to Tank

    Science.gov (United States)

    Tuck, Geoffrey N.; Whitten, Athol R.

    2013-01-01

    Annual draft systems are the principal method used by teams in major sporting leagues to recruit amateur players. These draft systems frequently take one of three forms: a lottery style draft, a weighted draft, or a reverse-order draft. Reverse-order drafts can create incentives for teams to deliberately under-perform, or tank, due to the perceived gain from obtaining quality players at higher draft picks. This paper uses a dynamic simulation model that captures the key components of a win-maximising sporting league, including the amateur player draft, draft choice error, player productivity, and between-team competition, to explore how competitive balance and incentives to under-perform vary according to league characteristics. We find reverse-order drafts can lead to some teams cycling between success and failure and to other teams being stuck in mid-ranking positions for extended periods of time. We also find that an incentive for teams to tank exists, but that this incentive decreases (i) as uncertainty in the ability to determine quality players in the draft increases, (ii) as the number of teams in the league reduces, (iii) as team size decreases, and (iv) as the number of teams adopting a tanking strategy increases. Simulation models can be used to explore complex stochastic dynamic systems such as sports leagues, where managers face difficult decisions regarding the structure of their league and the desire to maintain competitive balance. PMID:24312243

  18. Error propagation in spatial modeling of public health data: a simulation approach using pediatric blood lead level data for Syracuse, New York.

    Science.gov (United States)

    Lee, Monghyeon; Chun, Yongwan; Griffith, Daniel A

    2018-04-01

    Lead poisoning produces serious health problems, which are worse when a victim is younger. The US government and society have tried to prevent lead poisoning, especially since the 1970s; however, lead exposure remains prevalent. Lead poisoning analyses frequently use georeferenced blood lead level data. Like other types of data, these spatial data may contain uncertainties, such as location and attribute measurement errors, which can propagate to analysis results. For this paper, simulation experiments are employed to investigate how selected uncertainties impact regression analyses of blood lead level data in Syracuse, New York. In these simulations, location error and attribute measurement error, as well as a combination of these two errors, are embedded into the original data, and then these data are aggregated into census block group and census tract polygons. These aggregated data are analyzed with regression techniques, and comparisons are reported between the regression coefficients and their standard errors for the error added simulation results and the original results. To account for spatial autocorrelation, the eigenvector spatial filtering method and spatial autoregressive specifications are utilized with linear and generalized linear models. Our findings confirm that location error has more of an impact on the differences than does attribute measurement error, and show that the combined error leads to the greatest deviations. Location error simulation results show that smaller administrative units experience more of a location error impact, and, interestingly, coefficients and standard errors deviate more from their true values for a variable with a low level of spatial autocorrelation. These results imply that uncertainty, especially location error, has a considerable impact on the reliability of spatial analysis results for public health data, and that the level of spatial autocorrelation in a variable also has an impact on modeling results.

  19. The ECE Culminating Design Experience: Analysis of ABET 2000 Compliance at Leading Academic Institutions

    Science.gov (United States)

    2006-05-01

    413: Monolithic Amplifier Circuits Analysis and design of BJT and MOS multi- transistor amplifiers. Feedback theory and application to feedback...Overview of electronic properties of semiconductor. Metal- semiconductor contracts, pn junctions, biopolor transistors , and MOS field-effect... transistors . Properties that are significant to device operation for integrated circuits. Silicon device fabrication technology. EE 140: Linear

  20. A Comprehensive Analysis of Breast Cancer News Coverage in Leading Media Outlets Focusing on Environmental Risks and Prevention

    OpenAIRE

    ATKIN, CHARLES K.; SMITH, SANDI W.; McFETERS, COURTNAY; FERGUSON, VANESSA

    2008-01-01

    Breast cancer has a high profile in the news media, which are a major source of information for cancer patients and the general public. To determine the nature of breast cancer news coverage available to audiences, particularly on the topics of environmental risks and prevention, this content analysis measured a broad array of dimensions in 231 stories appearing in nine leading newspapers, newsmagazines, and television networks in 2003 and 2004. One fourth of all stories reported on various r...

  1. Discordant diagnoses obtained by different approaches in antithrombin mutation analysis

    DEFF Research Database (Denmark)

    Feddersen, Søren; Nybo, Mads

    2014-01-01

    OBJECTIVES: In hereditary antithrombin (AT) deficiency it is important to determine the underlying mutation since the future risk of thromboembolism varies considerably between mutations. DNA investigations are in general thought of as flawless and irrevocable, but the diagnostic approach can...... be critical. We therefore investigated mutation results in the AT gene, SERPINC1, with two different approaches. DESIGN AND METHODS: Sixteen patients referred to the Centre for Thrombosis and Haemostasis, Odense University Hospital, with biochemical indications of AT deficiency, but with a negative denaturing...... high-performance liquid chromatography (DHPLC) mutation screening (routine approach until recently) were included. As an alternative mutation analysis, direct sequencing of all exons and exon-intron boundaries without pre-selection by DHPLC was performed. RESULTS: Out of sixteen patients...

  2. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  3. Multiobjective engineering design optimization problems: a sensitivity analysis approach

    Directory of Open Access Journals (Sweden)

    Oscar Brito Augusto

    2012-12-01

    Full Text Available This paper proposes two new approaches for the sensitivity analysis of multiobjective design optimization problems whose performance functions are highly susceptible to small variations in the design variables and/or design environment parameters. In both methods, the less sensitive design alternatives are preferred over others during the multiobjective optimization process. While taking the first approach, the designer chooses the design variable and/or parameter that causes uncertainties. The designer then associates a robustness index with each design alternative and adds each index as an objective function in the optimization problem. For the second approach, the designer must know, a priori, the interval of variation in the design variables or in the design environment parameters, because the designer will be accepting the interval of variation in the objective functions. The second method does not require any law of probability distribution of uncontrollable variations. Finally, the authors give two illustrative examples to highlight the contributions of the paper.

  4. Relationship between mediation analysis and the structured life course approach

    Science.gov (United States)

    Howe, Laura D; Smith, Andrew D; Macdonald-Wallis, Corrie; Anderson, Emma L; Galobardes, Bruna; Lawlor, Debbie A; Ben-Shlomo, Yoav; Hardy, Rebecca; Cooper, Rachel; Tilling, Kate; Fraser, Abigail

    2016-01-01

    Abstract Many questions in life course epidemiology involve mediation and/or interaction because of the long latency period between exposures and outcomes. In this paper, we explore how mediation analysis (based on counterfactual theory and implemented using conventional regression approaches) links with a structured approach to selecting life course hypotheses. Using theory and simulated data, we show how the alternative life course hypotheses assessed in the structured life course approach correspond to different combinations of mediation and interaction parameters. For example, an early life critical period model corresponds to a direct effect of the early life exposure, but no indirect effect via the mediator and no interaction between the early life exposure and the mediator. We also compare these methods using an illustrative real-data example using data on parental occupational social class (early life exposure), own adult occupational social class (mediator) and physical capability (outcome). PMID:27681097

  5. Personalized translational epilepsy research - Novel approaches and future perspectives: Part I: Clinical and network analysis approaches.

    Science.gov (United States)

    Rosenow, Felix; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Bauer, Sebastian

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. Part I includes the clinical phenotyping and diagnostic methods, EEG network-analysis, biomarkers, and personalized treatment approaches. In Part II, experimental and translational approaches will be discussed (Bauer et al., 2017) [1]. Copyright © 2017 Elsevier Inc

  6. Uncertainties leading to the use of fuzzy risk analysis of hydrogen safety

    Energy Technology Data Exchange (ETDEWEB)

    Viola, J.; Turksen, I. B.; Venter, R. D. [University of Toronto, Dept. of Mechanical and Industrial Engineering, Toronto ON (Canada)

    2004-07-01

    Safety risks involved with the expanded use of hydrogen as a fuel are discussed. Since the designation of the level of risk is subjective, the sources and level of uncertainty and risk are also examined to determine the degree to which uncertainty and risk are functions of relevant fuel characteristics pertaining to safety, or of the methods of measurement, accuracy and relative importance of these characteristics. Based on the examination of these sources of uncertainties, an analytical procedure based on fuzzy logic is proposed as the most appropriate. Application of fuzzy risk analysis in a variety of situations ranging from economic and investment choices to project analysis and safety are shown to demonstrate the technique's usefulness in assessing the potential risk of using hydrogen as an alternative to fossil fuels. 13 refs., 1 tab., 1 fig.

  7. Cadmium, lead, mercury and arsenic in animal feed and feed materials - trend analysis of monitoring results.

    Science.gov (United States)

    Adamse, Paulien; Van der Fels-Klerx, H J Ine; de Jong, Jacob

    2017-08-01

    This study aimed to obtain insights into the presence of cadmium, lead, mercury and arsenic in feed materials and feed over time for the purpose of guiding national monitoring. Data from the Dutch feed monitoring programme and from representatives of the feed industry during the period 2007-13 were used. Data covered a variety of feed materials and compound feeds in the Netherlands. Trends in the percentage of samples that exceeded the maximum limit (ML) set by the European Commission, and trends in average, median and 90th percentile concentrations of each of these elements were investigated. Based on the results, monitoring should focus on feed material of mineral origin, feed material of marine origin, especially fish meal, seaweed and algae, as well as feed additives belonging to the functional groups of (1) trace elements (notably cupric sulphate, zinc oxide and manganese oxide for arsenic) and (2) binders and anti-caking agents. Mycotoxin binders are a new group of feed additives that also need attention. For complementary feed it is important to make a proper distinction between mineral and non-mineral feed (lower ML). Forage crops in general do not need high priority in monitoring programmes, although for arsenic grass meal still needs attention.

  8. Transient analysis for lead-bismuth-cooled accelerator-driven system proposed by JAEA

    International Nuclear Information System (INIS)

    Sugawara, T.; Nishihara, K.; Tsujimoto, K.

    2015-01-01

    It is supposed that an Accelerator-driven System (ADS) is safer than conventional critical reactors since an ADS is driven by the external neutron source in the subcritical state. In this study, the transient analyses for the lead-bismuth cooled ADS proposed by JAEA were performed using the SIMMER-III and RELAP5/mod3.2 codes to investigate the possibility of core damage. In this research, 3 accidents: the protected loss of heat sink, the protected overcooling and the unprotected blockage accident were considered as typical ADS accidents. Through these calculations, it was confirmed that all calculation results, except for the protected loss of heat sink, fulfilled the no-damage criteria. In the protected loss of heat sink, the cladding tube temperature reached its melting temperature after 18-21 hours, although the calculation condition was very conservative. These results have led to requirements to design a safety system of the ADS to decrease the frequencies of accidents. (authors)

  9. Analysis for lead in undiluted whole blood by tantalum ribbon atomic absorption spectrophotometry.

    Science.gov (United States)

    Therrell, B L; Drosche, J M; Dziuk, T W

    1978-07-01

    We describe a modified tantalum ribbon atomic absorption procedure for determining lead in undiluted whole blood. An instrumentation Laboratory (I.L.) Model 151 atomic absorption spectrophotometer equipped with an I.L. Model 355 Flameless Sampler was used. The Flameless Sampler was slightly modified to include three-cycle operation instead of the normal two cycles. This modified single-beam system, equipped with background correction, allows 5-microliter specimens of whole blood to be quickly and accurately analyzed. No sample preparation other than vortex mixing is involved and method reliability has been demonstrated during an extended period of successful participation in proficiency testing studies conducted by the Center for Disease Control. This tantalum ribbon methodology has further been demonstrated to be effective both as a primary screening procedure and as a confirmatory procedure, when coupled with erythrocyte protoporphyrin determinations, in screening over 300 000 clients during a three-year period of use in the Early and Periodic Screening, Diagnosis and Treatment (EPSDT) Program in Texas.

  10. Analysis of the thorium inclusion in the fuel of a fast reactor cooled by lead

    International Nuclear Information System (INIS)

    Juarez M, L. C.; Francois L, J. L.

    2017-09-01

    In the present work, we first verified a model of the European reactor cooled with lead (ELFR). The calculations were made with the code Monte Carlo serpent 2.27 and the library of cross sections Jeff-3.1. For this verification, three neutron parameters were compared: the evolution of the neutron multiplication factor, the Doppler constant and the effect of the vacuum fraction of the refrigerant, obtaining a good approximation with the reference values. Subsequently, the inclusion of thorium as a fertile material within the fuel was analyzed and the same neutron parameters were compared with the original fuel. The evolution of criticality for the case of thorium fuel differs significantly with respect to that of the original fuel (without thorium); this is due mainly to the breeding of the fissile isotope 233 U. Therefore, is possible to have a longer fuel cycle, favoring the availability factor of the plant, without compromising the performance of the reactor since both the Doppler constant and the effect of the vacuum fraction of the refrigerant show a similar tendency to those of the original fuel, being negative in both cases. (Author)

  11. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  12. An approach to seismic analysis of a CANDU fuelling machine

    International Nuclear Information System (INIS)

    Grover, L.K.; Panesar, J.S.

    1984-01-01

    An approach to seismic analysis of a CANDU fuelling machine (while unattached from the reactor face) and its support structure is presented in this paper. The Response Spectra method of analysis was used. The results obtained are in the form of natural frequencies; displacements, accelerations and loads on the fuelling machine elements; the loads at the base of the columns, column wall supports and the bridge structure. The analytical model incorporates a mathematical model for the reactor building itself. This approach results in an automatic inclusion of all the interaction effects between the reactor building and the fuelling machine support system. The finite element model using the beam, plate and spring elements was prepared for the various components of the fuelling machine and the support structures

  13. Exploratory analysis of spatial and temporal data a systematic approach

    CERN Document Server

    Andrienko, Natalia

    2006-01-01

    Exploratory data analysis (EDA) is about detecting and describing patterns, trends, and relations in data, motivated by certain purposes of investigation. As something relevant is detected in data, new questions arise, causing specific parts to be viewed in more detail. So EDA has a significant appeal: it involves hypothesis generation rather than mere hypothesis testing. The authors describe in detail and systemize approaches, techniques, and methods for exploring spatial and temporal data in particular. They start by developing a general view of data structures and characteristics and then build on top of this a general task typology, distinguishing between elementary and synoptic tasks. This typology is then applied to the description of existing approaches and technologies, resulting not just in recommendations for choosing methods but in a set of generic procedures for data exploration. Professionals practicing analysis will profit from tested solutions - illustrated in many examples - for reuse in the c...

  14. Scientific publications from Arab world in leading journals of Integrative and Complementary Medicine: a bibliometric analysis

    OpenAIRE

    Zyoud, Sa’ed H.; Al-Jabi, Samah W.; Sweileh, Waleed M.

    2015-01-01

    Background Bibliometric analysis is increasingly employed as a useful tool to assess the quantity and quality of research performance. The specific goal of the current study was to evaluate the performance of research output originating from Arab world and published in international Integrative and Complementary Medicine (ICM) journals. Methods Original scientific publications and reviews from the 22 Arab countries that were published in 22 international peer-reviewed ICM journals during all ...

  15. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-06-15

    A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.

  16. A Gaussian Approximation Approach for Value of Information Analysis.

    Science.gov (United States)

    Jalal, Hawre; Alarid-Escudero, Fernando

    2018-02-01

    Most decisions are associated with uncertainty. Value of information (VOI) analysis quantifies the opportunity loss associated with choosing a suboptimal intervention based on current imperfect information. VOI can inform the value of collecting additional information, resource allocation, research prioritization, and future research designs. However, in practice, VOI remains underused due to many conceptual and computational challenges associated with its application. Expected value of sample information (EVSI) is rooted in Bayesian statistical decision theory and measures the value of information from a finite sample. The past few years have witnessed a dramatic growth in computationally efficient methods to calculate EVSI, including metamodeling. However, little research has been done to simplify the experimental data collection step inherent to all EVSI computations, especially for correlated model parameters. This article proposes a general Gaussian approximation (GA) of the traditional Bayesian updating approach based on the original work by Raiffa and Schlaifer to compute EVSI. The proposed approach uses a single probabilistic sensitivity analysis (PSA) data set and involves 2 steps: 1) a linear metamodel step to compute the EVSI on the preposterior distributions and 2) a GA step to compute the preposterior distribution of the parameters of interest. The proposed approach is efficient and can be applied for a wide range of data collection designs involving multiple non-Gaussian parameters and unbalanced study designs. Our approach is particularly useful when the parameters of an economic evaluation are correlated or interact.

  17. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    Beichel, R.R.; Sonka, M.

    2006-01-01

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  18. An algebraic approach to analysis of recursive and concurrent programs

    DEFF Research Database (Denmark)

    Terepeta, Michal Tomasz

    This thesis focuses on formal techniques based on static program analysis, model checking and abstract interpretation that offer means for reasoning about software, verification of its properties and discovering potential bugs. First, we investigate an algebraic approach to static analysis...... implementing those algorithms, which also provides a lot of flexibility with respect to, e.g., various constraints solvers. Finally, we describe one such experimental solver based on Newton’s method. It allows solving equation systems over abstract domains that were not accommodated by other solving techniques...

  19. Spectral Synthesis via Mean Field approach to Independent Component Analysis

    International Nuclear Information System (INIS)

    Hu, Ning; Su, Shan-Shan; Kong, Xu

    2016-01-01

    We apply a new statistical analysis technique, the Mean Field approach to Independent Component Analysis (MF-ICA) in a Bayseian framework, to galaxy spectral analysis. This algorithm can compress a stellar spectral library into a few Independent Components (ICs), and the galaxy spectrum can be reconstructed by these ICs. Compared to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, the MF-ICA approach offers a large improvement in efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter recovery for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters derived with galaxies from the Sloan Digital Sky Survey. We find that our MF-ICA method can not only fit the observed galaxy spectra efficiently, but can also accurately recover the physical parameters of galaxies. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find it can provide excellent fitting results for low signal-to-noise spectra. (paper)

  20. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  1. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    OpenAIRE

    Chahinez Benkoussas; Patrice Bellot

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval ...

  2. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  3. Analysis of approaches to the concept of job satisfaction

    OpenAIRE

    Lysova E.

    2017-01-01

    the article is devoted the problem of satisfaction of person with his professional activities. It examines different approaches to understanding the phenomenon of job satisfaction, the analysis of their evolution in the works of domestic and Western scholars, describes the strengths and weaknesses of different interpretations of the concept of satisfaction. Almost a third of his life a person spends at work. In the labour force, he reveals himself as a person, implements the physical and ment...

  4. A novel approach for system change pathway analysis

    OpenAIRE

    Walaa Ibrahim Gabr

    2016-01-01

    This paper is directed toward presenting a novel approach based on “consolidity charts” for the analysis of natural and man-made systems during their change pathway or course of life. The physical significance of the consolidity chart (region) is that it marks the boundary of all system interactive behavior resulting from all exhaustive internal and external influences. For instance, at a specific event state, the corresponding consolidity region describes all the plausible points of normaliz...

  5. Non-iterative geometric approach for inverse kinematics of redundant lead-module in a radiosurgical snake-like robot.

    Science.gov (United States)

    Omisore, Olatunji Mumini; Han, Shipeng; Ren, Lingxue; Zhang, Nannan; Ivanov, Kamen; Elazab, Ahmed; Wang, Lei

    2017-08-01

    Snake-like robot is an emerging form of serial-link manipulator with the morphologic design of biological snakes. The redundant robot can be used to assist medical experts in accessing internal organs with minimal or no invasion. Several snake-like robotic designs have been proposed for minimal invasive surgery, however, the few that were developed are yet to be fully explored for clinical procedures. This is due to lack of capability for full-fledged spatial navigation. In rare cases where such snake-like designs are spatially flexible, there exists no inverse kinematics (IK) solution with both precise control and fast response. In this study, we proposed a non-iterative geometric method for solving IK of lead-module of a snake-like robot designed for therapy or ablation of abdominal tumors. The proposed method is aimed at providing accurate and fast IK solution for given target points in the robot's workspace. n-1 virtual points (VPs) were geometrically computed and set as coordinates of intermediary joints in an n-link module. Suitable joint angles that can place the end-effector at given target points were then computed by vectorizing coordinates of the VPs, in addition to coordinates of the base point, target point, and tip of the first link in its default pose. The proposed method is applied to solve IK of two-link and redundant four-link modules. Both two-link and four-link modules were simulated with Robotics Toolbox in Matlab 8.3 (R2014a). Implementation result shows that the proposed method can solve IK of the spatially flexible robot with minimal error values. Furthermore, analyses of results from both modules show that the geometric method can reach 99.21 and 88.61% of points in their workspaces, respectively, with an error threshold of 1 mm. The proposed method is non-iterative and has a maximum execution time of 0.009 s. This paper focuses on solving IK problem of a spatially flexible robot which is part of a developmental project for abdominal

  6. Liquid electrode plasma-optical emission spectrometry combined with solid-phase preconcentration for on-site analysis of lead.

    Science.gov (United States)

    Barua, Suman; Rahman, Ismail M M; Alam, Iftakharul; Miyaguchi, Maho; Sawai, Hikaru; Maki, Teruya; Hasegawa, Hiroshi

    2017-08-15

    A relatively rapid and precise method is presented for the determination of lead in aqueous matrix. The method consists of analyte quantitation using the liquid electrode plasma-optical emission spectrometry (LEP-OES) coupled with selective separation/preconcentration by solid-phase extraction (SPE). The impact of operating variables on the retention of lead in SPEs such as pH, flow rate of the sample solution; type, volume, flow rate of the eluent; and matrix effects were investigated. Selective SPE-separation/preconcentration minimized the interfering effect due to manganese in solution and limitations in lead-detection in low-concentration samples by LEP-OES. The LEP-OES operating parameters such as the electrical conductivity of sample solution; applied voltage; on-time, off-time, pulse count for applied voltage; number of measurements; and matrix effects have also been optimized to obtain a distinct peak for the lead at λ max =405.8nm. The limit of detection (3σ) and the limit of quantification (10σ) for lead determination using the technique were found as 1.9 and 6.5ng mL -1 , respectively. The precision, as relative standard deviation, was lower than 5% at 0.1μg mL -1 Pb, and the preconcentration factor was found to be 187. The proposed method was applied to the analysis of lead contents in the natural aqueous matrix (recovery rate:>95%). The method accuracy was verified using certified reference material of wastewaters: SPS-WW1 and ERM-CA713. The results from LEP-OES were in good agreement with inductively coupled plasma optical emission spectrometry measurements of the same samples. The application of the method is rapid (≤5min, without preconcentration) with a reliable detection limit at trace levels. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Early phase drugs and biologicals clinical trials on worldwide leading causes of death: a descriptive analysis.

    Science.gov (United States)

    Dal-Ré, Rafael

    2011-06-01

    To describe the global effort targeting the major causes of mortality in terms of "open" early phase clinical trials with drugs and biologicals. Sixteen of the 20 leading causes of death were chosen; 9 of these were also amongst the top 10 causes of death in low-income countries. Studies were identified from the ClinicalTrials.gov database and included phase 1 and/or 2 "interventional" "open" trials, i.e. those recruiting or about to start recruitment. Trials were considered in terms of sponsorship [industry, universities and other organisations (UNO), and US federal agencies (NIH included)], genders and age groups included, and whether they were conducted with drugs and/or biologicals. The search was performed in March 2010. A total of 2,298 (824 phase 1; 1,474 phase 2) trials were retrieved. Of these, 67% were on trachea, bronchus, and lung cancers (25%); diabetes mellitus (15%); colon and rectum cancers (14%); and HIV/AIDS (12%). In contrast, only 4% were trials on diarrhoeal disease, nephrosis and nephritis, liver cirrhosis, and prematurity and low birth weight. UNO were the first source of funding. Fifty-two percent of phase 1 non-cancer trials were on healthy volunteers. Twenty-nine percent of all trials were co-funded. There were 4.6 times as many drug trials as those with biologicals. Only 7% were conducted with a combination of drugs and biologicals, the majority (78%) on cancers. Discrimination in terms of gender or age group was not observed. Four of the 16 diseases considered represented 2/3 of early phase trials. Cancers were a top priority for all sponsors. Increasing attention should be given to conditions with current and projected global high mortality rates that had few "open" early phase trials.

  8. Analysis of induced electrical currents from magnetic field coupling inside implantable neurostimulator leads

    Directory of Open Access Journals (Sweden)

    Seidman Seth J

    2011-10-01

    Full Text Available Abstract Background Over the last decade, the number of neurostimulator systems implanted in patients has been rapidly growing. Nearly 50, 000 neurostimulators are implanted worldwide annually. The most common type of implantable neurostimulators is indicated for pain relief. At the same time, commercial use of other electromagnetic technologies is expanding, making electromagnetic interference (EMI of neurostimulator function an issue of concern. Typically reported sources of neurostimulator EMI include security systems, metal detectors and wireless equipment. When near such sources, patients with implanted neurostimulators have reported adverse events such as shock, pain, and increased stimulation. In recent in vitro studies, radio frequency identification (RFID technology has been shown to inhibit the stimulation pulse of an implantable neurostimulator system during low frequency exposure at close distances. This could potentially be due to induced electrical currents inside the implantable neurostimulator leads that are caused by magnetic field coupling from the low frequency identification system. Methods To systematically address the concerns posed by EMI, we developed a test platform to assess the interference from coupled magnetic fields on implantable neurostimulator systems. To measure interference, we recorded the output of one implantable neurostimulator, programmed for best therapy threshold settings, when in close proximity to an operating low frequency RFID emitter. The output contained electrical potentials from the neurostimulator system and those induced by EMI from the RFID emitter. We also recorded the output of the same neurostimulator system programmed for best therapy threshold settings without RFID interference. Using the Spatially Extended Nonlinear Node (SENN model, we compared threshold factors of spinal cord fiber excitation for both recorded outputs. Results The electric current induced by low frequency RFID emitter

  9. Analysis of induced electrical currents from magnetic field coupling inside implantable neurostimulator leads.

    Science.gov (United States)

    Pantchenko, Oxana S; Seidman, Seth J; Guag, Joshua W

    2011-10-21

    Over the last decade, the number of neurostimulator systems implanted in patients has been rapidly growing. Nearly 50, 000 neurostimulators are implanted worldwide annually. The most common type of implantable neurostimulators is indicated for pain relief. At the same time, commercial use of other electromagnetic technologies is expanding, making electromagnetic interference (EMI) of neurostimulator function an issue of concern. Typically reported sources of neurostimulator EMI include security systems, metal detectors and wireless equipment. When near such sources, patients with implanted neurostimulators have reported adverse events such as shock, pain, and increased stimulation. In recent in vitro studies, radio frequency identification (RFID) technology has been shown to inhibit the stimulation pulse of an implantable neurostimulator system during low frequency exposure at close distances. This could potentially be due to induced electrical currents inside the implantable neurostimulator leads that are caused by magnetic field coupling from the low frequency identification system. To systematically address the concerns posed by EMI, we developed a test platform to assess the interference from coupled magnetic fields on implantable neurostimulator systems. To measure interference, we recorded the output of one implantable neurostimulator, programmed for best therapy threshold settings, when in close proximity to an operating low frequency RFID emitter. The output contained electrical potentials from the neurostimulator system and those induced by EMI from the RFID emitter. We also recorded the output of the same neurostimulator system programmed for best therapy threshold settings without RFID interference. Using the Spatially Extended Nonlinear Node (SENN) model, we compared threshold factors of spinal cord fiber excitation for both recorded outputs. The electric current induced by low frequency RFID emitter was not significant to have a noticeable effect on

  10. Micro energy dispersive X-ray fluorescence analysis of polychrome lead-glazed Portuguese faiences

    International Nuclear Information System (INIS)

    Guilherme, A.; Pessanha, S.; Carvalho, M.L.; Santos, J.M.F. dos; Coroado, J.

    2010-01-01

    Several glazed ceramic pieces, originally produced in Coimbra (Portugal), were submitted to elemental analysis, having as premise the pigment manufacture production recognition. Although having been produced in Coimbra, their location changed as time passed due to historical reasons. A recent exhibition in Coimbra brought together a great number of these pieces and in situ micro Energy Dispersive X-ray Fluorescence (μ-EDXRF) analyses were performed in order to achieve some chemical and physical data on the manufacture of faiences in Coimbra. A non-commercial μ-EDXRF equipment for in situ analysis was employed in this work, carrying some important improvements when compared to the conventional ones, namely, analyzing spot sizes of about 100 μm diameter. The combination of a capillary X-ray lens with a new generation of low power microfocus X-ray tube and a drift chamber detector enabled a portable unit for micro-XRF with a few tens of μm lateral resolution. The advantages in using a portable system emphasized with polycapillary optics enabled to distinguish proximal different pigmented areas, as well as the glaze itself. These first scientific results on the pigment analysis of the collection of faiences seem to point to a unique production center with own techniques and raw materials. This conclusion arose with identification of the blue pigments having in its constitution Mn, Fe Co and As and the yellows as a result of the combination between Pb and Sb. A statistical treatment was used to reveal groups of similarities on the pigments elemental profile.

  11. Chemical characterization of tin-lead glazed ceramics from Aragon (Spain) by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Inanez, J.G. [Smithsonian Institution, Suitland, MD (United States). Museum Conservation Inst.; Barcelona Univ. (Spain). Facultat de Geografia i Historia; Speakman, R.J. [Smithsonian Institution, Suitland, MD (United States). Museum Conservation Inst.; Buxeda i Garrigos, J. [Barcelona Univ. (Spain). Facultat de Geografia i Historia; Glascock, M.D. [Missouri Univ., Columbia, MO (United States). Research Reactor Center

    2010-07-01

    Majolica pottery was the most characteristic tableware produced in Spain during the Medieval and Renaissance periods. A study of the three main production centers in the historical region of Aragon during Middle Ages and Renaissance was conducted on a set of 71 samples. The samples were analyzed by instrumental neutron activation analysis (INAA), and the resulting data were interpreted using an array of multivariate statistical procedures. Our results show a clear discrimination among different production centers allowing a reliable provenance attribution of ceramic sherds from the Aragonese workshops. (orig.)

  12. Chemical characterization of tin-lead glazed ceramics from Aragon (Spain) by neutron activation analysis

    International Nuclear Information System (INIS)

    Inanez, J.G.; Barcelona Univ.; Speakman, R.J.; Buxeda i Garrigos, J.; Glascock, M.D.

    2010-01-01

    Majolica pottery was the most characteristic tableware produced in Spain during the Medieval and Renaissance periods. A study of the three main production centers in the historical region of Aragon during Middle Ages and Renaissance was conducted on a set of 71 samples. The samples were analyzed by instrumental neutron activation analysis (INAA), and the resulting data were interpreted using an array of multivariate statistical procedures. Our results show a clear discrimination among different production centers allowing a reliable provenance attribution of ceramic sherds from the Aragonese workshops. (orig.)

  13. Unifying Approach to Analytical Chemistry and Chemical Analysis: Problem-Oriented Role of Chemical Analysis.

    Science.gov (United States)

    Pardue, Harry L.; Woo, Jannie

    1984-01-01

    Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)

  14. Critical plane analysis of multiaxial fatigue experiments leading to White Etching Crack formation

    Science.gov (United States)

    Averbeck, S.; Kerscher, E.

    2017-05-01

    Various researchers have shown that rolling contact fatigue can be reproduced with cyclic compression-torsion experiments, with the load components either in-phase or out of phase. As reported previously, the authors used such experiments to reproduce the rolling contact fatigue phenomenon “White Etching Cracks” which can cause premature failures of rolling element bearings. It is characterized by subsurface crack initiation and propagation coupled with microstructural changes alongside the crack flanks. Surprisingly, only in-phase load superposition caused these microstructural changes to occur. This suggests that White Etching Crack formation is somehow linked to the multiaxial stress state in the specimens, as this was the only variable that changed between in-phase and out-of-phase testing. In this study, the multiaxial stress state in the two experiments is analysed and compared using different critical plane criteria. In contrast to common ways of characterizing the stress state, e.g. equivalent stress approaches, this class of criteria is explicitly designed for multiaxial stress states. Special attention is given to the Dang Van criterion, which has been used in a number of rolling contact fatigue studies.

  15. Mechanistic approach to the sodium leakage and fire analysis

    International Nuclear Information System (INIS)

    Yamaguchi, Akira; Muramatsu, Toshiharu; Ohira, Hiroaki; Ida, Masao

    1997-04-01

    In December 1995, a thermocouple well was broken and liquid sodium leaked out of the intermediate heat transport system of the prototype fast breeder reactor Monju. In the initiating process of the incident, liquid sodium flowed out through the hollow thermocouple well, nipple and connector. As a result, liquid sodium, following ignition and combustion, was dropping from the connector to colide with the duct and grating placed below. The collision may cause fragmentation and scattering of the sodium droplet that finally was piled up on the floor. This report deals with the development of computer programs for the phenomena based on mechanistics approach. Numerical analyses are also made for fundamental sodium leakage and combustion phenomenon, sodium combustion experiment, and Monju incident condition. The contents of this report is listed below: (1) Analysis of chemical reaction process based on molecular orbital method, (2) Thermalhy draulic analysis of the sodium combustion experiment II performed in 1996 at O-arai Engineering Center, PNC, (3) Thermalhy draulic analysis of room A-446 of Monju reactor when the sodium leakage took place, (4) Direct numerical simulation of sodium droplet, (5) Sodium leakage and scattering analysis using three dimensional particle method, (6) Multi-dimensional combustion analysis and multi-point approximation combustion analysis code. Subsequent to the development work of the programs, they are to be applied to the safety analysis of the Fast Breeder Reactor. (author)

  16. A QSAR approach for virtual screening of lead-like molecules en route to antitumor and antibiotic drugs from marine and microbial natural products

    Directory of Open Access Journals (Sweden)

    Florbela Pereira

    2014-05-01

    Figure 1. The unreported 15 lead antibiotic MNPs and MbNPs from AntiMarin database, using the best Rfs antibiotic model with a probability of being antibiotic greater than or equal to 0.8. Figure 2. The selected 4 lead antitumor MNPs and MbNPs from the AntiMarin database, using the best Rfs antitumor model with a probability of being antitumor greater than or equal to 0.8. The present work corroborates by one side the results of our previous work6 and enables the presentation of a new set of possible lead like bioactive compounds. Additionally, it is shown the usefulness of quantum-chemical descriptors in the discrimination of biological active and inactive compounds. The use of the εHOMO quantum-chemical descriptor in the discrimination of large scale data sets of lead-like or drug-like compounds has never been reported. This approach results in the reduction, in great extent, of the number of compounds used in real screens, and it reinforces the results of our previous work. Furthermore, besides the virtual screening, the computational methods can be very useful to build appropriate databases, allowing for effective shortcuts of NP extracts dereplication procedures, which will certainly result in increasing the efficiency of drug discovery.

  17. Lead Poisoning

    Science.gov (United States)

    Lead is a metal that occurs naturally in the earth's crust. Lead can be found in all parts of our ... from human activities such as mining and manufacturing. Lead used to be in paint; older houses may ...

  18. Genotypic and environmental variation in cadmium, chromium, lead and copper in rice and approaches for reducing the accumulation

    International Nuclear Information System (INIS)

    Cao, Fangbin; Wang, Runfeng; Cheng, Wangda; Zeng, Fanrong; Ahmed, Imrul Mosaddek; Hu, Xinna; Zhang, Guoping; Wu, Feibo

    2014-01-01

    The field scale trials revealed significant genotypic and environmental differences in grain heavy metal (HM) concentrations of 158 newly developed rice varieties grown in twelve locations of Zhejiang province of China. Grain Pb and Cd contents in 5.3% and 0.4% samples, respectively, were above the maximum permissible concentration (MPC); none of samples had Cr/Cu exceeding MPC. Stepwise multiple linear regression analysis estimated soil HM critical levels for safe rice production. Low grain HM accumulation cultivars such as Xiushui817, Jiayou08-1 and Chunyou689 were recommended as suitable cultivars for planting in slight/medium HM contaminated soils. The alleviating regulator (AR) of (NH 4 ) 2 SO 4 as N fertilizer coupled with foliar spray of a mixture containing glutathione (GSH), Si, Zn and Se significantly decreased grain Cd, Cr, Cu and Pb concentrations grown in HM contaminated fields with no effect on yield, indicating a promising measurement for further reducing grain HM content to guarantee safe food production. - Highlights: • Field trials evaluated situation of grain HM in main rice growing areas of Zhejiang. • Forecasting index system to predict rice grain HM concentration was achieved. • Hybrid rice holds higher grain Cd concentration than conventional cultivars. • Low grain HM accumulation rice cultivars were successfully identified. • Developed alleviating regulator which effectively reduced grain toxic HM

  19. Genotypic and environmental variation in cadmium, chromium, lead and copper in rice and approaches for reducing the accumulation.

    Science.gov (United States)

    Cao, Fangbin; Wang, Runfeng; Cheng, Wangda; Zeng, Fanrong; Ahmed, Imrul Mosaddek; Hu, Xinna; Zhang, Guoping; Wu, Feibo

    2014-10-15

    The field scale trials revealed significant genotypic and environmental differences in grain heavy metal (HM) concentrations of 158 newly developed rice varieties grown in twelve locations of Zhejiang province of China. Grain Pb and Cd contents in 5.3% and 0.4% samples, respectively, were above the maximum permissible concentration (MPC); none of samples had Cr/Cu exceeding MPC. Stepwise multiple linear regression analysis estimated soil HM critical levels for safe rice production. Low grain HM accumulation cultivars such as Xiushui817, Jiayou08-1 and Chunyou689 were recommended as suitable cultivars for planting in slight/medium HM contaminated soils. The alleviating regulator (AR) of (NH₄)₂SO₄ as N fertilizer coupled with foliar spray of a mixture containing glutathione (GSH), Si, Zn and Se significantly decreased grain Cd, Cr, Cu and Pb concentrations grown in HM contaminated fields with no effect on yield, indicating a promising measurement for further reducing grain HM content to guarantee safe food production. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Launch Vehicle Abort Analysis for Failures Leading to Loss of Control

    Science.gov (United States)

    Hanson, John M.; Hill, Ashley D.; Beard, Bernard B.

    2013-01-01

    Launch vehicle ascent is a time of high risk for an onboard crew. There is a large fraction of possible failures for which time is of the essence and a successful abort is possible if the detection and action happens quickly enough. This paper focuses on abort determination based on data already available from the Guidance, Navigation, and Control system. This work is the result of failure analysis efforts performed during the Ares I launch vehicle development program. The two primary areas of focus are the derivation of abort triggers to ensure that abort occurs as quickly as possible when needed, but that false aborts are avoided, and evaluation of success in aborting off the failing launch vehicle.

  1. Comparative Analysis for Polluted Agricultural Soils with Arsenic, Lead, and Mercury in Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Yarto-Ramirez, Mario; Santos-Santos, Elvira; Gavilan-Garcia, Arturo; Castro-Diaz, Jose; Gavilan-Garcia, Irma Cruz; Rosiles, Rene; Suarez, Sara

    2004-03-31

    The use of mercury in Mexico has been associated with the mining industry of Zacatecas. This activity has polluted several areas currently used for agriculture. The main objective of this study was to investigate the heavy metal concentration (Hg, As and Pb) in soil of Guadalupe Zacatecas in order to justify a further environmental risk assessment in the site. A 2X3 km grid was used for the sampling process and 20 soil samples were taken. The analysis was developed using EPA SW 846: 3050B/6010B method for arsenic and metals and EPA SW 846: 7471A for total mercury. It was concluded that there are heavy metals in agricultural soils used for corn and bean farming. For this it is required to make an environmental risk assessment and a bioavailability study in order to determine if there's a risk for heavy metals bioaccumulation in animals or human beings or metal lixiviation to aquifers.

  2. Decoupling of the leading contribution in the discrete BFKL analysis of high-precision HERA data

    Energy Technology Data Exchange (ETDEWEB)

    Kowalski, H. [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Lipatov, L.N. [St. Petersburg State University, St. Petersburg (Russian Federation); Petersburg Nuclear Physics Institute, Gatchina (Russian Federation); Ross, D.A. [University of Southampton, School of Physics and Astronomy, Southampton (United Kingdom); Schulz, O. [Max Planck Institute for Physics, Munich (Germany)

    2017-11-15

    We analyse, in NLO, the physical properties of the discrete eigenvalue solution for the BFKL equation. We show that a set of eigenfunctions with positive eigenvalues, ω, together with a small contribution from a continuum of eigenfunctions with negative ω, provide an excellent description of high-precision HERA F{sub 2} data in the region, x < 0.001, Q{sup 2} > 6 GeV{sup 2}. The phases of the eigenfunctions can be obtained from a simple parametrisation of the pomeron spectrum, which has a natural motivation within BFKL. The data analysis shows that the first eigenfunction decouples completely or almost completely from the proton. This suggests that there exists an additional ground state, which is naturally saturated and may have the properties of the soft pomeron. (orig.)

  3. Biamperometric analysis of nonaqueous scandium solutions containing lanthanides, lead and thorium

    International Nuclear Information System (INIS)

    Gevorgyan, A.M.; Talipov, Sh.T.; Kostylev, V.S.; Khadeev, V.A.; Nadol'skij, M.Ya.

    1978-01-01

    Investigated was a possibility of direct scandium titration in the presence of large rare earth quantities, and also a possibility of complexonometric scandium and rare earth sum determination at their joint presence in non-aqueous acetic acid solution. The titration was carried out at electrode voltage of 0.95V, background electrolyte concentration of lithium perchlorate being 0.2M. Non-aqueous magnesium complexonate was used as titrating reagent. Th and Pb complexonates are shown to be less stable as compared to Sc complexonate, and consequently, Th and Pb ions must not interfere with biamperometric titration of Sc ion. A method applied to analysis of binary mixture, containing scandium, and a method for model alloy and thortveitite mineral was developed. Well reproducible and precise enough results are obtained in all the cases. Ions of Bi, Cu, Cd, Zn, In, Ga and Ti interfere with determination

  4. HemoVision: An automated and virtual approach to bloodstain pattern analysis.

    Science.gov (United States)

    Joris, Philip; Develter, Wim; Jenar, Els; Suetens, Paul; Vandermeulen, Dirk; Van de Voorde, Wim; Claes, Peter

    2015-06-01

    Bloodstain pattern analysis (BPA) is a subspecialty of forensic sciences, dealing with the analysis and interpretation of bloodstain patterns in crime scenes. The aim of BPA is uncovering new information about the actions that took place in a crime scene, potentially leading to a confirmation or refutation of a suspect's statement. A typical goal of BPA is to estimate the flight paths for a set of stains, followed by a directional analysis in order to estimate the area of origin for the stains. The traditional approach, referred to as stringing, consists of attaching a piece of string to each stain, and letting the string represent an approximation of the stain's flight path. Even though stringing has been used extensively, many (practical) downsides exist. We propose an automated and virtual approach, employing fiducial markers and digital images. By automatically reconstructing a single coordinate frame from several images, limited user input is required. Synthetic crime scenes were created and analysed in order to evaluate the approach. Results demonstrate the correct operation and practical advantages, suggesting that the proposed approach may become a valuable asset for practically analysing bloodstain spatter patterns. Accompanying software called HemoVision is currently provided as a demonstrator and will be further developed for practical use in forensic investigations. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    Science.gov (United States)

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  6. A graphical vector autoregressive modelling approach to the analysis of electronic diary data

    Directory of Open Access Journals (Sweden)

    Zipfel Stephan

    2010-04-01

    Full Text Available Abstract Background In recent years, electronic diaries are increasingly used in medical research and practice to investigate patients' processes and fluctuations in symptoms over time. To model dynamic dependence structures and feedback mechanisms between symptom-relevant variables, a multivariate time series method has to be applied. Methods We propose to analyse the temporal interrelationships among the variables by a structural modelling approach based on graphical vector autoregressive (VAR models. We give a comprehensive description of the underlying concepts and explain how the dependence structure can be recovered from electronic diary data by a search over suitable constrained (graphical VAR models. Results The graphical VAR approach is applied to the electronic diary data of 35 obese patients with and without binge eating disorder (BED. The dynamic relationships for the two subgroups between eating behaviour, depression, anxiety and eating control are visualized in two path diagrams. Results show that the two subgroups of obese patients with and without BED are distinguishable by the temporal patterns which influence their respective eating behaviours. Conclusion The use of the graphical VAR approach for the analysis of electronic diary data leads to a deeper insight into patient's dynamics and dependence structures. An increasing use of this modelling approach could lead to a better understanding of complex psychological and physiological mechanisms in different areas of medical care and research.

  7. A global analysis approach for investigating structural resilience in urban drainage systems.

    Science.gov (United States)

    Mugume, Seith N; Gomez, Diego E; Fu, Guangtao; Farmani, Raziyeh; Butler, David

    2015-09-15

    Building resilience in urban drainage systems requires consideration of a wide range of threats that contribute to urban flooding. Existing hydraulic reliability based approaches have focused on quantifying functional failure caused by extreme rainfall or increase in dry weather flows that lead to hydraulic overloading of the system. Such approaches however, do not fully explore the full system failure scenario space due to exclusion of crucial threats such as equipment malfunction, pipe collapse and blockage that can also lead to urban flooding. In this research, a new analytical approach based on global resilience analysis is investigated and applied to systematically evaluate the performance of an urban drainage system when subjected to a wide range of structural failure scenarios resulting from random cumulative link failure. Link failure envelopes, which represent the resulting loss of system functionality (impacts) are determined by computing the upper and lower limits of the simulation results for total flood volume (failure magnitude) and average flood duration (failure duration) at each link failure level. A new resilience index that combines the failure magnitude and duration into a single metric is applied to quantify system residual functionality at each considered link failure level. With this approach, resilience has been tested and characterised for an existing urban drainage system in Kampala city, Uganda. In addition, the effectiveness of potential adaptation strategies in enhancing its resilience to cumulative link failure has been tested. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Reduced self-control leads to disregard of an unfamiliar behavioral option: an experimental approach to the study of neuroenhancement.

    Science.gov (United States)

    Wolff, Wanja; Baumgarten, Franz; Brand, Ralf

    2013-12-06

    Neuroenhancement (NE), the use of psychoactive substances in order to enhance a healthy individual's cognitive functioning from a proficient to an even higher level, is prevalent in student populations. According to the strength model of self-control, people fail to self-regulate and fall back on their dominant behavioral response when finite self-control resources are depleted. An experiment was conducted to test the hypothesis that ego-depletion will prevent students who are unfamiliar with NE from trying it. 130 undergraduates, who denied having tried NE before (43% female, mean age = 22.76 ± 4.15 years old), were randomly assigned to either an ego-depletion or a control condition. The dependent variable was taking an "energy-stick" (a legal nutritional supplement, containing low doses of caffeine, taurine and vitamin B), offered as a potential means of enhancing performance on the bogus concentration task that followed. Logistic regression analysis showed that ego-depleted participants were three times less likely to take the substance, OR = 0.37, p = .01. This experiment found that trying NE for the first time was more likely if an individual's cognitive capacities were not depleted. This means that mental exhaustion is not predictive for NE in students for whom NE is not the dominant response. Trying NE for the first time is therefore more likely to occur as a thoughtful attempt at self-regulation than as an automatic behavioral response in stressful situations. We therefore recommend targeting interventions at this inter-individual difference. Students without previous reinforcing NE experience should be provided with information about the possible negative health outcomes of NE. Reconfiguring structural aspects in the academic environment (e.g. lessening workloads) might help to deter current users.

  9. A pattern recognition approach in X-ray fluorescence analysis

    Science.gov (United States)

    Yin, Lo I.; Trombka, Jacob I.; Seltzer, Stephen M.

    1989-05-01

    In many applications of X-ray fluorescence (XRF) analysis, quantitative information on the chemical components of the sample is not of primary concern. Instead, the XRF spectra are used to monitor changes in the composition among samples, or to select and classify samples with similar compositions. We propose in this paper that the use of pattern recognition technique in such applications may be more convenient than traditional quantitative analysis. The pattern recognition technique discussed here involves only one parameter, i.e., the normalized correlation coefficient and can be applied directly to raw data. Its computation is simple and fast, and can be easily carried out on a personal computer. The efficacy of this pattern recognition approach is illustrated with the analysis of experimental XRF spectra obtained from geological and alloy samples.

  10. Lead exposure and fear-potentiated startle in the VA Normative Aging Study: a pilot study of a novel physiological approach to investigating neurotoxicant effects.

    Science.gov (United States)

    Grashow, Rachel; Miller, Mark W; McKinney, Ann; Nie, Linda H; Sparrow, David; Hu, Howard; Weisskopf, Marc G

    2013-01-01

    Physiologically-based indicators of neural plasticity in humans could provide mechanistic insights into toxicant actions on learning in the brain, and perhaps prove more objective and sensitive measures of such effects than other methods. We explored the association between lead exposure and classical conditioning of the acoustic startle reflex (ASR)-a simple form of associative learning in the brain-in a population of elderly men. Fifty-one men from the VA Normative Aging Study with cumulative bone lead exposure measurements made with K-X-Ray-Fluorescence participated in a fear-conditioning protocol. The mean age of the men was 75.5years (standard deviation [sd]=5.9) and mean patella lead concentration was 22.7μg/g bone (sd=15.9). Baseline ASR eyeblink response decreased with age, but was not associated with subsequent conditioning. Among 37 men with valid responses at the end of the protocol, higher patella lead was associated with decreased awareness of the conditioning contingency (declarative learning; adjusted odds ratio [OR] per 20μg/g patella lead=0.91, 95% confidence interval [CI]: 0.84, 0.99, p=0.03). Eyeblink conditioning (non-declarative learning) was 0.44sd less (95% CI: -0.91, 0.02; p=0.06) per 20μg/g patella lead after adjustment. Each result was stronger when correcting for the interval between lead measurement and startle testing (awareness: OR=0.88, 95% CI: 0.78, 0.99, p=0.04; conditioning: -0.79sd less, 95% CI: -1.56, 0.03, p=0.04). This initial exploration suggests that lead exposure interferes with specific neural mechanisms of learning and offers the possibility that the ASR may provide a new approach to physiologically explore the effects of neurotoxicant exposures on neural mechanisms of learning in humans with a paradigm that is directly comparable to animal models. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Pencil lead scratches on steel surfaces as a substrate for LIBS analysis of dissolved salts in liquids

    Energy Technology Data Exchange (ETDEWEB)

    Jijon, D; Costa, C, E-mail: judijival@hotmail.com [Departamento de Fisica, Escuela Politecnica Nacional, Ladron de Guevara E11-256, Apartado 17-12-866, Quito (Ecuador)

    2011-01-01

    A new substrate for the quantitative analysis of salts dissolved in liquids with Laser-induced Breakdown Spectroscopy (LIBS) is introduced for the first time. A steel surface scratched with HB pencil lead is introduced as a very efficient and sensitive substrate for quantitative analysis of dissolved salts in liquids. In this work we demonstrate the analytical quality of this system with the analysis of the crystalline deposits formed by the dried aqueous solutions of salts. We focused on analytical parameters such as sensitivity and linearity for the salt cations in each case. Four salts were studied (Sr(NO{sub 3}){sub 2}, LiSO{sub 4}, RbCl and BaCl), at nine different concentrations each. To improve linearity and lower the overall error in the calibration curves, we introduce a novel outlier removal method that takes into account the homogeneity of the dry deposits on the analytical surface.

  12. Numerical 3D analysis of cloud cavitation shedding frequency on a circular leading edge hydrofoil with a barotropic cavitation model

    Science.gov (United States)

    Blume, M.; Skoda, R.

    2015-12-01

    A compressible density-based time-explicit low Mach number consistent viscous flow solver is utilised in combination with a barotropic cavitation model for the analysis of cloud cavitation on a circular leading edge (CLE) hydrofoil. For 5° angle of attack, cloud structure and shedding frequency for different cavitation numbers are compared to experimental data. A strong grid sensitivity is found in particular for high cavitation numbers. On a fine grid, a very good agreement with validation data is achieved even without explicit turbulence model. The neglect of viscous effects as well as a two-dimensional set-up lead to a less realistic prediction of cloud structures and frequencies. Comparative simulations with the Sauer-Schnerr cavitation model and modified pre-factors of the mass transfer terms underestimate the measured shedding frequency.

  13. Comparative proteomic analysis of Typha angustifolia leaf under chromium, cadmium and lead stress

    International Nuclear Information System (INIS)

    Bah, Alieu Mohamed; Sun Hongyan; Chen Fei; Zhou Jing; Dai Huaxin; Zhang Guoping; Wu Feibo

    2010-01-01

    The present study investigated Typha angustifolia leaf proteome in response to Cr, Cd and Pb stress. T. angustifolia of 90 (D90) and 130 d (D130) old plants were subjected to 1 mM Cr, Cd and Pb and samples were collected 30 d after treatment. 2-DE coupled with MS (mass spectrometry) was used to analyze and identify Cr, Cd and Pb-responsive proteins. More than 1600 protein spots were reproducibly detected on each gel, wherein 44, 46, 66 and 33, 26, 62 spots in D90 and D130 samples were differentially expressed by Cr, Cd, Pb over the control, respectively. Of these differentially expressed proteins, 3, 1, 8 overlapped in D90 and D130; while 5, 8, 5 with regulation factors above 3 in one of D90 or D130 samples. Total of 22 and 4 up- and down-regulated proteins were identified using MS and data bank analysis. Cr-induced expression of ATP synthase, RuBisCO small subunit and coproporphyrinogen III oxidase; Cd-induced RuBisCO large subunit; Pb up-regulated carbohydrate metabolic pathway enzymes of fructokinase, and improved RuBisCO activase and large subunit, Mg-protoporphyrin IX chelatase. Contrarily, elF4F was inhibited by Cr/Pb, chloroplast FtsZ-like protein and GF14omega impeded by Cd and Pb, respectively.

  14. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  15. A global sensitivity analysis approach for morphogenesis models.

    Science.gov (United States)

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  16. A Factor Analysis Approach for Clustering Patient Reported Outcomes.

    Science.gov (United States)

    Oh, Jung Hun; Thor, Maria; Olsson, Caroline; Skokic, Viktor; Jörnsten, Rebecka; Alsadius, David; Pettersson, Niclas; Steineck, Gunnar; Deasy, Joseph O

    2016-10-17

    In the field of radiation oncology, the use of extensive patient reported outcomes is increasingly common to measure adverse side effects after radiotherapy in cancer patients. Factor analysis has the potential to identify an optimal number of latent factors (i.e., symptom groups). However, the ultimate goal of treatment response modeling is to understand the relationship between treatment variables such as radiation dose and symptom groups resulting from FA. Hence, it is crucial to identify clinically more relevant symptom groups and improved response variables from those symptom groups for a quantitative analysis. The goal of this study is to design a computational method for finding clinically relevant symptom groups from PROs and to test associations between symptom groups and radiation dose. We propose a novel approach where exploratory factor analysis is followed by confirmatory factor analysis to determine the relevant number of symptom groups. We also propose to use a combination of symptoms in a symptom group identified as a new response variable in linear regression analysis to investigate the relationship between the symptom group and dose-volume variables. We analyzed patient-reported gastrointestinal symptom profiles from 3 datasets in prostate cancer patients treated with radiotherapy. The final structural model of each dataset was validated using the other two datasets and compared to four other existing FA methods. Our systematic EFA-CFA approach provided clinically more relevant solutions than other methods, resulting in new clinically relevant outcome variables that enabled a quantitative analysis. As a result, statistically significant correlations were found between some dose-volume variables to relevant anatomic structures and symptom groups identified by FA. Our proposed method can aid in the process of understanding PROs and provide a basis for improving our understanding of radiation-induced side effects.

  17. Modular approach to analysis of chemically recuperated gas turbine cycles

    Energy Technology Data Exchange (ETDEWEB)

    Carcasci, C.; Facchini, B. [University of Florence, `Sergio Stecco` (Italy). Dept. of Energy Engineering; Harvey, S. [Chalmers Institute of Technology, Goeteberg (Sweden). Dept. of Heat and Power Technology

    1998-12-31

    Current research programmes such as the CAGT programme investigate the opportunity for advanced power generation cycles based on state-of-the-art aeroderivative gas turbine technology. Such cycles would be primarily aimed at intermediate duty applications. Compared to industrial gas turbines, aeroderivatives offer high simple cycle efficiency, and the capability to start quickly and frequently without a significant maintenance cost penalty. A key element for high system performance is the development of improved heat recovery systems, leading to advanced cycles such as the humid air turbine (HAT) cycle, the chemically recuperated gas turbine (CRGT) cycle and the Kalina combined cycle. When used in combination with advanced technologies and components, screening studies conducted by research programmes such as the CAGT programme predict that such advanced cycles could theoretically lead to net cycle efficiencies exceeding 60%. In this paper, the authors present the application of the modular approach to cycle simulation and performance predictions of CRGT cycles. The paper first presents the modular simulation code concept and the main characteristics of CRGT cycles. The paper next discusses the development of the methane-steam reformer unit model used for the simulations. The modular code is then used to compute performance characteristics of a simple CRGT cycle and a reheat CRGT cycle, both based on the General Electric LM6000 aeroderivative gas turbine. (author)

  18. Earthquake response analysis of RC bridges using simplified modeling approaches

    Science.gov (United States)

    Lee, Do Hyung; Kim, Dookie; Park, Taehyo

    2009-07-01

    In this paper, simplified modeling approaches describing the hysteretic behavior of reinforced concrete bridge piers are proposed. For this purpose, flexure-axial and shear-axial interaction models are developed and implemented into a nonlinear finite element analysis program. Comparative verifications for reinforced concrete columns prove that the analytical predictions obtained with the new formulations show good correlation with experimental results under various levels of axial forces and section types. In addition, analytical correlation studies for the inelastic earthquake response of reinforced concrete bridge structures are also carried out using the simplified modeling approaches. Relatively good agreement is observed in the results between the current modeling approach and the elaborated fiber models. It is thus encouraging that the present developments and approaches are capable of identifying the contribution of deformation mechanisms correctly. Subsequently, the present developments can be used as a simple yet effective tool for the deformation capacity evaluation of reinforced concrete columns in general and reinforced concrete bridge piers in particular.

  19. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  20. What leads Indians to participate in clinical trials? A meta-analysis of qualitative studies.

    Directory of Open Access Journals (Sweden)

    Jatin Y Shah

    Full Text Available BACKGROUND: With the globalization of clinical trials, large developing nations have substantially increased their participation in multi-site studies. This participation has raised ethical concerns, among them the fear that local customs, habits and culture are not respected while asking potential participants to take part in study. This knowledge gap is particularly noticeable among Indian subjects, since despite the large number of participants, little is known regarding what factors affect their willingness to participate in clinical trials. METHODS: We conducted a meta-analysis of all studies evaluating the factors and barriers, from the perspective of potential Indian participants, contributing to their participation in clinical trials. We searched both international as well as Indian-specific bibliographic databases, including Pubmed, Cochrane, Openjgate, MedInd, Scirus and Medknow, also performing hand searches and communicating with authors to obtain additional references. We enrolled studies dealing exclusively with the participation of Indians in clinical trials. Data extraction was conducted by three researchers, with disagreement being resolved by consensus. RESULTS: Six qualitative studies and one survey were found evaluating the main themes affecting the participation of Indian subjects. Themes included Personal health benefits, Altruism, Trust in physicians, Source of extra income, Detailed knowledge, Methods for motivating participants as factors favoring, while Mistrust on trial organizations, Concerns about efficacy and safety of trials, Psychological reasons, Trial burden, Loss of confidentiality, Dependency issues, Language as the barriers. CONCLUSION: We identified factors that facilitated and barriers that have negative implications on trial participation decisions in Indian subjects. Due consideration and weightage should be assigned to these factors while planning future trials in India.

  1. Scientific publications from Arab world in leading journals of Integrative and Complementary Medicine: a bibliometric analysis.

    Science.gov (United States)

    Zyoud, Sa'ed H; Al-Jabi, Samah W; Sweileh, Waleed M

    2015-09-04

    Bibliometric analysis is increasingly employed as a useful tool to assess the quantity and quality of research performance. The specific goal of the current study was to evaluate the performance of research output originating from Arab world and published in international Integrative and Complementary Medicine (ICM) journals. Original scientific publications and reviews from the 22 Arab countries that were published in 22 international peer-reviewed ICM journals during all previous years up to December 31(st) 2013, were screened using the Web of Science databases. Five hundred and ninety-one documents were retrieved from 19 ICM journals. The h-index of the set of papers under study was 47. The highest h-index was 27 for Morocco, 21 for Jordan, followed by 19 for each Kingdom of Saudi Arabia (KSA), and Egypt, and the lowest h-index was 1 for each of Comoros, Qatar, and Syrian Arab Republic. No data related to ICM were published from Djibouti, and Mauritania. After adjusting for economy and population power, Somalia (89), Morocco (32.5), Egypt (31.1), Yemen (21.4), and Palestine (21.2) had the highest research productivity. The total number of citations was 9,466, with an average citation of 16 per document. The study identified 262 (44.3 %) documents with 39 countries in Arab-foreign country collaborations. Arab authors collaborated most with countries in Europe (24.2 %), followed by countries in the Asia-Pacific region (9.8 %). Scientific research output in the ICM field in the Arab world region is increasing. Most of publications from Arab world in ICM filed were driven by societal use of medicinal plants and herbs. Search for new therapies from available low cost medicinal plants in Arab world has motivated many researchers in academia and pharmaceutical industry. Further investigation is required to support these findings in a wider journal as well as to improve research output in the field of ICM from Arab world region by investing in more national and

  2. Lead Toxicity

    Science.gov (United States)

    ... time may lead to reduced IQ, slow learning, Attention Deficit Hyperactivity Disorder (ADHD), or behavioral issues. • Lead also affects other parts ... 800-424-5323) • U.S. Environmental Protection Agency Lead Awareness Program http: / / www. epa. gov/ lead • EPA publication “ ...

  3. A new response surface approach for structural reliability analysis

    Science.gov (United States)

    Thacker, B. H.; Wu, X.-T.

    1992-01-01

    This paper describes a new approach for computing structural reliability by post-processing previously computed probabilistic results for stress and strength. The objective is to provide an accurate method whereby independent probabilistic analyses for stress and strength functions can be performed independently and combined at a later time to compute probability of failure. The method provides a capability for testing different strength measures without the need for re-computing the probabilistic stress response. The proposed approach takes full account of the basic random variables effecting both stress and strength, and the failure region in the variable space identified during separate stress/strength probabilistic analyses. A simple closed-form example and a more complex analysis of a turbine blade subject to creep rupture is used to illustrate the method.

  4. Sea level rise and the geoid: factor analysis approach

    Directory of Open Access Journals (Sweden)

    Alexey Sadovski

    2013-08-01

    Full Text Available Sea levels are rising around the world, and this is a particular concern along most of the coasts of the United States. A 1989 EPA report shows that sea levels rose 5-6 inches more than the global average along the Mid-Atlantic and Gulf Coasts in the last century. The main reason for this is coastal land subsidence. This sea level rise is considered more as relative sea level rise than global sea level rise. Thus, instead of studying sea level rise globally, this paper describes a statistical approach by using factor analysis of regional sea level rates of change. Unlike physical models and semi-empirical models that attempt to approach how much and how fast sea levels are changing, this methodology allows for a discussion of the factor(s that statistically affects sea level rates of change, and seeks patterns to explain spatial correlations.

  5. Data analysis with the DIANA meta-scheduling approach

    International Nuclear Information System (INIS)

    Anjum, A; McClatchey, R; Willers, I

    2008-01-01

    The concepts, design and evaluation of the Data Intensive and Network Aware (DIANA) meta-scheduling approach for solving the challenges of data analysis being faced by CERN experiments are discussed in this paper. Our results suggest that data analysis can be made robust by employing fault tolerant and decentralized meta-scheduling algorithms supported in our DIANA meta-scheduler. The DIANA meta-scheduler supports data intensive bulk scheduling, is network aware and follows a policy centric meta-scheduling. In this paper, we demonstrate that a decentralized and dynamic meta-scheduling approach is an effective strategy to cope with increasing numbers of users, jobs and datasets. We present 'quality of service' related statistics for physics analysis through the application of a policy centric fair-share scheduling model. The DIANA meta-schedulers create a peer-to-peer hierarchy of schedulers to accomplish resource management that changes with evolving loads and is dynamic and adapts to the volatile nature of the resources

  6. Introduction to Safety Analysis Approach for Research Reactors

    International Nuclear Information System (INIS)

    Park, Suki

    2016-01-01

    The research reactors have a wide variety in terms of thermal powers, coolants, moderators, reflectors, fuels, reactor tanks and pools, flow direction in the core, and the operating pressure and temperature of the cooling system. Around 110 research reactors have a thermal power greater than 1 MW. This paper introduces a general approach to safety analysis for research reactors and deals with the experience of safety analysis on a 10 MW research reactor with an open-pool and open-tank reactor and a downward flow in the reactor core during normal operation. The general approach to safety analysis for research reactors is described and the design features of a typical open-pool and open-tank type reactor are discussed. The representative events expected in research reactors are investigated. The reactor responses and the thermal hydraulic behavior to the events are presented and discussed. From the minimum CHFR and the maximum fuel temperature calculated, it is ensured that the fuel is not damaged in the step insertion of reactivity by 1.8 mk and the failure of all primary pumps for the reactor with a 10 MW thermal power and downward core flow

  7. Bioinformatics approaches to single-cell analysis in developmental biology.

    Science.gov (United States)

    Yalcin, Dicle; Hakguder, Zeynep M; Otu, Hasan H

    2016-03-01

    Individual cells within the same population show various degrees of heterogeneity, which may be better handled with single-cell analysis to address biological and clinical questions. Single-cell analysis is especially important in developmental biology as subtle spatial and temporal differences in cells have significant associations with cell fate decisions during differentiation and with the description of a particular state of a cell exhibiting an aberrant phenotype. Biotechnological advances, especially in the area of microfluidics, have led to a robust, massively parallel and multi-dimensional capturing, sorting, and lysis of single-cells and amplification of related macromolecules, which have enabled the use of imaging and omics techniques on single cells. There have been improvements in computational single-cell image analysis in developmental biology regarding feature extraction, segmentation, image enhancement and machine learning, handling limitations of optical resolution to gain new perspectives from the raw microscopy images. Omics approaches, such as transcriptomics, genomics and epigenomics, targeting gene and small RNA expression, single nucleotide and structural variations and methylation and histone modifications, rely heavily on high-throughput sequencing technologies. Although there are well-established bioinformatics methods for analysis of sequence data, there are limited bioinformatics approaches which address experimental design, sample size considerations, amplification bias, normalization, differential expression, coverage, clustering and classification issues, specifically applied at the single-cell level. In this review, we summarize biological and technological advancements, discuss challenges faced in the aforementioned data acquisition and analysis issues and present future prospects for application of single-cell analyses to developmental biology. © The Author 2015. Published by Oxford University Press on behalf of the European

  8. Stability Analysis of a Model of Atherogenesis: An Energy Estimate Approach

    Directory of Open Access Journals (Sweden)

    A. I. Ibragimov

    2008-01-01

    Full Text Available Atherosclerosis is a disease of the vasculature that is characterized by chronic inflammation and the accumulation of lipids and apoptotic cells in the walls of large arteries. This disease results in plaque growth in an infected artery typically leading to occlusion of the artery. Atherosclerosis is the leading cause of human mortality in the US, much of Europe, and parts of Asia. In a previous work, we introduced a mathematical model of the biochemical aspects of the disease, in particular the inflammatory response of macrophages in the presence of chemoattractants and modified low density lipoproteins. Herein, we consider the onset of a lesion as resulting from an instability in an equilibrium configuration of cells and chemical species. We derive an appropriate norm by taking an energy estimate approach and present stability criteria. A bio-physical analysis of the mathematical results is presented.

  9. Lead preconcentration in synthetic samples with triton x-114 in the cloud point extraction and analysis by atomic absorption (EAAF)

    International Nuclear Information System (INIS)

    Zegarra Pisconti, Marixa; Cjuno Huanca, Jesus

    2015-01-01

    A methodology was developed about lead preconcentration in water samples that were added dithizone as complexing agent, previously dissolved in the nonionic surfactant Triton X-114, until the formation of the critical micelle concentration and the cloud point temperature. The centrifuged system gave a precipitate with high concentrations of Pb (II) that was measured by atomic absorption spectroscopy with flame (EAAF). The method has proved feasible to be implemented as a method of preconcentration and analysis of Pb in aqueous samples with concentrations less than 1 ppm. Several parameters were evaluated to obtain a percentage recovery of 89.8%. (author)

  10. Intelligent Systems Approaches to Product Sound Quality Analysis

    Science.gov (United States)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  11. A comparison of portable XRF and ICP-OES analysis for lead on air filter samples from a lead ore concentrator mill and a lead-acid battery recycler.

    Science.gov (United States)

    Harper, Martin; Pacolay, Bruce; Hintz, Patrick; Andrew, Michael E

    2006-03-01

    Personal and area samples for airborne lead were taken at a lead mine concentrator mill, and at a lead-acid battery recycler. Lead is mined as its sulfidic ore, galena, which is often associated with zinc and silver. The ore typically is concentrated, and partially separated, on site by crushing and differential froth flotation of the ore minerals before being sent to a primary smelter. Besides lead, zinc and iron are also present in the airborne dusts, together with insignificant levels of copper and silver, and, in one area, manganese. The disposal of used lead-acid batteries presents environmental issues, and is also a waste of recoverable materials. Recycling operations allow for the recovery of lead, which can then be sold back to battery manufacturers to form a closed loop. At the recycling facility lead is the chief airborne metal, together with minor antimony and tin, but several other metals are generally present in much smaller quantities, including copper, chromium, manganese and cadmium. Samplers used in these studies included the closed-face 37 mm filter cassette (the current US standard method for lead sampling), the 37 mm GSP or "cone" sampler, the 25 mm Institute of Occupational Medicine (IOM) inhalable sampler, the 25 mm Button sampler, and the open-face 25 mm cassette. Mixed cellulose-ester filters were used in all samplers. The filters were analyzed after sampling for their content of the various metals, particularly lead, that could be analyzed by the specific portable X-ray fluorescence (XRF) analyzer under study, and then were extracted with acid and analyzed by inductively coupled plasma optical emission spectroscopy (ICP-OES). The 25 mm filters were analyzed using a single XRF reading, while three readings on different parts of the filter were taken from the 37 mm filters. For lead at the mine concentrate mill, all five samplers gave good correlations (r2 > 0.96) between the two analytical methods over the entire range of found lead mass

  12. System-synergetic approach to the analysis of Waldorf school

    Directory of Open Access Journals (Sweden)

    Ionova E.N.

    2012-03-01

    Full Text Available It is considered the basic aspects of usage a system-synergetic approach to the analysis of Waldorf school as an example of relevant embodiment of synergetic paradigm on the different levels of organization and activity of educational establishment (functioning of school as a social institute; philosophical comprehension of essence of education and education; psychological ground of processes of development and self-development of man; a contents of Waldorf education, forms and methods of his mastering by students; providing of pedagogical influence of personality of teacher on a child.

  13. A MANAGERIAL AND COST ACCOUNTING APPROACH OF CUSTOMER PROFITABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    CARDOS Ildiko Reka

    2010-07-01

    Full Text Available In the last years many organizations realized that market orientation is essential to their success. Satisfying the needs of customers, offering them products and services which meet their desires and demands, customer loyalty can increase profitability for long term. After analyzing the existing journal literature in this field we would like to emphasize that managerial accounting, cost calculation methods and techniques, the analysis of costs provides relevant information when analyzing the customer’s profitability. We pay special attention on cost systems. An activity based costing approach takes customer profitability to new levels of accuracy and usefulness, provides the basis for creating, communicating and delivering value to the customers.

  14. Analysis of patient diaries in Danish ICUs: a narrative approach

    DEFF Research Database (Denmark)

    Egerod, Ingrid; Christensen, Doris

    2009-01-01

    -traumatic stress. Patient diaries written in the intensive care unit are used to help ICU-survivors come to terms with their illness. RESEARCH METHODOLOGY: The study had a qualitative, descriptive and explorative design, using a narrative approach of analysis. Data were analysed on several levels: extra-case level......OBJECTIVES: The objective was to describe the structure and content of patient diaries written for critically ill patients in Danish intensive care units (ICUs). BACKGROUND: Critical illness is associated with physical and psychological aftermath including cognitive impairment and post...

  15. Equivalence of ADM Hamiltonian and Effective Field Theory approaches at next-to-next-to-leading order spin1-spin2 coupling of binary inspirals

    International Nuclear Information System (INIS)

    Levi, Michele; Steinhoff, Jan

    2014-01-01

    The next-to-next-to-leading order spin1-spin2 potential for an inspiralling binary, that is essential for accuracy to fourth post-Newtonian order, if both components in the binary are spinning rapidly, has been recently derived independently via the ADM Hamiltonian and the Effective Field Theory approaches, using different gauges and variables. Here we show the complete physical equivalence of the two results, thereby we first prove the equivalence of the ADM Hamiltonian and the Effective Field Theory approaches at next-to-next-to-leading order with the inclusion of spins. The main difficulty in the spinning sectors, which also prescribes the manner in which the comparison of the two results is tackled here, is the existence of redundant unphysical spin degrees of freedom, associated with the spin gauge choice of a point within the extended spinning object for its representative worldline. After gauge fixing and eliminating the unphysical degrees of freedom of the spin and its conjugate at the level of the action, we arrive at curved spacetime generalizations of the Newton-Wigner variables in closed form, which can also be used to obtain further Hamiltonians, based on an Effective Field Theory formulation and computation. Finally, we make use of our validated result to provide gauge invariant relations among the binding energy, angular momentum, and orbital frequency of an inspiralling binary with generic compact spinning components to fourth post-Newtonian order, including all known sectors up to date

  16. Study of the speciation of lead and zinc in industrial dusts and slags and in a contaminated soil: a spectroscopic approach

    International Nuclear Information System (INIS)

    Sobanska, Sophie

    1999-01-01

    As the study of physicochemical forms of metals in polluted soils is necessary to understand their mobilisation, and therefore to assess the risk they represent for the environment, the objective of this research thesis is to determine the speciation of lead and zinc in a soil contaminated by particles (dust and slag) released by a lead production plant. This determination is performed by using a spectroscopic approach, optic microscopy, X ray diffraction, scanning electronic microscopy, transmission electronic microscopy, electronic microprobe, and Raman micro-spectrometry. In order to understand the evolution of speciation of metals and of their propagation in soils, dust and slag produced by the industrial process have been sampled, and morphologically characterized. Associations of metals with other compounds like iron oxides and carbonates have been highlighted. The author shows that the contact with the ground results in a higher alteration of particles and in metal mobilisation. She reports the study of lead and zinc localisation in various particles, and of the influence of a change of soil physicochemical conditions (pH decrease, reduction by soil clogging during humid periods) [fr

  17. Analysis of dijet events in diffractive ep interactions with tagged leading proton at the H1 experiment

    International Nuclear Information System (INIS)

    Polifka, Richard

    2011-08-01

    An inclusive dijet production in diffractive deep-inelastic scattering is measured. The diffractive selection is based on tagging of the leading proton in the Forward Proton Spectrometer. The statistics of events obtained during the HERA II running period (integrated luminosity of 156.7 pb -1 ) enables the measurement of jet final states with leading proton for the first time. The data cover the phase space of x P 2 and 4≤ Q 2 ≤110 GeV 2 . The dijet data are compared with the next to leading order predictions of the quantum chromodynamics (QCD). The phase space of diffractive dijets is in this analysis by factor of 3 in x P larger than in previous measurements. The QCD predictions based on the DGLAP parton evolution describe the measured data well even in a non-DGLAP enriched phase space where one on the jets goes into the region close to the direction of the outgoing proton. The measured single-differential cross sections are compared to several Monte Carlo models with different treatment of diffractive exchange implemented. (orig.)

  18. The Peltier driven frequency domain approach in thermal analysis.

    Science.gov (United States)

    De Marchi, Andrea; Giaretto, Valter

    2014-10-01

    The merits of Frequency Domain analysis as a tool for thermal system characterization are discussed, and the complex thermal impedance approach is illustrated. Pure AC thermal flux generation with negligible DC component is possible with a Peltier device, differently from other existing methods in which a significant DC component is intrinsically attached to the generated AC flux. Such technique is named here Peltier Driven Frequency Domain (PDFD). As a necessary prerequisite, a novel one-dimensional analytical model for an asymmetrically loaded Peltier device is developed, which is general enough to be useful in most practical situations as a design tool for measurement systems and as a key for the interpretation of experimental results. Impedance analysis is possible with Peltier devices by the inbuilt Seebeck effect differential thermometer, and is used in the paper for an experimental validation of the analytical model. Suggestions are then given for possible applications of PDFD, including the determination of thermal properties of materials.

  19. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2009-01-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  20. Cognitive approaches for patterns analysis and security applications

    Science.gov (United States)

    Ogiela, Marek R.; Ogiela, Lidia

    2017-08-01

    In this paper will be presented new opportunities for developing innovative solutions for semantic pattern classification and visual cryptography, which will base on cognitive and bio-inspired approaches. Such techniques can be used for evaluation of the meaning of analyzed patterns or encrypted information, and allow to involve such meaning into the classification task or encryption process. It also allows using some crypto-biometric solutions to extend personalized cryptography methodologies based on visual pattern analysis. In particular application of cognitive information systems for semantic analysis of different patterns will be presented, and also a novel application of such systems for visual secret sharing will be described. Visual shares for divided information can be created based on threshold procedure, which may be dependent on personal abilities to recognize some image details visible on divided images.

  1. An integrated sampling and analysis approach for improved biodiversity monitoring.

    Science.gov (United States)

    DeWan, Amielle A; Zipkin, Elise F

    2010-05-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  2. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  3. Common components analysis: An adapted approach for evaluating programs.

    Science.gov (United States)

    Morgan, Nicole R; Davis, Kelly D; Richardson, Cameron; Perkins, Daniel F

    2018-04-01

    Common Components Analysis (CCA) summarizes the results of program evaluations that utilize randomized control trials and have demonstrated effectiveness in improving their intended outcome(s) into their key elements. This area of research has integrated and modified the existing CCA approach to provide a means of evaluating components of programs without a solid evidence-base, across a variety of target outcomes. This adapted CCA approach (a) captures a variety of similar program characteristics to increase the quality of the comparison within components; (b) identifies components from four primary areas (i.e., content, process, barrier reduction, and sustainability) within specific programming domains (e.g., vocation, social); and (c) proposes future directions to test the extent to which the common components are associated with changes in intended program outcomes (e.g., employment, job retention). The purpose of this paper is to discuss the feasibility of this adapted CCA approach. To illustrate the utility of this technique, researchers used CCA with two popular employment programs that target successful Veteran reintegration but have limited program evaluation - Hire Heroes USA and Hire Our Heroes. This adapted CCA could be applied to longitudinal research designs to identify all utilized programs and the most promising components of these programs as they relate to changes in outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. A Distributed Flocking Approach for Information Stream Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Intelligence analysts are currently overwhelmed with the amount of information streams generated everyday. There is a lack of comprehensive tool that can real-time analyze the information streams. Document clustering analysis plays an important role in improving the accuracy of information retrieval. However, most clustering technologies can only be applied for analyzing the static document collection because they normally require a large amount of computation resource and long time to get accurate result. It is very difficult to cluster a dynamic changed text information streams on an individual computer. Our early research has resulted in a dynamic reactive flock clustering algorithm which can continually refine the clustering result and quickly react to the change of document contents. This character makes the algorithm suitable for cluster analyzing dynamic changed document information, such as text information stream. Because of the decentralized character of this algorithm, a distributed approach is a very natural way to increase the clustering speed of the algorithm. In this paper, we present a distributed multi-agent flocking approach for the text information stream clustering and discuss the decentralized architectures and communication schemes for load balance and status information synchronization in this approach.

  5. Development of a CT-guided standard approach for tined lead implantation at the sacral nerve root S3 in minipigs for chronic neuromodulation

    Science.gov (United States)

    Foditsch, Elena Esra; Zimmermann, Reinhold

    2016-01-01

    Purpose The aim of this study was to develop a controlled approach for sacral neuromodulation (SNM) to improve both nerve targeting and tined lead placement, for which a new computed tomography (CT)-guided implantation technique was analyzed in minipigs. Materials and methods This study included five female, adult Göttingen minipigs. In deep sedoanalgesia, the minipigs were placed in an extended prone position. Commercially available SNM materials were used (needle, introduction sheath, and quadripolar tined lead electrode). Gross anatomy was displayed by CT, and the nerves were bilaterally identified. The optimal angles to puncture the S3 foramen, the resulting access path, and the site for the skin incision were defined subsequently. The needle puncture and the tined lead placement were followed by successive CT scans/3D-reconstruction images. Once proper CT-guided placement of the needle and electrode was established, response to functional stimuli was intraoperatively checked to verify correct positioning. Results Successful bilateral tined lead implantation was performed in four out of five minipigs. Implantation was different from the clinical situation because the puncture was done from the contralateral side at a 30° angle to the midline and 60° horizontal angle to ensure both passage through the foramen and nerve access. Surgery time was 50–150 minutes. Stimulation response comprised a twitch of the perianal musculature and tail rotation to the contralateral side. Conclusion We have established a new, minimally invasive, highly standardized, CT-guided SNM electrode implantation technique. Functional outcomes are clearly defined and reproducible. All procedures can be performed without complications. Future chronic stimulation studies in minipigs can thereby be conducted using a controlled and highly standardized protocol. PMID:27730097

  6. An Approach for Economic Analysis of Intermodal Transportation

    Directory of Open Access Journals (Sweden)

    Bahri Sahin

    2014-01-01

    Full Text Available A different intermodal transportation model based on cost analysis considering technical, economical, and operational parameters is presented. The model consists of such intermodal modes as sea-road, sea-railway, road-railway, and multimode of sea-road-railway. A case study of cargo transportation has been carried out by using the suggested model. Then, the single road transportation mode has been compared to intermodal modes in terms of transportation costs. This comparison takes into account the external costs of intermodal transportation. The research reveals that, in the short distance transportation, single transportation modes always tend to be advantageous. As the transportation distance gets longer, intermodal transportation advantages begin to be effective on the costs. In addition, the proposed method in this study leads to determining the fleet size and capacity for transportation and the appropriate transportation mode.

  7. An Approach for Economic Analysis of Intermodal Transportation

    Science.gov (United States)

    Sahin, Bahri; Ust, Yasin; Guneri, Ali Fuat; Gulsun, Bahadir; Turan, Eda

    2014-01-01

    A different intermodal transportation model based on cost analysis considering technical, economical, and operational parameters is presented. The model consists of such intermodal modes as sea-road, sea-railway, road-railway, and multimode of sea-road-railway. A case study of cargo transportation has been carried out by using the suggested model. Then, the single road transportation mode has been compared to intermodal modes in terms of transportation costs. This comparison takes into account the external costs of intermodal transportation. The research reveals that, in the short distance transportation, single transportation modes always tend to be advantageous. As the transportation distance gets longer, intermodal transportation advantages begin to be effective on the costs. In addition, the proposed method in this study leads to determining the fleet size and capacity for transportation and the appropriate transportation mode. PMID:25152919

  8. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  9. VOLUMETRIC LEAD ASSAY

    International Nuclear Information System (INIS)

    Ebadian, M.A.; Dua, S.K.; Roelant, David; Kumar, Sachin

    2001-01-01

    This report describes a system for handling and radioassay of lead, consisting of a robot, a conveyor, and a gamma spectrometer. The report also presents a cost-benefit analysis of options: radioassay and recycling lead vs. disposal as waste

  10. A Big Data Analysis Approach for Rail Failure Risk Assessment.

    Science.gov (United States)

    Jamshidi, Ali; Faghih-Roohi, Shahrzad; Hajizadeh, Siamak; Núñez, Alfredo; Babuska, Robert; Dollevoet, Rolf; Li, Zili; De Schutter, Bart

    2017-08-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by analyzing a type of rail surface defect called squats that are detected automatically among the huge number of records from video cameras. We propose an image processing approach for automatic detection of squats, especially severe types that are prone to rail breaks. We measure the visual length of the squats and use them to model the failure risk. For the assessment of the rail failure risk, we estimate the probability of rail failure based on the growth of squats. Moreover, we perform severity and crack growth analyses to consider the impact of rail traffic loads on defects in three different growth scenarios. The failure risk estimations are provided for several samples of squats with different crack growth lengths on a busy rail track of the Dutch railway network. The results illustrate the practicality and efficiency of the proposed approach. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  11. A variational approach to the analysis of dissipative electromechanical systems.

    Directory of Open Access Journals (Sweden)

    Andrew Allison

    Full Text Available We develop a method for systematically constructing Lagrangian functions for dissipative mechanical, electrical, and electromechanical systems. We derive the equations of motion for some typical electromechanical systems using deterministic principles that are strictly variational. We do not use any ad hoc features that are added on after the analysis has been completed, such as the Rayleigh dissipation function. We generalise the concept of potential, and define generalised potentials for dissipative lumped system elements. Our innovation offers a unified approach to the analysis of electromechanical systems where there are energy and power terms in both the mechanical and electrical parts of the system. Using our novel technique, we can take advantage of the analytic approach from mechanics, and we can apply these powerful analytical methods to electrical and to electromechanical systems. We can analyse systems that include non-conservative forces. Our methodology is deterministic, and does does require any special intuition, and is thus suitable for automation via a computer-based algebra package.

  12. A complex analysis approach to the motion of uniform vortices

    Science.gov (United States)

    Riccardi, Giorgio

    2018-02-01

    A new mathematical approach to kinematics and dynamics of planar uniform vortices in an incompressible inviscid fluid is presented. It is based on an integral relation between Schwarz function of the vortex boundary and induced velocity. This relation is firstly used for investigating the kinematics of a vortex having its Schwarz function with two simple poles in a transformed plane. The vortex boundary is the image of the unit circle through the conformal map obtained by conjugating its Schwarz function. The resulting analysis is based on geometric and algebraic properties of that map. Moreover, it is shown that the steady configurations of a uniform vortex, possibly in presence of point vortices, can be also investigated by means of the integral relation. The vortex equilibria are divided in two classes, depending on the behavior of the velocity on the boundary, measured in a reference system rotating with this curve. If it vanishes, the analysis is rather simple. However, vortices having nonvanishing relative velocity are also investigated, in presence of a polygonal symmetry. In order to study the vortex dynamics, the definition of Schwarz function is then extended to a Lagrangian framework. This Lagrangian Schwarz function solves a nonlinear integrodifferential Cauchy problem, that is transformed in a singular integral equation. Its analytical solution is here approached in terms of successive approximations. The self-induced dynamics, as well as the interactions with a point vortex, or between two uniform vortices are analyzed.

  13. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    Directory of Open Access Journals (Sweden)

    Chahinez Benkoussas

    2015-01-01

    Full Text Available A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  14. Phenylketonuria mutation analysis in Northern Ireland: A rapid stepwise approach

    Energy Technology Data Exchange (ETDEWEB)

    Zschocke, J.; Graham, C.A.; Nevin, N.C. [Queen`s Univ., Belfast (Australia)] [and others

    1995-12-01

    We present a multistep approach for the rapid analysis of phenylketonuria (PKU) mutations. In the first step, three common mutations and a polymorphic short tandem repeat (STR) system are rapidly analyzed with a fluorescent multiplex assay. In the second step, minihaplotypes combining STR and VNTR data are used to determine rare mutations likely to be present in an investigated patient, which are then confirmed by restriction enzyme analysis. The remaining mutations are analyzed with denaturant gradient-gel electrophoresis and sequencing. The first two steps together identify both mutations in 90%-95% of PKU patients, and results can be obtained within 2 d. We have investigated 121 Northern Irish families with hyperphenylalaninemia, including virtually all patients born since 1972, and have found 34 different mutations on 241 of the 242 mutant alleles. Three mutations (R408W, 165T, and F39L) account for 57.5% of mutations, while 14 mutations occur with a frequency of 1%-6%. The present analysis system is efficient and inexpensive and is particularly well suited to routine mutation analysis in a diagnostic setting. 19 refs., 5 tabs.

  15. A general aerodynamic approach to the problem of decaying or growing vibrations of thin, flexible wings with supersonic leading and trailing edges and no side edges

    Science.gov (United States)

    Warner, R. W.

    1975-01-01

    Indicial aerodynamic influence coefficients were evaluated from potential theory for a thin, flexible wing with supersonic leading and trailing edges only. The analysis is based on the use of small surface areas in which the downwash is assumed uniform. Within this limitation, the results are exact except for the restriction of linearized theory. The areas are not restricted either to square boxes or Mach boxes. A given area may be any rectangle or square which may or may not be cut by the Mach forecone, and any area can be used anywhere in the forecone without loss of accuracy.

  16. A full subtraction approach for finite element method based source analysis using constrained Delaunay tetrahedralisation.

    Science.gov (United States)

    Drechsler, F; Wolters, C H; Dierkes, T; Si, H; Grasedyck, L

    2009-07-15

    A mathematical dipole is widely used as a model for the primary current source in electroencephalography (EEG) source analysis. In the governing Poisson-type differential equation, the dipole leads to a singularity on the right-hand side, which has to be treated specifically. In this paper, we will present a full subtraction approach where the total potential is divided into a singularity and a correction potential. The singularity potential is due to a dipole in an infinite region of homogeneous conductivity. The correction potential is computed using the finite element (FE) method. Special care is taken in order to evaluate the right-hand side integral appropriately with the objective of achieving highest possible convergence order for linear basis functions. Our new approach allows the construction of transfer matrices for fast computation of the inverse problem for anisotropic volume conductors. A constrained Delaunay tetrahedralisation (CDT) approach is used for the generation of high-quality FE meshes. We validate the new approach in a four-layer sphere model with a highly conductive cerebrospinal fluid (CSF) and an anisotropic skull compartment. For radial and tangential sources with eccentricities up to 1 mm below the CSF compartment, we achieve a maximal relative error of 0.71% in a CDT-FE model with 360 k nodes which is not locally refined around the source singularity and therefore useful for arbitrary dipole locations. The combination of the full subtraction approach with the high quality CDT meshes leads to accuracies that, to the best of the author's knowledge, have not yet been presented before.

  17. A life cycle analysis approach to D and D decision-making

    International Nuclear Information System (INIS)

    Yuracko, K.L.; Gresalfi, M.; Yerace, P.; Krstich, M.; Gerrick, D.

    1998-05-01

    This paper describes a life cycle analysis (LCA) approach that makes decontamination and decommissioning (D and D) of US Department of Energy facilities more efficient and more responsive to the concerns of the society. With the considerable complexity of D and D projects and their attendant environmental and health consequences, projects can no longer be designed based on engineering and economic criteria alone. Using the LCA D and D approach, the evaluation of material disposition alternatives explicitly includes environmental impacts, health and safety impacts, socioeconomic impacts, and stakeholder attitudes -- in addition to engineering and economic criteria. Multi-attribute decision analysis is used to take into consideration the uncertainties and value judgments that are an important part of all material disposition decisions. Use of the LCA D and D approach should lead to more appropriate selections of material disposition pathways and a decision-making process that is both understandable and defensible. The methodology and procedures of the LCA D and D approach are outlined and illustrated by an application of the approach at the Department of Energy's West Valley Demonstration Project. Specifically, LCA was used to aid decisions on disposition of soil and concrete from the Tank Pad D and D Project. A decision tree and the Pollution Prevention/Waste Minimization Users Guide for Environmental Restoration Projects were used to identify possible alternatives for disposition of the soil and concrete. Eight alternatives encompassing source reduction, segregation, treatment, and disposal were defined for disposition of the soil; two alternatives were identified for disposition of the concrete. Preliminary results suggest that segregation and treatment are advantageous in the disposition of both the soil and the concrete. This and other recent applications illustrate the strength and ease of application of the LCA D and D approach

  18. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  19. An Evaluation on Factors Influencing Decision making for Malaysia Disaster Management: The Confirmatory Factor Analysis Approach

    Science.gov (United States)

    Zubir, S. N. A.; Thiruchelvam, S.; Mustapha, K. N. M.; Che Muda, Z.; Ghazali, A.; Hakimie, H.

    2017-12-01

    For the past few years, natural disaster has been the subject of debate in disaster management especially in flood disaster. Each year, natural disaster results in significant loss of life, destruction of homes and public infrastructure, and economic hardship. Hence, an effective and efficient flood disaster management would assure non-futile efforts for life saving. The aim of this article is to examine the relationship between approach, decision maker, influence factor, result, and ethic to decision making for flood disaster management in Malaysia. The key elements of decision making in the disaster management were studied based on the literature. Questionnaire surveys were administered among lead agencies at East Coast of Malaysia in the state of Kelantan and Pahang. A total of 307 valid responses had been obtained for further analysis. Exploratory Factor Analysis (EFA) and Confirmatory Factor Analysis (CFA) were carried out to analyse the measurement model involved in the study. The CFA for second-order reflective and first-order reflective measurement model indicates that approach, decision maker, influence factor, result, and ethic have a significant and direct effect on decision making during disaster. The results from this study showed that decision- making during disaster is an important element for disaster management to necessitate a successful collaborative decision making. The measurement model is accepted to proceed with further analysis known as Structural Equation Modeling (SEM) and can be assessed for the future research.

  20. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  1. Network Analysis: A Novel Approach to Understand Suicidal Behaviour

    Directory of Open Access Journals (Sweden)

    Derek de Beurs

    2017-02-01

    Full Text Available Although suicide is a major public health issue worldwide, we understand little of the onset and development of suicidal behaviour. Suicidal behaviour is argued to be the end result of the complex interaction between psychological, social and biological factors. Epidemiological studies resulted in a range of risk factors for suicidal behaviour, but we do not yet understand how their interaction increases the risk for suicidal behaviour. A new approach called network analysis can help us better understand this process as it allows us to visualize and quantify the complex association between many different symptoms or risk factors. A network analysis of data containing information on suicidal patients can help us understand how risk factors interact and how their interaction is related to suicidal thoughts and behaviour. A network perspective has been successfully applied to the field of depression and psychosis, but not yet to the field of suicidology. In this theoretical article, I will introduce the concept of network analysis to the field of suicide prevention, and offer directions for future applications and studies.

  2. Symmetric or asymmetric oil prices? A meta-analysis approach

    International Nuclear Information System (INIS)

    Perdiguero-García, Jordi

    2013-01-01

    The analysis of price asymmetries in the gasoline market is one of the most widely studied in energy economics. However, the great variation in the outcomes reported makes the drawing of any definitive conclusions difficult. Given this situation, a meta-analysis serves as an excellent tool to discover which characteristics of the various markets analyzed, and which specific features of these studies, might account for these differences. In adopting such an approach, this paper shows how the particular segment of the industry analyzed, the characteristics of the data, the years under review, the type of publication and the introduction of control variables might explain this heterogeneity in results. The paper concludes on these grounds that increased competition may significantly reduce the possibility of occurrence of asymmetric behavior. These results should be taken into consideration therefore in future studies of asymmetries in the oil industry. - Highlights: ► I study asymmetries in the price gasoline industry through a meta-analysis regression. ► The asymmetries are produced mainly in the retail market. ► The asymmetries are less frequent when we analyze recent cases. ► There may be some degree of publication bias. ► The level of competition may explain the patterns of asymmetry

  3. Analysis of C/E results of fission rate ratio measurements in several fast lead VENUS-F cores

    Directory of Open Access Journals (Sweden)

    Kochetkov Anatoly

    2017-01-01

    Full Text Available During the GUINEVERE FP6 European project (2006–2011, the zero-power VENUS water-moderated reactor was modified into VENUS-F, a mock-up of a lead cooled fast spectrum system with solid components that can be operated in both critical and subcritical mode. The Fast Reactor Experiments for hybrid Applications (FREYA FP7 project was launched in 2011 to support the designs of the MYRRHA Accelerator Driven System (ADS and the ALFRED Lead Fast Reactor (LFR. Three VENUS-F critical core configurations, simulating the complex MYRRHA core design and one configuration devoted to the LFR ALFRED core conditions were investigated in 2015. The MYRRHA related cores simulated step by step design peculiarities like the BeO reflector and in pile sections. For all of these cores the fuel assemblies were of a simple design consisting of 30% enriched metallic uranium, lead rodlets to simulate the coolant and Al2O3 rodlets to simulate the oxide fuel. Fission rate ratios of minor actinides such as Np-237, Am-241 as well as Pu-239, Pu-240, Pu-242 and U-238 to U-235 were measured in these VENUS-F critical assemblies with small fission chambers in specially designed locations, to determine the spectral indices in the different neutron spectrum conditions. The measurements have been analyzed using advanced computational tools including deterministic and stochastic codes and different nuclear data sets like JEFF-3.1, JEFF-3.2, ENDF/B7.1 and JENDL-4.0. The analysis of the C/E discrepancies will help to improve the nuclear data in the specific energy region of fast neutron reactor spectra.

  4. Analysis of C/E results of fission rate ratio measurements in several fast lead VENUS-F cores

    Science.gov (United States)

    Kochetkov, Anatoly; Krása, Antonín; Baeten, Peter; Vittiglio, Guido; Wagemans, Jan; Bécares, Vicente; Bianchini, Giancarlo; Fabrizio, Valentina; Carta, Mario; Firpo, Gabriele; Fridman, Emil; Sarotto, Massimo

    2017-09-01

    During the GUINEVERE FP6 European project (2006-2011), the zero-power VENUS water-moderated reactor was modified into VENUS-F, a mock-up of a lead cooled fast spectrum system with solid components that can be operated in both critical and subcritical mode. The Fast Reactor Experiments for hybrid Applications (FREYA) FP7 project was launched in 2011 to support the designs of the MYRRHA Accelerator Driven System (ADS) and the ALFRED Lead Fast Reactor (LFR). Three VENUS-F critical core configurations, simulating the complex MYRRHA core design and one configuration devoted to the LFR ALFRED core conditions were investigated in 2015. The MYRRHA related cores simulated step by step design peculiarities like the BeO reflector and in pile sections. For all of these cores the fuel assemblies were of a simple design consisting of 30% enriched metallic uranium, lead rodlets to simulate the coolant and Al2O3 rodlets to simulate the oxide fuel. Fission rate ratios of minor actinides such as Np-237, Am-241 as well as Pu-239, Pu-240, Pu-242 and U-238 to U-235 were measured in these VENUS-F critical assemblies with small fission chambers in specially designed locations, to determine the spectral indices in the different neutron spectrum conditions. The measurements have been analyzed using advanced computational tools including deterministic and stochastic codes and different nuclear data sets like JEFF-3.1, JEFF-3.2, ENDF/B7.1 and JENDL-4.0. The analysis of the C/E discrepancies will help to improve the nuclear data in the specific energy region of fast neutron reactor spectra.

  5. Single-molecule analysis of lead(II)-binding aptamer conformational changes in an α-hemolysin nanopore, and sensitive detection of lead(II)

    International Nuclear Information System (INIS)

    Wang, Hai-Yan; Song, Ze-Yang; Zhang, Hui-Sheng; Chen, Si-Ping

    2016-01-01

    The α-hemolysin (αHL) nanopore is capable of analyzing DNA duplex and DNA aptamer as they can be electrophoretically driven into the vestibule from the cis entrance. The current study describes the competitive interaction induced by Pb 2+ that changes the secondary structure of DNA duplex in asymmetrical electrolyte solution. DNA duplex formed by the partial complementary DNA and DNA aptamer sequence produced unzipping blockages with the dwell unzipping time lasting 2.84 ± 0.7 ms. By cation-DNA interaction with Pb 2+ , the DNA duplex will unwind and then form Pb 2+ -stabilized-DNA aptamer, which will be captured and unfolded in vestibule. The pore conductance were reduced to 54 % and 94 % with mean dwell unfolding times of 165 ± 12 ms. The competitive behavior between Pb 2+ and single-strand DNA was further utilized to detect Pb 2+ in solution with a detection limit of 0.5 nM. This nanopore platform also provides a powerful tool for studying the cation-DNA interactions in DNA aptamer conformational changes. Thus, the results drawn from these studies provide insights into the applications of α-hemolysin nanopore as a molecular sieve to different DNA secondary structure in future application of nanopore analysis. (author)

  6. An integrated approach to grey relational analysis, analytic hierarchy process and data envelopment analysis

    OpenAIRE

    Pakkar, Mohammad Sadegh

    2017-01-01

    Purpose: This paper aims to propose an integration of the analytic hierarchy process (AHP) and data envelopment analysis (DEA) methods in a multiattribute grey relational analysis (GRA) methodology in which the attribute weights are completely unknown and the attribute values take the form of fuzzy numbers. Design/methodology/approach: This research has been organized to proceed along the following steps: computing the grey relational coefficients for alternatives with respect to each attribu...

  7. Factors influencing crime rates: an econometric analysis approach

    Science.gov (United States)

    Bothos, John M. A.; Thomopoulos, Stelios C. A.

    2016-05-01

    The scope of the present study is to research the dynamics that determine the commission of crimes in the US society. Our study is part of a model we are developing to understand urban crime dynamics and to enhance citizens' "perception of security" in large urban environments. The main targets of our research are to highlight dependence of crime rates on certain social and economic factors and basic elements of state anticrime policies. In conducting our research, we use as guides previous relevant studies on crime dependence, that have been performed with similar quantitative analyses in mind, regarding the dependence of crime on certain social and economic factors using statistics and econometric modelling. Our first approach consists of conceptual state space dynamic cross-sectional econometric models that incorporate a feedback loop that describes crime as a feedback process. In order to define dynamically the model variables, we use statistical analysis on crime records and on records about social and economic conditions and policing characteristics (like police force and policing results - crime arrests), to determine their influence as independent variables on crime, as the dependent variable of our model. The econometric models we apply in this first approach are an exponential log linear model and a logit model. In a second approach, we try to study the evolvement of violent crime through time in the US, independently as an autonomous social phenomenon, using autoregressive and moving average time-series econometric models. Our findings show that there are certain social and economic characteristics that affect the formation of crime rates in the US, either positively or negatively. Furthermore, the results of our time-series econometric modelling show that violent crime, viewed solely and independently as a social phenomenon, correlates with previous years crime rates and depends on the social and economic environment's conditions during previous years.

  8. A numerical approach for the analysis of deformable journal bearings

    Directory of Open Access Journals (Sweden)

    D. Benasciutti

    2012-07-01

    Full Text Available This paper presents a numerical approach for the analysis of hydrodynamic radial journal bearings. The effect of shaft and housing elastic deformation on pressure distribution within oil film is investigated. An iterative algorithm that couples Reynolds equation with a plane finite elements structural model is solved. Temperature and pressure effects on viscosity are also included with the Vogel-Barus model. The deformed lubrication gap and the overall stress state were calculated. Numerical results are presented with reference to a typical journal bearing configuration at two different inlet oil temperatures. Obtained results show the great influence of elastic deformation of bearing components on oil pressure distribution, compared with results for ideally rigid components obtained by Raimondi and Boyd solution.

  9. Vulnerability survival analysis: a novel approach to vulnerability management

    Science.gov (United States)

    Farris, Katheryn A.; Sullivan, John; Cybenko, George

    2017-05-01

    Computer security vulnerabilities span across large, enterprise networks and have to be mitigated by security engineers on a routine basis. Presently, security engineers will assess their "risk posture" through quantifying the number of vulnerabilities with a high Common Vulnerability Severity Score (CVSS). Yet, little to no attention is given to the length of time by which vulnerabilities persist and survive on the network. In this paper, we review a novel approach to quantifying the length of time a vulnerability persists on the network, its time-to-death, and predictors of lower vulnerability survival rates. Our contribution is unique in that we apply the cox proportional hazards regression model to real data from an operational IT environment. This paper provides a mathematical overview of the theory behind survival analysis methods, a description of our vulnerability data, and an interpretation of the results.

  10. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    Science.gov (United States)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  11. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  12. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  13. Strategic Technology Investment Analysis: An Integrated System Approach

    Science.gov (United States)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  14. A Novel Synchronization-Based Approach for Functional Connectivity Analysis

    Directory of Open Access Journals (Sweden)

    Angela Lombardi

    2017-01-01

    Full Text Available Complex network analysis has become a gold standard to investigate functional connectivity in the human brain. Popular approaches for quantifying functional coupling between fMRI time series are linear zero-lag correlation methods; however, they might reveal only partial aspects of the functional links between brain areas. In this work, we propose a novel approach for assessing functional coupling between fMRI time series and constructing functional brain networks. A phase space framework is used to map couples of signals exploiting their cross recurrence plots (CRPs to compare the trajectories of the interacting systems. A synchronization metric is extracted from the CRP to assess the coupling behavior of the time series. Since the functional communities of a healthy population are expected to be highly consistent for the same task, we defined functional networks of task-related fMRI data of a cohort of healthy subjects and applied a modularity algorithm in order to determine the community structures of the networks. The within-group similarity of communities is evaluated to verify whether such new metric is robust enough against noise. The synchronization metric is also compared with Pearson’s correlation coefficient and the detected communities seem to better reflect the functional brain organization during the specific task.

  15. What's the state of energy studies research?: A content analysis of three leading journals from 1999 to 2008

    International Nuclear Information System (INIS)

    D'Agostino, Anthony Louis; Sovacool, Benjamin K.; Trott, Kirsten; Ramos, Catherine Regalado; Saleem, Saleena; Ong, Yanchun

    2011-01-01

    We present the results of a content analysis conducted on 2502 papers written by 5318 authors published between 1999 and 2008 in three leading energy studies journals: Energy Policy, The Energy Journal, and The Electricity Journal. Our study finds that authors were most likely to be male, based in North America, possess a background in science or engineering, and affiliated with a university or research institute. Articles were likely to be written by authors working within disciplinary boundaries and using research methods from an economics/engineering background. The US was the most written about country among papers that adopted a country focus and electricity was the most frequently discussed energy source. Energy markets and public policy instruments were the most popular focus areas. According to these findings, we identify five thematic areas whose further investigation could enhance the energy studies field and increase the policy-relevance of contemporary research.

  16. Neutronic analysis of the European reference design of the water cooled lithium lead blanket for a DEMOnstration reactor

    International Nuclear Information System (INIS)

    Petrizzi, L.

    1994-01-01

    Water cooled lithium lead blankets, using liquid Pb-17Li eutectic both as breeder and neutron multiplier material, and martensitic steel as structural material, represent one of the four families under development in the European DEMO blanket programme. Two concepts were proposed, both reaching tritium breeding self-sufficiency: the 'box-shaped' and the 'cylindrical modules'. Also to this scope a new concept has been defined: 'the single box'. A neutronic analysis of the 'single box' is presented. A full 3-D model including the whole assembly and many of the reactor details (divertors, holes, gaps) has been defined, together with a 3-D neutron source. A tritium breeding ration (TBR) value of 1.19 confirms the tritium breeding self-sufficiency of the design. Selected power densities, calculated for the different materials and zones, are here presented. Some shielding capability considerations with respect to the toroidal field coil system are presented too. (author) 10 refs.; 3 figs.; 3 tabs

  17. Impact of right-ventricular apical pacing on the optimal left-ventricular lead positions measured by phase analysis of SPECT myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Hung, Guang-Uei; Huang, Jin-Long; Lin, Wan-Yu; Tsai, Shih-Chung; Wang, Kuo-Yang; Chen, Shih-Ann; Lloyd, Michael S.; Chen, Ji

    2014-01-01

    The use of SPECT phase analysis to optimize left-ventricular (LV) lead positions for cardiac resynchronization therapy (CRT) was performed at baseline, but CRT works as simultaneous right ventricular (RV) and LV pacing. The aim of this study was to assess the impact of RV apical (RVA) pacing on optimal LV lead positions measured by SPECT phase analysis. This study prospectively enrolled 46 patients. Two SPECT myocardial perfusion scans were acquired under sinus rhythm with complete left bundle branch block and RVA pacing, respectively, following a single injection of 99m Tc-sestamibi. LV dyssynchrony parameters and optimal LV lead positions were measured by the phase analysis technique and then compared between the two scans. The LV dyssynchrony parameters were significantly larger with RVA pacing than with sinus rhythm (p ∝0.01). In 39 of the 46 patients, the optimal LV lead positions were the same between RVA pacing and sinus rhythm (kappa = 0.861). In 6 of the remaining 7 patients, the optimal LV lead positions were along the same radial direction, but RVA pacing shifted the optimal LV lead positions toward the base. The optimal LV lead positions measured by SPECT phase analysis were consistent, no matter whether the SPECT images were acquired under sinus rhythm or RVA pacing. In some patients, RVA pacing shifted the optimal LV lead positions toward the base. This study supports the use of baseline SPECT myocardial perfusion imaging to optimize LV lead positions to increase CRT efficacy. (orig.)

  18. A scoping review of statistical approaches to the analysis of multiple health-related behaviours.

    Science.gov (United States)

    McAloney, Kareena; Graham, Hilary; Law, Catherine; Platt, Lucinda

    2013-06-01

    Smoking, diet, exercise, and alcohol are leading causes of chronic disease and premature death, many engage in two or more of these behaviours concurrently. The paper identified statistical approaches used to investigate multiple behavioural risk factors. A scoping review of papers published in English from 2000 to 2011 was conducted; papers are related to concurrent participation in at least two of the behaviours. Statistical approaches were recorded and categorised. Across 50 papers, two distinct approaches were identified. Co-occurrence analyses focused on concurrent but independent behaviours, represented by prevalence of behavioural combinations and/or by the summing behaviours into risk indexes. Clustering analyses investigated underlying associations between the concurrent behaviours, with clustering identified by divergences in observed and expected prevalence of combinations or through identification of latent or unobservable clusters. Co-occurrence was more frequently reported, but the use of clustering techniques and, in particular, cluster analytic and latent variable techniques increased across the study period. The two approaches investigate concurrent participation in multiple health behaviours but differ in conceptualisation and analysis. Despite differences, inconsistency in the terminology describing the study of multiple health behaviours was apparent, with potential to influence understandings of concurrent health behaviours in policy and practice. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Hybrid Approach of Aortic Diseases: Zone 1 Delivery and Volumetric Analysis on the Descending Aorta.

    Science.gov (United States)

    Duncan, José Augusto; Dias, Ricardo Ribeiro; Dinato, Fabrício José; Fernandes, Fábio; Ramirez, Félix José Álvares; Mady, Charles; Jatene, Fabio Biscegli

    2017-01-01

    Conventional techniques of surgical correction of arch and descending aortic diseases remains as high-risk procedures. Endovascular treatments of abdominal and descending thoracic aorta have lower surgical risk. Evolution of both techniques - open debranching of the arch and endovascular approach of the descending aorta - may extend a less invasive endovascular treatment for a more extensive disease with necessity of proximal landing zone in the arch. To evaluate descending thoracic aortic remodeling by means of volumetric analysis after hybrid approach of aortic arch debranching and stenting the descending aorta. Retrospective review of seven consecutive patients treated between September 2014 and August 2016 for diseases of proximal descending aorta (aneurysms and dissections) by hybrid approach to deliver the endograft at zone 1. Computed tomography angiography were analyzed using a specific software to calculate descending thoracic aorta volumes pre- and postoperatively. Follow-up was done in 100% of patients with a median time of 321 days (range, 41-625 days). No deaths or permanent neurological complications were observed. There were no endoleaks or stent migrations. Freedom from reintervention was 100% at 300 days and 66% at 600 days. Median volume reduction was of 45.5 cm3, representing a median volume shrinkage by 9.3%. Hybrid approach of arch and descending thoracic aorta diseases is feasible and leads to a favorable aortic remodeling with significant volume reduction.

  20. Hybrid Approach of Aortic Diseases: Zone 1 Delivery and Volumetric Analysis on the Descending Aorta

    Directory of Open Access Journals (Sweden)

    José Augusto Duncan

    Full Text Available Abstract Introduction: Conventional techniques of surgical correction of arch and descending aortic diseases remains as high-risk procedures. Endovascular treatments of abdominal and descending thoracic aorta have lower surgical risk. Evolution of both techniques - open debranching of the arch and endovascular approach of the descending aorta - may extend a less invasive endovascular treatment for a more extensive disease with necessity of proximal landing zone in the arch. Objective: To evaluate descending thoracic aortic remodeling by means of volumetric analysis after hybrid approach of aortic arch debranching and stenting the descending aorta. Methods: Retrospective review of seven consecutive patients treated between September 2014 and August 2016 for diseases of proximal descending aorta (aneurysms and dissections by hybrid approach to deliver the endograft at zone 1. Computed tomography angiography were analyzed using a specific software to calculate descending thoracic aorta volumes pre- and postoperatively. Results: Follow-up was done in 100% of patients with a median time of 321 days (range, 41-625 days. No deaths or permanent neurological complications were observed. There were no endoleaks or stent migrations. Freedom from reintervention was 100% at 300 days and 66% at 600 days. Median volume reduction was of 45.5 cm3, representing a median volume shrinkage by 9.3%. Conclusion: Hybrid approach of arch and descending thoracic aorta diseases is feasible and leads to a favorable aortic remodeling with significant volume reduction.

  1. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    Science.gov (United States)

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of

  2. Development of a novel kinetic model for the analysis of PAH biodegradation in the presence of lead and cadmium co-contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Deary, Michael E., E-mail: michael.deary@northumbria.ac.uk [Department of Geography,Faculty of Engineering and Environment, Northumbria University, Ellison Building, Newcastle upon Tyne NE1 8ST (United Kingdom); Ekumankama, Chinedu C. [Department of Geography,Faculty of Engineering and Environment, Northumbria University, Ellison Building, Newcastle upon Tyne NE1 8ST (United Kingdom); Cummings, Stephen P. [Faculty of Health and Life Sciences, Northumbria University, Ellison Building, Newcastle upon Tyne NE1 8ST (United Kingdom)

    2016-04-15

    Highlights: • 40 week study of the biodegradation of 16 US EPA priority PAHs in a soil with high organic matter. • Effects of cadmium, lead and mercury co-contaminants studied. • Novel kinetic approach developed. • Biodegradation of lower molecular weight PAHs relatively unaffected by Cd or Pb. • Soil organic matter plays a key role in the PAH removal mechanism. - Abstract: We report on the results of a 40 week study in which the biodegradation of 16 US EPA polycyclic aromatic hydrocarbons (PAHs) was followed in microcosms containing soil of high organic carbon content (11%) in the presence and absence of lead and cadmium co-contaminants. The total spiked PAH concentration was 2166 mg/kg. Mercury amendment was also made to give an abiotic control. A novel kinetic model has been developed to explain the observed biphasic nature of PAH degradation. The model assumes that PAHs are distributed across soil phases of varying degrees of bioaccessibility. The results of the analysis suggest that overall percentage PAH loss is dependent on the respective rates at which the PAHs (a) are biodegraded by soil microorganisms in pore water and bioaccessible soil phases and (b) migrate from bioaccessible to non-bioaccessible soil phases. In addition, migration of PAHs to non-bioaccessible and non-Soxhlet-extractable soil phases associated with the humin pores gives rise to an apparent removal process. The presence of metal co-contaminants shows a concentration dependent inhibition of the biological degradation processes that results in a reduction in overall degradation. Lead appears to have a marginally greater inhibitory effect than cadmium.

  3. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  4. Investigation of the leading and subleading high-energy behavior of hadron-hadron total cross sections using a best-fit analysis of hadronic scattering data

    Science.gov (United States)

    Giordano, M.; Meggiolaro, E.; Silva, P. V. R. G.

    2017-08-01

    In the present investigation we study the leading and subleading high-energy behavior of hadron-hadron total cross sections using a best-fit analysis of hadronic scattering data. The parametrization used for the hadron-hadron total cross sections at high energy is inspired by recent results obtained by Giordano and Meggiolaro [J. High Energy Phys. 03 (2014) 002, 10.1007/JHEP03(2014)002] using a nonperturbative approach in the framework of QCD, and it reads σtot˜B ln2s +C ln s ln ln s . We critically investigate if B and C can be obtained by means of best-fits to data for proton-proton and antiproton-proton scattering, including recent data obtained at the LHC, and also to data for other meson-baryon and baryon-baryon scattering processes. In particular, following the above-mentioned nonperturbative QCD approach, we also consider fits where the parameters B and C are set to B =κ Bth and C =κ Cth, where Bth and Cth are universal quantities related to the QCD stable spectrum, while κ (treated as an extra free parameter) is related to the asymptotic value of the ratio σel/σtot. Different possible scenarios are then considered and compared.

  5. Proteomic analysis of serum of workers occupationally exposed to arsenic, cadmium, and lead for biomarker research: A preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Kossowska, Barbara, E-mail: barbara@immchem.am.wroc.pl [Department of Chemistry and Immunochemistry, Wroclaw Medical University, Bujwida 44a, 50-345 Wroclaw (Poland); Dudka, Ilona, E-mail: ilona.dudka@pwr.wroc.pl [Medicinal Chemistry and Microbiology Group, Department of Chemistry, Wroclaw University of Technology, Wybrzeze Wyspianskiego 27, 50-370 Wroclaw (Poland); Bugla-Ploskonska, Gabriela, E-mail: gabriela.bugla-ploskonska@microb.uni.wroc.pl [Department of Microbiology, Institute of Genetics and Microbiology, University of Wroclaw, Przybyszewskiego 63/77, 51-148 Wroclaw (Poland); Szymanska-Chabowska, Anna, E-mail: aszyman@mp.pl [Department of Internal and Occupational Medicine, Wroclaw Medical University, Wybrzeze L. Pasteura 4, 50-367 Wroclaw (Poland); Doroszkiewicz, Wlodzimierz, E-mail: wlodzimierz.doroszkiewicz@microb.uni.wroc.pl [Department of Microbiology, Institute of Genetics and Microbiology, University of Wroclaw, Przybyszewskiego 63/77, 51-148 Wroclaw (Poland); Gancarz, Roman, E-mail: roman.gancarz@pwr.wroc.pl [Medicinal Chemistry and Microbiology Group, Department of Chemistry, Wroclaw University of Technology, Wybrzeze Wyspianskiego 27, 50-370 Wroclaw (Poland); Andrzejak, Ryszard, E-mail: ryszard@chzaw.am.wroc.pl [Department of Internal and Occupational Medicine, Wroclaw Medical University, Wybrzeze L. Pasteura 4, 50-367 Wroclaw (Poland); Antonowicz-Juchniewicz, Jolanta, E-mail: jola@chzaw.am.wroc.pl [Department of Internal and Occupational Medicine, Wroclaw Medical University, Wybrzeze L. Pasteura 4, 50-367 Wroclaw (Poland)

    2010-10-15

    The main factor of environmental contamination is the presence of the heavy metals lead, cadmium, and arsenic. The aim of serum protein profile analysis of people chronically exposed to heavy metals is to find protein markers of early pathological changes. The study was conducted in a group of 389 healthy men working in copper foundry and 45 age-matched non-exposed healthy men. Toxicological test samples included whole blood, serum, and urine. Thirty-seven clinical parameters were measured. Based on the parameters values of the healthy volunteers, the centroid in 37-dimensional space was calculated. The individuals in the metal-exposed and control groups were ordered based on the Euclidean distance from the centroid defined by the first component according to Principal Component Analysis (PCA). Serum samples of two individuals, one from the control and one from the metal-exposed group, were chosen for proteomic analysis. In optimized conditions of two-dimensional gel electrophoresis (2-DE), two protein maps were obtained representing both groups. Twenty-eight corresponding protein spots from both protein maps were chosen and identified based on PDQuest analysis and the SWISS-2DPAGE database. From a panel of six proteins with differences in expression greater than a factor of two, three potential markers with the highest differences were selected: hemoglobin-spot 26 (pI 7.05, Mw 10.53), unidentified protein-spot 27 (pI 6.73, Mw 10.17), and unidentified protein-spot 25 (pI 5.75, Mw 12.07). Further studies are required to prove so far obtained results. Identified proteins could serve as potential markers of preclinical changes and could be in the future included in biomonitoring of people exposed to heavy metals.

  6. Proteomic analysis of serum of workers occupationally exposed to arsenic, cadmium, and lead for biomarker research: A preliminary study

    International Nuclear Information System (INIS)

    Kossowska, Barbara; Dudka, Ilona; Bugla-Ploskonska, Gabriela; Szymanska-Chabowska, Anna; Doroszkiewicz, Wlodzimierz; Gancarz, Roman; Andrzejak, Ryszard; Antonowicz-Juchniewicz, Jolanta

    2010-01-01

    The main factor of environmental contamination is the presence of the heavy metals lead, cadmium, and arsenic. The aim of serum protein profile analysis of people chronically exposed to heavy metals is to find protein markers of early pathological changes. The study was conducted in a group of 389 healthy men working in copper foundry and 45 age-matched non-exposed healthy men. Toxicological test samples included whole blood, serum, and urine. Thirty-seven clinical parameters were measured. Based on the parameters values of the healthy volunteers, the centroid in 37-dimensional space was calculated. The individuals in the metal-exposed and control groups were ordered based on the Euclidean distance from the centroid defined by the first component according to Principal Component Analysis (PCA). Serum samples of two individuals, one from the control and one from the metal-exposed group, were chosen for proteomic analysis. In optimized conditions of two-dimensional gel electrophoresis (2-DE), two protein maps were obtained representing both groups. Twenty-eight corresponding protein spots from both protein maps were chosen and identified based on PDQuest analysis and the SWISS-2DPAGE database. From a panel of six proteins with differences in expression greater than a factor of two, three potential markers with the highest differences were selected: hemoglobin-spot 26 (pI 7.05, Mw 10.53), unidentified protein-spot 27 (pI 6.73, Mw 10.17), and unidentified protein-spot 25 (pI 5.75, Mw 12.07). Further studies are required to prove so far obtained results. Identified proteins could serve as potential markers of preclinical changes and could be in the future included in biomonitoring of people exposed to heavy metals.

  7. DNA Microarray Data Analysis: A Novel Biclustering Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Tewfik Ahmed H

    2006-01-01

    Full Text Available Biclustering algorithms refer to a distinct class of clustering algorithms that perform simultaneous row-column clustering. Biclustering problems arise in DNA microarray data analysis, collaborative filtering, market research, information retrieval, text mining, electoral trends, exchange analysis, and so forth. When dealing with DNA microarray experimental data for example, the goal of biclustering algorithms is to find submatrices, that is, subgroups of genes and subgroups of conditions, where the genes exhibit highly correlated activities for every condition. In this study, we develop novel biclustering algorithms using basic linear algebra and arithmetic tools. The proposed biclustering algorithms can be used to search for all biclusters with constant values, biclusters with constant values on rows, biclusters with constant values on columns, and biclusters with coherent values from a set of data in a timely manner and without solving any optimization problem. We also show how one of the proposed biclustering algorithms can be adapted to identify biclusters with coherent evolution. The algorithms developed in this study discover all valid biclusters of each type, while almost all previous biclustering approaches will miss some.

  8. Network approaches to the functional analysis of microbial proteins.

    Science.gov (United States)

    Hallinan, J S; James, K; Wipat, A

    2011-01-01

    Large amounts of detailed biological data have been generated over the past few decades. Much of these data is freely available in over 1000 online databases; an enticing, but frustrating resource for microbiologists interested in a systems-level view of the structure and function of microbial cells. The frustration engendered by the need to trawl manually through hundreds of databases in order to accumulate information about a gene, protein, pathway, or organism of interest can be alleviated by the use of computational data integration to generated network views of the system of interest. Biological networks can be constructed from a single type of data, such as protein-protein binding information, or from data generated by multiple experimental approaches. In an integrated network, nodes usually represent genes or gene products, while edges represent some form of interaction between the nodes. Edges between nodes may be weighted to represent the probability that the edge exists in vivo. Networks may also be enriched with ontological annotations, facilitating both visual browsing and computational analysis via web service interfaces. In this review, we describe the construction, analysis of both single-data source and integrated networks, and their application to the inference of protein function in microbes. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. A Tox21 Approach to Altered Epigenetic Landscapes: Assessing Epigenetic Toxicity Pathways Leading to Altered Gene Expression and Oncogenic Transformation In Vitro

    Directory of Open Access Journals (Sweden)

    Craig L. Parfett

    2017-06-01

    Full Text Available An emerging vision for toxicity testing in the 21st century foresees in vitro assays assuming the leading role in testing for chemical hazards, including testing for carcinogenicity. Toxicity will be determined by monitoring key steps in functionally validated molecular pathways, using tests designed to reveal chemically-induced perturbations that lead to adverse phenotypic endpoints in cultured human cells. Risk assessments would subsequently be derived from the causal in vitro endpoints and concentration vs. effect data extrapolated to human in vivo concentrations. Much direct experimental evidence now shows that disruption of epigenetic processes by chemicals is a carcinogenic mode of action that leads to altered gene functions playing causal roles in cancer initiation and progression. In assessing chemical safety, it would therefore be advantageous to consider an emerging class of carcinogens, the epigenotoxicants, with the ability to change chromatin and/or DNA marks by direct or indirect effects on the activities of enzymes (writers, erasers/editors, remodelers and readers that convey the epigenetic information. Evidence is reviewed supporting a strategy for in vitro hazard identification of carcinogens that induce toxicity through disturbance of functional epigenetic pathways in human somatic cells, leading to inactivated tumour suppressor genes and carcinogenesis. In the context of human cell transformation models, these in vitro pathway measurements ensure high biological relevance to the apical endpoint of cancer. Four causal mechanisms participating in pathways to persistent epigenetic gene silencing were considered: covalent histone modification, nucleosome remodeling, non-coding RNA interaction and DNA methylation. Within these four interacting mechanisms, 25 epigenetic toxicity pathway components (SET1, MLL1, KDM5, G9A, SUV39H1, SETDB1, EZH2, JMJD3, CBX7, CBX8, BMI, SUZ12, HP1, MPP8, DNMT1, DNMT3A, DNMT3B, TET1, MeCP2, SETDB2, BAZ2

  10. A Tox21 Approach to Altered Epigenetic Landscapes: Assessing Epigenetic Toxicity Pathways Leading to Altered Gene Expression and Oncogenic Transformation In Vitro.

    Science.gov (United States)

    Parfett, Craig L; Desaulniers, Daniel

    2017-06-01

    An emerging vision for toxicity testing in the 21st century foresees in vitro assays assuming the leading role in testing for chemical hazards, including testing for carcinogenicity. Toxicity will be determined by monitoring key steps in functionally validated molecular pathways, using tests designed to reveal chemically-induced perturbations that lead to adverse phenotypic endpoints in cultured human cells. Risk assessments would subsequently be derived from the causal in vitro endpoints and concentration vs. effect data extrapolated to human in vivo concentrations. Much direct experimental evidence now shows that disruption of epigenetic processes by chemicals is a carcinogenic mode of action that leads to altered gene functions playing causal roles in cancer initiation and progression. In assessing chemical safety, it would therefore be advantageous to consider an emerging class of carcinogens, the epigenotoxicants, with the ability to change chromatin and/or DNA marks by direct or indirect effects on the activities of enzymes (writers, erasers/editors, remodelers and readers) that convey the epigenetic information. Evidence is reviewed supporting a strategy for in vitro hazard identification of carcinogens that induce toxicity through disturbance of functional epigenetic pathways in human somatic cells, leading to inactivated tumour suppressor genes and carcinogenesis. In the context of human cell transformation models, these in vitro pathway measurements ensure high biological relevance to the apical endpoint of cancer. Four causal mechanisms participating in pathways to persistent epigenetic gene silencing were considered: covalent histone modification, nucleosome remodeling, non-coding RNA interaction and DNA methylation. Within these four interacting mechanisms, 25 epigenetic toxicity pathway components (SET1, MLL1, KDM5, G9A, SUV39H1, SETDB1, EZH2, JMJD3, CBX7, CBX8, BMI, SUZ12, HP1, MPP8, DNMT1, DNMT3A, DNMT3B, TET1, MeCP2, SETDB2, BAZ2A, UHRF1, CTCF

  11. A novel approach for system change pathway analysis

    Directory of Open Access Journals (Sweden)

    Walaa Ibrahim Gabr

    2016-03-01

    Full Text Available This paper is directed toward presenting a novel approach based on “consolidity charts” for the analysis of natural and man-made systems during their change pathway or course of life. The physical significance of the consolidity chart (region is that it marks the boundary of all system interactive behavior resulting from all exhaustive internal and external influences. For instance, at a specific event state, the corresponding consolidity region describes all the plausible points of normalized input–output (fuzzy or non-fuzzy interactions. These charts are developed as each event step for zone scaling of system parameters changes due to affected events or varying environments “on and above” their normal operation or set points and following the “time driven-event driven-parameters change” paradigm. Examples of the consolidity trajectory movement in the regions or patterns centers in the proposed charts of various consolidity classes are developed showing situations of change pathways from the unconsolidated form to the consolidated ones and vice versa. It is shown that the regions comparisons are based on type of consolidity region geometric shapes properties. Moreover, it is illustrated that the centerlines connecting consolidity regions during the change pathway could follow some certain type of trajectories designated as “consolidity pathway trajectory” that could assume various forms including zigzagging patterns depending on the consecutive affected influences. Implementation procedures are elaborated for the consolidity chart analysis of four real life case studies during their conventional and unconventional change pathways, describing: (i the drug concentration production problem, (ii the prey–predator population problem, (iii the spread of infectious disease problem and (iv the HIV/AIDS Epidemic problem. These solved case studies have lucidly demonstrated the applicability and effectiveness of the suggested

  12. Three-dimensional thermal-structural analysis of a swept cowl leading edge subjected to skewed shock-shock interference heating

    Science.gov (United States)

    Polesky, Sandra P.; Dechaumphai, Pramote; Glass, Christopher E.; Pandey, Ajay K.

    1990-01-01

    A three-dimensional flux-based thermal analysis method has been developed and its capability is demonstrated by predicting the transient nonlinear temperature response of a swept cowl leading edge subjected to intense three-dimensional aerodynamic heating. The predicted temperature response from the transient thermal analysis is used in a linear elastic structural analysis to determine thermal stresses. Predicted thermal stresses are compared with those obtained from a two-dimensional analysis which represents conditions along the chord where maximum heating occurs. Results indicate a need for a three-dimensional analysis to predict accurately the leading edge thermal stress response.

  13. Nonlinear Dynamic Analysis of RC Shear Walls using Damage Mechanics Approach Considering Bond-Slip Effects

    Directory of Open Access Journals (Sweden)

    N. Davoodi

    2015-07-01

    Full Text Available In this research, nonlinear dynamic analysis of concrete shear wall using a new nonlinear model based on damage mechanics approach and considering bond slip effects is presented. Nonlinear behavior of concrete is modeled by a rotational smeared crack model using damage mechanics approach. The proposed model considers major characteristics of the concrete subjected to two and three dimensional loading conditions. These characteristics are pre-softening behavior, softening initiation criteria and fracture energy conservation. The model was used in current research analysis after verification by some available numerical tests. Reinforcements are modeled by a bilinear relationship using two models: Discrete truss steel element and Smeared model. In Discrete model the effects of bond-slide between concrete and rebar is mentioned using the bond-link element model concept. Based on the presented algorithms and methodology, an FEM code is developed in FORTRAN. The validity of the proposed models and numerical algorithms has been checked using the available experimental results. Finally, numerical simulation of CAMUS I and CAMUS III reinforced concrete shear walls is carried out. Comparisons of deduced results confirm the validity of proposed models. The obtained results, both in the expected displacements and crack profiles for the walls, show a good accuracy with respect to the experimental results. Also, using discrete truss element model with respect to the smeared steel model leads to increasing the accuracy of maximum displacement response to 7% in analysis.

  14. ANALYSIS, SELECTION AND RANKING OF FOREIGN MARKETS. A COMPREHENSIVE APPROACH

    Directory of Open Access Journals (Sweden)

    LIVIU NEAMŢU

    2013-12-01

    Full Text Available Choosing the appropriate markets for growth and development is essential for a company that wishes expanding businesses through international economic exchanges. But in this business case foreign markets research is not sufficient even though is an important chapter in the decision technology and an indispensable condition for achieving firm’s objectives. If in marketing on the national market this market is defined requiring no more than its prospection and segmentation, in the case of the international market outside the research process there is a need of a selection of markets and their classification. Companies that have this intention know little or nothing about the conditions offered by a new market or another. Therefore, they must go, step by step, through a complex analysis process, multilevel- type, composed of selection and ranking of markets followed by the proper research through exploration and segmentation, which can lead to choosing the most profitable markets. In this regard, within this study, we propose a multi-criteria model for selection and ranking of international development markets, allowing companies access to those markets which are in compliance with the company's development strategy.

  15. Assessment of lead pollution in topsoils of a southern Italy area: Analysis of urban and peri-urban environment.

    Science.gov (United States)

    Guagliardi, Ilaria; Cicchella, Domenico; De Rosa, Rosanna; Buttafuoco, Gabriele

    2015-07-01

    Exposure to lead (Pb) may affect adversely human health. Mapping soil Pb contents is essential to obtain a quantitative estimate of potential risk of Pb contamination. The main aim of this paper was to determine the soil Pb concentrations in the urban and peri-urban area of Cosenza-Rende to map their spatial distribution and assess the probability that soil Pb concentration exceeds a critical threshold that might cause concern for human health. Samples were collected at 149 locations from residual and non-residual topsoil in gardens, parks, flower-beds, and agricultural fields. Fine earth fraction of soil samples was analyzed by X-ray Fluorescence spectrometry. Stochastic images generated by the sequential Gaussian simulation were jointly combined to calculate the probability of exceeding the critical threshold that could be used to delineate the potentially risky areas. Results showed areas in which Pb concentration values were higher to the Italian regulatory values. These polluted areas were quite large and likely, they could create a significant health risk for human beings and vegetation in the near future. The results demonstrated that the proposed approach can be used to study soil contamination to produce geochemical maps, and identify hot-spot areas for soil Pb concentration. Copyright © 2015. Published by Elsevier B.V.

  16. An analysis of the impact of LHC Run I proton-lead data on nuclear parton densities

    Energy Technology Data Exchange (ETDEWEB)

    Armesto, Nestor; Penin, Jose Manuel; Salgado, Carlos A.; Zurita, Pia [Universidade de Santiago de Compostela, Departamento de Fisica de Particulas and IGFAE, Galicia (Spain); Paukkunen, Hannu [Universidade de Santiago de Compostela, Departamento de Fisica de Particulas and IGFAE, Galicia (Spain); University of Jyvaeskylae, Department of Physics, P.O. Box 35, Jyvaeskylae (Finland); University of Helsinki, Helsinki Institute of Physics, P.O. Box 64, Helsinki (Finland)

    2016-04-15

    We report on an analysis of the impact of available experimental data on hard processes in proton-lead collisions during Run I at the large hadron collider on nuclear modifications of parton distribution functions. Our analysis is restricted to the EPS09 and DSSZ global fits. The measurements that we consider comprise production of massive gauge bosons, jets, charged hadrons and pions. This is the first time a study of nuclear PDFs includes this number of different observables. The goal of the paper is twofold: (i) checking the description of the data by nPDFs, as well as the relevance of these nuclear effects, in a quantitative manner; (ii) testing the constraining power of these data in eventual global fits, for which we use the Bayesian reweighting technique. We find an overall good, even too good, description of the data, indicating that more constraining power would require a better control over the systematic uncertainties and/or the proper proton-proton reference from LHC Run II. Some of the observables, however, show sizeable tension with specific choices of proton and nuclear PDFs. We also comment on the corresponding improvements as regards the theoretical treatment. (orig.)

  17. Toxicokinetics of bone lead.

    Science.gov (United States)

    Rabinowitz, M B

    1991-02-01

    This article discusses bone as a source of lead to the rest of the body and as a record of past lead exposure. Bone lead levels generally increase with age at rates dependent on the skeletal site and lead exposure. After occupational exposure, the slow decline in blood lead, a 5- to 19-year half-life, reflects the long skeletal half-life. Repeated measurements of bone lead demonstrate the slow elimination of lead from bone. Stable isotope ratios have revealed many details of skeletal uptake and subsequent release. The bulk turnover rates for compact bone are about 2% per year and 8% for spine. Turnover activity varies with age and health. Even though lead approximates calcium, radium, strontium, barium, fluorine, and other bone seekers, the rates for each are different. A simple, two-pool (bone and blood) kinetic model is presented with proposed numerical values for the changes in blood lead levels that occur with changes in turnover rates. Two approaches are offered to further quantify lead turnover. One involves a study of subjects with known past exposure. Changes in the ratio of blood lead to bone lead with time would reflect the course of bone lead availability. Also, stable isotopes and subjects who move from one geographical area to another offer opportunities. Sequential isotope measurements would indicate how much of the lead in blood is from current exposure or bone stores, distinct from changes in absorption or excretion.

  18. Does a higher glycemic level lead to a higher rate of dental implant failure?: A meta-analysis.

    Science.gov (United States)

    Shi, Quan; Xu, Juan; Huo, Na; Cai, Chuan; Liu, Hongchen

    2016-11-01

    Owing to limited evidence, it is unclear whether diabetes that is not well controlled would lead to a higher rate of dental implant failure. The authors of this meta-analysis evaluated whether the failure rate for patients with diabetes that was not well controlled was higher than the failure rate for patients with well-controlled diabetes. The authors searched PubMed, the Cochrane Library, and ClinicalTrials.gov without limitations for studies whose investigators compared the dental implant failure rates between patients with well-controlled diabetes and diabetes that was not well controlled. The authors pooled the relative risk (RR) and 95% confidence interval (CI) values to estimate the relative effect of the glycemic level on dental implant failures. The authors used a subgroup analysis to identify the association between the implant failure rate and the stage at which the failure occurred. The authors included 7 studies in this meta-analysis, including a total of 252 patients and 587 dental implants. The results of the pooled analysis did not indicate a direct association between the glycemic level in patients with diabetes and the dental implant failure rate (RR, 0.620; 95% CI, 0.225-1.705). The pooled RR in the subgroup of patients who experienced early implant failure was 0.817 (95% CI, 0.096-6.927), whereas in the subgroup of patients who experienced late implant failure, the pooled RR was 0.572 (95% CI, 0.206-1.586). On the basis of the evidence, the results of this meta-analysis failed to show a difference in the failure rates for dental implants between patients with well-controlled diabetes and patients with diabetes that was not well controlled. However, considering the limitations associated with this meta-analysis, the authors determined that future studies that are well designed and provide adequate controls for confounding factors are required. Copyright © 2016 American Dental Association. Published by Elsevier Inc. All rights reserved.

  19. Findings From 12-lead Electrocardiography That Predict Circulatory Shock From Pulmonary Embolism: Systematic Review and Meta-analysis.

    Science.gov (United States)

    Shopp, Jacob D; Stewart, Lauren K; Emmett, Thomas W; Kline, Jeffrey A

    2015-10-01

    higher in smaller studies. Patients who were outcome-negative had a significantly lower mean ± SD Daniel score (2.6 ± 1.5) than patients with hemodynamic collapse (5.9 ± 3.9; p = 0.039, ANOVA with Dunnett's post hoc), but not patients with all-cause 30-day mortality (4.9 ± 3.3; p = 0.12). This systematic review and meta-analysis revealed 10 studies, including 3,007 patients with acute PE, that demonstrate that six findings of RV strain on 12-lead ECG (heart rate > 100 beats/min, S1Q3T3, complete RBBB, inverted T waves in V1-V4, ST elevation in aVR, and atrial fibrillation) are associated with increased risk of circulatory shock and death. © 2015 by the Society for Academic Emergency Medicine.

  20. Fluorescent microscopy approaches of quantitative soil microbial analysis

    Science.gov (United States)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    hybridization method (FISH). This approach was used for evaluation of contribution of each gram-negative bactera group. No significant difference between the main soil gram-negative bacterial groups (phylum Proteobacteria and Bacteroidetes) was found both under anaerobic and anaerobic conditions in chernozem in the topsoil. Thus soil gram-negative bacteria play an important ecological role in natural polymer degradation as common group of microorganisms. Another approach with using cascade filtration technique for bacterial population density estimation in chernozem was compared to classical method of fluorescent microscopy. Quantification of soil bacteria with cascade filtration provided by filters with different diameters and filtering of soil suspension in fixed amount. In comparison to the classical fluorescent microscopy method the modification with filtration of soil suspension provided to quantify more bacterial cells. Thus biomass calculation results of soil bacteria by using classical fluorescent microscopy could be underestimated and combination with cascade filtration technique allow to avoid potential experimental error. Thereby, combination and comparison of several fluorescent microscopy methods modifications established during the research provided miscellaneous approaches in soil bacteria quantification and analysis of ecological roles of soil microorganisms.

  1. Image-guided left ventricular lead placement in cardiac resynchronization therapy for patients with heart failure: a meta-analysis.

    Science.gov (United States)

    Jin, Yan; Zhang, Qi; Mao, Jia-Liang; He, Ben

    2015-05-10

    Heart failure (HF) is a debilitating condition that affects millions of people worldwide. One means of treating HF is cardiac resynchronization therapy (CRT). Recently, several studies have examined the use of echocardiography (ECHO) in the optimization of left ventricular (LV) lead placement to increase the response to CRT. The objective of this study was to synthesize the available data on the comparative efficacy of image-guided and standard CRT. We searched the PubMed, Cochrane, Embase, and ISI Web of Knowledge databases through April 2014 with the following combinations of search terms: left ventricular lead placement, cardiac resynchronization therapy, image-guided, and echocardiography-guided. Studies meeting all of the inclusion criteria and none of the exclusion criteria were eligible for inclusion. The primary outcome measures were CRT response rate, change in LV ejection fraction (LVEF), and change in LV end systolic volume (LVESV). Secondary outcomes included the rates of all-cause mortality and HF-related hospitalization. Our search identified 103 articles, 3 of which were included in the analysis. In total, 270 patients were randomized to the image-guided CRT and 241, to the standard CRT. The pooled estimates showed a significant benefit for image-guided CRT (CRT response: OR, 2.098, 95 % CI, 1.432-3.072; LVEF: difference in means, 3.457, 95 % CI, 1.910-5.005; LVESV: difference in means, -20.36, 95 % CI, -27.819 - -12.902). Image-guided CRT produced significantly better clinical outcomes than the standard CRT. Additional trials are warranted to validate the use of imaging in the prospective optimization of CRT.

  2. A Multimodal Data Analysis Approach for Targeted Drug Discovery Involving Topological Data Analysis (TDA).

    Science.gov (United States)

    Alagappan, Muthuraman; Jiang, Dadi; Denko, Nicholas; Koong, Albert C

    In silico drug discovery refers to a combination of computational techniques that augment our ability to discover drug compounds from compound libraries. Many such techniques exist, including virtual high-throughput screening (vHTS), high-throughput screening (HTS), and mechanisms for data storage and querying. However, presently these tools are often used independent of one another. In this chapter, we describe a new multimodal in silico technique for the hit identification and lead generation phases of traditional drug discovery. Our technique leverages the benefits of three independent methods-virtual high-throughput screening, high-throughput screening, and structural fingerprint analysis-by using a fourth technique called topological data analysis (TDA). We describe how a compound library can be independently tested with vHTS, HTS, and fingerprint analysis, and how the results can be transformed into a topological data analysis network to identify compounds from a diverse group of structural families. This process of using TDA or similar clustering methods to identify drug leads is advantageous because it provides a mechanism for choosing structurally diverse compounds while maintaining the unique advantages of already established techniques such as vHTS and HTS.

  3. A Discussion of Water Pollution in the United States and Mexico; with High School Laboratory Activities for Analysis of Lead, Atrazine, and Nitrate.

    Science.gov (United States)

    Kelter, Paul B.; Grundman, Julie; Hage, David S.; Carr, James D.; Castro-Acuna, Carlos Mauricio

    1997-01-01

    Presents discussions on sources, health impacts, methods of analysis as well as lengthy discussions of lead, nitrates, and atrazine as related to water pollution and the interdisciplinary nature of the modern chemistry curriculum. (DKM)

  4. Protein precipitation of diluted samples in SDS-containing buffer with acetone leads to higher protein recovery and reproducibility in comparison with TCA/acetone approach.

    Science.gov (United States)

    Santa, Cátia; Anjo, Sandra I; Manadas, Bruno

    2016-07-01

    Proteomic approaches are extremely valuable in many fields of research, where mass spectrometry methods have gained an increasing interest, especially because of the ability to perform quantitative analysis. Nonetheless, sample preparation prior to mass spectrometry analysis is of the utmost importance. In this work, two protein precipitation approaches, widely used for cleaning and concentrating protein samples, were tested and compared in very diluted samples solubilized in a strong buffer (containing SDS). The amount of protein recovered after acetone and TCA/acetone precipitation was assessed, as well as the protein identification and relative quantification by SWATH-MS yields were compared with the results from the same sample without precipitation. From this study, it was possible to conclude that in the case of diluted samples in denaturing buffers, the use of cold acetone as precipitation protocol is more favourable than the use of TCA/acetone in terms of reproducibility in protein recovery and number of identified and quantified proteins. Furthermore, the reproducibility in relative quantification of the proteins is even higher in samples precipitated with acetone compared with the original sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Determination of lead and cadmium concentration limits in agricultural soil and municipal solid waste compost through an approach of zero tolerance to food contamination.

    Science.gov (United States)

    Saha, Jayanta Kumar; Panwar, N R; Singh, M V

    2010-09-01

    Cadmium and lead are important environmental pollutants with high toxicity to animals and human. Soils, though have considerable metal immobilizing capability, can contaminate food chain via plants grown upon them when their built-up occurs to a large extent. Present experiment was carried out with the objective of quantifying the limits of Pb and Cd loading in soil for the purpose of preventing food chain contamination beyond background concentration levels. Two separate sets of pot experiment were carried out for these two heavy metals with graded levels of application doses of Pb at 0.4-150 mg/kg and Cd at 0.02-20 mg/kg to an acidic light textured alluvial soil. Spinach crop was grown for 50 days on these treated soils after a stabilization period of 2 months. Upper limit of background concentration levels (C(ul)) of these metals were calculated through statistical approach from the heavy metals concentration values in leaves of spinach crop grown in farmers' fields. Lead and Cd concentration limits in soil were calculated by dividing C(ul) with uptake response slope obtained from the pot experiment. Cumulative loading limits (concentration limits in soil minus contents in uncontaminated soil) for the experimental soil were estimated to be 170 kg Pb/ha and 0.8 kg Cd/ha. Based on certain assumptions on application rate and computed cumulative loading limit values, maximum permissible Pb and Cd concentration values in municipal solid waste (MSW) compost were proposed as 170 mg Pb/kg and 0.8 mg Cd/kg, respectively. In view of these limiting values, about 56% and 47% of the MSW compost samples from different cities are found to contain Pb and Cd in the safe range.

  6. Contribution of Italy to the activities on intercomparison of analysis methods for seismically isolated nuclear structures: Finite element analysis of lead rubber bearings

    International Nuclear Information System (INIS)

    Dusi, A.; Forni, M.; Martelli, A.

    1998-01-01

    This paper presents a summary of the results of nonlinear Finite Element (FE) analyses carried out by ENEL-Ricerca, Hydraulic and Structural Centre and ENEA-ERG-SIEC-SISM, on Lead Rubber Bearings (LRBs). Activities were carried out in the framework of the four years' Coordinated Research Programme (CRP) of the International Atomic Energy Agency (IAEA) on I ntercomparison of Analysis Methods for Seismically Isolated Nuclear Structures . The bearing Finite Element Models (FEMs) are validated through comparisons of the numerical results with experimental test data. The reliability of FEMs for simulating the behaviour of rubber bearings is presented and discussed. (author)

  7. Saving the world by teaching behavior analysis: A behavioral systems approach

    OpenAIRE

    Malott, Richard W.; Vunovich, Pamela L.; Boettcher, William; Groeger, Corina

    1995-01-01

    This article presents a behavioral systems approach to organizational design and applies that approach to the teaching of behavior analysis. This systems approach consists of three components: goal-directed systems design, behavioral systems engineering, and performance management. This systems approach is applied to the Education Board and Teaching Behavior Analysis Special Interest Group of the Association for Behavior Analysis, with a conclusion that we need to emphasize the recruitment of...

  8. On the analysis of line profile variations: A statistical approach

    International Nuclear Information System (INIS)

    McCandliss, S.R.

    1988-01-01

    This study is concerned with the empirical characterization of the line profile variations (LPV), which occur in many of and Wolf-Rayet stars. The goal of the analysis is to gain insight into the physical mechanisms producing the variations. The analytic approach uses a statistical method to quantify the significance of the LPV and to identify those regions in the line profile which are undergoing statistically significant variations. Line positions and flux variations are then measured and subject to temporal and correlative analysis. Previous studies of LPV have for the most part been restricted to observations of a single line. Important information concerning the range and amplitude of the physical mechanisms involved can be obtained by simultaneously observing spectral features formed over a range of depths in the extended mass losing atmospheres of massive, luminous stars. Time series of a Wolf-Rayet and two of stars with nearly complete spectral coverage from 3940 angstrom to 6610 angstrom and with spectral resolution of R = 10,000 are analyzed here. These three stars exhibit a wide range of both spectral and temporal line profile variations. The HeII Pickering lines of HD 191765 show a monotonic increase in the peak rms variation amplitude with lines formed at progressively larger radii in the Wolf-Rayet star wind. Two times scales of variation have been identified in this star: a less than one day variation associated with small scale flickering in the peaks of the line profiles and a greater than one day variation associated with large scale asymmetric changes in the overall line profile shapes. However, no convincing period phenomena are evident at those periods which are well sampled in this time series

  9. A Comparison of the Approaches to Customer Experience Analysis

    Directory of Open Access Journals (Sweden)

    Havíř David

    2017-08-01

    Full Text Available Nowadays, customer experience is receiving much attention in scientific and managerial community. Scholars and practitioners state that customer experience is the next area of competition. For a long time, there has been a call for a uniform, accurate definition, definition of its components, and the development of the customer experience frameworks. As this topic is new, there has been a considerable fragmentation. The question is if the fragmentation is still present and how can we address it. The aim of this paper is to summarize research on customer experience analysis and to explore and compare the dimensions describing customer experience listed in seven conceptual models with findings from 17 research projects on customer experience conducted after the year 2010. The purpose of this is to summarize recent knowledge, get the most comprehensive view on customer experience and its possible decomposition, and to reveal possible relationships between the dimensions. Based on a review of the available literature, the paper juxtaposes several approaches to customer experience analysis and compares their results to find similarities and differences among them. In the first step, the dimensions and factors of the customer experience were extracted from the seven models to analyze customer experience and they were compared with each other. This resulted in a set of dimensions and factors. In the next step, customer experience factors and dimensions were extracted from 17 practical research papers on customer experience. Finally, based on their descriptions and found similarities, the dimensions and factors were put together into several groups, as this grouping and creation of the new universal set of dimensions might solve the fragmentation issue.

  10. Lead poisoning

    Science.gov (United States)

    ... help if this information is not immediately available. Poison Control If someone has severe symptoms from possible ... be caused by lead poisoning, call your local poison control center. Your local poison center can be ...

  11. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.; Kim, I.S.; Lofgren, E.V.

    1989-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective tests and maintenance practices that control; risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety. 3 refs., 7 figs., 2 tabs

  12. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Lofgren, E.V.; Vesely, W.E.

    1990-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective test and maintenance practices that control risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety

  13. Reliability analysis with linguistic data: An evidential network approach

    International Nuclear Information System (INIS)

    Zhang, Xiaoge; Mahadevan, Sankaran; Deng, Xinyang

    2017-01-01

    In practical applications of reliability assessment of a system in-service, information about the condition of a system and its components is often available in text form, e.g., inspection reports. Estimation of the system reliability from such text-based records becomes a challenging problem. In this paper, we propose a four-step framework to deal with this problem. In the first step, we construct an evidential network with the consideration of available knowledge and data. Secondly, we train a Naive Bayes text classification algorithm based on the past records. By using the trained Naive Bayes algorithm to classify the new records, we build interval basic probability assignments (BPA) for each new record available in text form. Thirdly, we combine the interval BPAs of multiple new records using an evidence combination approach based on evidence theory. Finally, we propagate the interval BPA through the evidential network constructed earlier to obtain the system reliability. Two numerical examples are used to demonstrate the efficiency of the proposed method. We illustrate the effectiveness of the proposed method by comparing with Monte Carlo Simulation (MCS) results. - Highlights: • We model reliability analysis with linguistic data using evidential network. • Two examples are used to demonstrate the efficiency of the proposed method. • We compare the results with Monte Carlo Simulation (MCS).

  14. Analysis of opioid consumption in clinical trials: a simulation based analysis of power of four approaches

    DEFF Research Database (Denmark)

    Juul, Rasmus Vestergaard; Nyberg, Joakim; Kreilgaard, Mads

    2017-01-01

    Inconsistent trial design and analysis is a key reason that few advances in postoperative pain management have been made from clinical trials analyzing opioid consumption data. This study aimed to compare four different approaches to analyze opioid consumption data. A repeated time-to-event (RTTE......) model in NONMEM was used to simulate clinical trials of morphine consumption with and without a hypothetical adjuvant analgesic in doses equivalent to 15-62% reduction in morphine consumption. Trials were simulated with duration of 24-96 h. Monte Carlo simulation and re-estimation were performed...... of potency was obtained with a RTTE model accounting for both morphine effects and time-varying covariates on opioid consumption. An RTTE analysis approach proved better suited for demonstrating efficacy of opioid sparing analgesics than traditional statistical tests as a lower sample size was required due...

  15. Assessment of the effects of the Japanese shift to lead-free solders and its impact on material substitution and environmental emissions by a dynamic material flow analysis

    International Nuclear Information System (INIS)

    Fuse, Masaaki; Tsunemi, Kiyotaka

    2012-01-01

    Lead-free electronics has been extensively studied, whereas their adoption by society and their impact on material substitution and environmental emissions are not well understood. Through a material flow analysis (MFA), this paper explores the life cycle flows for solder-containing metals in Japan, which leads the world in the shift to lead-free solders in electronics. The results indicate that the shift has been progressing rapidly for a decade, and that substitutes for lead in solders, which include silver and copper, are still in the early life cycle stages. The results also show, however, that such substitution slows down during the late life cycle stages owing to long electronic product lifespans. This deceleration of material substitution in the solder life cycle may not only preclude a reduction in lead emissions to air but also accelerate an increase in silver emissions to air and water. As an effective measure against ongoing lead emissions, our scenario analysis suggests an aggressive recycling program for printed circuit boards that utilizes an existing recycling scheme. -- Highlights: ► We model the life cycle flows for solder-containing metals in Japan. ► The Japanese shift to lead-free solders progresses rapidly for a decade. ► Substitution for lead in solders slows down during the late life cycle stages. ► The deceleration of substitution precludes a reduction in lead emissions to air.

  16. Selection of mode for the measurement of lead isotope ratios by inductively coupled plasma mass spectrometry and its application to milk powder analysis

    International Nuclear Information System (INIS)

    Dean, J.R.; Ebdon, L.; Massey, R.

    1987-01-01

    An investigation into the selection of the optimum mode for the measurement of isotope ratios in inductively coupled plasma mass spectrometry (ICP-MS) is reported, with particular reference to lead isotope ratios. Variation in the accuracy and precision achievable using the measurement modes of scanning and peak jumping are discussed. It is concluded that if sufficient sample and time are available, scanning gives best accuracy and precision. Isotope dilution analysis (IDA) has been applied to the measurement of the lead content of two dried milk powders of Australian and European origin introduced as slurries into ICP-MS. Differences in the lead isotope ratios in the two milk powders were investigated and the total lead content determined by IDA. Isotope dilution analysis permitted accurate data to be obtained with an RSD of 4.2% or milk powder containing less than 30 ng g -1 of lead. The ICP-MS technique is confirmed as a useful tool for IDA. (author)

  17. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  18. Visual Analytics approach for Lightning data analysis and cell nowcasting

    Science.gov (United States)

    Peters, Stefan; Meng, Liqiu; Betz, Hans-Dieter

    2013-04-01

    Thunderstorms and their ground effects, such as flash floods, hail, lightning, strong wind and tornadoes, are responsible for most weather damages (Bonelli & Marcacci 2008). Thus to understand, identify, track and predict lightning cells is essential. An important aspect for decision makers is an appropriate visualization of weather analysis results including the representation of dynamic lightning cells. This work focuses on the visual analysis of lightning data and lightning cell nowcasting which aim to detect and understanding spatial-temporal patterns of moving thunderstorms. Lightnings are described by 3D coordinates and the exact occurrence time of lightnings. The three-dimensionally resolved total lightning data used in our experiment are provided by the European lightning detection network LINET (Betz et al. 2009). In all previous works, lightning point data, detected lightning cells and derived cell tracks are visualized in 2D. Lightning cells are either displayed as 2D convex hulls with or without the underlying lightning point data. Due to recent improvements of lightning data detection and accuracy, there is a growing demand on multidimensional and interactive visualization in particular for decision makers. In a first step lightning cells are identified and tracked. Then an interactive graphic user interface (GUI) is developed to investigate the dynamics of the lightning cells: e.g. changes of cell density, location, extension as well as merging and splitting behavior in 3D over time. In particular a space time cube approach is highlighted along with statistical analysis. Furthermore a lightning cell nowcasting is conducted and visualized. The idea thereby is to predict the following cell features for the next 10-60 minutes including location, centre, extension, density, area, volume, lifetime and cell feature probabilities. The main focus will be set to a suitable interactive visualization of the predicted featured within the GUI. The developed visual

  19. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  20. Glucuronidated Flavonoids in Neurological Protection: Structural Analysis and Approaches for Chemical and Biological Synthesis.

    Science.gov (United States)

    Docampo, Maite; Olubu, Adiji; Wang, Xiaoqiang; Pasinetti, Giulio; Dixon, Richard A

    2017-09-06

    Both plant and mammalian cells express glucuronosyltransferases that catalyze glucuronidation of polyphenols such as flavonoids and other small molecules. Oral administration of select polyphenolic compounds leads to the accumulation of the corresponding glucuronidated metabolites at μM and sub-μM concentrations in the brain, associated with amelioration of a range of neurological symptoms. Determining the mechanisms whereby botanical extracts impact cognitive wellbeing and psychological resiliency will require investigation of the modes of action of the brain-targeted metabolites. Unfortunately, many of these compounds are not commercially available. This article describes the latest approaches for the analysis and synthesis of glucuronidated flavonoids. Synthetic schemes include both standard organic synthesis, semisynthesis, enzymatic synthesis and use of synthetic biology utilizing heterologous enzymes in microbial platform organisms.

  1. The "Food Polymer Science" approach to the practice of industrial R&D, leading to patent estates based on fundamental starch science and technology.

    Science.gov (United States)

    Slade, Louise; Levine, Harry

    2018-04-13

    This article reviews the application of the "Food Polymer Science" approach to the practice of industrial R&D, leading to patent estates based on fundamental starch science and technology. The areas of patents and patented technologies reviewed here include: (a) soft-from-the-freezer ice creams and freezer-storage-stable frozen bread dough products, based on "cryostabilization technology" of frozen foods, utilizing commercial starch hydrolysis products (SHPs); (b) glassy-matrix encapsulation technology for flavors and other volatiles, based on structure-function relationships for commercial SHPs; (c) production of stabilized whole-grain wheat flours for biscuit products, based on the application of "solvent retention capacity" technology to develop flours with reduced damaged starch; (d) production of improved-quality, low-moisture cookies and crackers, based on pentosanase enzyme technology; (e) production of "baked-not-fried," chip-like, starch-based snack products, based on the use of commercial modified-starch ingredients with selected functionality; (f) accelerated staling of a starch-based food product from baked bread crumb, based on the kinetics of starch retrogradation, treated as a crystallization process for a partially crystalline glassy polymer system; and (g) a process for producing an enzyme-resistant starch, for use as a reduced-calorie flour replacer in a wide range of grain-based food products, including cookies, extruded expanded snacks, and breakfast cereals.

  2. A comprehensive analysis of breast cancer news coverage in leading media outlets focusing on environmental risks and prevention.

    Science.gov (United States)

    Atkin, Charles K; Smith, Sandi W; McFeters, Courtnay; Ferguson, Vanessa

    2008-01-01

    Breast cancer has a high profile in the news media, which are a major source of information for cancer patients and the general public. To determine the nature of breast cancer news coverage available to audiences, particularly on the topics of environmental risks and prevention, this content analysis measured a broad array of dimensions in 231 stories appearing in nine leading newspapers, newsmagazines, and television networks in 2003 and 2004. One fourth of all stories reported on various risks such as hormone replacement therapy (HRT) use. Very few items specifically addressed risks related to controllable lifestyle practices such as prepubertal obesity or chemical contaminants in the environment. About one third of the stories included prevention content, primarily focusing narrowly on use of pharmaceutical products. Little information described risk reduction via other individual preventive behaviors (e.g., diet, exercise, and smoking), parental protective measures, or collective actions to combat contamination sites. The more traditional categories of prevalence, detection, and treatment were featured in one third, one quarter, and two fifths of the news items, respectively. There were twice as many stories featuring personal narratives as statistical figures, and two thirds of all the news items cited expert medical professionals, researchers, or organizations. Implications of these findings and directions for future research are addressed.

  3. Measurement and Analysis Plan for Investigation of Spent-Fuel Assay Using Lead Slowing-Down Spectroscopy

    International Nuclear Information System (INIS)

    Smith, Leon E.; Haas, Derek A.; Gavron, Victor A.; Imel, G.R.; Ressler, Jennifer J.; Bowyer, Sonya M.; Danon, Y.; Beller, D.

    2009-01-01

    Under funding from the Department of Energy Office of Nuclear Energy's Materials, Protection, Accounting, and Control for Transmutation (MPACT) program (formerly the Advanced Fuel Cycle Initiative Safeguards Campaign), Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboratory (LANL) are collaborating to study the viability of lead slowing-down spectroscopy (LSDS) for spent-fuel assay. Based on the results of previous simulation studies conducted by PNNL and LANL to estimate potential LSDS performance, a more comprehensive study of LSDS viability has been defined. That study includes benchmarking measurements, development and testing of key enabling instrumentation, and continued study of time-spectra analysis methods. This report satisfies the requirements for a PNNL/LANL deliverable that describes the objectives, plans and contributing organizations for a comprehensive three-year study of LSDS for spent-fuel assay. This deliverable was generated largely during the LSDS workshop held on August 25-26, 2009 at Rensselaer Polytechnic Institute (RPI). The workshop itself was a prominent milestone in the FY09 MPACT project and is also described within this report.

  4. Metabolic analysis of knee synovial fluid as a potential diagnostic approach for osteoarthritis.

    Science.gov (United States)

    Mickiewicz, Beata; Kelly, Jordan J; Ludwig, Taryn E; Weljie, Aalim M; Wiley, J Preston; Schmidt, Tannin A; Vogel, Hans J

    2015-11-01

    Osteoarthritis (OA) is a leading cause of chronic joint pain in the older human population. Diagnosis of OA at an earlier stage may enable the development of new treatments to one day effectively modify the progression and prognosis of the disease. In this work, we explore whether an integrated metabolomics approach could be utilized for the diagnosis of OA. Synovial fluid (SF) samples were collected from symptomatic chronic knee OA patients and normal human cadaveric knee joints. The samples were analyzed using (1)H nuclear magnetic resonance (NMR) spectroscopy and gas chromatography-mass spectrometry (GC-MS) followed by multivariate statistical analysis. Based on the metabolic profiles, we were able to distinguish OA patients from the controls and validate the statistical models. Moreover, we have integrated the (1)H NMR and GC-MS results and we found that 11 metabolites were statistically important for the separation between OA and normal SF. Additionally, statistical analysis showed an excellent predictive ability of the constructed metabolomics model (area under the receiver operating characteristic curve = 1.0). Our findings indicate that metabolomics might serve as a promising approach for the diagnosis and prognosis of degenerative changes in the knee joint and should be further validated in clinical settings. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  5. Overview of the FEP analysis approach to model development

    International Nuclear Information System (INIS)

    Bailey, L.

    1998-01-01

    This report heads a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A five stage approach has been adopted, which provides a systematic framework for addressing uncertainty and for the documentation of all modelling decisions and assumptions. The five stages are as follows: Stage 1: EP Analysis - compilation and structuring of a FEP database; Stage 2: Scenario and Conceptual Model Development; Stage 3: Mathematical Model Development; Stage 4: Software Development; Stage 5: confidence Building. This report describes the development and structuring of a FEP database as a Master Directed Diagram (MDD) and explains how this may be used to identify different modelling scenarios, based upon the identification of scenario -defining FEPs. The methodology describes how the possible evolution of a repository system can be addressed in terms of a base scenario, a broad and reasonable representation of the 'natural' evolution of the system, and a number of variant scenarios, representing the effects of probabilistic events and processes. The MDD has been used to identify conceptual models to represent the base scenario and the interactions between these conceptual models have been systematically reviewed using a matrix diagram technique. This has led to the identification of modelling requirements for the base scenario, against which existing assessment software capabilities have been reviewed. A mechanism for combining probabilistic scenario-defining FEPs to construct multi-FEP variant scenarios has been proposed and trialled using the concept of a 'timeline', a defined sequence of events, from which consequences can be assessed. An iterative approach, based on conservative modelling principles, has been proposed for the evaluation of

  6. Social Sustainability Assessment across Provinces in China: An Analysis of Combining Intermediate Approach with Data Envelopment Analysis (DEA Window Analysis

    Directory of Open Access Journals (Sweden)

    Aizhen Zhang

    2018-03-01

    Full Text Available There are two categories (i.e., radial and non-radial category in conventional DEA (Data Envelopment Analysis. Recently, intermediate approach was put forward as a new third category. Intermediate approach is a newly proposed approach and there are quite limited related studies. This study contributes to the DEA studies by putting forward an analytical framework of combining intermediate approach and DEA window analysis along with the concepts of natural and managerial disposability. Such combination is quite meaningful and this new approach has three important features. To the best of our knowledge, such type of research has never been investigated by the existing studies. As an application, this approach is used to evaluate the performance of provinces in China from 2007 to 2014. Furthermore, this study develops a series of performance indices from different perspectives. This study identifies the three important findings. Firstly, eco-technology advancements can achieve economic prosperity and environmental protection simultaneously, and thus should become a new direction of climate policies. Secondly, considerable differences exist in a series of indices that evaluates the performance of various provinces and pollutants from different respective. Then, sufficient attention should be given to the provinces and the pollutants with poor performance. Finally, the Chinese government should promote efficiency improvement by “catching up” for provinces with poor performance in the short term. In addition, the central government should reduce regional disparity in order to promote the social sustainability in the long term.

  7. Ecotoxicology: Lead

    Science.gov (United States)

    Scheuhammer, A.M.; Beyer, W.N.; Schmitt, C.J.; Jorgensen, Sven Erik; Fath, Brian D.

    2008-01-01

    Lead (Pb) is a naturally occurring metallic element; trace concentrations are found in all environmental media and in all living things. However, certain human activities, especially base metal mining and smelting; combustion of leaded gasoline; the use of Pb in hunting, target shooting, and recreational angling; the use of Pb-based paints; and the uncontrolled disposal of Pb-containing products such as old vehicle batteries and electronic devices have resulted in increased environmental levels of Pb, and have created risks for Pb exposure and toxicity in invertebrates, fish, and wildlife in some ecosystems.

  8. A different approach to X-ray stress analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ogilvie, Robert E. [Massachusetts Institute of Technology, Room 13-5065, 77 Massachusetts Ave., Cambridge, MA (United States)], E-mail: bobogil@mit.edu

    2007-07-15

    A different approach to X-ray stress analysis has been developed. At the outset, it must be noted that the material to be analyzed is assumed homogeneous and isotropic. If a sphere with radius r within a specimen is subjected to a state of stress, the sphere is deformed into an ellipsoid. The semi-axes of the ellipsoid have the values of (r + {epsilon}{sub x}), (r + {epsilon}{sub y}), and (r + {epsilon}{sub z}), which are replaced by d{sub x}, d{sub y}, and d{sub z}, or for the cubic case, a{sub x}, a{sub y}, and a{sub z}. In this technique, at a particular {phi} angle (see ), the two-theta position of a high angle (hkl) peak is determined at {psi} angles of 0, 15, 30, and 45{sup o}. These measurements are repeated for 3 to 6 {phi} angles in steps of 30{sup o}. The d{sub {phi}}{sub {psi}} or a{sub {phi}}{sub {psi}} values are then determined from the peak positions. The data is then fitted to the general quadratic equation for an ellipsoid by the method of least squares. From the coefficients of the quadratic equation, the angle between the laboratory and the specimen coordinates (direction of the principle stress) can be determined. Applying the general rotation of axes equations to the quadratic, the equation of the ellipse in the x-y plane is determined. The a{sub x}, a{sub y}, and a{sub z} values for the principal axes of the lattice parameter ellipsoid are then evaluated. It is then possible to determine the unstressed a{sub 0} value from Hooke's Law using a{sub x}, a{sub y}, and a{sub z}. The magnitude of the principal strains/stresses is then determined.

  9. Analysis of lead concentration in forager stingless bees Trigona sp. (hymenoptera: Apidae) and propolis at Cilutung and Maribaya, West Java

    Science.gov (United States)

    Safira, Nabila; Anggraeni, Tjandra

    2015-09-01

    Several studies had shown that lead (Pb) in the environment could accumulate in bees, which in turn could affect the quality of the resulting product. In this study, forager stingless bees (Trigona sp.) and its product (propolis) collected from a stingless bees apiculture. This apiculture had two apiary sites which were distinguished by its environmental setting. Apiary site in Cilutung had a forest region environmental setting, while apiary site in Maribaya was located beside the main road. The objective of this study was to determine the extent of lead concentration in propolis originated from both apiary sites and establish the correlation between lead concentration in propolis and lead level in forager stingless bees. Forager bees and propolis samples were originated from 50 bees colonies (Cilutung) and 44 bees colonies (Maribaya). They were analyzed using AAS-GF (Atomic Absorption Spectrometre-Graphite Furnace) to determine the level of lead concentration. The results showed that the average level of lead in propolis originated from Cilutung (298.08±73.71 ppb) was lower than the average level of lead in forager bees which originated from Maribaya (330.64±156.34 ppb). However, these values did not show significant difference (p>0.05). There was no significant difference (p>0.05) between the average level of lead in forager bees which originated from Cilutung (118.08±30.46 ppb) and Maribaya (128.82±39.66 ppb). However, these values did not show significant difference (p>0.05). In conclusion, the average level of lead concentration in propolis in both sites had passed the maximum permission standard of lead for food in Indonesia. There was no correlation between lead concentration in propolis and forager stingless bees.

  10. Analysis of lead concentration in forager stingless bees Trigona sp. (hymenoptera: Apidae) and propolis at Cilutung and Maribaya, West Java

    Energy Technology Data Exchange (ETDEWEB)

    Safira, Nabila, E-mail: safira.nabila@ymail.com; Anggraeni, Tjandra, E-mail: tjandra@sith.itb.ac.id [School of Life Science and Technology, Institut Teknologi Bandung – Jalan Ganesha 10, Bandung (Indonesia)

    2015-09-30

    Several studies had shown that lead (Pb) in the environment could accumulate in bees, which in turn could affect the quality of the resulting product. In this study, forager stingless bees (Trigona sp.) and its product (propolis) collected from a stingless bees apiculture. This apiculture had two apiary sites which were distinguished by its environmental setting. Apiary site in Cilutung had a forest region environmental setting, while apiary site in Maribaya was located beside the main road. The objective of this study was to determine the extent of lead concentration in propolis originated from both apiary sites and establish the correlation between lead concentration in propolis and lead level in forager stingless bees. Forager bees and propolis samples were originated from 50 bees colonies (Cilutung) and 44 bees colonies (Maribaya). They were analyzed using AAS-GF (Atomic Absorption Spectrometre–Graphite Furnace) to determine the level of lead concentration. The results showed that the average level of lead in propolis originated from Cilutung (298.08±73.71 ppb) was lower than the average level of lead in forager bees which originated from Maribaya (330.64±156.34 ppb). However, these values did not show significant difference (p>0.05). There was no significant difference (p>0.05) between the average level of lead in forager bees which originated from Cilutung (118.08±30.46 ppb) and Maribaya (128.82±39.66 ppb). However, these values did not show significant difference (p>0.05). In conclusion, the average level of lead concentration in propolis in both sites had passed the maximum permission standard of lead for food in Indonesia. There was no correlation between lead concentration in propolis and forager stingless bees.

  11. Competencies in Higher Education: A Critical Analysis from the Capabilities Approach

    Science.gov (United States)

    Lozano, J. Felix; Boni, Alejandra; Peris, Jordi; Hueso, Andres

    2012-01-01

    With the creation of the European Higher Education Area, universities are undergoing a significant transformation that is leading towards a new teaching and learning paradigm. The competencies approach has a key role in this process. But we believe that the competence approach has a number of limitations and weaknesses that can be overcome and…

  12. A social network analysis of alcohol-impaired drivers in Maryland : an egocentric approach.

    Science.gov (United States)

    2011-04-01

    This study examined the personal, household, and social structural attributes of alcoholimpaired : drivers in Maryland. The study used an egocentric approach of social network : analysis. This approach concentrated on specific actors (alcohol-impaire...

  13. Lead Emissions and Population Vulnerability in the Detroit (Michigan, USA Metropolitan Area, 2006–2013: A Spatial and Temporal Analysis

    Directory of Open Access Journals (Sweden)

    Heather Moody

    2017-11-01

    Full Text Available Objective: The purpose of this research is to geographically model airborne lead emission concentrations and total lead deposition in the Detroit Metropolitan Area (DMA from 2006 to 2013. Further, this study characterizes the racial and socioeconomic composition of recipient neighborhoods and estimates the potential for IQ (Intelligence Quotient loss of children residing there. Methods: Lead emissions were modeled from emitting facilities in the DMA using AERMOD (American Meteorological Society/Environmental Protection Agency Regulatory Model. Multilevel modeling was used to estimate local racial residential segregation, controlling for poverty. Global Moran’s I bivariate spatial autocorrelation statistics were used to assess modeled emissions with increasing segregation. Results: Lead emitting facilities were primarily located in, and moving to, highly black segregated neighborhoods regardless of poverty levels—a phenomenon known as environmental injustice. The findings from this research showed three years of elevated airborne emission concentrations in these neighborhoods to equate to a predicted 1.0 to 3.0 reduction in IQ points for children living there. Across the DMA there are many areas where annual lead deposition was substantially higher than recommended for aquatic (rivers, lakes, etc. and terrestrial (forests, dunes, etc. ecosystems. These lead levels result in decreased reproductive and growth rates in plants and animals, and neurological deficits in vertebrates. Conclusions: This lead-hazard and neighborhood context assessment will inform future childhood lead exposure studies and potential health consequences in the DMA.

  14. Lead grids

    CERN Multimedia

    1974-01-01

    One of the 150 lead grids used in the multiwire proportional chamber g-ray detector. The 0.75 mm diameter holes are spaced 1 mm centre to centre. The grids were made by chemical cutting techniques in the Godet Workshop of the SB Physics.

  15. Leading men

    DEFF Research Database (Denmark)

    Bekker-Nielsen, Tønnes

    2016-01-01

    Through a systematic comparison of c. 50 careers leading to the koinarchate or high priesthood of Asia, Bithynia, Galatia, Lycia, Macedonia and coastal Pontus, as described in funeral or honorary inscriptions of individual koinarchs, it is possible to identify common denominators but also disting...

  16. Real time analysis of lead-containing atmospheric particles in Beijing during springtime by single particle aerosol mass spectrometry.

    Science.gov (United States)

    Ma, Li; Li, Mei; Huang, Zhengxu; Li, Lei; Gao, Wei; Nian, Huiqing; Zou, Lilin; Fu, Zhong; Gao, Jian; Chai, Fahe; Zhou, Zhen

    2016-07-01

    Using a single particle aerosol mass spectrometer (SPAMS), the chemical composition and size distributions of lead (Pb)-containing particles with diameter from 0.1 μm to 2.0 μm in Beijing were analyzed in the spring of 2011 during clear, hazy, and dusty days. Based on mass spectral features of particles, cluster analysis was applied to Pb-containing particles, and six major classes were acquired consisting of K-rich, carboneous, Fe-rich, dust, Pb-rich, and Cl-rich particles. Pb-containing particles accounted for 4.2-5.3%, 21.8-22.7%, and 3.2% of total particle number during clear, hazy and dusty days, respectively. K-rich particles are a major contribution to Pb-containing particles, varying from 30.8% to 82.1% of total number of Pb-containing particles, lowest during dusty days and highest during hazy days. The results reflect that the chemical composition and amount of Pb-containing particles has been affected by meteorological conditions as well as the emissions of natural and anthropogenic sources. K-rich particles and carbonaceous particles could be mainly assigned to the emissions of coal combustion. Other classes of Pb-containing particles may be associated with metallurgical processes, coal combustion, dust, and waste incineration etc. In addition, Pb-containing particles during dusty days were first time studied by SPAMS. This method could provide a powerful tool for monitoring and controlling of Pb pollution in real time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. A microbeam grazing-incidence approach to L-shell x-ray fluorescence measurements of lead concentration in bone and soft tissue phantoms.

    Science.gov (United States)

    Gherase, Mihai Raul; Al-Hamdani, Summer

    2018-02-06

    L-shell x-ray fluorescence (LXRF) is a non-invasive approach to lead (Pb) concentration measurements in the human bone. The first studies were published in the early 1980s. In the same period the K-shell x-ray fluorescence (KXRF) method using a Cd-109 radionuclide source was developed and later improved and refined. Lower sensitivity and calibration difficulties associated with the LXRF method led the KXRF to be the most adopted method for in vivo human bone Pb studies. In the present study a microbeam-based grazing-incidence approach to Pb LXRF measurements was investigated. The microbeam produced by an integrated x-ray tube and polycapillary x-ray lens (PXL) unit was used to excite cylindrical plaster-of-Paris (poP) bone phantoms doped with Pb in seven concentrations: 0, 8, 16, 26, 34, 59, and 74 µg/g. Two 1 mm- and 3 mm-thick cylindrical shell soft tissue phantoms were made out of polyoxymethylene (POM) plastic. Three bone-soft tissue phantom sets corresponding to the 0, 1, and 3 mm POM thickness values resulted. Each phantom was placed between the microbeam and the detector; its position was controlled using a positioning stage. Small steps (0.1-0.5 mm) and short 30 s x-ray spectra acquisitions were used to find the optimal phantom position according to the maximum observed Sr Kα peak height. At the optimal geometry, five 180 s x-ray spectra were acquired for each phantom set. Calibration lines were obtained using the fitted peak heights of the two observed Pb Lα and Pb Lβ peaks. The lowest detection limit (DL) values were (2.9±0.2), (4.9±0.3), and (23±3) µg/g, respectively. The order of magnitude of the absorbed radiation dose in the POM plastic for the 180 s irradiation was estimated to be <1 mGy. The results are superior to a relatively recently-published LXRF phantom study and show promise for future designs of in vivo LXRF measurements. Creative Commons Attribution license.

  18. Frequency domain analysis and design of nonlinear systems based on Volterra series expansion a parametric characteristic approach

    CERN Document Server

    Jing, Xingjian

    2015-01-01

    This book is a systematic summary of some new advances in the area of nonlinear analysis and design in the frequency domain, focusing on the application oriented theory and methods based on the GFRF concept, which is mainly done by the author in the past 8 years. The main results are formulated uniformly with a parametric characteristic approach, which provides a convenient and novel insight into nonlinear influence on system output response in terms of characteristic parameters and thus facilitate nonlinear analysis and design in the frequency domain.  The book starts with a brief introduction to the background of nonlinear analysis in the frequency domain, followed by recursive algorithms for computation of GFRFs for different parametric models, and nonlinear output frequency properties. Thereafter the parametric characteristic analysis method is introduced, which leads to the new understanding and formulation of the GFRFs, and nonlinear characteristic output spectrum (nCOS) and the nCOS based analysis a...

  19. A model-based prognostic approach to predict interconnect failure using impedance analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Dae Il; Yoon, Jeong Ah [Dept. of System Design and Control Engineering. Ulsan National Institute of Science and Technology, Ulsan (Korea, Republic of)

    2016-10-15

    The reliability of electronic assemblies is largely affected by the health of interconnects, such as solder joints, which provide mechanical, electrical and thermal connections between circuit components. During field lifecycle conditions, interconnects are often subjected to a DC open circuit, one of the most common interconnect failure modes, due to cracking. An interconnect damaged by cracking is sometimes extremely hard to detect when it is a part of a daisy-chain structure, neighboring with other healthy interconnects that have not yet cracked. This cracked interconnect may seem to provide a good electrical contact due to the compressive load applied by the neighboring healthy interconnects, but it can cause the occasional loss of electrical continuity under operational and environmental loading conditions in field applications. Thus, cracked interconnects can lead to the intermittent failure of electronic assemblies and eventually to permanent failure of the product or the system. This paper introduces a model-based prognostic approach to quantitatively detect and predict interconnect failure using impedance analysis and particle filtering. Impedance analysis was previously reported as a sensitive means of detecting incipient changes at the surface of interconnects, such as cracking, based on the continuous monitoring of RF impedance. To predict the time to failure, particle filtering was used as a prognostic approach using the Paris model to address the fatigue crack growth. To validate this approach, mechanical fatigue tests were conducted with continuous monitoring of RF impedance while degrading the solder joints under test due to fatigue cracking. The test results showed the RF impedance consistently increased as the solder joints were degraded due to the growth of cracks, and particle filtering predicted the time to failure of the interconnects similarly to their actual timesto- failure based on the early sensitivity of RF impedance.

  20. Determination of the minor and trace elements in Biriniwa's tin pyrite and ornamental lead/zinc ore using neutron activation analysis

    Directory of Open Access Journals (Sweden)

    A.O. Adebayo

    2002-12-01

    Full Text Available Preliminary results of analysis of two common decorative/ornamental minerals analysed for minor and trace elements with the neutron activation analysis technique are discussed. The samples of interest were the Biriniwa tin pyrite, which the local indigenous used to paint their huts and the ornamental lead which women use to adorn their eyelashes nation-wide. These samples were irradiated along the certified reference sample, CANMET-BLI, with thermal neutron at the Julich Reactor Centre, Julich, Germany. The prominent elements determined in the ornamental lead included zinc (35.8%, iron (6.15%, Na, Sb, Cd, Hg, Ag and Co at trace level (μg g-1. Tin pyrite sample was found to contain traces of Na, K, As, Br, Sb, Fe, La, Nd, Sm and Ce as the prominent impurities. Lead and tin, the major elements, respectively, of the lead/zinc ore and tin pyrite samples were determined by classical methods.