WorldWideScience

Sample records for analysis approach leading

  1. A SOLIDS ANALYSIS APPROACH INCORPORATING ARGON-ION MILLING TO COPPER AND LEAD PIPE SCALE ANALYSIS

    Science.gov (United States)

    Corrosion of copper and lead plumbing materials in water is complex and has been the topic of a number of studies on the topic (Lucey 1967; Edwards et al. 1994a; Edwards et al. 1994b; Duthil et al.1996; Harrison et al. 2004). Solids analysis is one of the most convenient and nfo...

  2. Safe job analysis in a lead refinery. A practical approach from the process side

    Energy Technology Data Exchange (ETDEWEB)

    Esser, Knut; Meurer, Urban [BERZELIUS Stolberg GmbH, Stolberg (Germany)

    2011-09-15

    In order to increase safety and to maintain legal requirements, Berzelius Stolberg decided in 2009 to update and change the approach for the safe job analysis (SJA). The new approach takes detailed Standard Operation Procedures (SOPs), which were also updated during the new approach, as a basis for all following documents. Together with supervisors and operators all SOPs were structured in single working steps, because only if the real work is properly described, the afterwards performed safe job analysis makes sense and the risks are correctly identified. After updating the SOPs, a draft of each SJA was discussed by representatives from the refinery management, works council, safety officers and operators. For every identified risk one or more measures to avoid the risk were agreed. For the technical and organisational measures an action plan was created. The behavior related measures were concentrated in a safety handbook, representing the basis for future safety training of the operators. In addition to the Safe Job Analysis the SOPs are also the basis for training manuals and also for FMEAs. All in all the new approach of safe job analysis represents not only a way to increase safety systematically according to OHSAS guidelines, but also satisfies all aspects of quality management. (orig.)

  3. Noninvasive Biomonitoring Approaches to Determine Dosimetry and Risk Following Acute Chemical Exposure: Analysis of Lead or Organophosphate Insecticide in Saliva

    International Nuclear Information System (INIS)

    There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. The system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk

  4. Androgen receptor mutations associated with androgen insensitivity syndrome: a high content analysis approach leading to personalized medicine.

    Directory of Open Access Journals (Sweden)

    Adam T Szafran

    Full Text Available Androgen insensitivity syndrome (AIS is a rare disease associated with inactivating mutations of AR that disrupt male sexual differentiation, and cause a spectrum of phenotypic abnormalities having as a common denominator loss of reproductive viability. No established treatment exists for these conditions, however there are sporadic reports of patients (or recapitulated mutations in cell lines that respond to administration of supraphysiologic doses (or pulses of testosterone or synthetic ligands. Here, we utilize a novel high content analysis (HCA approach to study AR function at the single cell level in genital skin fibroblasts (GSF. We discuss in detail findings in GSF from three historical patients with AIS, which include identification of novel mechanisms of AR malfunction, and the potential ability to utilize HCA for personalized treatment of patients affected by this condition.

  5. A systems approach to risk management through leading safety indicators

    International Nuclear Information System (INIS)

    The goal of leading indicators for safety is to identify the potential for an accident before it occurs. Past efforts have focused on identifying general leading indicators, such as maintenance backlog, that apply widely in an industry or even across industries. Other recommendations produce more system-specific leading indicators, but start from system hazard analysis and thus are limited by the causes considered by the traditional hazard analysis techniques. Most rely on quantitative metrics, often based on probabilistic risk assessments. This paper describes a new and different approach to identifying system-specific leading indicators and provides guidance in designing a risk management structure to generate, monitor and use the results. The approach is based on the STAMP (System-Theoretic Accident Model and Processes) model of accident causation and tools that have been designed to build on that model. STAMP extends current accident causality to include more complex causes than simply component failures and chains of failure events or deviations from operational expectations. It incorporates basic principles of systems thinking and is based on systems theory rather than traditional reliability theory. - Highlights: • Much effort has gone into developing leading indicators with only limited success. • A systems-theoretic, assumption-based approach may be more successful. • Leading indicators are warning signals of an assumption’s changing vulnerability. • Heuristic biases can be controlled by using plausibility rather than likelihood

  6. Lead isotopic analysis within a multiproxy approach to trace pottery sources. The example of White Slip II sherds from Late Bronze Age sites in Cyprus and Syria

    International Nuclear Information System (INIS)

    Lead isotope analyses were carried out on fragments of White Slip II ware, a Late Bronze Age Cypriote pottery ware, and on raw materials possibly used for their production. Sherds originate from three Late Bronze Age sites (Hala Sultan Tekke and Sanidha in Cyprus and Minet el-Beida in Syria) and clays come from the surroundings of Sanidha, a production site for White Slip ware. X-ray fluorescence (XRF) and a Principal Component Analysis (PCA) are combined with Pb isotope analyses to further investigate the effectiveness of the latter method within a multiproxy approach for pottery provenance study. The pottery sherds from the three sites are compared between themselves and with potential raw material. Additional X-ray diffraction (XRD) and analyses using a scanning electron microscope (SEM) equipped with an energy dispersive X-ray detection (EDX) facility were performed on selected sherds and clays. This work confirms that the clay source used for pottery production in Sanidha derives from local weathered gabbro. It also shows that different origins can be proposed for White Slip II ware sherds from Hala Sultan Tekke and Minet el-Beida and that clays were prepared prior to White Slip II ware production. It finally confirms the effectiveness of Pb isotopes in tracing pottery provenance not only by comparing sherd assemblages but also by comparing sherds to potential raw materials.

  7. Next-to-next-to-leading order QCD analysis of spin-dependent parton distribution functions and their uncertainties: Jacobi polynomials approach

    Science.gov (United States)

    Taghavi-Shahri, F.; Khanpour, Hamzeh; Atashbar Tehrani, S.; Alizadeh Yazdi, Z.

    2016-06-01

    We present a first QCD analysis of next-to-next-leading-order (NNLO) contributions of the spin-dependent parton distribution functions (PPDFs) in the nucleon and their uncertainties using the Jacobi polynomial approach. Having the NNLO contributions of the quark-quark and gluon-quark splitting functions in perturbative QCD [Nucl. Phys. B889, 351 (2014)], one can obtain the evolution of longitudinally polarized parton densities of hadrons up to NNLO accuracy of QCD. Very large sets of recent and up-to-date experimental data of spin structure functions of the proton g1p, neutron g1n, and deuteron g1d have been used in this analysis. The predictions for the NNLO calculations of the polarized parton distribution functions as well as the proton, neutron and deuteron polarized structure functions are compared with the corresponding results of the NLO approximation. We form a mutually consistent set of polarized PDFs due to the inclusion of the most available experimental data including the recently high-precision measurements from COMPASS16 experiments [Phys. Lett. B 753, 18 (2016)]. We have performed a careful estimation of the uncertainties using the most common and practical method, the Hessian method, for the polarized PDFs originating from the experimental errors. The proton, neutron and deuteron structure functions and also their first moments, Γp ,n ,d , are in good agreement with the experimental data at small and large momentum fractions of x . We will discuss how our knowledge of spin-dependence structure functions can improve at small and large values of x by the recent COMPASS16 measurements at CERN, the PHENIX and STAR measurements at RHIC, and at the future proposed colliders such as the Electron-Ion Collider.

  8. Lead reactor strategy economical analysis

    International Nuclear Information System (INIS)

    Conclusions: • A first attempt to evaluate LFR power plant electricity production cost has been performed; • Electricity price is similar to Gen III + plants; • The estimation accuracy is probably low; • Possible costs reduction could arise from coolant characteristics that may improve safety and simplicity by design; • Accident perception, not acceptable by public opinion, may be changed with low potential energy system (non exploding coolant); • Sustainability improvement could open to a better Public acceptance, depending on us. • Problems may arise in coupling a high capital cost low fuel cost plant in a grid with large amount of intermittent sources with priority dispatch. • Lead fast reactors can compete

  9. Learning to lead: a literary approach

    OpenAIRE

    Hafford - Letchfield, Trish; Harper, William

    2011-01-01

    Leading and Developing Public Service’ is an accredited Higher Education programme provided by the School of Health and Social Sciences and is aimed at those working in health, social care, community and public services. The acquisitions of skills and knowledge in leadership and management have been cited as key to delivering the UK Government’s vision of quality services. These highlight the relationship between ‘effective leadership’ and transforming services. How students engage with lea...

  10. Ancient iron and lead isotope analysis

    International Nuclear Information System (INIS)

    Full text: Little work has been published to date on the subject of lead isotope analysis of ancient iron artefacts. That which has suffers from a lack of understanding of the nature of ancient iron, and of the behavior of lead in relation to iron oxides. This paper examines data from a lead isotope study of 12th-10th Century B.C.E. iron artefacts from Israel and Palestine, and iron ores from these and surrounding areas, focusing on the issues of iron corrosion and lead contamination. The data shows that experimentally produced bloomery iron contains very little lead (less than O. 1 ppm), with most lead in the ore being reduced in the smelting process and lost to the slag. This low quantity of lead raises the question of contamination in samples which have been corroding whilst buried, in this case, for 3000 years. It is proposed that useful lead isotope data may be obtained where analysis of hydrated iron oxides in particular is avoided, as they commonly make up the outer layers of recovered ancient iron objects, formed in direct association with surrounding soil and rock. Lead contamination of these porous oxides- is a constantly observed feature of the material, and the affinity of lead for such oxides is well documented. Where there exists uncorroded iron (a rare event), or where there exists a core of magnetite beneath the outer hydrated oxide layers, it may be possible to obtain useful lead isotope data, which reflect the isotopic composition of the metal as it emerged from the furnace in antiquity. A magnetic separation process and washing in cold 7M HCl are proposed as means of removing contaminated hydrated iron oxides from this more useful material in the laboratory, prior to lead isotope analysis

  11. Current lead thermal analysis code 'CURRENT'

    International Nuclear Information System (INIS)

    Large gas-cooled current lead with the capacity more than 30 kA and 22 kV is required for superconducting toroidal and poloidal coils for fusion application. The current lead is used to carry electrical current from the power supply system at room temperature to the superconducting coil at 4 K. Accordingly, the thermal performance of the current lead is significantly important to determine the heat load requirements of the coil system at 4 K. Japan Atomic Energy Research Institute (JAERI) has being developed the large gas-cooled current leads with the optimum condition in which the heat load is around 1 W per 1 kA at 4 K. In order to design the current lead with the optimum thermal performances, JAERI developed thermal analysis code named as ''CURRENT'' which can theoretically calculate the optimum geometric shape and cooling conditions of the current lead. The basic equations and the instruction manual of the analysis code are described in this report. (author)

  12. Economic Analysis of Leading Logistics Companies' Stocks

    OpenAIRE

    Chyška, Pavel

    2013-01-01

    The bachelor thesis focuses on analysis of leading logistics companies’ stocks. Literature review defines logistics as a rapidly evolving discipline, which has its roots far in the history. Logistics is also recognized as an important force that helps to drive the global economy. Supply chain and outsourcing are also discussed since they are very much connected to logistics. Stock market and important indexes are evaluated along with different types of stock charts in the literature review, a...

  13. Real analysis a constructive approach

    CERN Document Server

    Bridger, Mark

    2012-01-01

    A unique approach to analysis that lets you apply mathematics across a range of subjects This innovative text sets forth a thoroughly rigorous modern account of the theoretical underpinnings of calculus: continuity, differentiability, and convergence. Using a constructive approach, every proof of every result is direct and ultimately computationally verifiable. In particular, existence is never established by showing that the assumption of non-existence leads to a contradiction. The ultimate consequence of this method is that it makes sense-not just to math majors but also to students from a

  14. Mechanochemical synthesis of nanocrystalline lead selenide. Industrial approach

    Energy Technology Data Exchange (ETDEWEB)

    Achimovicova, Marcela; Balaz, Peter [Slovak Academy of Sciences, Kosice (Slovakia). Inst. of Geotechnics; Durisin, Juraj [Slovak Academy of Sciences, Kosice (Slovakia). Inst. of Materials Research; Daneu, Nina [Josef Stefan Institute, Ljubljana (Slovenia). Dept. for Nanostructured Materials; Kovac, Juraj; Satka, Alexander [Slovak Univ. of Technology and International Laser Centre, Bratislava (Slovakia). Dept. of Microelectronics; Feldhoff, Armin [Leibniz Univ. Hannover (Germany). Inst. fuer Physikalische Chemie und Elektrochemie; Gock, Eberhard [Technical Univ. Clausthal, Clausthal-Zellerfeld (Germany). Inst. of Mineral and Waste Processing and Dumping Technology

    2011-04-15

    Mechanochemical synthesis of lead selenide PbSe nanoparticles was performed by high-energy milling of lead and selenium powder in a laboratory planetary ball mill and in an industrial eccentric vibratory mill. Structural properties of the synthesized lead selenide were characterized using X-ray diffraction that confirmed crystalline nature of PbSe nanoparticles. The average size of PbSe crystallites of 37 nm was calculated from X-ray diffraction data using the Williamson-Hall method. The methods of particle size distribution analysis, specific surface area measurement, scanning electron microscopy and transmission electron microscopy were used for characterization of surface, mean particle size, and morphology of PbSe. An application of industrial mill verified a possibility of the synthesis of a narrow bandgap semiconductor PbSe at ambient temperature and in a relatively short reaction time. (orig.)

  15. Isotopic analysis of bullet lead samples

    International Nuclear Information System (INIS)

    The possibility of using the isotopic composition of lead for the identification of bullet lead is investigated. Lead from several spent bullets were converted to lead sulphide and analysed for the isotopic abundances using an MS-7 mass spectrometer. The abundances are measured relative to that for Pb204 was too small to permit differentiation, while the range of variation of Pb206 and Pb207 and the better precision in their analyses permitted differentiating samples from one another. The correlation among the samples examined has been pointed out. The method is complementary to characterisation of bullet leads by the trace element composition. The possibility of using isotopically enriched lead for tagging bullet lead is pointed out. (author)

  16. A Thermostructural Analysis of a Diboride Composite Leading Edge

    Science.gov (United States)

    Kowalski, Tom; Buesking, Kent; Kolodziej, Paul; Bull, Jeff

    1996-01-01

    In an effort to support the design of zirconium diboride composite leading edges for hypersonic vehicles, a finite element model (FEM) of a prototype leading edge was created and finite element analysis (FEA) was employed to assess its thermal and structural response to aerothermal boundary conditions. Unidirectional material properties for the structural components of the leading edge, a continuous fiber reinforced diboride composite, were computed with COSTAR. These properties agree well with those experimentally measured. To verify the analytical approach taken with COSMOS/M, an independent FEA of one of the leading edge assembly components was also done with COSTAR. Good agreement was obtained between the two codes. Both showed that a unidirectional lay-up had the best margin of safety for a simple loading case. Both located the maximum stress in the same region and ply. The magnitudes agreed within 4 percent. Trajectory based aerothermal heating was then applied to the leading edge assembly FEM created with COSMOS/M to determine steady state temperature response, displacement, stresses, and contact forces due to thermal expansion and thermal strains. Results show that the leading edge stagnation line temperature reached 4700 F. The maximum computed failure index for the laminated composite components peaks at 4.2, and is located at the bolt flange in layer 2 of the side bracket. The temperature gradient in the tip causes a compressive stress of 279 ksi along its width and substantial tensile stresses within its depth.

  17. A meta-analysis to correlate lead bioavailability and bioaccessibility and predict lead bioavailability.

    Science.gov (United States)

    Dong, Zhaomin; Yan, Kaihong; Liu, Yanju; Naidu, Ravi; Duan, Luchun; Wijayawardena, Ayanka; Semple, Kirk T; Rahman, Mohammad Mahmudur

    2016-01-01

    Defining the precise clean-up goals for lead (Pb) contaminated sites requires site-specific information on relative bioavailability data (RBA). While in vivo measurement is reliable but resource insensitive, in vitro approaches promise to provide high-throughput RBA predictions. One challenge on using in vitro bioaccessibility (BAc) to predict in vivo RBA is how to minimize the heterogeneities associated with in vivo-in vitro correlations (IVIVCs) stemming from various biomarkers (kidney, blood, liver, urinary and femur), in vitro approaches and studies. In this study, 252 paired RBA-BAc data were retrieved from 9 publications, and then a Bayesian hierarchical model was implemented to address these random effects. A generic linear model (RBA (%)=(0.87±0.16)×BAc+(4.70±2.47)) of the IVIVCs was identified. While the differences of the IVIVCs among the in vitro approaches were significant, the differences among biomarkers were relatively small. The established IVIVCs were then applied to predict Pb RBA of which an overall Pb RBA estimation was 0.49±0.25. In particular the RBA in the residential land was the highest (0.58±0.19), followed by house dust (0.46±0.20) and mining/smelting soils (0.45±0.31). This is a new attempt to: firstly, use a meta-analysis to correlate Pb RBA and BAc; and secondly, estimate Pb RBA in relation to soil types. PMID:27104671

  18. Lead pressure loss in the heat exchanger of the ELSY fast lead-cooled reactor by CFD approach

    International Nuclear Information System (INIS)

    In the frame of the ELSY (European lead-cooled system) design proposal for a fast lead-cooled reactor, which should comply with the goals of the 4. generation nuclear power plants, the focus is set on the usage of the possible advantages offered by the lead technology in comparison to lead-bismuth eutectic (LBE). Lead is less expensive, less corrosive and has a smaller radiological emissivity in comparison with LBE. The ELSY project aims at demonstrating the feasibility of a lead fast reactor for energy generation and the identification of solutions for a simple but safe system. In order to properly dimension the reactor and to allow the flow of lead in natural circulation regime, as required by the nuclear accidents scenarios, the knowledge of the lead pressure losses through each component is mandatory. The present paper discusses the pressure loss through the new innovative design proposed for the ELSY spiral heat exchanger (HX). The lack of experimental data for lead flows through heat exchangers, as well as the novelty of the HX design, motivated an approach based on CFD (Computational Fluid Dynamics) analysis. We employed the commercial tool ANSYS CFX and successfully validate the program against theoretical predictions for pressure loss simulations through perforated plates and pipe bundles. The ELSY HX has a cylindrical design and uniformly perforated double inner and double outer walls, as described in [4]. The flow of lead represents the primary circuit, while supercritical water is planned for the secondary circuit of the reactor. The perforations in the walls and in the corresponding companion shells are displaced in a staggered way. About 200 tubes that are arranged vertically in a staggered way are planned for the secondary circuit of one HX. A detailed complete model is not feasible at the actual stage of the design, due to the complex geometry, which has reference elements ranging between 10-3/1 m scales. Therefore, unit slice models consisting of

  19. Lead and Conduct Problems: A Meta-Analysis

    Science.gov (United States)

    Marcus, David K.; Fulton, Jessica J.; Clarke, Erin J.

    2010-01-01

    This meta-analysis examined the association between conduct problems and lead exposure. Nineteen studies on 8,561 children and adolescents were included. The average "r" across all 19 studies was 0.19 (p less than 0.001), which is considered a medium effect size. Studies that assessed lead exposure using hair element analysis yielded considerably…

  20. Leading neutron production at HERA in the color dipole approach

    Directory of Open Access Journals (Sweden)

    Carvalho F.

    2016-01-01

    Full Text Available In this work we study leading neutron production in e + p → e + n + X collisions at high energies and calculate the Feynman xL distribution of these neutrons. The differential cross section is written in terms of the pion flux and of the photon-pion total cross section. We describe this process using the color dipole formalism and, assuming the validity of the additive quark model, we relate the dipole-pion with the well determined dipoleproton cross section. In this formalism we can estimate the impact of the QCD dynamics at high energies as well as the contribution of gluon saturation effects to leading neutron production. With the parameters constrained by other phenomenological information, we are able to reproduce the basic features of the recently released H1 leading neutron spectra.

  1. A Network Approach to Preparing Underrepresented Students: The LEAD Model

    Science.gov (United States)

    Siegel, David J.

    2008-01-01

    Results are reported from an empirical study of an interorganizational collaboration to prepare underrepresented students for elite postsecondary education and beyond. The LEAD (Leadership Education and Development) Program in Business is an initiative involving twelve U.S. universities, nearly forty multinational corporations, a federal…

  2. Lead

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-12-31

    This is one of a series of reports made on industrial pollutants by the Expert Panel on Air Quality Standards to advise the United Kingdom Government on air quality standards. It describes the main sources of lead exposure, including the relative contribution of lead in the air and lead in the diet, and the methods by which it is measured in air. The Panel also considers the airborne concentrations recorded to date in the United Kingdom, ways in which lead is handled in by the body, and its toxic effects on people. The dominant source of airborne lead is petrol combustion. Other source include coal combustion, the production of non-ferrous metals and waste treatment and disposal. The justification of an air quality standard for lead is set down. The Panel recommends an air quality standard for lead in the United Kingdom of 0.25 {mu}g/m{sup 3} measured as an annual average. This is intended to protect young children, the group most vulnerable to impairment of brain function. 17 refs., 3 figs., 2 tabs.

  3. Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge

    Science.gov (United States)

    Yap, Keng C.

    2010-01-01

    This viewgraph presentation reviews Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge. The Wing Leading Edge Impact Detection System (WLE IDS) and the Impact Analysis Process are also described to monitor WLE debris threats. The contents include: 1) Risk Management via SHM; 2) Hardware Overview; 3) Instrumentation; 4) Sensor Configuration; 5) Debris Hazard Monitoring; 6) Ascent Response Summary; 7) Response Signal; 8) Distribution of Flight Indications; 9) Probabilistic Risk Analysis (PRA); 10) Model Correlation; 11) Impact Tests; 12) Wing Leading Edge Modeling; 13) Ascent Debris PRA Results; and 14) MM/OD PRA Results.

  4. Climate Change Management Approaches of Cities: A Comparative Study Between Globally Leading and Turkish Metropolitan Cities

    Directory of Open Access Journals (Sweden)

    Solmaz Filiz Karabag

    2011-05-01

    Full Text Available Many studies have focused on climate change policies and action at the national level, but few have studied policies and action at the city level, especially cities in emerging economies. To address this gap, the present study analyzes the management strategies globally leading cities have developed to address climate change and related issues and compares them with the city strategies of one rapidly urbanizing emerging economy, Turkey. In the analysis, the strategic plans of five leading global cities are compared with those of sixteen Turkish cities. While the leading global cities have specific managerial approaches to mitigate climate change, none of the Turkish cities exhibits any comprehensive approach. Furthermore, while leading global cities modify urban services to reduce greenhouse gas (GHG emissions, few Turkish cities adjust any services to address this challenge. Some Turkish cities propose an increased use of renewable energy sources and modification in their transportation system, but the focus in these plans is the current daily needs of their inhabitants. The findings of this study suggest several climate change strategies both for Turkish cities and cities in other developing countries.

  5. LEADING CHANGES IN ASSESSMENT USING AN EVIDENCE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    J. O. Macaulay

    2015-08-01

    Full Text Available Introduction and objectivesIt is has been widely accepted that assessment of learning is a critical component of education and that assessment drives/guides student learning through shaping study habits and student approaches to learning. However, although most academics would agree that assessment is a critical aspect of their roles as teachers it is often an aspect of teaching that is regarded more as an additional task rather than an integral component of the teaching/learning continuum. An additional impediment to high quality assessment is the non-evidence based-approach to the decision making process. The overall aim of this project was to improve the quality of assessment in Biochemistry and Molecular Biology undergraduate education by promoting high quality assessment.Materials and methodsTo do this we developed and trialled an audit tool for mapping assessment practices. The audit tool was designed to gather data on current assessment practices and identify areas of good practice in which assessment aligned with the learning objectives and areas in need of improvement. This evidence base will then be used to drive change in assessment.Results and conclusionsUsing the assessment mapping tool we have mapped the assessment regime in a Biochemistry and Molecular Biology major at Monash University. Criteria used included: assessment type, format, timing, assessors, provision of feedback, level of learning (Bloom’s, approaches taken to planning assessment. We have mapped assessment of content and the systematic development of higher order learning and skills progression throughout the program of study. The data has enabled us to examine the assessment at unit (course level as well as the vertical development across the major. This information is now being used to inform a review of the units and the major.

  6. Effects of Uncertainties in Lead Cross Section Data in Analysis of Lead Cooled and Reflected Reactors

    International Nuclear Information System (INIS)

    There are numerous uncertainties in the analyses of innovative reactor designs, arising from approximations used in the solution of the transport equation, and in nuclear data processing and cross section libraries generation. This paper describes: the problems encountered in the analysis of the lead cooled and reflected reactors; the new cross section data libraries developed to overcome these problems; and applications of these new data libraries to the Encapsulated Nuclear Heat Source (ENHS) core benchmark analysis. The ENHS is a new lead-bismuth or lead cooled novel reactor concept that is fuelled with metallic alloy of Pu, U and Zr, and is designed to operate for 20 effective full power years without refuelling and with very small burnup reactivity swing. The computational tool benchmarked include MOCUP, a coupled MCNP-4C and ORIGEN2.1 utility codes with MCNP data libraries based on the newest evaluations. (author)

  7. The Quantum Approach Leading from Evolutionary to Exhaustive Optimization

    Directory of Open Access Journals (Sweden)

    Giuseppe Martinelli

    2012-01-01

    Full Text Available What bio-inspired algorithms mimic are natural mechanisms governing the macroscopic world for optimizing actual performances that are of vital importance. Neural and neurofuzzy networks, genetic, swarm-intelligence and other evolutionary algorithms are well-known results of this imitation. A completely different situation characterizes the microscopic world governed by quantum mechanics. All the possible solutions exist simultaneously in superposition and the problem is to extract the optimal one. In this case, basic mechanisms of quantum mechanics, i.e., superposition and entanglement, are necessary to mimic nature. Following the latter approach, in this paper a quantum architecture was proposed for determining the maximum/minimum in a set of positive integers which is a basic problem related to optimization. The proposed architecture is based on a suitable nonlinear quantum operator and it solves the said problem by an exhaustive search. This was illustrated in detail in the case of a typical NP-complete problem.

  8. A noninvasive isotopic approach to estimate the bone lead contribution to blood in children: Implications for assessing the efficacy of lead abatement

    OpenAIRE

    Gwiazda, Roberto H; C. Campbell; Smith, D.

    2005-01-01

    Lead hazard control measures to reduce children's exposure to household lead sources often result in only limited reductions in blood lead levels. This may be due to incomplete remediation of lead sources and/or to the remobilization of lead stores from bone, which may act as an endogenous lead source that buffers reductions in blood lead levels. Here we present a noninvasive isotopic approach to estimate the magnitude of the bone lead contribution to blood in children following household lea...

  9. Real Analysis A Historical Approach

    CERN Document Server

    Stahl, Saul

    2011-01-01

    A provocative look at the tools and history of real analysis This new edition of Real Analysis: A Historical Approach continues to serve as an interesting read for students of analysis. Combining historical coverage with a superb introductory treatment, this book helps readers easily make the transition from concrete to abstract ideas. The book begins with an exciting sampling of classic and famous problems first posed by some of the greatest mathematicians of all time. Archimedes, Fermat, Newton, and Euler are each summoned in turn, illuminating the utility of infinite, power, and trigonome

  10. Petri Net approach for a Lead-cooled Fast Reactor startup design

    International Nuclear Information System (INIS)

    A preliminary approach to the development of the startup mode design for the Advanced Lead Fast Reactor European Demonstrator (ALFRED) currently under development within the European FP7 LEADER (Lead-cooled European Advanced Demonstration Reactor) Project has been undertaken. The reactor startup is the operational mode in which all the operating systems of the plant are brought from the cold shutdown condition to the full power operating status, close to power-frequency control. In this phase, the working conditions radically change and it is fundamental that the several control actions are properly coordinated. These aspects deserve a particular attention in a new generation reactor, whose management is not fully defined and for which the control strategy has to be finalized. In the development of the ALFRED reactor control system, it is then necessary to provide an adequate formalization of the sequence of the several control actions to be performed. For this purpose, Petri net approach has been employed in this work, since it represents a useful formalism for the modelling and the analysis of discrete event systems and it allows to identify the events coming from the plant that enable the switches among the several feedback controllers. In the first part of the paper, the issues characterizing the startup mode are described and some solutions to bring the reactor to full power mode (fulfilling the technological constraints) are provided. Finally, the results of the simulations related to the ALFRED output variables and control variables are discussed. (author)

  11. Analysis of the leading tourism journals 1999-2008

    OpenAIRE

    Wickham, Mark; Dunn, Alison

    2011-01-01

    The purpose of this article is to examine the research themes, method and outcome trends that have been published in the three leading tourism journals from 1999 to 2008. This study builds upon previous research relating to tourism publications throughout the 1980s and 1990s, but includes analysis of adopted methodology and practical versus theoretical implications. This study involved a content analysis of 1584 articles published between 1999 and 2008 in the three most prominent tourism j...

  12. Neutron Activation Analysis of Lead Halide Pollution Aerosols

    International Nuclear Information System (INIS)

    Iodine, bromine and chlorine have been determined by neutron activation analysis in atmospheric samples of both natural and pollution origin, and a comparison of the two sources provides the basis of a technique described in this paper for determining the composition and possible source of lead halide pollution aerosols. The activation analysis procedure employed consists of reactor neutron irradiation of aqueous samples and comparators for 20 min followed by radiochemical separation of iodine, bromine and chlorine and automatic counting of beta radioactivity from solid silver halide sources. Determination of lead by anodic stripping voltammetry (inverse polarography) consists of deposition of Pb++ from the solution onto a composite paraffin- impregnated graphite and mercury electrode at -1.00 V versus the standard calomel electrode, and then stripping by increasing the potential continuously. A significant question of public health interest in the air chemistry of lead is the source of the lead. Ethyl fluid, a mixture of organic lead, bromine and chlorine compounds, burns to form inorganic lead halide particles with Cl/Pb = 0.34 and Br/Pb = 0.39 by weight. In Cambridge, Massachusetts, analyses of cascade impactor aerosols were compared with similarly collected samples from the unpolluted air of Hawaii. The pollution bromine component ranged from 0.4 to 0.1 or less of the lead concentration, indicating in most cases either automotive lead with a bromine deficiency or a mixture of lead from automotive and other sources. In Fairbanks, Alaska, during winter, atmospheric conditions favour high local concentrations of air pollutants. Aerosols collected by Millipore filters show that pollution chlorine averages very nearly the value predicted from the observed lead and the known composition of ethyl fluid, and the automotive source for both chlorine and lead is strongly indicated. Pollution bromine, however, was less than predicted, and the bromine deficiency was about

  13. Evaluation of the dynamic impacts of customer centered lead time reduction improvements on customer-oriented and financial performance: a hybrid approach of system dynamics and queuing network analysis

    OpenAIRE

    Alp, Arda; Reiner, Gerald

    2014-01-01

    Motivated by the strategic importance of reduced lead times in today’s competitive business environment, this doctoral dissertation analyzes the dynamic impacts of lead time reduction (LTR) improvements on customer satisfaction and related financial performance metrics. The core thesis is centered on development of an integrated dynamic performance measurement framework which covers operational, customer-oriented and financial performance dependencies over time. The framework is demonstrated ...

  14. Content Analysis Customer Education on Websites of Leading Financial Companies

    OpenAIRE

    Hoang, Hong

    2008-01-01

    Abstract This research focuses on examining how the web sites of some leading financial companies chosen from the list of 500 top companies in the world of Fortune in 2008 implement customer education. Using the content analysis, this research focuses on analyzing five main categories in order to evaluate the customer education program on web sites of these samples. The five categories consist of dentification customers target features of customer education, information content, method...

  15. Lead shielded cells for the spectrographic analysis of radioisotope solutions

    International Nuclear Information System (INIS)

    Two lead shielded cells for the spectrochemical analysis of radioisotope samples are described. One of them is devoted to the evaporation of samples before excitation and the other one contains a suitable spectrographic excitation stand for the copper spark technique. A special device makes it possible the easy displacement of the excitation cell on wheels and rails for its accurate and reproducible position as well as its replacement by a glove box for plutonium analysis. In order to guarantee safety the room in which the spectrograph and the source are set up in separated from the active laboratory by a wall with a suitable window. (Author) 1 refs

  16. Analysis of natural radionuclides and lead in foods and diets

    International Nuclear Information System (INIS)

    The main purpose of the present study was to determine the lead-210, polonium-210 and lead concentrations in foods and diets. Consumption of food is generally the main route by which radionuclides can enter the human organism. Precision and accuracy of the methods developed were verifies by the analysis of reference materials from the International Atomic Energy Agency (IAEA). The method for polonium-210 analysis consisted of sample dissolution by using a microwave digester (open system) employing concentrated nitric acid and hydrogen peroxide, evaporation almost dryness, addition of hydrochloric acid, polonium deposition onto silver disc for six hours and counting by alpha spectrometry. Lead was analysed by atomic absorption technique. After sample dissolution in a microwave digester (using concentrated nitric acid and hydrogen peroxide) and dilution to 50 ml, 20μl of the sample was injected in a pyrolytic graphite furnace - atomic absorption spectrophotometer equipped with Zeeman background correction. The assessment of the contaminants in foods and diets allowed to estimate the intake of these elements and for the radionuclides were also evaluated the radiation doses that the individuals selected were exposed by the food consumption. The effective dose for lead-210 by diets intake ranged from 1.3 to 4.3 μSv/year, corresponding to 25% of the resulting from polonium-210 intake. The dose due to the both natural radionuclides varied from 6.8 to 23.0 μSv/year. These values are in good agreement with the literature data. The value estimated by the United Nations Scientific Committee on Effects of Atomic Radiation (UNSCEAR, 1993) that is 60 μSv and lower than the dose of 0.02 Sv, limit established by ICRP, 1980. The lead levels found in the majority of the Brazilian foods are in good agreement with the values published by CONAT and FAO/WHO. However, some foods such as bean, potato, papaya, apple and rice present levels above of the recommended values by the Public

  17. Corrosion by liquid lead and lead-bismuth: experimental results review and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jinsuo [Los Alamos National Laboratory

    2008-01-01

    Liquid metal technologies for liquid lead and lead-bismuth alloy are under wide investigation and development for advanced nuclear energy systems and waste transmutation systems. Material corrosion is one of the main issues studied a lot recently in the development of the liquid metal technology. This study reviews corrosion by liquid lead and lead bismuth, including the corrosion mechanisms, corrosion inhibitor and the formation of the protective oxide layer. The available experimental data are analyzed by using a corrosion model in which the oxidation and scale removal are coupled. Based on the model, long-term behaviors of steels in liquid lead and lead-bismuth are predictable. This report provides information for the selection of structural materials for typical nuclear reactor coolant systems when selecting liquid lead or lead bismuth as heat transfer media.

  18. Information-theoretic approach to lead-lag effect on financial markets

    OpenAIRE

    Pawe{\\l} Fiedor

    2014-01-01

    Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both of those analyses are concentrated only on Pearson's correlation coefficient and thus intraday lead-lag relationships associated with such. Under Efficient Market Hypothesis such relationships are not possible as all information is embedded in the prices. In this paper we analyse lead-lag relationships of f...

  19. Uncertainties in analysis of innovative lead-cooled fast reactors

    International Nuclear Information System (INIS)

    There are numerous uncertainties in the prediction of innovative reactor design, arising from approximations used in the solution of the transport equation, and in nuclear data processing and cross section libraries generation. This paper describes the problems encountered in the analysis of the Encapsulated Nuclear Heat Source (ENHS) core benchmark and the new cross section libraries developed to overcome these problems. The ENHS is a new lead-bismuth or lead cooled novel reactor concept that is fuelled with metallic alloy of Pu, U and Zr, and i designed to operate for 20 effective full power years without refuelling and with very small burnup reactivity swing. The computational tools benchmarked include MOCUP-a coupled MCNP-4C and ORIGEN2.1 utility codes with MCNP data libraries based on ENDF/B-VI evaluations; and KWO2-a coupled KENO-V.a and ORIGEN2.1 code with ENDFB-V.2 based 238 groups library. Uncertainties in the cross sections of lead were found particularly large and deserve careful evaluation. (author)

  20. Corrections to the leading eikonal amplitude for high-energy scattering and quasipotential approach

    International Nuclear Information System (INIS)

    Asymptotic behaviour of the scattering amplitude for two scalar particle at high energy and fixed momentum transfers is reconsidered in quantum field theory. In the framework of the quasipotential approach and the modified perturbation theory a systematic scheme of finding the leading eikonal scattering amplitudes and its corrections is developed and constructed. The connection between the solutions obtained by quasipotential and functional approaches is also discussed. (author)

  1. A new approach to evaluate the leading hadronic corrections to the muon g-2

    Directory of Open Access Journals (Sweden)

    C.M. Carloni Calame

    2015-06-01

    Full Text Available We propose a novel approach to determine the leading hadronic corrections to the muon g-2. It consists in a measurement of the effective electromagnetic coupling in the space-like region extracted from Bhabha scattering data. We argue that this new method may become feasible at flavor factories, resulting in an alternative determination potentially competitive with the accuracy of the present results obtained with the dispersive approach via time-like data.

  2. Evaluation of Lead Release in a Simulated Lead-Free Premise Plumbing System Using a Sequential Sampling Approach.

    Science.gov (United States)

    Ng, Ding-Quan; Lin, Yi-Pin

    2016-01-01

    In this pilot study, a modified sampling protocol was evaluated for the detection of lead contamination and locating the source of lead release in a simulated premise plumbing system with one-, three- and seven-day stagnation for a total period of 475 days. Copper pipes, stainless steel taps and brass fittings were used to assemble the "lead-free" system. Sequential sampling using 100 mL was used to detect lead contamination while that using 50 mL was used to locate the lead source. Elevated lead levels, far exceeding the World Health Organization (WHO) guideline value of 10 µg·L(-1), persisted for as long as five months in the system. "Lead-free" brass fittings were identified as the source of lead contamination. Physical disturbances, such as renovation works, could cause short-term spikes in lead release. Orthophosphate was able to suppress total lead levels below 10 µg·L(-1), but caused "blue water" problems. When orthophosphate addition was ceased, total lead levels began to spike within one week, implying that a continuous supply of orthophosphate was required to control total lead levels. Occasional total lead spikes were observed in one-day stagnation samples throughout the course of the experiments. PMID:26927154

  3. Evaluation of Lead Release in a Simulated Lead-Free Premise Plumbing System Using a Sequential Sampling Approach

    Science.gov (United States)

    Ng, Ding-Quan; Lin, Yi-Pin

    2016-01-01

    In this pilot study, a modified sampling protocol was evaluated for the detection of lead contamination and locating the source of lead release in a simulated premise plumbing system with one-, three- and seven-day stagnation for a total period of 475 days. Copper pipes, stainless steel taps and brass fittings were used to assemble the “lead-free” system. Sequential sampling using 100 mL was used to detect lead contamination while that using 50 mL was used to locate the lead source. Elevated lead levels, far exceeding the World Health Organization (WHO) guideline value of 10 µg·L−1, persisted for as long as five months in the system. “Lead-free” brass fittings were identified as the source of lead contamination. Physical disturbances, such as renovation works, could cause short-term spikes in lead release. Orthophosphate was able to suppress total lead levels below 10 µg·L−1, but caused “blue water” problems. When orthophosphate addition was ceased, total lead levels began to spike within one week, implying that a continuous supply of orthophosphate was required to control total lead levels. Occasional total lead spikes were observed in one-day stagnation samples throughout the course of the experiments. PMID:26927154

  4. Early results from a systems approach to improving the performance and lifetime of lead acid batteries

    Science.gov (United States)

    Kellaway, M. J.; Jennings, P.; Stone, D.; Crowe, E.; Cooper, A.

    Lead acid batteries offer important advantages in respect of unit cost and ease of recycling. They also have good power and low temperature performance. However, for hybrid electric vehicle (HEV) duty with their extreme rates and continuous PSoC operation, improvements are required to significantly extend service life. The Reliable Highly Optimised Lead Acid Battery (RHOLAB) project is taking a radical approach to the design of a lead acid HEV battery pack to address this issue, taking a systems approach to produce a complete pack that is attractive to vehicle manufacturers. This paper describes the project at an intermediate stage where some testing has been completed and the construction of the complete pack system is well under way.

  5. Detection and cellular localization of lead by electron probe analysis in the diagnosis of suspected lead poisoning in rhesus monkeys

    International Nuclear Information System (INIS)

    Lead poisoning of unknown source was diagnosed histologically in 2 rhesus monkeys (Macaca mulatta) by finding acid-fast intranuclear inclusion bodies in the epithelial cells of renal cortical tubules. The presence of lead in the inclusions was determined by scanning electron microscopy/energy dispersive x-ray analysis using sections from paraffin embedded tissues. This observation indicates the usefulness of this technique for the detection and cellular localization of lead in tissues, even from archival material

  6. Identification of sources of lead exposure in French children by lead isotope analysis: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Lucas Jean-Paul

    2011-08-01

    Full Text Available Abstract Background The amount of lead in the environment has decreased significantly in recent years, and so did exposure. However, there is no known safe exposure level and, therefore, the exposure of children to lead, although low, remains a major public health issue. With the lower levels of exposure, it is becoming more difficult to identify lead sources and new approaches may be required for preventive action. This study assessed the usefulness of lead isotope ratios for identifying sources of lead using data from a nationwide sample of French children aged from six months to six years with blood lead levels ≥25 μg/L. Methods Blood samples were taken from 125 children, representing about 600,000 French children; environmental samples were taken from their homes and personal information was collected. Lead isotope ratios were determined using quadrupole ICP-MS (inductively coupled plasma - mass spectrometry and the isotopic signatures of potential sources of exposure were matched with those of blood in order to identify the most likely sources. Results In addition to the interpretation of lead concentrations, lead isotope ratios were potentially of use for 57% of children aged from six months to six years with blood lead level ≥ 25 μg/L (7% of overall children in France, about 332,000 children, with at least one potential source of lead and sufficiently well discriminated lead isotope ratios. Lead isotope ratios revealed a single suspected source of exposure for 32% of the subjects and were able to eliminate at least one unlikely source of exposure for 30% of the children. Conclusions In France, lead isotope ratios could provide valuable additional information in about a third of routine environmental investigations.

  7. Forecasting UK Real Estate Cycle Phases With Leading Indicators: A Probit Approach

    OpenAIRE

    Kyrstaloyianni, A.; Matysiak, George; S Tsolacos

    2004-01-01

    This paper examines the significance of widely used leading indicators of the UK economy for predicting the cyclical pattern of commercial real estate performance. The analysis uses monthly capital value data for UK industrials, offices and retail from the Investment Property Databank (IPD). Prospective economic indicators are drawn from three sources namely, the series used by the US Conference Board to construct their UK leading indicator and the series deployed by two private organisations...

  8. A Core Design Approach Aimed at the Sustainability and Intrinsic Safety of the European Lead-Cooled Fast Reactor

    International Nuclear Information System (INIS)

    Among the Generation-IV fast reactor technologies, a Lead-cooled Fast Reactor concept is currently under development in Europe as a potential candidate for the deployment, to meet long-term objectives of European energy policies. Within the Lead-cooled European Advanced DEmonstration Reactor (LEADER) project, co-financed by the European Union within the 7th EURATOM Framework Programme, the conceptual design of the reference Generation-IV European LFR (ELFR) industrial plant was developed, benefiting from and further optimizing the concept put forward during the ELSY 6th EURATOM Framework Programme project. In order to embed in the design the safety and sustainability goals in the most effective way, an innovative, dedicated design approach was developed and applied to the design of the ELFR fuel pins, fuel assemblies and core. This new approach, together with the main analysis results supporting the design of the reference ELFR configuration, are presented and discussed in detail. (author)

  9. Analysis of Lead and Zinc by Mercury-Free Potentiometric Stripping Analysis

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    A method is presented for trace-element analysis of lead and zinc by potentiometric stripping analysis (PSA) where both the glassy-carbon working electrode and the electrolyte are free of mercury. Analysis of zinc requires an activation procedure of the glassy-carbon electrode. The activation is...

  10. Strangeness $S=-1$ hyperon-nucleon scattering at leading order in the covariant Weinberg's approach

    CERN Document Server

    Li, Kai-Wen; Geng, Li-Sheng

    2016-01-01

    Inspired by the success of covariant baryon chiral perturbation theory in the one baryon sector and in the heavy-light systems, we explore the relevance of relativistic effects in the construction of the strangeness $S=-1$ hyperon-nucleon interaction using chiral perturbation theory. Due to the non-perturbative nature of the hyperon-nucleon interaction, we follow the covariant Weinberg's approach recently proposed by Epelbaum and Gegelia to sum the leading order chiral potential using the Kadyshevsky equation (Epelbaum, 2012) in this exploratory work. By fitting the five low-energy constants to available experimental data, we find that the cutoff dependence is mitigated compared with the results obtained in the Weinberg's approach for both partial wave phase shifts and the description of experimental data. Nevertheless, at leading order, the description of experimental data remains quantitatively similar. We discuss in detail the cutoff dependence of the partial wave phase shifts and cross sections in the Wei...

  11. A derivative-based approach for the leading order hadronic contribution to $g_\\mu-2$

    CERN Document Server

    Gregory, Eric B

    2015-01-01

    We describe a lattice approach to calculating the leading-order hadronic contribution to the anomalous magnetic moment of the muon. We employ lattice momentum derivatives, in both the spatial and temporal directions, to determine the hadronic vacuum polarization scalar at low momenta and construct a smooth, intregrable function in this momentum region. The method is tested on one hex-smeared Wilson-quark lattice ensemble with physical pion masses.

  12. Nuclear shadowing in deep inelastic scattering on nuclei: leading twist versus eikonal approaches

    International Nuclear Information System (INIS)

    We use several diverse parameterizations of diffractive parton distributions, extracted in leading twist QCD analyses of the HERA diffractive deep inelastic scattering (DIS) data, to make predictions for leading twist nuclear shadowing of nuclear quark and gluon distributions in DIS on nuclei. We find that the HERA diffractive data are sufficiently precise to allow us to predict large nuclear shadowing for gluons and quarks, unambiguously. We performed detailed studies of nuclear shadowing for up and charm sea quarks and gluons within several scenarios of shadowing and diffractive slopes, as well as at central impact parameters. We compare these leading twist results with those obtained from the eikonal approach to nuclear shadowing (which is based on a very different space-time picture) and observe sharply contrasting predictions for the size and Q2-dependence of nuclear shadowing. The most striking differences arise for the interaction of small dipoles with nuclei, in particular for the longitudinal structure function FLA. (author)

  13. STRUCTURAL ANALYSIS OF ANNEALED LEAD PHTHALOCYANINE THIN FILMS

    Directory of Open Access Journals (Sweden)

    P.Kalugasalam

    2010-07-01

    Full Text Available The thin films of Lead Phthalocyanine (PbPc on glass substrates were prepared by Vacuum deposition. The thicknesses of the films were 150 nm, 300nm and 450 nm on glass substrate. The thickness of the film was 150 nm on KCl substrate. The thickness of sample 450 nm annealed at 323 K and 373 K temperature. Diffraction is one of the most powerful methods of the study and structure of materials, which mayinvolve X-rays, electrons. The relative ease and convenience, large diffraction angle, representation of the average crystalline lattice throughout the film and simultaneous display of diffraction pattern from the film make the XRD method a successful analytical technique for the study of thin films. The sample has been analysed by X-ray diffraction (XRD to get structural analysis of the PbPc thin film. From the XRD patterns higher thickness of PbPc films show a triclinic grains (T seen along with monoclinic (M forms of PbPc. The sample is annealed at 323 K and 373 K temperatures; the film shows peaks that assigned to the triclinic phase.

  14. Gamma radiation shielding analysis of lead-flyash concretes

    International Nuclear Information System (INIS)

    Six samples of lead-flyash concrete were prepared with lead as an admixture and by varying flyash content – 0%, 20%, 30%, 40%, 50% and 60% (by weight) by replacing cement and keeping constant w/c ratio. Different gamma radiation interaction parameters used for radiation shielding design were computed theoretically and measured experimentally at 662 keV, 1173 keV and 1332 keV gamma radiation energy using narrow transmission geometry. The obtained results were compared with ordinary-flyash concretes. The radiation exposure rate of gamma radiation sources used was determined with and without lead-flyash concretes. - Highlights: • Concrete samples with lead as admixture were casted with flyash replacing 0%, 20%, 30%, 40%, 50% and 60% of cement content (by weight). • Gamma radiation shielding parameters of concretes for different gamma ray sources were measured. • The attenuation results of lead-flyash concretes were compared with the results of ordinary flyash concretes

  15. Electrode alignment of transverse tripoles using a percutaneous triple-lead approach in spinal cord stimulation

    Science.gov (United States)

    Sankarasubramanian, V.; Buitenweg, J. R.; Holsheimer, J.; Veltink, P.

    2011-02-01

    The aim of this modeling study is to determine the influence of electrode alignment of transverse tripoles on the paresthesia coverage of the pain area in spinal cord stimulation, using a percutaneous triple-lead approach. Transverse tripoles, comprising a central cathode and two lateral anodes, were modeled on the low-thoracic vertebral region (T10-T12) using percutaneous triple-lead configurations, with the center lead on the spinal cord midline. The triple leads were oriented both aligned and staggered. In the staggered configuration, the anodes were offset either caudally (caudally staggered) or rostrally (rostrally staggered) with respect to the midline cathode. The transverse tripolar field steering with the aligned and staggered configurations enabled the estimation of dorsal column fiber thresholds (IDC) and dorsal root fiber thresholds (IDR) at various anodal current ratios. IDC and IDR were considerably higher for the aligned transverse tripoles as compared to the staggered transverse tripoles. The aligned transverse tripoles facilitated deeper penetration into the medial dorsal columns (DCs). The staggered transverse tripoles always enabled broad and bilateral DC activation, at the expense of mediolateral steerability. The largest DC recruited area was obtained with the rostrally staggered transverse tripole. Transverse tripolar geometries, using percutaneous leads, allow for selective targeting of either medial or lateral DC fibers, if and only if the transverse tripole is aligned. Steering of anodal currents between the lateral leads of the staggered transverse tripoles cannot target medially confined populations of DC fibers in the spinal cord. An aligned transverse tripolar configuration is strongly recommended, because of its ability to provide more post-operative flexibility than other configurations.

  16. Approaches to folyl polyglutamate analysis

    OpenAIRE

    Yang, Yingying

    2013-01-01

    The literature review presented the effects of the polyglutamate chain on the biological and nutritional properties of folates and the main methods used for folate assays, with a special emphasis on the approaches to studying intact polyglutamates. A brief introduction regarding safety aspects of folate fortification was also given. The aim of this study was to develop a UPLC-FLR/PDA method for simultaneous determination of polyglutamyl folate vitamers. Chromatographic conditions were opt...

  17. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  18. Effects of Uncertainties in Lead Cross Section Data in Monte Carlo Analysis of lead Cooled and reflected Reactors

    International Nuclear Information System (INIS)

    This paper describes the problems encountered in the analysis of the Encapsulated Nuclear Heat Source (ENHS) core benchmark and the new cross section libraries developed to overcome these problems. The ENHS is a new lead-bismuth or lead cooled novel reactor concept that is fuelled with metallic alloy of Pu, U and Zr, and is designed to operate for 20 effective full power years without re-fuelling and with very small burn-up reactivity swing. There are numerous uncertainties in the prediction of core parameters of this and other innovative reactor designs, arising from approximations used in the solution of the transport equation, in nuclear data processing and cross section libraries generation. In this paper we analyzed the effects of uncertainties in lead cross sections data from several versions of ENDF, JENDL and JEFF for lead-cooled and reflected computational benchmarks. (author)

  19. Nuclear microprobe analysis of lead profile in crocodile bones

    International Nuclear Information System (INIS)

    Elevated concentrations of lead were found in Australian free ranging saltwater crocodile (Crocodylus porosus) bone and flesh. Lead shots were found as potential source of lead in these animals. ANSTO's heavy ion nuclear microprobe was used to measure the distribution of Pb in a number of bones and osteoderms. The aim was to find out if elevated Pb concentration remains in growth rings and if the concentration is correlated with the blood levels recorded at the time. Results of our study show a very distinct distribution of accumulated Pb in bones and osteoderms as well as good correlation with the level of lead concentration in blood. To investigate influence of ion species on detection limits measurements of the same sample were performed by using 3 MeV protons, 9 MeV He ions and 20 MeV carbon ions. Peak to background ratios, detection limits and the overall 'quality' of obtained spectra are compared and discussed

  20. Nuclear microprobe analysis of lead profile in crocodile bones

    Science.gov (United States)

    Orlic, I.; Siegele, R.; Hammerton, K.; Jeffree, R. A.; Cohen, D. D.

    2003-09-01

    Elevated concentrations of lead were found in Australian free ranging saltwater crocodile ( Crocodylus porosus) bone and flesh. Lead shots were found as potential source of lead in these animals. ANSTO's heavy ion nuclear microprobe was used to measure the distribution of Pb in a number of bones and osteoderms. The aim was to find out if elevated Pb concentration remains in growth rings and if the concentration is correlated with the blood levels recorded at the time. Results of our study show a very distinct distribution of accumulated Pb in bones and osteoderms as well as good correlation with the level of lead concentration in blood. To investigate influence of ion species on detection limits measurements of the same sample were performed by using 3 MeV protons, 9 MeV He ions and 20 MeV carbon ions. Peak to background ratios, detection limits and the overall 'quality' of obtained spectra are compared and discussed.

  1. Information-theoretic approach to lead-lag effect on financial markets

    Science.gov (United States)

    Fiedor, Paweł

    2014-08-01

    Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both types of analysis are concentrated mostly on Pearson's correlation coefficient and consequently intraday lead-lag relationships (where one of the variables in a pair is time-lagged) are also associated with them. Under the Efficient-Market Hypothesis such relationships are not possible as all information is embedded in the prices, but in real markets we find such dependencies. In this paper we analyse lead-lag relationships of financial instruments and extend known methodology by using mutual information instead of Pearson's correlation coefficient. Mutual information is not only a more general measure, sensitive to non-linear dependencies, but also can lead to a simpler procedure of statistical validation of links between financial instruments. We analyse lagged relationships using New York Stock Exchange 100 data not only on an intraday level, but also for daily stock returns, which have usually been ignored.

  2. Discovering lead-free perovskite solar materials with a split-anion approach

    Science.gov (United States)

    Sun, Yi-Yang; Shi, Jian; Lian, Jie; Gao, Weiwei; Agiorgousis, Michael L.; Zhang, Peihong; Zhang, Shengbai

    2016-03-01

    Organic-inorganic hybrid perovskite solar materials, being low-cost and high-performance, are promising for large-scale deployment of the photovoltaic technology. A key challenge that remains to be addressed is the toxicity of these materials since the high-efficiency solar cells are made of lead-containing materials, in particular, CH3NH3PbI3. Here, based on first-principles calculation, we search for lead-free perovskite materials based on the split-anion approach, where we replace Pb with non-toxic elements while introducing dual anions (i.e., splitting the anion sites) that preserve the charge neutrality. We show that CH3NH3BiSeI2 and CH3NH3BiSI2 exhibit improved band gaps and optical absorption over CH3NH3PbI3. The split-anion approach could also be applied to pure inorganic perovskites, significantly enlarging the pool of candidate materials in the design of low-cost, high-performance and environmentally-friendly perovskite solar materials.Organic-inorganic hybrid perovskite solar materials, being low-cost and high-performance, are promising for large-scale deployment of the photovoltaic technology. A key challenge that remains to be addressed is the toxicity of these materials since the high-efficiency solar cells are made of lead-containing materials, in particular, CH3NH3PbI3. Here, based on first-principles calculation, we search for lead-free perovskite materials based on the split-anion approach, where we replace Pb with non-toxic elements while introducing dual anions (i.e., splitting the anion sites) that preserve the charge neutrality. We show that CH3NH3BiSeI2 and CH3NH3BiSI2 exhibit improved band gaps and optical absorption over CH3NH3PbI3. The split-anion approach could also be applied to pure inorganic perovskites, significantly enlarging the pool of candidate materials in the design of low-cost, high-performance and environmentally-friendly perovskite solar materials. Electronic supplementary information (ESI) available: Detailed descriptions on

  3. Contaminant source apportionment by PIMMS lead isotope analysis and SEM-image analysis.

    Science.gov (United States)

    McGill, R A; Pearce, J M; Fortey, N J; Watt, J; Ault, L; Parrish, R R

    2003-03-01

    By combining scanning electron microscopy (SEM) image analysis and laser ablation plasma ionisation multi-collector mass spectrometry (LA-PIMMS), high precision lead isotope analyses can be obtained from individual metal-rich particles. Soils from Wolverhampton and Nottingham were sampled on the basis of high Pb concentrations or brownfield location. Pressed powder pellets of each were rastered by LA-PIMMS to obtain a bulk Pb-isotope signature. The results plot along an apparent mixing line between the major sources of lead contamination in the UK, that is UK ore deposits and alkyl-lead from petrol additives (Australian ore). Two particularly lead-rich soils were chosen to investigate the lead distribution and isotope variability between size and density fractions. The fine-grained and low-density fractions contained most of the lead and have Pb-isotope ratios comparable with the bulk soils. By contrast, the small, lead-enriched denser fractions contained only a minor proportion of the total lead but Pb-isotope signatures indicating relative enrichment in one or other of the end-members from the mixing line. Further characterisation of individual Pb-rich grains is in progress. PMID:12901075

  4. Approaches to Sensitivity Analysis in MOLP

    Directory of Open Access Journals (Sweden)

    Sebastian Sitarz

    2014-02-01

    Full Text Available The paper presents two approaches to the sensitivity analysis in multi-objective linear programming (MOLP. The first one is the tolerance approach and the other one is the standard sensitivity analysis. We consider the perturbation of the objective function coefficients. In the tolerance method we simultaneously change all of the objective function coefficients. In the standard sensitivity analysis we change one objective function coefficient without changing the others. In the numerical example we compare the results obtained by using these two different approaches.

  5. Classifying Enterprise Architecture Analysis Approaches

    Science.gov (United States)

    Buckl, Sabine; Matthes, Florian; Schweda, Christian M.

    Enterprise architecture (EA) management forms a commonly accepted means to enhance the alignment of business and IT, and to support the managed evolution of the enterprise. One major challenge of EA management is to provide decision support by analyzing as-is states of the architecture as well as assessing planned future states. Thus, different kinds of analysis regarding the EA exist, each relying on certain conditions and demands for models, methods, and techniques.

  6. Hybrid approaches for sentiment analysis

    OpenAIRE

    Wiegand, Michael

    2011-01-01

    Sentiment Analysis is the task of extracting and classifying opinionated content in natural language texts. Common subtasks are the distinction between opinionated and factual texts, the classification of polarity in opinionated texts, and the extraction of the participating entities of an opinion(-event), i.e. the source from which an opinion emanates and the target towards which it is directed. With the emerging Web 2.0 which describes the shift towards a highly user-interactive communicati...

  7. The Nd Break-Up Process in Leading Order in a Three-Dimensional Approach

    CERN Document Server

    Fachuddin, I; Glöckle, W; Elster, Ch.

    2003-01-01

    A three-dimensional approach based on momentum vectors as variables for solving the three nucleon Faddeev equation in first order is presented. The nucleon-deuteron break-up amplitude is evaluated in leading order in the NN T-matrix, which is also generated directly in three dimensions avoiding a summation of partial wave contributions. A comparison of semi-exclusive observables in the $d(p,n)pp$ reaction calculated in this scheme with those generated by a traditional partial wave expansion shows perfect agreement at lower energies. At about 200 MeV nucleon laboratory energies deviations in the peak of the cross section appear, which may indicate that special care is required in a partial wave approach for energies at and higher than 200 MeV. The role of higher order rescattering processes beyond the leading order in the NN T-matrix is investigated with the result, that at 200 MeV rescattering still provides important contributions to the cross section and certain spin observables. The influence of a relativi...

  8. Nuclear microprobe analysis of lead profile in crocodile bones

    Energy Technology Data Exchange (ETDEWEB)

    Orlic, I. E-mail: ivo@ansto.gov.au; Siegele, R.; Hammerton, K.; Jeffree, R.A.; Cohen, D.D

    2003-09-01

    Elevated concentrations of lead were found in Australian free ranging saltwater crocodile (Crocodylus porosus) bone and flesh. Lead shots were found as potential source of lead in these animals. ANSTO's heavy ion nuclear microprobe was used to measure the distribution of Pb in a number of bones and osteoderms. The aim was to find out if elevated Pb concentration remains in growth rings and if the concentration is correlated with the blood levels recorded at the time. Results of our study show a very distinct distribution of accumulated Pb in bones and osteoderms as well as good correlation with the level of lead concentration in blood. To investigate influence of ion species on detection limits measurements of the same sample were performed by using 3 MeV protons, 9 MeV He ions and 20 MeV carbon ions. Peak to background ratios, detection limits and the overall 'quality' of obtained spectra are compared and discussed.

  9. Lead identification for the K-Ras protein: virtual screening and combinatorial fragment-based approaches

    Science.gov (United States)

    Pathan, Akbar Ali Khan; Panthi, Bhavana; Khan, Zahid; Koppula, Purushotham Reddy; Alanazi, Mohammed Saud; Sachchidanand; Parine, Narasimha Reddy; Chourasia, Mukesh

    2016-01-01

    Objective Kirsten rat sarcoma (K-Ras) protein is a member of Ras family belonging to the small guanosine triphosphatases superfamily. The members of this family share a conserved structure and biochemical properties, acting as binary molecular switches. The guanosine triphosphate-bound active K-Ras interacts with a range of effectors, resulting in the stimulation of downstream signaling pathways regulating cell proliferation, differentiation, and apoptosis. Efforts to target K-Ras have been unsuccessful until now, placing it among high-value molecules against which developing a therapy would have an enormous impact. K-Ras transduces signals when it binds to guanosine triphosphate by directly binding to downstream effector proteins, but in case of guanosine diphosphate-bound conformation, these interactions get disrupted. Methods In the present study, we targeted the nucleotide-binding site in the “on” and “off” state conformations of the K-Ras protein to find out suitable lead compounds. A structure-based virtual screening approach has been used to screen compounds from different databases, followed by a combinatorial fragment-based approach to design the apposite lead for the K-Ras protein. Results Interestingly, the designed compounds exhibit a binding preference for the “off” state over “on” state conformation of K-Ras protein. Moreover, the designed compounds’ interactions are similar to guanosine diphosphate and, thus, could presumably act as a potential lead for K-Ras. The predicted drug-likeness properties of these compounds suggest that these compounds follow the Lipinski’s rule of five and have tolerable absorption, distribution, metabolism, excretion and toxicity values. Conclusion Thus, through the current study, we propose targeting only “off” state conformations as a promising strategy for the design of reversible inhibitors to pharmacologically inhibit distinct conformations of K-Ras protein.

  10. Differences between IC Analysis and TG Approach

    Institute of Scientific and Technical Information of China (English)

    张美玲

    2014-01-01

    Structuralism and generative approach are two representative syntax theories, which study language from different per-spectives. They employ different methodologies i.e. Immediate constituent (IC) Analysis and transformational-generative (TG) approach to make syntactical analysis. In this paper, these two methods will be applied to analyze some sentences for further dis-cussion and comparison. After analysis by examples, we find that both these two methods have their merits and inadequacies. To some extent, TG method can help IC analysis solve some problems. However, TG grammar is by no means complete and perfect. Improvements are needed to reach its ultimate goal of producing a universal grammar for all human languages.

  11. MODERN TERRORISM: CONCEPT AND APPROACH ANALYSIS

    OpenAIRE

    CHAIKA ALEXANDER VIKTOROVICH

    2015-01-01

    The problem of modern terrorism as an image of counterculture environment is considered. The analysis of concepts and approaches of foreign and native authors, specialists of terrorism problem research was conducted. Separate features of the modern terrorism are considered and emphasized. The author drew conceptual conclusions on the basis of dialectical approach to modern terrorism counterculture phenomenon research.

  12. Document Analysis by Crosscount Approach

    Institute of Scientific and Technical Information of China (English)

    王海琴; 戴汝为

    1998-01-01

    In this paper a new feature called crosscount for document analysis is introduced.The reature crosscount is a function of white line segment with its start on the edge of document images.It reflects not only the contour of image,but also the periodicity of white lines(background)and text lines in the document images.In complex printed-page layouts,there are different blocks such as textual,graphical,tabular,and so on.Of these blocks,textual ones have the most obvious periodicity with their homogeneous white lines arranged regularly.The important property of textual blocks can be extracted by crosscount functions.here the document layouts are classified into three classes on the basis of their physical structures.Then the definition and properties of the crosscount function are described.According to the classification of document layouts,the application of this new feature to different types of document images' analysis and understanding is discussed.

  13. ECG Signal Analysis: Different Approaches

    Directory of Open Access Journals (Sweden)

    S. Thulasi Prasad

    2014-01-01

    Full Text Available In recent years scientists and engineers are facing several challenges in solving biomedical problems and making Digital Signal Processing as an essential and effective pedagogical approach to solve a problem of detecting selected arrhythmia conditions from a patient’s electrocardiograph (ECG signals. The detection of QRS complex has many clinical applications as it marks the beginning of the left ventricular contraction. A lot of possible heart malfunctions such as cardiac arrhythmias, transient ischemic episodes and silent myocardial ischemia or failures will be slow while monitoring of ECG signal in real-time during normal activity. Introducing an efficient method for arrhythmia detection can be very useful for better conceptual understanding of signal processing. In this paper, we discussed two methods to clean ECG signal corrupted by noise and to extract required parameters for detecting arrhythmia condition. One method is Hilbert Transform method and another method is Filter Bank method. These methods involve using filter techniques, algorithms of finding peaks & valleys, local maxima & minima etc, for determining R peaks, R-R intervals and QRS complexes.

  14. A supply sided analysis of leading MOOC platforms and universities

    Directory of Open Access Journals (Sweden)

    Georg Peters

    2016-03-01

    Full Text Available Investing in education is generally considered as a promising strategy to fight poverty and increase prosperity. This applies to all levels of an economy reaching from individuals to local communities and countries and has a global perspective as well. However, high-quality education is often costly and not available anytime anywhere. Therefore, any promising concept that might help to democratize education is worth pursuing, in a sense that it makes education accessible for everybody without any restrictions. The characteristics attributed to MOOC – Massive Open Online Courses are promising to contribute to this objective. Hence, our objective is to analyse MOOC as it currently operates. Obviously, there is a huge demand for free high-quality education anytime anywhere but a shortage on the supply side. So, we will concentrate on supply-sided effects and study MOOC platforms as well as content providers, particularly universities. We focus our research on some of the leading platforms and universities worldwide. Relative to their size Australia and the Netherlands are very active players in the MOOC sector. Germany is lagging behind and leading universities in the UK seem to virtually refrain from offering MOOC. Our research also shows the leading role of US universities and platform providers.

  15. Lead isotopic compositions of environmental certified reference materials for an inter-laboratory comparison of lead isotope analysis

    International Nuclear Information System (INIS)

    Lead isotope ratios, viz. 207Pb/206Pb and 208Pb/206Pb, of the commercially available certified reference materials (CRMs) issued in Japan are presented with an objective to provide a data set, which will be useful for the quality assurance of analytical procedures, instrumental performance and method validation of the laboratories involved in environmental lead isotope ratio analysis. The analytical method used in the present study was inductively coupled plasma quadrupole mass spectrometry (ICPQMS) presented by acid digestion and with/without chemical separation of lead from the matrix. The precision of the measurements in terms of the relative standard deviation (RSD) of triplicated analyses was 0.19% and 0.14%, for 207Pb/206Pb and 208Pb/206Pb, respectively. The trueness of lead isotope ratio measurements of the present study was tested with a few CRMs, which have been analyzed by other analytical methods and reported in various literature. The lead isotopic ratios of 18 environmental matrix CRMs (including 6 CRMs analyzed for our method validation) are presented and the distribution of their ratios is briefly discussed. (author)

  16. Analysis of Residential System Strategies Targeting Least-Cost Solutions Leading to Net Zero Energy Homes: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, R.; Christensen, C.; Horowitz, S.

    2006-04-01

    The U. S. Department of Energy's Building America residential systems research project uses an analysis-based system research approach to identify research priorities, identify technology gaps and opportunities, establish a consistent basis to track research progress, and identify system solutions that are most likely to succeed as the initial targets for residential system research projects. This report describes the analysis approach used by the program to determine the most cost-effective pathways to achieve whole-house energy-savings goals. This report also provides an overview of design/technology strategies leading to net zero energy buildings as the basis for analysis of future residential system performance.

  17. Analysis of Lead and Zinc by Mercury-Free Potentiometric Stripping Analysis

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    A method is presented for trace-element analysis of lead and zinc by potentiometric stripping analysis (PSA) where both the glassy-carbon working electrode and the electrolyte are free of mercury. Analysis of zinc requires an activation procedure of the glassy-carbon electrode. The activation is...... performed by pre-concentrating zinc on glassy carbon at -1400 mV(SCE) in a mercury-free electrolyte containing 0.1 M HCl and 2 ppm Zn2+, followed by stripping at approx. -1050 mV. A linear relationship between stripping peak areas, recorded in the derivative mode, and concentration was found in the...

  18. From the ephemeral to the enduring: how approach-oriented mindsets lead to greater status.

    Science.gov (United States)

    Kilduff, Gavin J; Galinsky, Adam D

    2013-11-01

    We propose that the psychological states individuals bring into newly formed groups can produce meaningful differences in status attainment. Three experiments explored whether experimentally created approach-oriented mindsets affected status attainment in groups, both immediately and over time. We predicted that approach-oriented states would lead to greater status attainment by increasing proactive behavior. Furthermore, we hypothesized that these status gains would persist longitudinally, days after the original mindsets had dissipated, due to the self-reinforcing behavioral cycles the approach-oriented states initiated. In Experiment 1, individuals primed with a promotion focus achieved higher status in their newly formed groups, and this was mediated by proactive behavior as rated by themselves and their teammates. Experiment 2 was a longitudinal experiment and revealed that individuals primed with power achieved higher status, both immediately following the prime and when the groups were reassembled 2 days later to work on new tasks. These effects were mediated by independent coders' ratings of proactive behavior during the first few minutes of group interaction. Experiment 3 was another longitudinal experiment and revealed that priming happiness led to greater status as well as greater acquisition of material resources. Importantly, these immediate and longitudinal effects were independent of the effects of a number of stable dispositional traits. Our results establish that approach-oriented psychological states affect status attainment, over and above the more stable characteristics emphasized in prior research, and provide the most direct test yet of the self-reinforcing nature of status hierarchies. These findings depict a dynamic view of status organization in which the same group may organize itself differently depending on members' incoming psychological states. PMID:23895266

  19. Analysis and Data Mining of Lead-Zinc Ore Data

    OpenAIRE

    Zanev, Vladimir; Topalov, Stanislav; Christov, Veselin

    2013-01-01

    This paper presents the results of our data mining study of Pb-Zn (lead-zinc) ore assay records from a mine enterprise in Bulgaria. We examined the dataset, cleaned outliers, visualized the data, and created dataset statistics. A Pb-Zn cluster data mining model was created for segmentation and prediction of Pb-Zn ore assay data. The Pb-Zn cluster data model consists of five clusters and DMX queries. We analyzed the Pb-Zn cluster content, size, structure, and characteristics. The set of the DM...

  20. Environmental health risk assessment of ambient lead levels in Lisbon, Portugal: A full chain study approach

    DEFF Research Database (Denmark)

    Casimiro, E.; Philippe Ciffroy, P.; Serpa, P.;

    2011-01-01

    useful for integrated full-chain human health risk assessments. In this study we use a newly developed computational tool – the 2FUN player to conduct a full-chain assessment combining measured ambient air lead concentrations with multi-media modelling and PBPK simulations to estimate the health risks...... calculate the Pb levels in the various body systems. Our results showed a low health risk from Pb exposures. It also identified that ingestion of leafy vegetables (i.e. lettuce, cabbage, and spinach) and fruits contribute the most to total Pb blood levels. This full chain assessment approach of the 2FUN...... player is likely to be very useful for local health risk assessment studies (i.e. EIA and SEA studies)....

  1. The right side? Under time pressure, approach motivation leads to right-oriented bias.

    Science.gov (United States)

    Roskes, Marieke; Sligte, Daniel; Shalvi, Shaul; De Dreu, Carsten K W

    2011-11-01

    Approach motivation, a focus on achieving positive outcomes, is related to relative left-hemispheric brain activation, which translates to a variety of right-oriented behavioral biases. In two studies, we found that approach-motivated individuals display a right-oriented bias, but only when they are forced to act quickly. In a task in which they had to divide lines into two equal parts, approach-motivated individuals bisected the line at a point farther to the right than avoidance-motivated individuals did, but only when they worked under high time pressure. In our analysis of all Fédération Internationale de Football Association (FIFA) World Cup penalty shoot-outs, we found that goalkeepers were two times more likely to dive to the right than to the left when their team was behind, a situation that we conjecture induces approach motivation. Because penalty takers shot toward the two sides of the goal equally often, the goalkeepers' right-oriented bias was dysfunctional, allowing more goals to be scored. Directional biases may facilitate group coordination but prove maladaptive in individual settings and interpersonal competition. PMID:22006059

  2. Introduction to Real Analysis An Educational Approach

    CERN Document Server

    Bauldry, William C

    2011-01-01

    An accessible introduction to real analysis and its connection to elementary calculus Bridging the gap between the development and history of real analysis, Introduction to Real Analysis: An Educational Approach presents a comprehensive introduction to real analysis while also offering a survey of the field. With its balance of historical background, key calculus methods, and hands-on applications, this book provides readers with a solid foundation and fundamental understanding of real analysis. The book begins with an outline of basic calculus, including a close examination of problems illust

  3. Lead identification for the K-Ras protein: virtual screening and combinatorial fragment-based approaches

    Directory of Open Access Journals (Sweden)

    Pathan AAK

    2016-05-01

    Full Text Available Akbar Ali Khan Pathan,1,2,* Bhavana Panthi,3,* Zahid Khan,1 Purushotham Reddy Koppula,4–6 Mohammed Saud Alanazi,1 Sachchidanand,3 Narasimha Reddy Parine,1 Mukesh Chourasia3,* 1Genome Research Chair (GRC, Department of Biochemistry, College of Science, King Saud University, 2Integrated Gulf Biosystems, Riyadh, Kingdom of Saudi Arabia; 3Department of Pharmacoinformatics, National Institute of Pharmaceutical Education and Research, Hajipur, India; 4Department of Internal Medicine, School of Medicine, 5Harry S. Truman Memorial Veterans Affairs Hospital, 6Department of Radiology, School of Medicine, Columbia, MO, USA *These authors contributed equally to this work Objective: Kirsten rat sarcoma (K-Ras protein is a member of Ras family belonging to the small guanosine triphosphatases superfamily. The members of this family share a conserved structure and biochemical properties, acting as binary molecular switches. The guanosine triphosphate-bound active K-Ras interacts with a range of effectors, resulting in the stimulation of downstream signaling pathways regulating cell proliferation, differentiation, and apoptosis. Efforts to target K-Ras have been unsuccessful until now, placing it among high-value molecules against which developing a therapy would have an enormous impact. K-Ras transduces signals when it binds to guanosine triphosphate by directly binding to downstream effector proteins, but in case of guanosine diphosphate-bound conformation, these interactions get disrupted. Methods: In the present study, we targeted the nucleotide-binding site in the “on” and “off” state conformations of the K-Ras protein to find out suitable lead compounds. A structure-based virtual screening approach has been used to screen compounds from different databases, followed by a combinatorial fragment-based approach to design the apposite lead for the K-Ras protein. Results: Interestingly, the designed compounds exhibit a binding preference for the

  4. Analysis of proton-lead data via re-weighting

    Science.gov (United States)

    Zurita, P.

    2015-03-01

    The recent proton-lead run at the LHC shall provide new information on the partonic behaviour within the nuclear medium. At LHC energies the dominant contribution comes from gluon-initiated processes that is, the least well constrained parton density. Therefore it is important to profit from any information that new data can provide us. A time-saving alternative to performing a global fit is the use of Bayesian inference, a powerful tool to realize the impact of data into a set of PDFs independently of the original fitters. In this work we apply the Bayesian re-weighting technique to analyze pseudo data for LHC kinematics in Drell-Yan and hadro-production processes. A set of Monte Carlo replicas for EPS09 is released in a public code for general use.

  5. The lead cooled fast reactor benchmark Brest-300: analysis with sensitivity method

    International Nuclear Information System (INIS)

    Lead cooled fast neutrons reactor is one of the most interesting candidates for the development of atomic energy. BREST-300 is a 300 MWe lead cooled fast reactor developed by the NIKIET (Russia) with a deterministic safety approach which aims to exclude reactivity margins greater than the delayed neutron fraction. The development of innovative reactors (lead coolant, nitride fuel...) and fuel cycles with new constraints such as cycle closure or actinide burning, requires new technologies and new nuclear data. In this connection, the tool and neutron data used for the calculational analysis of reactor characteristics requires thorough validation. NIKIET developed a reactor benchmark fitting of design type calculational tools (including neutron data). In the frame of technical exchanges between NIKIET and EDF (France), results of this benchmark calculation concerning the principal parameters of fuel evolution and safety parameters has been inter-compared, in order to estimate the uncertainties and validate the codes for calculations of this new kind of reactors. Different codes and cross-sections data have been used, and sensitivity studies have been performed to understand and quantify the uncertainties sources.The comparison of results shows that the difference on keff value between ERANOS code with ERALIB1 library and the reference is of the same order of magnitude than the delayed neutron fraction. On the other hand, the discrepancy is more than twice bigger if JEF2.2 library is used with ERANOS. Analysis of discrepancies in calculation results reveals that the main effect is provided by the difference of nuclear data, namely U238, Pu239 fission and capture cross sections and lead inelastic cross sections

  6. Shotgun approaches to gait analysis: insights & limitations

    OpenAIRE

    Kaptein, Ronald G; Wezenberg, Daphne; IJmker, Trienke; Houdijk, Han; Beek, Peter J; Lamoth, Claudine J. C.; Daffertshofer, Andreas

    2014-01-01

    Background: Identifying features for gait classification is a formidable problem. The number of candidate measures is legion. This calls for proper, objective criteria when ranking their relevance. Methods: Following a shotgun approach we determined a plenitude of kinematic and physiological gait measures and ranked their relevance using conventional analysis of variance (ANOVA) supplemented by logistic and partial least squares (PLS) regressions. We illustrated this approach using data from ...

  7. Index analysis approach theory at work

    CERN Document Server

    Lowen, R

    2015-01-01

    A featured review of the AMS describes the author’s earlier work in the field of approach spaces as, ‘A landmark in the history of general topology’. In this book, the author has expanded this study further and taken it in a new and exciting direction.   The number of conceptually and technically different systems which characterize approach spaces is increased and moreover their uniform counterpart, uniform gauge spaces, is put into the picture. An extensive study of completions, both for approach spaces and for uniform gauge spaces, as well as compactifications for approach spaces is performed. A paradigm shift is created by the new concept of index analysis.   Making use of the rich intrinsic quantitative information present in approach structures, a technique is developed whereby indices are defined that measure the extent to which properties hold, and theorems become inequalities involving indices; therefore vastly extending the realm of applicability of many classical results. The theory is the...

  8. Algebraic analysis approach for multibody problems

    International Nuclear Information System (INIS)

    Here we propose an algebraic analysis approach for multibody Coulomb interactions. The momentum transfer cross section calculated by the algebraic approximation is close to the exact one. The CPU time required for the algebraic approximation is only about 20 min using a personal computer, whereas the exact analysis requires 15 h to integrate the entire set of multibody equations of motion, in which all the field particles are at rest. (author)

  9. Practical Approach to Fragility Analysis of Bridges

    Directory of Open Access Journals (Sweden)

    Yasamin Rafie Nazari

    2012-12-01

    Full Text Available Damages during past earthquakes reveal seismic vulnerability of bridge structures and the necessity of probabilistic approach toward seismic performance evaluation of bridges and its interpretation in terms of decision variables such as repair cost, downtime and life loss. This Procedure involves hazard analysis, structural analysis, damage analysis and loss analysis. The purpose of present study is reviewing different methods developed to derive fragility curves for damage analysis of bridges and demonstrating a simple procedure for fragility analysis using Microsoft Office Excel worksheet to reach probability of occurring predefined level of damage due to different levels of seismic demand parameters. The input of this procedure is the intensity of ground motion and the output is an appropriate estimate of the expected damage. Different observed damages of the bridges are discussed and compared the practical definition of damage states. Different methods of fragility analyses are discussed and a practical step by step example is illustrated.

  10. The conformal approach to asymptotic analysis

    CERN Document Server

    Nicolas, Jean-Philippe

    2015-01-01

    This essay was written as an extended version of a talk given at a conference in Strasbourg on "Riemann, Einstein and geometry", organized by Athanase Papadopoulos in September 2014. Its aim is to present Roger Penrose's approach to asymptotic analysis in general relativity, which is based on conformal geometric techniques, focusing on historical and recent aspects of two specialized topics~: conformal scattering and peeling.

  11. An Ethnografic Approach to Video Analysis

    DEFF Research Database (Denmark)

    Holck, Ulla

    2007-01-01

    European Music Therapy Congress, June 16-20, 2004 Jyväskylä, Finland. P. 1094-1110. eBook available at MusicTherapyToday.com Vol.6. Issue 4 (November 2005). Holck, U. (2007). An Ethnographic Descriptive Approach to Video Micro Analysis. In: T. Wosch & T. Wigram (Eds.) Microanalysis in music therapy...

  12. A Mellin transform approach to wavelet analysis

    Science.gov (United States)

    Alotta, Gioacchino; Di Paola, Mario; Failla, Giuseppe

    2015-11-01

    The paper proposes a fractional calculus approach to continuous wavelet analysis. Upon introducing a Mellin transform expression of the mother wavelet, it is shown that the wavelet transform of an arbitrary function f(t) can be given a fractional representation involving a suitable number of Riesz integrals of f(t), and corresponding fractional moments of the mother wavelet. This result serves as a basis for an original approach to wavelet analysis of linear systems under arbitrary excitations. In particular, using the proposed fractional representation for the wavelet transform of the excitation, it is found that the wavelet transform of the response can readily be computed by a Mellin transform expression, with fractional moments obtained from a set of algebraic equations whose coefficient matrix applies for any scale a of the wavelet transform. Robustness and computationally efficiency of the proposed approach are shown in the paper.

  13. Drag Coefficient of Water Droplets Approaching the Leading Edge of an Airfoil

    Science.gov (United States)

    Vargas, Mario; Sor, Suthyvann; Magarino, Adelaida Garcia

    2013-01-01

    This work presents results of an experimental study on droplet deformation and breakup near the leading edge of an airfoil. The experiment was conducted in the rotating rig test cell at the Instituto Nacional de Tecnica Aeroespacial (INTA) in Madrid, Spain. An airfoil model was placed at the end of the rotating arm and a monosize droplet generator produced droplets that fell from above, perpendicular to the path of the airfoil. The interaction between the droplets and the airfoil was captured with high speed imaging and allowed observation of droplet deformation and breakup as the droplet approached the airfoil near the stagnation line. Image processing software was used to measure the position of the droplet centroid, equivalent diameter, perimeter, area, and the major and minor axes of an ellipse superimposed over the deforming droplet. The horizontal and vertical displacement of each droplet against time was also measured, and the velocity, acceleration, Weber number, Bond number, Reynolds number, and the drag coefficients were calculated along the path of the droplet to the beginning of breakup. Results are presented and discussed for drag coefficients of droplets with diameters in the range of 300 to 1800 micrometers, and airfoil velocities of 50, 70 and 90 meters/second. The effect of droplet oscillation on the drag coefficient is discussed.

  14. Comparative analysis of using natural and radiogenic lead as heat-transfer agent in fast reactors

    Science.gov (United States)

    Laas, R. A.; Gizbrekht, R. V.; Komarov, P. A.; Nesterov, V. N.

    2016-06-01

    Fast reactors with lead coolant have several advantages over analogues. Performance can be further improved by replacement of natural composition lead with radiogenic one. Thus, two main issues need to be addressed: induced radioactivity in coolant and efficient neutron multiplication factor in the core will be changed and need to be estimated. To address these issues analysis of the scheme of the nuclear transformations in the lead heat-transfer agent in the process of radiation was carried out. Induced radioactivity of radiogenic and natural lead has been studied. It is shown that replacement of lead affects multiplication factor in a certain way. Application of radiogenic lead can significantly affect reactor operation.

  15. On the analysis of lead-time disturbances in production and inventory control models

    OpenAIRE

    Spiegler, VLM; Naim, MM; Syntetos, A

    2015-01-01

    Changes in the lead-time can lead to supply chain inefficiencies and risks. In this paper, we investigate the effects of lead-time disturbances on the system’s output responses of a production and inventory control model. In the adaption process of the control system for lead-time disturbance analysis, the resulting model becomes nonlinear. Hence nonlinear control theory in combination with simulation is used to analyse the impact of leadtime changes on the transient and steady state respo...

  16. Introduction to audio analysis a MATLAB approach

    CERN Document Server

    Giannakopoulos, Theodoros

    2014-01-01

    Introduction to Audio Analysis serves as a standalone introduction to audio analysis, providing theoretical background to many state-of-the-art techniques. It covers the essential theory necessary to develop audio engineering applications, but also uses programming techniques, notably MATLAB®, to take a more applied approach to the topic. Basic theory and reproducible experiments are combined to demonstrate theoretical concepts from a practical point of view and provide a solid foundation in the field of audio analysis. Audio feature extraction, audio classification, audio segmentation, au

  17. Linking cases of illegal shootings of the endangered California condor using stable lead isotope analysis

    International Nuclear Information System (INIS)

    Lead poisoning is preventing the recovery of the critically endangered California condor (Gymnogyps californianus) and lead isotope analyses have demonstrated that ingestion of spent lead ammunition is the principal source of lead poisoning in condors. Over an 8 month period in 2009, three lead-poisoned condors were independently presented with birdshot embedded in their tissues, evidencing they had been shot. No information connecting these illegal shooting events existed and the timing of the shooting(s) was unknown. Using lead concentration and stable lead isotope analyses of feathers, blood, and recovered birdshot, we observed that: i) lead isotope ratios of embedded shot from all three birds were measurably indistinguishable from each other, suggesting a common source; ii) lead exposure histories re-constructed from feather analysis suggested that the shooting(s) occurred within the same timeframe; and iii) two of the three condors were lead poisoned from a lead source isotopically indistinguishable from the embedded birdshot, implicating ingestion of this type of birdshot as the source of poisoning. One of the condors was subsequently lead poisoned the following year from ingestion of a lead buckshot (blood lead 556 µg/dL), illustrating that ingested shot possess a substantially greater lead poisoning risk compared to embedded shot retained in tissue (blood lead ∼20 µg/dL). To our knowledge, this is the first study to use lead isotopes as a tool to retrospectively link wildlife shooting events. - Highlights: • We conducted a case-based analysis of illegal shootings of California condors. • Blood and feather Pb isotopes were used to reconstruct the illegal shooting events. • Embedded birdshot from the three condors had the same Pb isotope ratios. • Feather and blood Pb isotopes indicated that the condors were shot in a common event. • Ingested shot causes substantially greater lead exposure compared to embedded shot

  18. Linking cases of illegal shootings of the endangered California condor using stable lead isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, Myra E., E-mail: myraf@ucsc.edu [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Kuspa, Zeka E. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Welch, Alacia [National Park Service, Pinnacles National Park, 5000 Highway 146, Paicines, CA 95043 (United States); Eng, Curtis; Clark, Michael [Los Angeles Zoo and Botanical Gardens, 5333 Zoo Drive, Los Angeles, CA 90027 (United States); Burnett, Joseph [Ventana Wildlife Society, 19045 Portola Dr. Ste. F-1, Salinas, CA 93908 (United States); Smith, Donald R. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States)

    2014-10-15

    Lead poisoning is preventing the recovery of the critically endangered California condor (Gymnogyps californianus) and lead isotope analyses have demonstrated that ingestion of spent lead ammunition is the principal source of lead poisoning in condors. Over an 8 month period in 2009, three lead-poisoned condors were independently presented with birdshot embedded in their tissues, evidencing they had been shot. No information connecting these illegal shooting events existed and the timing of the shooting(s) was unknown. Using lead concentration and stable lead isotope analyses of feathers, blood, and recovered birdshot, we observed that: i) lead isotope ratios of embedded shot from all three birds were measurably indistinguishable from each other, suggesting a common source; ii) lead exposure histories re-constructed from feather analysis suggested that the shooting(s) occurred within the same timeframe; and iii) two of the three condors were lead poisoned from a lead source isotopically indistinguishable from the embedded birdshot, implicating ingestion of this type of birdshot as the source of poisoning. One of the condors was subsequently lead poisoned the following year from ingestion of a lead buckshot (blood lead 556 µg/dL), illustrating that ingested shot possess a substantially greater lead poisoning risk compared to embedded shot retained in tissue (blood lead ∼20 µg/dL). To our knowledge, this is the first study to use lead isotopes as a tool to retrospectively link wildlife shooting events. - Highlights: • We conducted a case-based analysis of illegal shootings of California condors. • Blood and feather Pb isotopes were used to reconstruct the illegal shooting events. • Embedded birdshot from the three condors had the same Pb isotope ratios. • Feather and blood Pb isotopes indicated that the condors were shot in a common event. • Ingested shot causes substantially greater lead exposure compared to embedded shot.

  19. Multivariate analysis: A statistical approach for computations

    Science.gov (United States)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  20. Approach to uncertainty in risk analysis

    International Nuclear Information System (INIS)

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented

  1. A new approach to the evaluation and selection of leading indicators

    OpenAIRE

    Gottschling, Andreas; Trimbur, Thomas

    1998-01-01

    Leading indicators are typical constructs used in macroeconomics to guide decision making in several areas of economic activity, including policy formation and long term investment. Researchers often evaluate and select leading indicators on a seemingly ad hoc basis involving OLS regression, which does not take into account the fact that perhaps the most important property of a good leading indicator lies in its ability to anticipate the turning points of the time series of interest. We propo...

  2. ANALYSIS APPROACHES TO EVALUATION OF INFORMATION PROTECTION

    Directory of Open Access Journals (Sweden)

    Zyuzin A. S.

    2015-03-01

    Full Text Available The article is devoted to an actual problem of information systems’ security assessment and the importance of objective quantitative assessment results receiving. The author offers the creation of complex system of information security with system approach, which will be used at each stage of information system’s life cycle. On the basis of this approach the author formulates the general scheme of an information security assessment of information system, and also the principles of an assessment’s carrying out method choice. In this work the existing methods of a quantitative assessment based on object-oriented methods of the system analysis, and also the objectivity of the received estimates on the basis of this approach are considered. On the basis of the carried-out analysis, serious shortcomings of the used modern techniques of an information systems’ security assessment are allocated, then the idea of the scientific and methodical device providing the increase of objectivity and complexity of an information assessment means on the basis of expert data formalization creation necessity was formulated. The possibility of this approach application for expeditious receiving a quantitative information security assessment in the conditions security threat’s dynamics changes, functioning and developments of information system is considered. The problem definition of automated information systems’ security assessment is executed, and the general technique of protection means of information in systems of this type was formulated

  3. A Clifford analysis approach to superspace

    International Nuclear Information System (INIS)

    A new framework for studying superspace is given, based on methods from Clifford analysis. This leads to the introduction of both orthogonal and symplectic Clifford algebra generators, allowing for an easy and canonical introduction of a super-Dirac operator, a super-Laplace operator and the like. This framework is then used to define a super-Hodge coderivative, which, together with the exterior derivative, factorizes the Laplace operator. Finally both the cohomology of the exterior derivative and the homology of the Hodge operator on the level of polynomial-valued super-differential forms are studied. This leads to some interesting graphical representations and provides a better insight in the definition of the Berezin-integral

  4. Primary Dentition Analysis: Exploring a Hidden Approach

    Science.gov (United States)

    Vanjari, Kalasandhya; Kamatham, Rekhalakshmi; Gaddam, Kumar Raja

    2016-01-01

    ABSTRACT Background: Accurate prediction of the mesiodistal widths (MDWs) of canines and premolars in children with primary dentition facilitates interception of malocclusion at an early age. Boston University (BU) approach is one, i.e., based on primary teeth for predicting canine and premolar dimensions. Aim: To predict the canine and premolar dimensions, in the contemporary population, using BU approach and compare with the values obtained using Tanaka-Johnston (T/J) approach. Design: Children in the age range of 7-11 years with presence of all permanent mandibular incisors and primary maxillary and mandibular canines and first molars were included in the study. Those with interproximal caries or restorations, abnormalities in shape or size and history of orthodontic treatment were excluded. Impressions of both arches were made using irreversible hydrocolloid and poured with dental stone. The MDWs of the required teeth were measured on the models using electronic digital vernier caliper from which widths of permanent canines and premolars were predicted using both T/J and BU approaches. Results: Statistically significant (p = 0.00) positive correlation (r = 0.52-0.55) was observed between T/J and BU approaches. A statistically significant (p = 0.00) strong positive correlation (r = 0.72-0.77) was observed among girls, whereas boys showed a statistically nonsignificant weak positive correlation (r=0.17-0.42) based on gender. Conclusion: Boston University approach can be further studied prospectively to make it possible as a prediction method of permanent tooth dimensions for children in primary dentition stage. How to cite this article: Nuvvula S, Vanjari K, Kamatham R, Gaddam KR. Primary Dentition Analysis: Exploring a Hidden Approach. Int J Clin Pediatr Dent 2016;9(1):1-4. PMID:27274146

  5. A factorization approach to next-to-leading-power threshold logarithms

    NARCIS (Netherlands)

    Bonocore, D.; Laenen, E.; Magnea, L.; Melville, S.; Vernazza, L.; White, C. D.

    2015-01-01

    Threshold logarithms become dominant in partonic cross sections when the selected final state forces gluon radiation to be soft or collinear. Such radiation factorizes at the level of scattering amplitudes, and this leads to the resummation of threshold logarithms which appear at leading power in th

  6. A new analytical approach to understanding nanoscale lead-iron interactions in drinking water distribution systems.

    Science.gov (United States)

    Trueman, Benjamin F; Gagnon, Graham A

    2016-07-01

    High levels of iron in distributed drinking water often accompany elevated lead release from lead service lines and other plumbing. Lead-iron interactions in drinking water distribution systems are hypothesized to be the result of adsorption and transport of lead by iron oxide particles. This mechanism was explored using point-of-use drinking water samples characterized by size exclusion chromatography with UV and multi-element (ICP-MS) detection. In separations on two different stationary phases, high apparent molecular weight (>669kDa) elution profiles for (56)Fe and (208)Pb were strongly correlated (average R(2)=0.96, N=73 samples representing 23 single-unit residences). Moreover, (56)Fe and (208)Pb peak areas exhibited an apparent linear dependence (R(2)=0.82), consistent with mobilization of lead via adsorption to colloidal particles rich in iron. A UV254 absorbance peak, coincident with high molecular weight (56)Fe and (208)Pb, implied that natural organic matter was interacting with the hypothesized colloidal species. High molecular weight UV254 peak areas were correlated with both (56)Fe and (208)Pb peak areas (R(2)=0.87 and 0.58, respectively). On average, 45% (std. dev. 10%) of total lead occurred in the size range 0.05-0.45μm. PMID:26971028

  7. The octave approach to EEG analysis.

    Science.gov (United States)

    Stassen, H H

    1991-10-01

    A "tonal" approach to EEG spectral analysis is presented which is compatible with the concept of physical octaves, thus providing a constant resolution of partial tones over the full frequency range inherent to human brain waves, rather than for equidistant frequency steps in the spectral domain. The specific advantages of the tonal approach, however, mainly pay off in the field of EEG sleep analysis where the interesting information is predominantly located in the lower octaves. In such cases the proposed method reveals a fine structure which displays regular maxima possessing typical properties of "overtones" within the three octaves 1-2 Hz, 2-4 Hz and 4-8 Hz. Accordingly, spectral patterns derived from tonal spectral analyses are particularly suited to measure the fine gradations of mutual differences between individual EEG sleep patterns and will therefore allow a more efficient investigation of the genetically determined proportion of sleep EEGs. On the other hand, we also tested the efficiency of tonal spectral analyses on the basis of our 5-year follow-up data of 30 healthy volunteers. It turned out that 28 persons (93.3%) could be uniquely recognized after five years by means of their EEG spectral patterns. Hence, tonal spectral analysis proved to be a powerful tool also in cases where the main EEG information is typically located in the medium octave 8-16 Hz. PMID:1762585

  8. Risk Analysis Approach to Rainwater Harvesting Systems

    Directory of Open Access Journals (Sweden)

    Nadia Ursino

    2016-08-01

    Full Text Available Urban rainwater reuse preserves water resources and promotes sustainable development in rapidly growing urban areas. The efficiency of a large number of urban water reuse systems, operating under different climate and demand conditions, is evaluated here on the base of a new risk analysis approach. Results obtained by probability analysis (PA indicate that maximum efficiency in low demanding scenarios is above 0.5 and a threshold, distinguishing low from high demanding scenarios, indicates that in low demanding scenarios no significant improvement in performance may be attained by increasing the storage capacity of rainwater harvesting tanks. Threshold behaviour is displayed when tank storage capacity is designed to match both the average collected volume and the average reuse volume. The low demand limit cannot be achieved under climate and operating conditions characterized by a disproportion between harvesting and demand volume.

  9. An international pooled analysis for obtaining a benchmark dose for environmental lead exposure in children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce; Grandjean, Philippe

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack of a...... clear threshold presents a challenge to the identification of an acceptable level of exposure. The benchmark dose (BMD) is defined as the dose that leads to a specific known loss. As an alternative to elusive thresholds, the BMD is being used increasingly by regulatory authorities. Using the pooled data...... fitting models yielding lower confidence limits (BMDLs) of about 0.1-1.0 μ g/dL for the dose leading to a loss of one IQ point. We conclude that current allowable blood lead concentrations need to be lowered and further prevention efforts are needed to protect children from lead toxicity....

  10. [Analysis of washing efficiency and change in lead speciation in lead-contaminated soil of a battery factory].

    Science.gov (United States)

    Ren, Bei; Huang, Jin-lou; Miao, Ming-sheng

    2013-09-01

    Lead-contaminated soil with different pollution load in a lead battery factory in the southwest of China was chosen as the research object, the lead content and speciation were analyzed, and different washing agents were screened. The lead washing efficiency and lead speciation were analyzed under different pH conditions, and the soil of different particle size was washed using different duration to determine the best washing time. The results showed that the soil of sites A and B in the factory was severely contaminated, the lead concentration reaching 15,703.22 mg x kg(-1) and 1747.78 mg x kg(-1), respectively, and the proportion of the active-state lead was relatively high, while the residue state accounted for only 17.32%, 11.64%, 14.6% and 10.2%. EDTA and hydrochloric acid showed the best extraction effect in the 5 washing agents tested, which included EDTA, hydrochloric acid, citric acid, rhamnolipid and SDS. Cleaning under acidic conditions could not only effectively extract the total amount of lead but also effectively reduce the environmental risk of active-state lead. pH 4-7 was suggested as the most appropriate condition. The cleaning effect of coarse sand and fine sand was good, while for washing powder clay, it is better to improve the process, with the optimal washing time determined as 240 min. PMID:24289026

  11. Relational Leading

    DEFF Research Database (Denmark)

    Larsen, Mette Vinther; Rasmussen, Jørgen Gulddahl

    2015-01-01

    This first chapter presents the exploratory and curious approach to leading as relational processes – an approach that pervades the entire book. We explore leading from a perspective that emphasises the unpredictable challenges and triviality of everyday life, which we consider an interesting......, relevant and realistic way to examine leading. The chapter brings up a number of concepts and contexts as formulated by researchers within the field, and in this way seeks to construct a first understanding of relational leading....

  12. Dynamic Re-order Point Inventory Control with Lead-Time Uncertainty: Analysis and Empirical Investigation

    OpenAIRE

    Babai, Mohamed Zied; Syntetos, Aris A; Dallery, Yves; Nikolopoulos, Kostantinos

    2009-01-01

    Abstract A new forecast-based dynamic inventory control approach is discussed in this paper. In this approach, forecasts and forecast uncertainties are assumed to be exogenous data known in advance at each period over a fixed horizon. The control parameters are derived by using a sequential procedure. The merits of this approach as compared to the classical one are presented. We focus on a single-stage and single-item inventory system with non-stationary demand and lead-time uncert...

  13. Factors Leading to Success in Diversified Occupation: A Livelihood Analysis in India

    Science.gov (United States)

    Saha, Biswarup; Bahal, Ram

    2015-01-01

    Purpose: Livelihood diversification is a sound alternative for higher economic growth and its success or failure is conditioned by the interplay of a multitude of factors. The study of the profile of the farmers in which they operate is important to highlight the factors leading to success in diversified livelihoods. Design/Methodology/Approach: A…

  14. Tourism Destinations Network Analysis, Social Network Analysis Approach

    OpenAIRE

    2015-01-01

    The tourism industry is becoming one of the world's largest economical sources, and is expected to become the world's first industry by 2020. Previous studies have focused on several aspects of this industry including sociology, geography, tourism management and development, but have paid less attention to analytical and quantitative approaches. This study introduces some network analysis techniques and measures aiming at studying the structural characteristics of tourism networks. More speci...

  15. Leading Public Housing Organisation in a Problematic Situation: a critical soft systems methodology approach

    OpenAIRE

    Staadt, Jurgen

    2014-01-01

    markdownabstract__Abstract__ The challenges ahead such as climate change and social injustice require governments and their public organisations to be adaptive and open to learning. This necessitates the adoption of new ways of thinking so as cope to with complexity, dynamics as well as behavioural aspects. The leading public housing organisation used in this single case study is connected with disciplines such as transport for example which suggests the adoption of a systems thinking approac...

  16. A factorization approach to next-to-leading-power threshold logarithms

    Science.gov (United States)

    Bonocore, D.; Laenen, E.; Magnea, L.; Melville, S.; Vernazza, L.; White, C. D.

    2015-06-01

    Threshold logarithms become dominant in partonic cross sections when the selected final state forces gluon radiation to be soft or collinear. Such radiation factorizes at the level of scattering amplitudes, and this leads to the resummation of threshold logarithms which appear at leading power in the threshold variable. In this paper, we consider the extension of this factorization to include effects suppressed by a single power of the threshold variable. Building upon the Low-Burnett-Kroll-Del Duca (LBKD) theorem, we propose a decomposition of radiative amplitudes into universal building blocks, which contain all effects ultimately responsible for next-to-leading-power (NLP) threshold logarithms in hadronic cross sections for electroweak annihilation processes. In particular, we provide a NLO evaluation of the radiative jet function, responsible for the interference of next-to-soft and collinear effects in these cross sections. As a test, using our expression for the amplitude, we reproduce all abelian-like NLP threshold logarithms in the NNLO Drell-Yan cross section, including the interplay of real and virtual emissions. Our results are a significant step towards developing a generally applicable resummation formalism for NLP threshold effects, and illustrate the breakdown of next-to-soft theorems for gauge theory amplitudes at loop level.

  17. Fostering the Capacity for Distributed Leadership: A Post-Heroic Approach to Leading School Improvement

    Science.gov (United States)

    Klar, Hans W.; Huggins, Kristin Shawn; Hammonds, Hattie L.; Buskey, Frederick C.

    2016-01-01

    Principals are being encouraged to distribute leadership to increase schools' organizational capacities, and enhance student growth and learning. Extant research on distributed leadership practices provides an emerging basis for adopting such approaches. Yet, relatively less attention has been paid to examining the principal's role in fostering…

  18. Approaches to data analysis of multiple-choice questions

    Science.gov (United States)

    Ding, Lin; Beichner, Robert

    2009-12-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  19. Potentiometric stripping analysis of lead and cadmium leaching from dental prosthetic materials and teeth

    Directory of Open Access Journals (Sweden)

    GORAN M. NIKOLIC

    2004-07-01

    Full Text Available Potentiometric stipping analysis (PSA was applied for the determination of lead and cadmium leaching from dental prosthetic materials and teeth. The soluble lead content in finished dental implants was found to be much lower than that of the individual components used for their preparation. Cadmium was not detected in dental implants and materials under the defined conditions. The soluble lead and cadmium content of teeth was slightly lower than the lead and cadmium content in whole teeth (w/w reported by other researchers, except in the case of a tooth with removed amalgam filling. The results of this work suggest that PSA may be a good method for lead and cadmium leaching studies for investigation of the biocompatibility of dental prosthetic materials.

  20. Random matrix approach to categorical data analysis

    Science.gov (United States)

    Patil, Aashay; Santhanam, M. S.

    2015-09-01

    Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings, and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow random matrix predictions with the dominant eigenvalue being an exception. We demonstrate this approach by applying it to the data for Indian general elections and sea level pressures in the North Atlantic ocean.

  1. Multihapten approach leading to a sensitive ELISA with broad cross-reactivity to microcystins and nodularin

    OpenAIRE

    Samdal, Ingunn Anita; Ballot, Andreas; Løvberg, Kjersti Eriksen; Miles, Christopher Owen

    2014-01-01

    Microcystins (MCs) are a group of biotoxins (>150) produced by cyanoba cteria, with a worldwide distribution. MCs are hepatotoxic, and acute exposure causes severe liver damage in humans and animals. Rapid and cheap methods of analysis are therefore required to protect people and livestock, especially in developing countries. To include as many MCs as possible in a single analysis, we developed a new competitive ELISA. Ovine polyclonal antibodies were raised using a...

  2. Ancillary Resistor leads to Sparse Glitches: an Extra Approach to Avert Hacker using Syndicate Browser Design

    OpenAIRE

    Devaki Pendlimarri,; Paul Bharath Bhushan Petlu

    2012-01-01

    After the invention of internet most of the people all over the world have become a fan of it because of its vast exploitation for information exchange, e-mail, e-commerce etc. for their easy leading of life. On the other side, may be equally or less/more, many people are also using it for the purpose of hacking the information which is being communicated. Because, the data/information that is being communicated through the internet is via an unsecured networks. This gives breaches to the hac...

  3. Statistical approach to partial equilibrium analysis

    Science.gov (United States)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  4. Environmental health risk assessment of ambient lead levels in Lisbon, Portugal: A full chain study approach

    DEFF Research Database (Denmark)

    Casimiro, E.; Philippe Ciffroy, P.; Serpa, P.; Johansson, E.; Legind, Charlotte Nielsen; Brochot, C.

    The multi-causality interactions between environment and health are complex and call for an integrated multidisciplinary study approach. Emerging computational toxicology tools that link toxicology, chemistry, environmental sciences, biostatistics, and computer sciences are proving to be very...... then used to calculate the Pb concentration in various biota (leafy vegetables, root vegetables, grain, potatoes, and fruits) produced in the area as well as the amount of Pb a typical adult would inhale and ingest during this ten-year assessment period. The PBPK model of the 2FUN player was used to...... calculate the Pb levels in the various body systems. Our results showed a low health risk from Pb exposures. It also identified that ingestion of leafy vegetables (i.e. lettuce, cabbage, and spinach) and fruits contribute the most to total Pb blood levels. This full chain assessment approach of the 2FUN...

  5. Enhanced electrokinetic remediation of lead-contaminated soil by complexing agents and approaching anodes.

    Science.gov (United States)

    Zhang, Tao; Zou, Hua; Ji, Minhui; Li, Xiaolin; Li, Liqiao; Tang, Tang

    2014-02-01

    Optimizing process parameters that affect the remediation time and power consumption can improve the treatment efficiency of the electrokinetic remediation as well as determine the cost of a remediation action. Lab-scale electrokinetic remediation of Pb-contaminated soils was investigated for the effect of complexant ethylenediaminetetraacetic acid (EDTA) and acetic acid and approaching anode on the removal efficiency of Pb. When EDTA was added to the catholyte, EDTA dissolved insoluble Pb in soils to form soluble Pb-EDTA complexes, increasing Pb mobility and accordingly removal efficiency. The removal efficiency was enhanced from 47.8 to 61.5 % when the EDTA concentration was increased from 0.1 to 0.2 M, showing that EDTA played an important role in remediation. And the migration rate of Pb was increased to 72.3 % when both EDTA and acetic acid were used in the catholyte. The "approaching anode electrokinetic remediation" process in the presence of both EDTA and acetic acid had a higher Pb-removal efficiency with an average efficiency of 83.8 %. The efficiency of electrokinetic remediation was closely related to Pb speciation. Exchangeable and carbonate-bounded Pb were likely the forms which could be removed. All results indicate that the approaching anode method in the presence of EDTA and acetic acid is an advisable choice for electrokinetic remediation of Pb-contaminated soil. PMID:24203258

  6. Effects of evacuation assistant’s leading behavior on the evacuation efficiency: Information transmission approach

    Science.gov (United States)

    Wang, Xiao-Lu; Guo, Wei; Zheng, Xiao-Ping

    2015-07-01

    Evacuation assistants are expected to spread the escape route information and lead evacuees toward the exit as quickly as possible. Their leading behavior influences the evacuees’ movement directly, which is confirmed to be a decisive factor of the evacuation efficiency. The transmission process of escape information and its function on the evacuees’ movement are accurately presented by the proposed extended dynamic communication field model. For evacuation assistants and evacuees, their sensitivity parameter of static floor field (SFF), , and , are fully discussed. The simulation results indicate that the appropriate is associated with the maximum of evacuees. The optimal combinations of and were found to reach the highest evacuation efficiency. There also exists an optimal value for evacuation assistants’ information transmission radius. Project supported by the National Basic Research Program of China (Grant No. 2011CB706900), the National Natural Science Foundation of China (Grant Nos. 71225007 and 71203006), the National Key Technology Research and Development Program of the Ministry of Science and Technology of China (Grant No. 2012BAK13B06), the Humanities and Social Sciences Project of the Ministry of Education of China (Grant Nos. 10YJA630221 and 12YJCZH023), and the Beijing Philosophy and Social Sciences Planning Project of the Twelfth Five-Year Plan, China (Grant Nos. 12JGC090 and 12JGC098).

  7. Re-analysis of fatigue data for welded joints using the notch stress approach

    DEFF Research Database (Denmark)

    Pedersen, Mikkel Melters; Mouritsen, Ole Ø.; Hansen, Michael Rygaard;

    2010-01-01

    Experimental fatigue data for welded joints have been collected and subjected to re-analysis using the notch stress approach according to IIW recommendations. This leads to an overview regarding the reliability of the approach, based on a large number of results (767 specimens). Evidently, there......-welded joints agree quite well with the FAT 225 curve; however a reduction to FAT 200 is suggested in order to achieve approximately the same safety as observed in the nominal stress approach....

  8. Foreign Policy: Approaches, Levels Of Analysis, Dimensions

    Directory of Open Access Journals (Sweden)

    Nina Šoljan

    2012-01-01

    Full Text Available This paper provides an overview of key issues related to foreign policy and foreign policy theories in the wider context of political science. Discussing the origins and development of foreign policy analysis (FPA, as well as scholarly work produced over time, it argues that today FPA encompasses a variety of theoretical approaches, models and tools. These share the understanding that foreign policy outputs cannot be fully explained if analysis is confined to the systemic level. Furthermore, this paper conceptualizes foreign policy by comparing it to other types of policy. Although during the Cold War period foreign policy was equated with foreign security policy, in today’s world, security policy is only one dimension. Foreign policy’s scope has expanded to cover other issues such as trade, human rights and the environment. The growing number of domestic, international and transnational issues, stakeholders and inputs into the policy making process have made the formation and conduct of a coherent foreign policy increasingly challenging.

  9. The QCD analysis of xF_3 structure function based on the analytic approach

    OpenAIRE

    Sidorov, A. V.; Solovtsova, O. P.

    2013-01-01

    We apply analytic perturbation theory to the QCD analysis of the xF_3 structure function data of the CCFR collaboration. We use different approaches for the leading order Q^2 evolution of the xF_3 structure function and compare the extracted values of the parameter Lambda_QCD and the shape of the higher twistcontribution. Our consideration is based on the Jacobi polynomial expansion method of the unpolarized structure function. The analysis shows that the analytic approach provides reasonable...

  10. Control approach to the load frequency regulation of a Generation IV Lead-cooled Fast Reactor

    International Nuclear Information System (INIS)

    Highlights: • Dedicated control strategy for adjusting the electrical power according to the grid requirements. • Decoupling of the Balance of Plant from the reactor primary circuit thanks to effective feedback regulators. • Primary frequency regulation and islanding simulations assessed with an object-oriented model. - Abstract: One of the most pressing issues in the study of the power generation and distribution is the characterization of the grid behavior, whether a relevant fraction of the connected power plants relies on Renewable Energy Sources. Indeed, because of the discontinuous power supply and the limited presence of energy accumulators, concerning power imbalances may take place on the grid. The power plants ensuring high reliability performance should be ready to feed the loads when the Renewable Energy Sources are not available. In order to ensure the grid stability and the sustainability of nuclear energy, the possibility of operating Generation-IV nuclear reactors in a flexible way should be considered, i.e., the Nuclear Power Plants should adjust the mechanical power produced so as to comply with the sudden grid frequency variations. In the present work, this opportunity is assessed for the Lead-cooled Fast Reactors, adopting the Advanced Lead Fast Reactor European Demonstrator (ALFRED) as a representative of Lead-cooled Fast Reactor technology. For this reactor concept, because of the large thermal inertia that characterizes the system, the adoption of the “reactor-follows-turbine” scheme (currently employed in the Pressurized Water Reactors) is not feasible. An alternative solution is proposed, i.e., the set-point for the thermal power produced in the core is kept constant at the nominal value (or slowly variable), and the set-point for the mechanical power available to the alternator is adjusted according to the load demands. In order to assess the performance of the developed control scheme, two case studies are simulated. In the

  11. A new approach to evaluate natural zeolite ability to sorb lead (Pb) from aqueous solutions

    Science.gov (United States)

    Drosos, Evangelos I. P.; Karapanagioti, Hrissi K.

    2013-04-01

    Lead (Pb) is a hazardous pollutant commonly found in aquatic ecosystems. Among several methods available, the addition of sorbent amendments to soils or sediments is attractive, since its application is relatively simple, while it can also be cost effective when a low cost and re-usable sorbent is used; e.g. natural zeolites. Zeolites are crystalline aluminosilicates with a three-dimensional structure composed of a set of cavities occupied by large ions and water molecules. Zeolites can accommodate a wide variety of cations, such as Na+, K+, Ca2+, Mg2+, which are rather loosely held and can readily be exchanged for others in an aqueous solution. Natural zeolites are capable of removing cations, such as lead, from aqueous solutions by ion exchange. There is a wide variation in the cation exchange capacity (CEC) of natural zeolites because of the different nature of various zeolites cage structures, natural structural defects, adsorbed ions, and their associated gangue minerals. Naturally occurring zeolites are rarely pure and are contaminated to varying degrees by other minerals, such as clays and feldspars, metals, quartz, or other zeolites as well. These impurities affect the CEC even for samples originated from the same region but from a different source. CEC of the material increases with decreasing impurity content. Potentially exchangeable ions in such impurities do not necessarily participate in ion exchange mechanism, while, in some cases, impurities may additionally block the access to active sites. For zeoliferous rocks having the same percentage of a zeolitic phase, the CEC increases with decreasing Si/Al ratio, as the more Si ions are substituted by Al ions, the more negative the valence of the matrix becomes. Sodium seems to be the most effective exchangeable ion for lead. On the contrary, it is unlikely that the potassium content of the zeolite would be substituted. A pretreatment with high concentration solutions of Na, such as 2 M NaCl, can

  12. Lead isotope ratio analysis of bullet samples by using quadrupole ICP-MS

    International Nuclear Information System (INIS)

    The measurement conditions for the precise analysis of the lead stable isotope ratio by using an ICP-MS equipped with a quadrupole mass spectrometer were studied in order to apply the technique to the forensic identification of bullet samples. The values of the relative standard deviation obtained for the ratio of 208Pb/206Pb, 207Pb/206Pb and 204Pb/206Pb were lower than 0.2% after optimization of the analytical conditions, including the optimum lead concentration of the sample solution to be about 70 ppb and an integration time for 1 m/s of 15 s. This method was applied to an analysis of lead in bullets for rifles and handguns; a stable isotope ratio of lead was found to be suitable for the identification of bullets. This study has demonstrated that the lead isotope ratio measured by using a quadrupole ICP-MS was useful for a practical analysis of bullet samples in forensic science. (author)

  13. Cadmium and lead interaction with diatom surfaces: A combined thermodynamic and kinetic approach

    Science.gov (United States)

    Gélabert, A.; Pokrovsky, O. S.; Schott, J.; Boudou, A.; Feurtet-Mazel, A.

    2007-08-01

    This work is devoted to the physico-chemical study of cadmium and lead interaction with diatom-water interfaces for two marine planktonic ( Thalassiosira weissflogii, TW; Skeletonema costatum, SC) and two freshwater periphytic species ( Achnanthidium minutissimum, AMIN; Navicula minima, NMIN) by combining adsorption measurements with surface complexation modeling. Adsorption kinetics was studied as a function of pH and initial metal concentration in sodium nitrate solution and in culture media. Kinetic data were consistent with a two-step mechanism in which the loss of a water molecule from the inner coordination sphere of the metal is rate limiting. Reversible adsorption experiments, with 3 h of exposure to metal, were performed as a function of pH (2-9), metal concentration in solution (10 -9-10 -3 M), and ionic strength (10 -3-1.0 M). While the shape of pH-dependent adsorption edge is similar among all four diatom species, the constant-pH adsorption isotherm and maximal binding capacities differ. Measurements of electrophoretic mobilities ( μ) revealed negative surface potential for AMIN diatom, however, the absolute value of μ decreases with increase of [Pb 2+] aq suggesting the metal adsorption on negative surface sites. These observations allowed us to construct a surface complexation model (SCM) for cadmium and lead binding by diatom surfaces that postulates the Constant Capacitance of the electric double layer and considers Cd and Pb complexation with mainly carboxylic and, partially, silanol groups. In the full range of investigated Cd concentration, the SCM is able to describe the concentration of adsorbed metal as a function of [Cd 2+] aq without implying the presence of high affinity, low abundance sites, that are typically used to model the metal interactions with natural multi-component organic substances. At the same time, Cd fast initial reaction requires the presence of "highly reactive sites" those concentration represents only 2.5-3% of the

  14. Our On-Its-Head-and-In-Your-Dreams Approach Leads to Clean Energy

    Energy Technology Data Exchange (ETDEWEB)

    Kazmerski, Lawrence; Gwinner, Don; Hicks, Al

    2013-07-18

    Representing the Center for Inverse Design (CID), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE: energy. The mission of the CID is to revolutionize the discovery of new materials by design with tailored properties through the development and application of a novel inverse design approach powered by theory guiding experiment with an initial focus on solar energy conversion.

  15. An approach to automated chromosome analysis

    International Nuclear Information System (INIS)

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author)

  16. Tourism Destinations Network Analysis, Social Network Analysis Approach

    Directory of Open Access Journals (Sweden)

    2015-09-01

    Full Text Available The tourism industry is becoming one of the world's largest economical sources, and is expected to become the world's first industry by 2020. Previous studies have focused on several aspects of this industry including sociology, geography, tourism management and development, but have paid less attention to analytical and quantitative approaches. This study introduces some network analysis techniques and measures aiming at studying the structural characteristics of tourism networks. More specifically, it presents a methodology to analyze tourism destinations network. We apply the methodology to analyze mazandaran’s Tourism destination network, one of the most famous tourism areas of Iran.

  17. A meta-analysis of studies investigating the effects of lead exposure on nerve conduction

    Energy Technology Data Exchange (ETDEWEB)

    Krieg, Edward F.; Chrislip, David W.; Brightwell, W.S. [National Institute for Occupational Safety and Health, Robert A. Taft Laboratories, Cincinnati, OH (United States)

    2008-08-15

    Group means from nerve conduction studies of persons exposed to lead were used in a meta-analysis. Differences between the control and exposed groups, and the slopes between nerve conduction measurements and log{sub 10} blood lead concentrations were estimated using mixed models. Conduction velocity was reduced in the median, ulnar, and radial nerves in the arm, and in the deep peroneal nerve in the leg. Distal latencies of the median, ulnar, and deep peroneal nerves were longer. No changes in the amplitudes of compound muscle or nerve action potentials were detected. The lowest concentration at which a relationship with blood lead could be detected was 33.0{mu}g/dl for the nerve conduction velocity of the median sensory nerve. Lead may reduce nerve conduction velocity by acting directly on peripheral nerves or by acting indirectly, for example, on the kidney or liver. (orig.)

  18. Near-crater discoloration of white lead in wall paintings during laser induced breakdown spectroscopy analysis

    Science.gov (United States)

    Bruder, R.; L'Hermite, D.; Semerok, A.; Salmon, L.; Detalle, V.

    2007-12-01

    During Laser-Induced Breakdown Spectroscopy (LIBS) analysis of white lead pigment (basic lead carbonate, 2PbCO 3·Pb(OH) 2), used in wall paintings of historical interest, a yellow-brown discoloration has been observed around the crater. This phenomenon faded after a few days exposure under ambient atmosphere. It was established that the mechanism of this discoloration consists in lead oxides (PbO) formation. It was verified by further experiments under argon atmosphere that recombination of lead with oxygen in the plasma plume produces the oxides, which settle around the crater and induce this discoloration. The impact of discoloration on the artwork's aesthetic aspect and the role of atmosphere on discoloration attenuation are discussed. The mechanism is studied on three other pigments (malachite, Prussian blue and ultramarine blue) and threshold for discoloration occurrence is estimated.

  19. Thermodynamic properties of lead at high temperatures and high pressures-mean-field potential approach

    International Nuclear Information System (INIS)

    In the present paper, we report theoretical calculations for the thermodynamic properties of lead (Pb) at high temperatures and pressures. We use the mean-field potential (MFP) model proposed recently by Wang and Li (Phys. Rev. B 62 (2000) 196) for evaluating the vibrational contribution of the lattice ion to the total free energy. The MFP seen by the lattice ion is constructed, for the first time, in terms of the total energy-volume relation using the local pseudopotential due to Fiolhais et al. (Phys. Rev. B 51 (1995) 14001). We have calculated static compression, shock-wave compression, thermal expansion (βP), isothermal and adiabatic bulk moduli (BT and BS), internal energy, specific heats (CV and CP), thermodynamic Gruneisen parameter (γth), anharmonic contribution to the specific heat and temperature along shock Hugoniot. The results are satisfactorily comparable with those generated through first-principles methods, other theoretical methods and with experiments. We demonstrate that in comparison with other theoretical models, the present model has the advantages of computational simplicity and physical transparency

  20. Lead toxicity to Lemna minor predicted using a metal speciation chemistry approach.

    Science.gov (United States)

    Antunes, Paula M C; Kreager, Nancy J

    2014-10-01

    In the present study, predictive measures for Pb toxicity and Lemna minor were developed from bioassays with 7 surface waters having varied chemistries (0.5-12.5 mg/L dissolved organic carbon, pH of 5.4-8.3, and water hardness of 8-266 mg/L CaCO3 ). As expected based on water quality, 10%, 20%, and 50% inhibitory concentration (IC10, IC20, and IC50, respectively) values expressed as percent net root elongation (%NRE) varied widely (e.g., IC20s ranging from 306 nM to >6920 nM total dissolved Pb), with unbounded values limited by Pb solubility. In considering chemical speciation, %NRE variability was better explained when both Pb hydroxides and the free lead ion were defined as bioavailable (i.e., f{OH} ) and colloidal Fe(III)(OH)3 precipitates were permitted to form and sorb metals (using FeOx as the binding phase). Although cause and effect could not be established because of covariance with alkalinity (p = 0.08), water hardness correlated strongly (r(2)  = 0.998, p minor and highlight the importance of chemical speciation in Pb-based risk assessments for aquatic macrophytes. PMID:25044009

  1. Ancillary Resistor leads to Sparse Glitches: an Extra Approach to Avert Hacker using Syndicate Browser Design

    Directory of Open Access Journals (Sweden)

    Devaki Pendlimarri

    2012-01-01

    Full Text Available After the invention of internet most of the people all over the world have become a fan of it because of its vast exploitation for information exchange, e-mail, e-commerce etc. for their easy leading of life. On the other side, may be equally or less/more, many people are also using it for the purpose of hacking the information which is being communicated. Because, the data/information that is being communicated through the internet is via an unsecured networks. This gives breaches to the hacker who is known as the man-in-the-middle to hack the data/information. In this paper, we describe some novel methodologies to prevent the hacker in hacking the data/information. The web browser design is being  carried out in our R&D lab and we have found that the novel methodology has given solution to prevent the man-in-the-middle from several attacks.

  2. Mercury-Free Analysis of Lead in Drinking Water by Anodic Stripping Square Wave Voltammetry

    Science.gov (United States)

    Wilburn, Jeremy P.; Brown, Kyle L.; Cliffel, David E.

    2007-01-01

    The analysis of drinking water for lead, which has well-known health effects, is presented as an instructive example for undergraduate chemistry students. It allows the students to perform an experiment and evaluate to monitor risk factors and common hazard of everyday life.

  3. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    . Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual...

  4. Lead in Hair and in Red Wine by Potentiometric Stripping Analysis: The University Students' Design.

    Science.gov (United States)

    Josephsen, Jens

    1985-01-01

    A new program for training upper secondary school chemistry teachers (SE 537 693) depends heavily on student project work. A project in which lead in hair and in red wine was examined by potentiometric stripping analysis is described and evaluated. (JN)

  5. An Analysis of the Factors Leading to Rising Credit Risk in the Zimbabwe Banking Sector

    OpenAIRE

    Maxwell Sandada; Agness Kanhukamwe

    2016-01-01

    The study sought to analyse the factors that lead to rising credit risk in the Zimbabwean banking sector. The objective was to ascertain the impact of macroeconomic, industry and bank specific factors on rising credit risk in in Zimbabwe. The study aimed at contributing to credit risk management literature by providing evidence Sub Saharan context. Being anchored on the positivist quantitative research approach, a survey was carried out gather the data that were analysed using ...

  6. X-ray Structure Refinements and Strain Analysis of Substituted Cubic Lead Pyrochlores Pb

    Energy Technology Data Exchange (ETDEWEB)

    Nalini, G.; Somashekar, R.; Guru Row T. N.

    2001-01-01

    The phase diagrams in the PbO-Nb{sub 2}O{sub 5} system and the PbO-Ta{sub 2}O{sub 5} system depict pyrochlore structure at certain molar ratios. Compositions Pb{sub 2}Nb{sub 1.51}Pb{sub 0.49}O{sub 6.30} (1), Pb{sub 2}Ta{sub 1.4}Pb{sub 0.6}O{sub 6.21} (2), and Pb{sub 2}Ta{sub 1.25}Pb{sub 0.75}O{sub 6.57} (3) belonging to this family, are refined in the cubic space group Fd{ovr 3}m (Z=8; lattice parameter a=10.762(1), 10.744(1), 10.757(5) {angstrom}, respectively) using the Rietveld refinement approach. The analyses suggest that the B-site is partially occupied by Pb leading to the general formula Pb{sub 2}(M{sub 2-y}Pb{sub y})O{sub 7-{delta}}(0.0 < y < 0.8; M=Nb or Ta). There is an overall broadening observed in the X-ray peak widths in 1, 2, and 3 compared to the Pb-deficient parent phases. It is observed that the X-ray peak widths of 2 is broad, while 3 displays narrow peak widths. It is found via strain analysis that the line broadening observed correlates with the strain in the lattice.

  7. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  8. European Lead-cooled SYstem core design: an approach towards sustainability

    International Nuclear Information System (INIS)

    This paper deals with the neutronic design of ELSY (the European Lead-cooled SYstem), a 600 MWe Fast Reactor developed within the 6th EURATOM Framework Programme. The overall core layout, characterized by open square Fuel Assemblies in a rectangular staggered lattice configuration mostly defined by complying with mechanical and seismic constraints, has been optimized in order to obtain a flat power/Fuel Assembly distribution (maximum-to-average ratio: 1.2). The power-to-flow ratio is locally adjusted by changing the fissile (Plutonium) enrichment at different radial positions in the core. Three independent scram systems have been introduced in order to achieve the required reliability for reactor shutdown and safety: eight traditional concept Control Rod assemblies together with two sets of sparse control 'Finger' Absorber Rods, small B4C rods that can be inserted, in principle, in the centre of each FA. One of the two finger absorber systems includes a motorized subset devoted to the regulation of the criticality swing during the cycle: their number can be limited indeed since the small reactivity swing (some hundreds pcm) due to the about unitary breeding ratio. Such an innovative solution can also be positioned in order to maintain an optimal power flattening during the fuel cycle. The core design of ELSY has been organized aiming also at showing that it is possible to realize an 'adiabatic' reactor, i.e. a reactor self-sustainable in Plutonium and burning its own generated Minor Actinides. This complies with the sustainability goal of Generation IV systems: for the implementation of a closed fuel cycle the forthcoming reactors would have to base their operation upon the net 'conversion' of either Natural or Depleted Uranium into Fission Products only. (author)

  9. Global approach of emergency response, reflection analysis

    International Nuclear Information System (INIS)

    The emergency response management approach must be dealt with adequately within company strategy, since a badly managed emergency situation can adversely affect a company, not only in terms of asset, but also in terms of the negative impact on its credibility, profitability and image. Thereby, it can be said that there are three main supports to manage the response in an emergency situation. a) Diagnosis b) Prognosis. c) Communications. To reach these capabilities it is necessary a co-ordination of different actions at the following levels. i. Facility Operation implies Local level. ii. Facility Property implies National level iii. Local Authority implies Local level iv. National Authority implies National level Taking into account all the last, these following functions must be covered: a) Management: incorporating communication, diagnosis and prognosis areas. b) Decision: incorporating communication and information means. c) Services: in order to facilitate the decision, as well as the execution of this decision. d) Analysis: in order to facilitate the situations that make easier to decide. e) Documentation: to seek the information for the analysts and decision makers. (Author)

  10. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  11. A Community-Based Approach to Leading the Nation in Smart Energy Use

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2013-12-31

    Project Objectives The AEP Ohio gridSMART® Demonstration Project (Project) achieved the following objectives: • Built a secure, interoperable, and integrated smart grid infrastructure in northeast central Ohio that demonstrated the ability to maximize distribution system efficiency and reliability and consumer use of demand response programs that reduced energy consumption, peak demand, and fossil fuel emissions. • Actively attracted, educated, enlisted, and retained consumers in innovative business models that provided tools and information reducing consumption and peak demand. • Provided the U.S. Department of Energy (DOE) information to evaluate technologies and preferred smart grid business models to be extended nationally. Project Description Ohio Power Company (the surviving company of a merger with Columbus Southern Power Company), doing business as AEP Ohio (AEP Ohio), took a community-based approach and incorporated a full suite of advanced smart grid technologies for 110,000 consumers in an area selected for its concentration and diversity of distribution infrastructure and consumers. It was organized and aligned around: • Technology, implementation, and operations • Consumer and stakeholder acceptance • Data management and benefit assessment Combined, these functional areas served as the foundation of the Project to integrate commercially available products, innovative technologies, and new consumer products and services within a secure two-way communication network between the utility and consumers. The Project included Advanced Metering Infrastructure (AMI), Distribution Management System (DMS), Distribution Automation Circuit Reconfiguration (DACR), Volt VAR Optimization (VVO), and Consumer Programs (CP). These technologies were combined with two-way consumer communication and information sharing, demand response, dynamic pricing, and consumer products, such as plug-in electric vehicles and smart appliances. In addition, the Project

  12. A Novel Approach for the Removal of Lead(II Ion from Wastewater Using Mucilaginous Leaves of Diceriocaryum eriocarpum Plant

    Directory of Open Access Journals (Sweden)

    Joshua N. Edokpayi

    2015-10-01

    Full Text Available Lead(II ion is a very toxic element known to cause detrimental effects to human health even at very low concentrations. An adsorbent prepared using mucilaginous leaves from Diceriocaryum eriocarpum plant (DEP was used for the adsorption of lead(II ion from aqueous solution. Batch experiments were performed on simulated aqueous solutions under optimized conditions of adsorbent dosage, contact time, pH and initial lead(II ion concentration at 298 K. The Langmuir isotherm model more suitably described the adsorption process than the Freundlich model with linearized coefficients of 0.9661 and 0.9547, respectively. Pseudo-second order kinetic equation best described the kinetics of the reaction. Fourier transform infra-red analysis confirmed the presence of amino (–NH, carbonyl (–C=O and hydroxyl (–OH functional groups. Application of the prepared adsorbent to wastewater samples of 10 mg/L and 12 mg/L of lead(II ion concentration taken from a waste stabilization pond showed removal efficiencies of 95.8% and 96.4%, respectively. Futhermore, 0.1 M HCl was a better desorbing agent than 0.1 M NaOH and de-ionized water. The experimental data obtained demonstrated that mucilaginous leaves from DEP can be used as a suitable adsorbent for lead(II ion removal from wastewater.

  13. Finite element analysis of vibration energy harvesting using lead-free piezoelectric materials: A comparative study

    Directory of Open Access Journals (Sweden)

    Anuruddh Kumar

    2014-06-01

    Full Text Available In this article, the performance of various piezoelectric materials is simulated for the unimorph cantilever-type piezoelectric energy harvester. The finite element method (FEM is used to model the piezolaminated unimorph cantilever structure. The first-order shear deformation theory (FSDT and linear piezoelectric theory are implemented in finite element simulations. The genetic algorithm (GA optimization approach is carried out to optimize the structural parameters of mechanical energy-based energy harvester for maximum power density and power output. The numerical simulation demonstrates the performance of lead-free piezoelectric materials in unimorph cantilever-based energy harvester. The lead-free piezoelectric material K0.5Na0.5NbO3-LiSbO3-CaTiO3 (2 wt.% has demonstrated maximum mean power and maximum mean power density for piezoelectric energy harvester in the ambient frequency range of 90–110 Hz. Overall, the lead-free piezoelectric materials of K0.5Na0.5NbO3-LiSbO3 (KNN-LS family have shown better performance than the conventional lead-based piezoelectric material lead zirconate titanate (PZT in the context of piezoelectric energy harvesting devices.

  14. Three Approaches to Data Analysis Test Theory, Rough Sets and Logical Analysis of Data

    CERN Document Server

    Chikalov, Igor; Lozina, Irina; Moshkov, Mikhail; Nguyen, Hung Son; Skowron, Andrzej; Zielosko, Beata

    2013-01-01

    In this book, the following three approaches to data analysis are presented:  - Test Theory, founded by Sergei V. Yablonskii (1924-1998); the first publications appeared in 1955 and 1958, -           Rough Sets, founded by Zdzisław I. Pawlak (1926-2006); the first publications appeared in 1981 and 1982, -           Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected.

  15. A User Requirements Analysis Approach Based on Business Processes

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yue-bin; HAN Wen-xiu

    2001-01-01

    Requirements analysis is the most important phase of information system development.Existing requirements analysis techniques concern little or no about features of different business processes.This paper presents a user requirements analysis approach which focuses business processes on the early stage of requirements analysis. It also gives an example of the using of this approach in the analysis of an enterprise information system.

  16. An integrated experimental-modeling approach to study the acid leaching behavior of lead from sub-micrometer lead silicate glass particles

    International Nuclear Information System (INIS)

    Highlights: • Generation of particles by laser ablation of lead silicate glass. • Collection of particles on filters and continuous acid leaching and ICP-MS monitoring. • Fitting of the lead leaching profile to a mathematical intraparticle diffusion model. • Extraction of individual leaching profiles for selected mono-dispersed size fractions. • Leaching kinetics is based on ion-exchange and correlated with particle size. -- Abstract: This work focuses on the development of a procedure to study the mechanism of leaching of lead from sub-micrometer lead glass particles using 0.3 mol l−1 HNO3 as a leachant. Glass particles with an effective size distribution range from 0.05 to 1.4 μm were generated by laser ablation (213 nm Nd:YAG laser) and collected on an inline 0.2 μm syringe filter. Subsequently, the glass particles on the filter were subjected to online leaching and continuous monitoring of lead (Pb-208) in the leachate by quadrupole ICP-MS. The lead leaching profile, aided by the particle size distribution information from cascade impaction, was numerically fitted to a mathematical model based on the glass intraparticle diffusion, liquid film distribution and thermodynamic glass-leachant distribution equilibrium. The findings of the modeling show that the rate-limiting step of leaching is the migration of lead from the core to the surface of the glass particle by an ion-exchange mechanism, governed by the apparent intraparticle lead diffusivity in glass which was calculated to be 3.1 × 10−18 m2 s−1. Lead leaching is illustrated in the form of graphs and animations of intraparticle lead release (in time and intraparticle position) from particles with sizes of 0.1 and 0.3 μm

  17. Analysis of fiber-optic lead-silicate glasses by nuclear-physics methods

    International Nuclear Information System (INIS)

    An elemental analysis of the near-surface layers of original and reduced lead-silicate glasses is performed by means of nondestructive testing methods using accelerated charged particles. The concentration profiles of hydrogen (to a depth of 0.4 μm) and lead (to a depth of 3.5 μm) are measured in 15 pairs of samples of different melts. Based on the resulting data, conclusions are reached with regard to the concentration profile of lead, reduced in the thermohydrogen treatment process to a depth of 0.4 μm. These results show how the quantities that characterize the surface composition of the samples and the volume of the glass depend on the original amount of hydrogen within the glass upon completion of the melt. 15 refs., 5 figs

  18. Analysis of radial basis function interpolation approach

    Institute of Scientific and Technical Information of China (English)

    Zou You-Long; Hu Fa-Long; Zhou Can-Can; Li Chao-Liu; Dunn Keh-Jim

    2013-01-01

    The radial basis function (RBF) interpolation approach proposed by Freedman is used to solve inverse problems encountered in well-logging and other petrophysical issues. The approach is to predict petrophysical properties in the laboratory on the basis of physical rock datasets, which include the formation factor, viscosity, permeability, and molecular composition. However, this approach does not consider the effect of spatial distribution of the calibration data on the interpolation result. This study proposes a new RBF interpolation approach based on the Freedman's RBF interpolation approach, by which the unit basis functions are uniformly populated in the space domain. The inverse results of the two approaches are comparatively analyzed by using our datasets. We determine that although the interpolation effects of the two approaches are equivalent, the new approach is more flexible and beneficial for reducing the number of basis functions when the database is large, resulting in simplification of the interpolation function expression. However, the predicted results of the central data are not sufficiently satisfied when the data clusters are far apart.

  19. Isotope ratio analysis of lead in biological materials by inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Inductively coupled plasma mass spectrometry (ICP-MS) allowed 0.2-0.3% imprecision (1 sigma) in 204Pb/206Pb 207Pb/'206Pb, and 208Pb/206Pb measurements at the 20-100 ppb level, which was precise enough to detect some of the isotopic variations observed in nature. Mass discrimination could be corrected within ±0.5% of the true value by periodical analysis of standard reference material of known lead isotopic composition. As a separation method for lead in human bone, which contains enormous amounts of calcium and phosphorus, anion exchange of the Pb-Br complex was found to be effective. Lead isotope ratios in bone, measured by ICP-MS after separation, were consistent with those measured by thermal ionization mass spectrometry. Hair matrix did not have any influence on the accuracy and precision of the analysis; a digested sample could be directly analyzed and this offered rapid sample throughput. Preliminary data on lead isotope ratios in bone and hair from prehistoric and contemporary Japanese are presented. (author)

  20. The measurement of the chemically mobile fraction of lead in soil using isotopic dilution analysis

    International Nuclear Information System (INIS)

    The chemically available fraction of lead in eight soils measured by isotopic dilution analysis using 212Pb ranged from 7 to 16% of the total content of lead in soil. The soluble fractions achieved values up to 63% of the total content in 1 M NH4NO3, 1 M MgCl2 and 0.05 M DTPA solutions. Increasing the contact time between water and soil, the water-soil ratio from 1:1 to 5:1 and increasing the temperature of the soil-water suspension raised the chemically available fraction in soil. Comparing various soil parameters and the mobile fraction of lead, only pH shows a significant correlation. The amphoteric character of lead causes a minimum of mobility about pH 6; pH-values below are responsible for the higher mobility of lead as Pb2+, at pH-values above 6 soluble hydroxy and humic acid complexes are formed. (orig.)

  1. An integrated microfluidic chip with 40 MHz lead-free transducer for fluid analysis

    International Nuclear Information System (INIS)

    The design, fabrication, and evaluation of a high-frequency transducer made from lead-free piezoceramic for the application of microfluidic analysis is described. Barium strontium zirconate titanate [(Ba0.95Sr0.05)(Zr0.05Ti0.95)O3, abbreviated as BSZT] ceramic has been chosen to be the active element of the transducer. The center frequency and bandwidth of this high-frequency ultrasound transducer have been measured to be 43 MHz and 56.1%, respectively. The transducer was integrated into a microfluidic channel and used to measure the sound velocity and attenuation of the liquid flowing in the channel. Results suggest that lead-free high-frequency transducers could be used for in situ analysis of property of the fluid flowing through the microfluidic system.

  2. Leading order analysis of neutrino induced dimuon events in the CHORUS experiment

    International Nuclear Information System (INIS)

    We present a leading order QCD analysis of a sample of neutrino induced charged-current events with two muons in the final state originating in the lead-scintillating fibre calorimeter of the CHORUS detector. The results are based on a sample of 8910 neutrino and 430 antineutrino induced opposite-sign dimuon events collected during the exposure of the detector to the CERN Wide Band Neutrino Beam between 1995 and 1998. The analysis yields a value of the charm quark mass of mc=(1.26±0.16±0.09)GeV/c2 and a value of the ratio of the strange to non-strange sea in the nucleon of κ=0.33±0.05±0.05, improving the results obtained in similar analyses by previous experiments

  3. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  4. Inventory policies analysis under demand patterns and lead times constraints in a real supply chain

    OpenAIRE

    De Sensi, Giuseppe; Longo, Francesco; Mirabelli, Giovanni

    2008-01-01

    Abstract This paper reports a study on a real three-echelon supply chain operating in the beverage sector. The authors, starting from the actual supply chain configuration, propose a detailed study of the inventory systems. The test of a comprehensive set of different operative scenarios, in terms of customers? demand intensity, customers? demand variability and lead times, becomes a powerful tool for inventory systems analysis along the supply chain. The main objective is the comp...

  5. Analysis of Arsenic, Lead and Mercury in Farming Areas with Mining Contaminated Soils at Zacatecas, Mexico

    OpenAIRE

    Elvira Santos-Santos; Mario Yarto-Ramírez; Irma Gavilán-García; José Castro-Díaz; Arturo Gavilán-García; René Rosiles; Sara Suárez; Tania López-Villegas

    2006-01-01

    This study was conducted in order to identify the concentration of heavy metals (As, Pb Hg) in contaminated soil in the county of Guadalupe, Zacatecas in order to have elements for decision making by authorities and to identify exposition routes and further research activities to evaluate and reduce environmental and health risks in the site. Analysis were developed using EPA methods: SW 846: 3050B/6010B for arsenic and lead (FLAAS); and EPA SW 846: 7471A for total mercury (CVAAS). Compared...

  6. Analysis of Leading Cities in Central Europe: Control of Regional Economy

    OpenAIRE

    Csomós György (1974-) (geográfus)

    2011-01-01

    Nowadays, one of the characteristic orientations in social science studies focusing on cities is the ranking of cities, as well as the definition of the world's leading cities (world cities, global cities) on the basis of various criteria. Central European countries are given just a minor role in these researches, particularly in comparison with German cities with their considerable economic performance. This analysis compares the large cities of Austria, Germany and the countries of the Vise...

  7. Polyphase Order Analysis Based on Convolutional Approach

    OpenAIRE

    M. Drutarovsky

    1999-01-01

    The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing) rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational ...

  8. Approaches to Workflow Analysis in Healthcare Settings

    OpenAIRE

    Sheehan, Barbara; Bakken, Suzanne

    2012-01-01

    Attention to workflow is an important component of a comprehensive approach to designing usable information systems. In healthcare, inattention to workflow is associated with poorly accepted systems and unforeseen effects of use. How best to examine workflow for the purpose of system design is in itself the subject of scientific inquiry. Several disciplines offer approaches to the study of workflow that can be tailored to meet the needs of systems designers in healthcare settings. This paper ...

  9. Production efficiency evaluation: analysis of approaches

    Directory of Open Access Journals (Sweden)

    Yuliya Litkovets

    2013-11-01

    Full Text Available The most common approaches to the effectiveness evaluation of both the national and in foreign practice of management are researched. The widespread in domestic management practice system of generalized and partial efficiency evaluation indicators from a position of resource-cost approach is grounded. Due to market requirements, imposed on businesses to ensure competitiveness, it is necessary to take into account the requirements of all contractors that are displayed in financial indicators.

  10. Thermal analysis of selected tin-based lead-free solder alloys

    DEFF Research Database (Denmark)

    Palcut, Marián; Sopoušek, J.; Trnková, L.; Hodúlová, E.; Szewczyková, B.; Ožvold, M.; Turňa, M.; Janovec, J.

    2009-01-01

    The Sn-Ag-Cu alloys have favourable solderability and wetting properties and are, therefore, being considered as potential lead-free solder materials. In the present study, tin-based Sn-Ag-Cu and Sn-Ag-Cu-Bi alloys were studied in detail by a differential scanning calorimetry (DSC) and...... thermodynamic calculations using the CALPHAD approach. The amount of the alloying elements in the materials was chosen to be close to the respective eutectic composition and the nominal compositions were the following: Sn-3.7Ag-0.7Cu, Sn-1.0Ag-0.5Cu-1Bi (in wt.%). Thermal effects during melting and solidifying...... simulated using the Thermo-Calc software package. This approach enabled us to obtain the enthalpy of cooling for each alloy and to compare its temperature derivative with the experimental DSC curves....

  11. The Repurposing of Old Drugs or Unsuccessful Lead Compounds by in Silico Approaches: New Advances and Perspectives.

    Science.gov (United States)

    Martorana, Annamaria; Perricone, Ugo; Lauria, Antonino

    2016-01-01

    Have you a compound in your lab, which was not successful against the designed target, or a drug that is no more attractive? The drug repurposing represents the right way to reconsider them. It can be defined as the modern and rationale approach of the traditional methods adopted in drug discovery, based on the knowledge, insight and luck, alias known as serendipity. This repurposing approach can be applied both in silico and in wet. In this review we report the molecular modeling facilities that can be of huge support in the repurposing of drugs and/or unsuccessful lead compounds. In the last decades, different methods were proposed to help the scientists in drug design and in drug repurposing. The steps strongly depend on the approach applied. It could be a ligand or a structure based method, correlated to the use of specific means. These processes, starting from a compound with potential therapeutic properties and a sizeable number of toxicity passed tests, can successfully speed up the very slow development of a molecule from bench to market. Herein, we discuss the facilities available to date, classifying them by methods and types. We have reported a series of databases, ligand and structure stand-alone software, and of web-based tools, which are free accessible to scientific community. This review does not claim to be exhaustive, but can be of interest to help in drug repurposing through in silico methods, as a valuable tool for the medicinal chemistry community. PMID:26881716

  12. Comparison of different approaches for lifetime prediction of electrochemical systems - Using lead-acid batteries as example

    Energy Technology Data Exchange (ETDEWEB)

    Sauer, Dirk Uwe [Electrochemical Energy Conversion and Storage Systems Group, Institute for Power Electronics and Electrical Drives (ISEA), RWTH Aachen University, Jaegerstrasse 17/19, D-52066 Aachen (Germany); Wenzl, Heinz [Electrical Energy Storage Group, Institute of Electrical Power Engineering, Clausthal University of Technology and Beratung fuer Batterien und Energietechnik, Am Bergwaeldchen 27, D-37520 Osterode (Germany)

    2008-02-01

    Different approaches for lifetime prediction for electrochemical energy storage devices are discussed with respect to their general concepts. Examples for their implementation and advantages and disadvantages are given. The models are based on: (a) physical and chemical processes and their interaction as regards ageing effects; (b) weighting of the Ah throughput whenever the operating conditions deviate from the standard conditions used for determining the lifetime under laboratory conditions; (c) an event-oriented concept from mechanical engineering (Woehler curves) which is based on a pattern recognition approach to identify severe operating conditions. Examples and details are explained for lead-acid batteries. The approaches can be applied to other electrochemical technologies including fuel cells. However, it is beyond the scope of this paper, to describe the models in all mathematical details. The models are used in system design and identification of appropriate operating strategies and therefore they must have high computational speed to allow for a comparison of a large number of system variations. (author)

  13. Oxygen concentration diffusion analysis of lead-bismuth-cooled, natural-circulation reactor

    International Nuclear Information System (INIS)

    The feasibility study on fast breeder reactors in Japan has been conducted at JNC and related organizations. The Phase-I study has finished in March, 2001. During the Phase-I activity, lead-bismuth eutectic coolant has been selected as one of the possible coolant options and a medium-scale plant, cooled by a lead-bismuth natural circulation flow was studied. On the other side, it is known that lead-bismuth eutectic has a problem of structural material corrosiveness. It was found that oxygen concentration control in the eutectic plays an important role on the corrosion protection. In this report, we have developed a concentration diffusion analysis code (COCOA: COncentration COntrol Analysis code) in order to carry out the oxygen concentration control analysis. This code solves a two-dimensional concentration diffusion equation by the finite differential method. It is possible to simulate reaction of oxygen and hydrogen by the code. We verified the basic performance of the code and carried out oxygen concentration diffusion analysis for the case of an oxygen increase by a refueling process in the natural circulation reactor. In addition, characteristics of the oxygen control system was discussed for a different type of the control system as well. It is concluded that the COCOA code can simulate diffusion of oxygen concentration in the reactor. By the analysis of a natural circulation medium-scale reactor, we make clear that the ON-OFF control and PID control can well control oxygen concentration by choosing an appropriate concentration measurement point. In addition, even when a trouble occurs in the oxygen emission or hydrogen emission system, it observes that control characteristic drops away. It is still possible, however, to control oxygen concentration in such case. (author)

  14. Applied Systems Analysis: A Genetic Approach

    OpenAIRE

    Majone, G.

    1980-01-01

    The International Institute for Applied Systems Analysis is preparing a "Handbook of Systems Analysis," which will appear in three volumes: Volume 1, "Overview," is aimed at a widely varied audience of producers and users of systems analysis; Volume 2, "Methods," is aimed at systems analysts who need basic knowledge of methods in which they are not expert; the volume contains introductory overviews of such methods; Volume 3, "Cases," contains descriptions of actual systems analyses that illus...

  15. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  16. Cadmium and lead residue control in a hazard analysis and critical control point (HACCP) environment.

    Science.gov (United States)

    Pagan-Rodríguez, Doritza; O'Keefe, Margaret; Deyrup, Cindy; Zervos, Penny; Walker, Harry; Thaler, Alice

    2007-02-21

    In 2003-2004, the U.S. Department of Agriculture Food Safety and Inspection Service (FSIS) conducted an exploratory assessment to determine the occurrence and levels of cadmium and lead in randomly collected samples of kidney, liver, and muscle tissues of mature chickens, boars/stags, dairy cows, and heifers. The data generated in the study were qualitatively compared to data that FSIS gathered in a 1985-1986 study in order to identify trends in the levels of cadmium and lead in meat and poultry products. The exploratory assessment was necessary to verify that Hazard Analysis and Critical Control Point plans and efforts to control exposure to these heavy metals are effective and result in products that meet U.S. export requirements. A comparison of data from the two FSIS studies suggests that the incidence and levels of cadmium and lead in different slaughter classes have remained stable since the first study was conducted in 1985-1986. This study was conducted to fulfill FSIS mandate to ensure that meat, poultry, and egg products entering commerce in the United States are free of adulterants, including elevated levels of environmental contaminants such as cadmium and lead. PMID:17249686

  17. Analysis of lead accumulated in vetiver grass using the x-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    This research was conducted to study the field application availability of using the x-ray fluorescence (XRF) technique in environmental analysis. The measurement was conducted with calibration standardization technique and internal standardization technique for comparison and optimization. The research was conducted to analyze the concentration of lead accumulated in shoot and root of vetiver grasses grown in lead mine tailings using both XRF techniques. Vetiver was planted on two difference tailings concentrations: 50% and 100%. Every 30 days period, both concentration treatments were amended with chemical fertilizer (C-treatment), organic fertilizer (O-treatment) and no fertilizer (N-treatment). Vetiver was designed to harvest at 120 days after planted. The results show that organic or chemical fertilizer could improve the growth of vetiver growing on all lead tailings concentration. Vetiver planted on 100% Pb tailings concentration and amended with chemical fertilizer have the highest uptake ability of 182.7 mg. In the analysis section, the quantitative results of Pb have shown no significant difference among both XRF techniques as well as the results from atomic absorption spectroscopy(AAS)

  18. Analysis and Evaluation of Organizational Change Approaches

    Directory of Open Access Journals (Sweden)

    Yuan Liu

    2009-11-01

    Full Text Available Organizational change is the trend for the further development and which been explained is the enduring quest of scholars in many disciplines. Prescriptive approach and emergent approach are two main types of models for organizational change. The ‘Seven S Framework’ from Peters and his colleague to show the interrelationships between different aspects of corporate strategy. Mintzberg developed his rational concept of an organisation as composed of five segments and uses his model flexibly to develop five different configurations of structure

  19. [Partial lease squares approach to functional analysis].

    Science.gov (United States)

    Preda, C

    2006-01-01

    We extend the partial least squares (PLS) approach to functional data represented in our models by sample paths of stochastic process with continuous time. Due to the infinite dimension, when functional data are used as a predictor for linear regression and classification models, the estimation problem is an ill-posed one. In this context, PLS offers a simple and efficient alternative to the methods based on the principal components of the stochastic process. We compare the results given by the PLS approach and other linear models using several datasets from economy, industry and medical fields. PMID:17124795

  20. Sonochemical synthesis of two new nano lead(II) coordination polymers: Evaluation of structural transformation via mechanochemical approach.

    Science.gov (United States)

    Aboutorabi, Leila; Morsali, Ali

    2016-09-01

    Two new lead(II) mixed-ligand coordination polymers, [Pb(PNO)(SCN)]n (1) and [Pb(PNO)(N3)]n (2), (HPNO=picolinic acid N-oxide) were synthesized by a sonochemical method and characterized by scanning electron microscopy, X-ray powder diffraction, IR spectroscopy and elemental analysis. Compounds 1 and 2 were structurally characterized by single crystal X-ray diffraction. The thermal behavior of 1 and 2 were studied by thermal gravimetric analysis. Structural transformations of compounds 1 and 2 were evaluated through anion-replacement processes by mechanochemical method. Moreover, the effect of sonication conditions including time, concentrations of initial reagents and power of irradiation were evaluated on size and morphology of compounds 1 and 2. PMID:27150742

  1. Polyphase Order Analysis Based on Convolutional Approach

    Directory of Open Access Journals (Sweden)

    M. Drutarovsky

    1999-06-01

    Full Text Available The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational speed. The paper presents optimised algorithm for polyphase order analysis based on non power of two DFT algorithm efficiently implemented by chirp FFT algorithm. Proposed algorithm decreases complexity of digital resampling algorithm, which is the most complex part of complete spectral order algorithm.

  2. Sentiment Analysis Using Hybrid Approach: A Survey

    Directory of Open Access Journals (Sweden)

    Chauhan Ashish P

    2015-01-01

    Full Text Available Sentiment analysis is the process of identifying people’s attitude and emotional state’s from language. The main objective is realized by identifying a set of potential features in the review and extracting opinion expressions about those features by exploiting their associations. Opinion mining, also known as Sentiment analysis, plays an important role in this process. It is the study of emotions i.e. Sentiments, Expressionsthat are stated in natural language. Natural language techniques are applied to extract emotions from unstructured data. There are several techniques which can be used to analysis such type of data. Here, we are categorizing these techniques broadly as ”supervised learning”, ”unsupervised learning” and ”hybrid techniques”. The objective of this paper is to provide the overview of Sentiment Analysis, their challenges and a comparative analysis of it’s techniques in the field of Natural Language Processing

  3. Numerical analysis on collection efficiency of the cold trap filter unit of liquid lithium lead loop

    International Nuclear Information System (INIS)

    Liquid LiPb (lithium-lead) purification technology was one of the key technologies on liquid LiPb breeder blankets for fusion reactors. Cold trap was commonly used in the lithium-lead-line purification devices. Since the cold trap filter unit collection efficiency is difficult to be measured on-line, the discrete phase model (PDM) was used to simulate the impurity trapped efficiency at the premise of considering impurities crystallization rate. When the velocity increased to the certain value, the growth of collection efficiency became slowly. The analysis results were useful for the optimization design of cold trap filter unit and determination of liquid metal flow velocity through the cold trap filter unit. (authors)

  4. Uncertainties leading to the use of fuzzy risk analysis of hydrogen safety

    International Nuclear Information System (INIS)

    An important issue involved with the expanded use of hydrogen as a fuel concerns the related safety risks that would be incurred by society. Hydrogen is generally considered a high risk or relatively dangerous fuel. However, assigning this designation is subjective and arguments can be leveled both for and against this position. These arguments stem from uncertainties in the meaning of words used to describe both the sources and levels of risk. Even more rudimentary is the definition of the concept of risk. Other sources of uncertainty in a safety analysis is found to be derived from choosing relevant fuel characteristics pertaining to safety, as well as from the measurement methods, accuracy and relative importance of these characteristics. A review of the sources of these uncertainties leads to the proposal that an analysis based on fuzzy logic would be appropriate. Fuzzy analysis is a relatively new area of study with Zadeh's 1965 seminal paper regarded as the establishment of this field. Since that time a proliferation of studies has been conducted in a wide field of applications. A variety of methods of fuzzy risk analysis have been applied to areas ranging from economic and investment choices to project analysis and safety, each method attempting to take advantage of the uncertainty and imprecision in the analysis. (author)

  5. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    OpenAIRE

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GI...

  6. Preparing a Safety Analysis Report using the building block approach

    International Nuclear Information System (INIS)

    The credibility of the applicant in a licensing proceeding is severely impacted by the quality of the license application, particularly the Safety Analysis Report. To ensure the highest possible credibility, the building block approach was devised to support the development of a quality Safety Analysis Report. The approach incorporates a comprehensive planning scheme that logically ties together all levels of the investigation and provides the direction necessary to prepare a superior Safety Analysis Report

  7. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  8. Computability and Analysis, a Historical Approach

    OpenAIRE

    Brattka, Vasco

    2016-01-01

    The history of computability theory and and the history of analysis are surprisingly intertwined since the beginning of the twentieth century. For one, \\'Emil Borel discussed his ideas on computable real number functions in his introduction to measure theory. On the other hand, Alan Turing had computable real numbers in mind when he introduced his now famous machine model. Here we want to focus on a particular aspect of computability and analysis, namely on computability properties of theorem...

  9. Structural health monitoring of multi-spot welded joints using a lead zirconate titanate based active sensing approach

    Science.gov (United States)

    Yao, Ping; Kong, Qingzhao; Xu, Kai; Jiang, Tianyong; Huo, Lin-sheng; Song, Gangbing

    2016-01-01

    Failures of spot welded joints directly reduce the load capacity of adjacent structures. Due to their complexity and invisibility, real-time health monitoring of spot welded joints is still a challenge. In this paper, a lead zirconate titanate (PZT) based active sensing approach was proposed to monitor the structural health of multi-spot welded joints in real time. In the active sensing approach, one PZT transducer was used as an actuator to generate a guided stress wave, while another one, as a sensor, detected the wave response. Failure of a spot welded joint reduces the stress wave paths and attenuates the wave propagation energy from the actuator to the sensor. A total of four specimens made of dual phase steel with spot welds, including two specimens with 20 mm intervals of spot welded joints and two with 25 mm intervals, were designed and fabricated for this research. Under tensile tests, the spot welded joints successively failed, resulting in the PZT sensor reporting decreased received energy. The energy attenuations due to the failures of joints were clearly observed by the PZT sensor signal in both the time domain and frequency domain. In addition, a wavelet packet-based spot-weld failure indicator was developed to quantitatively evaluate the failure condition corresponding to the number of failed joints.

  10. Approaches Towards the Minimisation of Toxicity in Chemical Solution Deposition Processes of Lead-Based Ferroelectric Thin Films

    Science.gov (United States)

    Bretos, Iñigo; Calzada, M. Lourdes

    The ever-growing environmental awareness in our lives has also been extended to the electroceramics field during the past decades. Despite the strong regulations that have come up (RoHS directive), a number of scientists work on ferroelectric thin film ceramics containing lead. Although the use of these materials in piezoelectric devices is exempt from the RoHS directive, successful ways of decreasing toxic load must be considered a crucial challenge. Within this framework, a few significant advances are presented here, based on different Chemical Solution Deposition strategies. Firstly, the UV sol-gel photoannealing technique (Photochemical Solution Deposition) avoids the volatilisation of hazardous lead from lead-based ferroelectric films, usually observed at conventional annealing temperatures. The key point of this approach lies in the photo-excitation of a few organic components in the gel film. There is also a subsequent annealing of the photo-activated film at temperatures low enough to prevent lead volatilisation, but allowing crystallisation of the pure perovskite phase. Ozonolysis of the films is also promoted when UV-irradiation is carried out in an oxygen atmosphere. This is known to improve electrical response. By this method, nominally stoichiometric solution (i.e., a solution without PbO-excess) derived films with reliable properties, and free of compositional gradients, may be prepared at temperatures as low as 450°C. A PtxPb interlayer between the ferroelectric film and the Pt silicon substrate is observed in the heterostructure of the low-temperature processed films. This is when lead excesses are present in their microstructure. The influence of this interface on the compositional depth profile of the films will be discussed. We will evaluate the feasibility of the UV sol-gel photoannealing technique in fabricating functional films while fulfilling environmental and technological aspects (like integration with silicon IC technology). The second

  11. Next-to leading order analysis of target mass corrections to structure functions and asymmetries

    Energy Technology Data Exchange (ETDEWEB)

    L. T. Brady, A. Accardi, T. J. Hobbs, W. Melnitchouk

    2011-10-01

    We perform a comprehensive analysis of target mass corrections (TMCs) to spin-averaged structure functions and asymmetries at next-to-leading order. Several different prescriptions for TMCs are considered, including the operator product expansion, and various approximations to it, collinear factorization, and xi-scaling. We assess the impact of each of these on a number of observables, such as the neutron to proton F{sub 2} structure function ratio, and parity-violating electron scattering asymmetries for protons and deuterons which are sensitive to gamma-Z interference effects. The corrections from higher order radiative and nuclear effects on the parity-violating deuteron asymmetry are also quantified.

  12. Lead test assembly irradiation and analysis Watts Bar Nuclear Plant, Tennessee and Hanford Site, Richland, Washington

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-01

    The U.S. Department of Energy (DOE) needs to confirm the viability of using a commercial light water reactor (CLWR) as a potential source for maintaining the nation`s supply of tritium. The Proposed Action discussed in this environmental assessment is a limited scale confirmatory test that would provide DOE with information needed to assess that option. This document contains the environmental assessment results for the Lead test assembly irradiation and analysis for the Watts Bar Nuclear Plant, Tennessee, and the Hanford Site in Richland, Washington.

  13. Structural characterization of lead sulfide thin films by means of X-ray line profile analysis

    Indian Academy of Sciences (India)

    N Choudhury; B K Sarma

    2009-02-01

    X-ray diffraction patterns of chemically deposited lead sulphide thin films have been recorded and X-ray line profile analysis studies have been carried out. The lattice parameter, crystallite size, average internal stress and microstrain in the film are calculated and correlated with molarities of the solutions. Both size and strain are found to contribute towards the broadening of X-ray diffraction line. The values of the crystallite size are found to be within the range from 22–33 nm and the values of strain to be within the range from 1.0 × 10-3–2.5 × 10-3.

  14. Lead test assembly irradiation and analysis Watts Bar Nuclear Plant, Tennessee and Hanford Site, Richland, Washington

    International Nuclear Information System (INIS)

    The U.S. Department of Energy (DOE) needs to confirm the viability of using a commercial light water reactor (CLWR) as a potential source for maintaining the nation's supply of tritium. The Proposed Action discussed in this environmental assessment is a limited scale confirmatory test that would provide DOE with information needed to assess that option. This document contains the environmental assessment results for the Lead test assembly irradiation and analysis for the Watts Bar Nuclear Plant, Tennessee, and the Hanford Site in Richland, Washington

  15. Comparative analysis of employment dynamics in leading and lagging rural regions of the EU, 1980-1997.

    NARCIS (Netherlands)

    Terluin, I.J.; Post, J.H.; Sjöström, Å.

    1999-01-01

    In this study a comparative analysis of factors hampering and encouraging the development of employment in 9 leading and 9 lagging regions in the EU during the 1980s and the first half of the 1990s is made. Derived from this comparative analysis, some lessons, which leading and lagging rural regions

  16. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    Science.gov (United States)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    Operational quantitative precipitation forecasts (QPF) are provided routinely by weather services or hydrological authorities, particularly those responsible for densely populated regions of small catchments, such as those typically found in Mediterranean areas prone to flash-floods. Specific rainfall values are used as thresholds for issuing warning levels considering different time frameworks (mid-range, short-range, 24h, 1h, etc.), for example 100 mm in 24h or 60 mm in 1h. There is a clear need to determine how feasible is a specific rainfall value for a given lead-time, in particular for very short range forecasts or nowcasts typically obtained from weather radar observations (Pierce et al 2012). In this study we assess which specific nowcast lead-times can be provided for a number of heavy precipitation events (HPE) that affected Catalonia (NE Spain). The nowcasting system we employed generates QPFs through the extrapolation of rainfall fields observed with weather radar following a Lagrangian approach developed and tested successfully in previous studies (Berenguer et al. 2005, 2011).Then QPFs up to 3h are compared with two quality controlled observational data sets: weather radar quantitative precipitation estimates (QPE) and raingauge data. Several high-impact weather HPE were selected including the 7 September 2005 Llobregat Delta river tornado outbreak (Bech et al. 2007) or the 2 November 2008 supercell tornadic thunderstorms (Bech et al. 2011) both producing, among other effects, local flash floods. In these two events there were torrential rainfall rates (30' amounts exceeding 38.2 and 12.3 mm respectively) and 24h accumulation values above 100 mm. A number of verification scores are used to characterize the evolution of precipitation forecast quality with time, which typically presents a decreasing trend but showing an strong dependence on the selected rainfall threshold and integration period. For example considering correlation factors, 30

  17. Orthogonal simple component analysis: A new, exploratory approach

    OpenAIRE

    Anaya-Izquierdo, Karim; Critchley, Frank; Vines, Karen

    2011-01-01

    Combining principles with pragmatism, a new approach and accompanying algorithm are presented to a longstanding problem in applied statistics: the interpretation of principal components. Following Rousson and Gasser [53 (2004) 539–555], the ultimate goal is not to propose a method that leads automatically to a unique solution, but rather to develop tools for assisting the user in his or her choice of an interpretable solution. ¶ Accordingly, our approach is essentially exploratory. Call...

  18. Unified statistical approach to cortical thickness analysis.

    Science.gov (United States)

    Chung, Moo K; Robbins, Steve; Evans, Alan C

    2005-01-01

    This paper presents a unified image processing and analysis framework for cortical thickness in characterizing a clinical population. The emphasis is placed on the development of data smoothing and analysis framework. The human brain cortex is a highly convoluted surface. Due to the convoluted non-Euclidean surface geometry, data smoothing and analysis on the cortex are inherently difficult. When measurements lie on a curved surface, it is natural to assign kernel smoothing weights based on the geodesic distance along the surface rather than the Euclidean distance. We present a new data smoothing framework that address this problem implicitly without actually computing the geodesic distance and present its statistical properties. Afterwards, the statistical inference is based on the random field theory based multiple comparison correction. As an illustration, we have applied the method in detecting the regions of abnormal cortical thickness in 16 high functioning autistic children. PMID:17354731

  19. Detection of insincere grips: multivariate analysis approach

    OpenAIRE

    Dasari, B.D.; Leung, K.F.

    2004-01-01

    BACKGROUND Smith and Chengalur were successful in using grip test and the cut-off criterion method to detect fake grips in healthy subjects and subjects with hand injuries. The purpose of this study is to test if methods other than the cut-off method, i.e. discriminate analysis and logistic analysis methods, cn be used to provide a more accurate detection of fake grips, with the use of sustain grip test. METHOD Two groups of subjects were recruited. Group one consisted of 40 healthy subjects ...

  20. Study of Object Oriented Analysis and Design Approach

    Directory of Open Access Journals (Sweden)

    Sunil K. Pandey

    2011-01-01

    Full Text Available Problem statement: Object and component technologies, rapidly maturing branches of information technology, have been becoming pervasive elements of systems development, especially the recently popular Internet applications and thus leading to increased complexity and at the same time broader range of applications. Approach: This needs to be understood in order to maximize its benefits and applications with consistent results. However, mainstream Object Oriented Systems Development (OOSD, consisting of Object Oriented Analysis and Design (OOAD and Object- Oriented Programming (OOP, has a history of difficulties and is still struggling to gain prevalent acceptance. Results: There have been number of studies and experiments conducted by experts and researchers in the past which provides a solid base to take up this study and look into various intricacies present. There have been several studies and focused efforts in this direction which laid down the basis for a segment of people to form the opinion as “technology adoption is mostly the result of marketing forces, not scientific evidence” whereas there have been another segment that believes that object technology is “still long on hype and short on results ...”. The gurus of OOSD continue to tout its vast superiority over conventional systems development, even to the extent of developing a unified software development process. Conclusion: The advocates of OOSD claim many advantages including easier modeling, increased code reuse, higher system quality and easier maintenance. It is well understood that analysis and design are extremely critical aspects of successful systems development especially in the case of OOSD. As the development of any successful information system must begin with a well-conceived and implemented analysis and design, this study will focus on the most recent empirical evidence on the pros and cons of OOAD.

  1. A behavioural approach to remittances analysis

    OpenAIRE

    Meyer, Wiebke; Mollers, Judith; Buchenrieder, Gertrud

    2012-01-01

    This paper approaches the migrant’s motivation to remit from a new, behavioural perspective. We apply the well-established Theory of Planned Behaviour (TPB) using a structural equation model for the first time for this specific research question. Our micro-dataset stems from a 2009/10 survey, covering Albanian migrants from Kosovo living in Germany as well as their home-country households. More than 90% of Kosovar migrants living in Germany remit. However, little is known about their underlyi...

  2. Multilevel index decomposition analysis: Approaches and application

    International Nuclear Information System (INIS)

    With the growing interest in using the technique of index decomposition analysis (IDA) in energy and energy-related emission studies, such as to analyze the impacts of activity structure change or to track economy-wide energy efficiency trends, the conventional single-level IDA may not be able to meet certain needs in policy analysis. In this paper, some limitations of single-level IDA studies which can be addressed through applying multilevel decomposition analysis are discussed. We then introduce and compare two multilevel decomposition procedures, which are referred to as the multilevel-parallel (M-P) model and the multilevel-hierarchical (M-H) model. The former uses a similar decomposition procedure as in the single-level IDA, while the latter uses a stepwise decomposition procedure. Since the stepwise decomposition procedure is new in the IDA literature, the applicability of the popular IDA methods in the M-H model is discussed and cases where modifications are needed are explained. Numerical examples and application studies using the energy consumption data of the US and China are presented. - Highlights: • We discuss the limitations of single-level decomposition in IDA applied to energy study. • We introduce two multilevel decomposition models, study their features and discuss how they can address the limitations. • To extend from single-level to multilevel analysis, necessary modifications to some popular IDA methods are discussed. • We further discuss the practical significance of the multilevel models and present examples and cases to illustrate

  3. A Performance Approach to Job Analysis.

    Science.gov (United States)

    Folsom, Al

    2001-01-01

    Discussion of performance technology and training evaluation focuses on a job analysis process in the Coast Guard. Topics include problems with low survey response rates; costs; the need for appropriate software; discussions with stakeholders and subject matter experts; and maximizing worthy performance. (LRW)

  4. Nonsmooth analysis approach to Isaac's equation

    Directory of Open Access Journals (Sweden)

    Leszek S. Zaremba

    1993-01-01

    Full Text Available We study Isaacs' equation (∗wt(t,x+H(t,x,wx(t,x=0 (H is a highly nonlinear function whose “natural” solution is a value W(t,x of a suitable differential game. It has been felt that even though Wx(t,x may be a discontinuous function or it may not exist everywhere, W(t,x is a solution of (∗ in some generalized sense. Several attempts have been made to overcome this difficulty, including viscosity solution approaches, where the continuity of a prospective solution or even slightly less than that is required rather than the existence of the gradient Wx(t,x. Using ideas from a very recent paper of Subbotin, we offer here an approach which, requiring literally no regularity assumptions from prospective solutions of (∗, provides existence results. To prove the uniqueness of solutions to (∗, we make some lower- and upper-semicontinuity assumptions on a terminal set Γ. We conclude with providing a close relationship of the results presented on Isaacs' equation with a differential games theory.

  5. Analysis and testing of the DIII-D ohmic heating coil lead repair clamp

    International Nuclear Information System (INIS)

    DIII-D has been operating for the last year with limited volt-second capabilities due to structural failure of a conductor lead to one of the ohmic heating (OH) solenoids. The conductor failure was due to poor epoxy impregnation of the overwrap of the lead pack, resulting in copper fatigue and a water leak. A number of structural analyses were performed to assist in determining the failure scenario and to evaluate various repair options. A fatigue stress analysis of the leads with a failed epoxy overwrap indicated crack initiation after 1,000 cycles at the maximum operating conditions. The failure occurred in a very inaccessible area which restricted design repair options to concepts which could be implemented remotely. Several design options were considered for repairing the lead so that it can sustain the loads for 7.5 Vs conditions at full toroidal field. A clamp, along with preloaded banding straps and shim bags, provides a system that guarantees that the stress at the crack location is always compressive and prevents further crack growth in the conductor. Due to the limited space available for the repair, it was necessary to design the clamp system to operate at the material yield stress. The primary components of the clamp system were verified by load tests prior to installation. The main body of the clamp contains a load cell and potentiometer for monitoring the load-deflection characteristics of the clamp and conductors during plasma operation. Strain gages provides redundant instrumentation. If required, the preload on the conductors can be increased remotely by a special wrench attached to the clamp assembly

  6. Analysis of Lead and Cadmium Contents in Local Vegetables in Surat Thani, Thailand

    Directory of Open Access Journals (Sweden)

    Nipaporn MEEPUN

    2014-06-01

    Full Text Available Two toxic heavy metals, cadmium (Cd(II and lead (Pb(II, in samples of local vegetables were analyzed by graphite furnace atomic absorption spectroscopy (GFAAS. Pak-Leang (Gnetum gnemon Linn., Pak-Waen (Marsilea crenata Presl., Mun-Poo (Glochidion littorale Blume Baill., and Chamuang (Garcinia cowa Roxb. were from fresh markets in 4 districts namely Muang, Phunphin, Kanchanadit and Ban Na Doem, Surat Thani province. The preparation of samples was carried out by mixed acid digestion procedure in order to extract the heavy metals. From the GFAAS analysis of sample solutions, the average lead contents were as follows: 0.10 ± 0.11 mg kg-1 in Pak-Leang, 0.04 ± 0.07 mg kg-1 in Pak-Waen, 0.14 ± 0.17 mg kg-1 in Mun-Poo and 0.02 ± 0.05 mg kg-1 in Chamuang. The results indicated that the concentrations of lead within these local vegetables were under the maximum allowable level according to the standard of the Ministry of Public Health, Thailand. On the other hand, analysis of cadmium found that 3 certain vegetables including Pak-Waen (0.48 ± 0.27 mg kg-1, Mun-Poo (0.78 ± 0.72 mg kg-1 and Chamuang (0.34 ± 0.27 mg kg-1, were contaminated with cadmium higher than the maximum allowable levels in the average for the standards of Australia-New Zealand, Codex, China and the European Union. The assessment of heavy metal indicated that these accumulation quantities in edible plants could be valuably evident for public concerns and research-based food safety.

  7. An Analysis of the Factors Leading to Rising Credit Risk in the Zimbabwe Banking Sector

    Directory of Open Access Journals (Sweden)

    Maxwell Sandada

    2016-02-01

    Full Text Available The study sought to analyse the factors that lead to rising credit risk in the Zimbabwean banking sector. The objective was to ascertain the impact of macroeconomic, industry and bank specific factors on rising credit risk in in Zimbabwe. The study aimed at contributing to credit risk management literature by providing evidence Sub Saharan context. Being anchored on the positivist quantitative research approach, a survey was carried out gather the data that were analysed using descriptive, correlation and regression analyses. The results revealed that the most significant factors leading to credit risk in the Zimbabwean banking sector were macroeconomic and bank specific factors. The industry factors did not show a significant influence on the rising credit risk. The research findings of this study will a valuable addition to the existing knowledge and provide a platform for further research on how the credit risk problems can be dealt with. While credit risk is known as one of the risks inherent to any banking institutions, the alarming levels of credit risk in the Zimbabwe banking sector has motivated this current study to critically analyse the factors that have led to the high credit risk levels.

  8. An artificial intelligence approach towards disturbance analysis

    International Nuclear Information System (INIS)

    Scale and degree of sophistication of technological plants, e.g. nuclear power plants, have been essentially increased during the last decades. Conventional disturbance analysis systems have proved to work successfully in well-known situations. But in cases of emergencies, the operator needs more advanced assistance in realizing diagnosis and therapy control. The significance of introducing artificial intelligence (AI) methods in nuclear power technology is emphasized. Main features of the on-line disturbance analysis system SAAP-2 are reported about. It is being developed for application to nuclear power plants. Problems related to man-machine communication will be gone into more detail, because their solution will influence end-user acceptance considerably. (author)

  9. Sediment Analysis Using a Structured Programming Approach

    OpenAIRE

    Daniela Arias-Madrid; Oscar A. López-Paz; Jovani A. Jiménez-Builes

    2012-01-01

    This paper presents an algorithm designed for the analysis of a sedimentary sample of unconsolidated material and seeks to identify very quickly the main features that occur in a sediment and thus classify them fast and efficiently. For this purpose, it requires that the weight of each particle size to be entered in the program and using the method of Moments, which is based on four equations representing the mean, standard deviation, skewness and kurtosis, is found the attributes of the samp...

  10. A different approach to hair analysis

    International Nuclear Information System (INIS)

    Hair samples from Canada, China, India and New Zealand were analyzed by neutron activation analysis. Comparison of the percent manganese in the alkali-soluble fraction of hair with the total manganese concentrations shows that, within groups, the percent manganese concentrations is relatively constant whereas overall concentrations are not. For multiple sclerosis patients, from Canada and New Zealand, highly significant differences were observed between controls and patients for % Mn in the alkali-soluble fraction. (author) 6 refs.; 3 figs.; 2 tabs

  11. Testability Analysis Approach For Reactive Systems

    Directory of Open Access Journals (Sweden)

    Nguyen Thanh Binh

    2011-11-01

    Full Text Available Reactive systems are often designed as two parts: computation and control. The computation part is modeled by operator diagrams, while the control part is modeled by transition-based models. In this paper, we concentrate on analyzing the testability of the control part by using upon transition based models. We first transform transition-based models into Markov chains by augmenting probability information. Then, testability measures are proposed from Markov chains as an estimate of testing effort for reaching state coverage and path coverage. The approach is applied to a case study and the obtained measures are compared to the testing effort required by a test generation tool. The results show some interesting perspectives.

  12. Energy policy and externalities: the life cycle analysis approach

    International Nuclear Information System (INIS)

    In the energy sector, getting the prices right is a prerequisite for market mechanisms to work effectively towards sustainable development. However, energy production and use creates 'costs' external to traditional accounting practices, such as damages to human health and the environment resulting from residual emissions or risks associated with dependence on foreign suppliers. Energy market prices do not fully reflect those external costs. For example, the costs of climate change are not internalized and, therefore, consumers do not get the right price signals leading them to make choices that are optimised from a societal viewpoint. Economic theory has developed approaches to assessing and internalizing external costs that can be applied to the energy sector and, in principle, provide means to quantify and integrate relevant information in a comprehensive framework. The tools developed for addressing these issues are generally aimed at monetary valuation of impacts and damages and integration of the valued 'external costs' in total cost of the product, e.g. electricity. The approach of Life Cycle Analysis (LCA) provides a conceptual framework for a detailed and comprehensive comparative evaluation of energy supply options. This paper offers a summary of the LCA methodology and an overview of some of its limitations. It then illustrates, through a few examples, how the methodology can be used to inform or correct policy making and to orient investment decisions. Difficulties and issues emerging at various stages in the application and use of LCA results are discussed, although in such a short note, it is impossible to address all issues related to LCA. Therefore, as part of the concluding section, some issues are left open - and areas in which further analytical work may be needed are described. (author)

  13. Modeling the effect of levothyroxine therapy on bone mass density in postmenopausal women: a different approach leads to new inference

    Directory of Open Access Journals (Sweden)

    Tavangar Seyed

    2007-06-01

    Full Text Available Abstract Background The diagnosis, treatment and prevention of osteoporosis is a national health emergency. Osteoporosis quietly progresses without symptoms until late stage complications occur. Older patients are more commonly at risk of fractures due to osteoporosis. The fracture risk increases when suppressive doses of levothyroxine are administered especially in postmenopausal women. The question is; "When should bone mass density be tested in postmenopausal women after the initiation of suppressive levothyroxine therapy?". Standard guidelines for the prevention of osteoporosis suggest that follow-up be done in 1 to 2 years. We were interested in predicting the level of bone mass density in postmenopausal women after the initiation of suppressive levothyroxine therapy with a novel approach. Methods The study used data from the literature on the influence of exogenous thyroid hormones on bone mass density. Four cubic polynomial equations were obtained by curve fitting for Ward's triangle, trochanter, spine and femoral neck. The behaviors of the models were investigated by statistical and mathematical analyses. Results There are four points of inflexion on the graphs of the first derivatives of the equations with respect to time at about 6, 5, 7 and 5 months. In other words, there is a maximum speed of bone loss around the 6th month after the start of suppressive L-thyroxine therapy in post-menopausal women. Conclusion It seems reasonable to check bone mass density at the 6th month of therapy. More research is needed to explain the cause and to confirm the clinical application of this phenomenon for osteoporosis, but such an approach can be used as a guide to future experimentation. The investigation of change over time may lead to more sophisticated decision making in a wide variety of clinical problems.

  14. Simulation Approach to Mission Risk and Reliability Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  15. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  16. A Comparison of Microeconomic and Macroeconomic Approaches to Deforestation Analysis

    OpenAIRE

    Jeff Felardo

    2016-01-01

    The economics of deforestation has been explored in detail. Generally, the frame of analysis takes either a microeconomics or macroeconomics approach. The microeconomics approach assumes that individual decision makers are responsible for deforestation as a result of utility maximizing behavior and imperfect property right regimes. The macroeconomics approach explores nationwide trends thought to be associated with forest conversion. This paper investigates the relationship between these two ...

  17. Multivariate analysis of 2-DE protein patterns - Practical approaches

    DEFF Research Database (Denmark)

    Jacobsen, Charlotte; Jacobsen, Susanne; Grove, H.;

    2007-01-01

    , although different subsets of protein spots were selected. The explorative approach of using multivariate data analysis and variable selection in the analyses of 2-DEs seems to be promising as a fast, reliable and convenient way of screening and transforming many gel images into spot quantities.......Practical approaches to the use of multivariate data analysis of 2-DE protein patterns are demonstrated by three independent strategies for the image analysis and the multivariate analysis on the same set of 2-DE data. Four wheat varieties were selected on the basis of their baking quality. Two...

  18. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  19. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael;

    2016-01-01

    This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power in agricultu...

  20. Developing a New Approach for Arabic Morphological Analysis and Generation

    OpenAIRE

    Gridach, Mourad; Chenfour, Noureddine

    2011-01-01

    Arabic morphological analysis is one of the essential stages in Arabic Natural Language Processing. In this paper we present an approach for Arabic morphological analysis. This approach is based on Arabic morphological automaton (AMAUT). The proposed technique uses a morphological database realized using XMODEL language. Arabic morphology represents a special type of morphological systems because it is based on the concept of scheme to represent Arabic words. We use this concept to develop th...

  1. A Multivariate Statistical Approach to the Analysis of Rural Development

    OpenAIRE

    Mazzocchi, Mario; Montresor, Elisa

    2000-01-01

    The aim of this work is to contribute to the definition of an analytical approach for evaluating the dynamics in progress in the agricultural and rural development at a territorial level. For this purpose principal components analysis and cluster analysis were applied and the different methodological approaches reviewed. A two-stage method is also proposed. This could provide the analytical tools to simplify and interpret the results of the territorial analyses, also in order to supply a flex...

  2. System analysis using multitracer approaches. Chapter 9

    International Nuclear Information System (INIS)

    The present chapter offers a qualitative or semi-quantitative step towards a synthesis of the information that the different tracers provide. It also offers some criteria that can be used to help assess the reliability of selected tracer data in evaluating tracer model ages from measurements of concentrations of multiple environmental tracers in the system. In most of the literature on isotope hydrology, the term ‘apparent age’ is used instead of ‘tracer model age’, and within this chapter the two terms are considered synonymous. This approach testing reliability of tracer data is only a first step and performs a black and white selection of the tracer data. Some consistency tests are performed, and for data passing these tests there is at least no obvious reason known that the tracer model age is not a valid description. For data that fail these tests, it is obvious that a straightforward calculation of tracer model ages will not give the intended result.

  3. Effects of Uncertainties in Lead Cross Section Data in Monte Carlo Analysis of the RBE C-M Lead-Bismuth Cooled Benchmark

    International Nuclear Information System (INIS)

    This paper describes the problems encountered in the analysis of the RBEC-M a lead-bismuth cooled fast reactor with a high level of primary coolant natural circulation and a gas lift system in the primary circuit. The RBEC-M lead-bismuth cooled fast reactor benchmark was suggested under an IAEA CRP on .Development of Small Reactors without On-site Refuelling. The computational tools: MOCUP, a coupled MCNP-4C and ORIGEN2.1 utility codes with MCNP data libraries based on ENDF/B-VI evaluations; and TRITON, a coupled KENO-V.a and ORIGEN2.1 codes with ENDF/B-V.2 based 238 groups library were used to simulate this benchmark. There are numerous uncertainties in the prediction of core parameters of these reactor designs, arising from approximations used in the solution of the transport equation, in nuclear data processing and cross section libraries generation. In this paper we analysed the effects of uncertainties in lead cross sections data from several versions of ENDF, JENDL and JEFF evaluations. Uncertainties in the cross sections of lead were found particularly large and deserve careful evaluation. (author)

  4. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP–AES and Portable XRF Instruments: A Comparative Study

    Science.gov (United States)

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-01-01

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP–AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP–AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP–AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP–AES analysis data, PXRF analysis data, both ICP–AES and transformed PXRF analysis data by considering the correlation between the ICP–AES and PXRF analysis data, and co-kriging to both the ICP–AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP–AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP–AES and PXRF analysis data. PMID:27043594

  5. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP-AES and Portable XRF Instruments: A Comparative Study.

    Science.gov (United States)

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-04-01

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP-AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP-AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP-AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP-AES analysis data, PXRF analysis data, both ICP-AES and transformed PXRF analysis data by considering the correlation between the ICP-AES and PXRF analysis data, and co-kriging to both the ICP-AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP-AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP-AES and PXRF analysis data. PMID:27043594

  6. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP–AES and Portable XRF Instruments: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Hyeongyu Lee

    2016-03-01

    Full Text Available Understanding spatial variation of potentially toxic trace elements (PTEs in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP–AES analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP–AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP–AES and portable X-ray fluorescence (PXRF analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP–AES analysis data, PXRF analysis data, both ICP–AES and transformed PXRF analysis data by considering the correlation between the ICP–AES and PXRF analysis data, and co-kriging to both the ICP–AES (primary variable and PXRF analysis data (secondary variable. Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP–AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP–AES and PXRF analysis data.

  7. Fatigue in engineering structures. A three fold analysis approach

    International Nuclear Information System (INIS)

    The integrity in most of the engineering structures in influenced by the presence of cracks or crack like defects. These structures fail, even catastrophically if a crack greater than a critically safe size exist. Although most of the optimal designed structures are initially free from critical cracks, sub-critical cracks can lead to failures under cyclic loadings, called fatigue crack growth. It is nearly impractical to prevent sub-critical crack growth in engineering structures particularly in crack sensitive structures like most of the structures in nuclear, aerospace and aeronautical domains. However, it is essential to predict the fatigue crack growth for these structures to preclude the in service failures causing loss of assets. The present research presents an automatic procedure for the prediction of fatigue crack growth in three dimensional engineering structures and the key data for the fracture mechanics based design: the stress intensity factors. Three fold analysis procedures are adopted to investigate the effects of repetitive (cyclic) loadings on the fatigue life of different geometries of aluminum alloy 2219-O. A general purpose Finite Element (FE) Code ANSYS-8.0 is used to predict/estimate the fatigue life of the geometries. Computer codes utilizing the Green's Function are developed to calculate the stress intensity factors. Another code based on superposition technique presented by Shivakumara and Foreman is developed to calculate the fatigue crack growth rate, fatigue life (No. of loading cycles are developed to validate the results and finally full scale laboratory tests are conducted for the comparison of the results. The results showing a close co-relation between the different techniques employed gives the promising feature of the analysis approach for the future work. (author)

  8. Sediment Analysis Using a Structured Programming Approach

    Directory of Open Access Journals (Sweden)

    Daniela Arias-Madrid

    2012-12-01

    Full Text Available This paper presents an algorithm designed for the analysis of a sedimentary sample of unconsolidated material and seeks to identify very quickly the main features that occur in a sediment and thus classify them fast and efficiently. For this purpose, it requires that the weight of each particle size to be entered in the program and using the method of Moments, which is based on four equations representing the mean, standard deviation, skewness and kurtosis, is found the attributes of the sample in few seconds. With the program these calculations are performed in an effective and more accurately way, obtaining also the explanations of the results of the features such as grain size, sorting, symmetry and origin, which helps to improve the study of sediments and in general the study of sedimentary rocks.

  9. Towards a More Holistic Stakeholder Analysis Approach

    DEFF Research Database (Denmark)

    Sedereviciute, Kristina; Valentini, Chiara

    2011-01-01

    identified based on the dimensions of connectivity and the content shared. Accordingly, the study introduces four groups of important actors from social media: unconcerned lurkers, unconcerned influencers, concerned lurkers and concerned influencers and integrates them into the existing Stakeholder Salience......This paper proposes a conceptual direction for organizations of how they could map their stakeholders in a more holistic way. Study suggests, that stakeholder theory is useful in identifying and prioritizing stakeholders that organization is aware of. However, theory is argued being ineffective in...... finding stakeholders on new environments (social media), where connectivity and relationships play a key role. The argument stems from the need to assess stakeholder presence beyond the dyadic ties. Consequently, the combination of the Stakeholder Salience Model (SSM) and social network analysis (SNA) is...

  10. Analysis of Spent Fuel Assay With a Lead Slowing Down Spectrometer

    International Nuclear Information System (INIS)

    Assay of fissile materials in spent fuel that are produced or depleted during the operation of a reactor, is of paramount importance to nuclear materials accounting, verification of the reactor operation history, as well as for criticality considerations for storage. In order to prevent future proliferation following the spread of nuclear energy, we must develop accurate methods to assay large quantities of nuclear fuels. We describe the analysis of an experimental technology to assay spent PWR fuel bundles, using a Lead Slowing Down Spectrometer (LSDS). A LSDS is large block of lead, typically 1 m3, where neutrons from an injected fast-neutron pulse collide, scatter and are slowed down. As time evolves, the average neutron energy decreases. One can use the response of a threshold fission detector (e.g., 238U) to fast neutrons produced by fission, as a function of time, to assay the major fissile isotopes in the bundle. We have performed a detailed analysis of the performance of the LSDS for spent fuel assay, using a combination of Monte Carlo and analytic techniques. Monte Carlo cannot provide sufficient statistical precision for the required analysis in a reasonable computation time. Therefore, we have augmented these calculations with an analytical model. Using this model, we analyze the precision that can be obtained with a measurement duration of a few hours, using a high-power neutron generator that is commercially available. For an assay duration of 2-4 hours, one can determine the concentrations of 239Pu, 241Pu and 235U in the bundle, to about 1%. This assumes medium to high-burnup of the fuel. To do so would require a high-power neutron generator, that is currently commercially available, combined with threshold fission detectors using 238U, or possibly, 232Th. Such measurements would be important for nuclear materials accounting.(especially verification of reactor operation declarations), as well as spent fuel storage and disposal (e.g., criticality

  11. AN EXPLORATORY ANALYSIS ON HALF-HOURLY ELECTRICITY LOAD PATTERNS LEADING TO HIGHER PERFORMANCES IN NEU

    Directory of Open Access Journals (Sweden)

    K.A.D. Deshani

    2014-05-01

    Full Text Available Accurate prediction of electricity demand can bring extensive benefits to any country as the forecasted values help the relevant authorities to take decisions regarding electricity generation, transmission and distribution appropriately. The literature reveals that, when compared to conventional time series techniques, the improved artificial intelligent approaches provide better prediction accuracies. However, the accuracy of predictions using intelligent approaches like neural networks are strongly influenced by the correct selection of inputs and the number of neuro-forecasters used for prediction. Deshani, Hansen, Attygalle, & Karunarathne (2014 suggested that a cluster analysis could be performed to group similar day types, which contribute towards selecting a better set of neuro-forecasters in neural networks. The cluster analysis was based on the daily total electricity demands as their target was to predict the daily total demands using neural networks. However, predicting half-hourly demand seems more appropriate due to the considerable changes of electricity demand observed during a particular day. As such clusters are identified considering half-hourly data within the daily load distribution curves. Thus, this paper is an improvement to Deshani et. al. (2014, which illustrates how the half hourly demand distribution within a day, is incorporated when selecting the inputs for the neuro-forecasters.

  12. Spatial analysis on impacts of mining activities leading to flood disaster in the Erai watershed, India

    Energy Technology Data Exchange (ETDEWEB)

    Katpatal, Y.B.; Patil, S.A. [Visvesvaraya National Institute of Technology, Nagpur (India). Dept. of Civil Engineering

    2010-05-15

    Decisions related to mine management, especially pertaining to dumped material, might lead to several environmental hazards including flood risks in mining areas. Excavation and mine dumps are dominant factors of land use/land cover change in the Erai River watershed of Chandrapur district in Maharashtra, India. Identification and quantification of the extent of mining activities is important for assessing how this change in land use/land cover affects ecosystem components such as aesthetics, biodiversity and mitigation of floods in the Erai watershed. The present study utilizes satellite data of Landsat TM (1989), IRS LISS-3 (1999, 2007) and CARTOSAT (2007) to study the extent of surface mines and management of mine overburden (OB) dumps of Hindustan Lalpeth coal mines, Chandrapur, India. Image processing techniques in conjunction with GIS have been used to visualize the flood scenario, the reasons for floods and area under impact. The study indicates that the development of the mine OB dump within the river channel on both the sides has been responsible for the 2006 flood within the region. Further increase in OB dump heights may result in the risk of floods of greater potential during heavy rainfall in the future. The study presents a spatial analysis to assess the impacts of OB dumps in the recent flood in the area. The study also spatially represents the area under impact leading to a disastrous situation due to floods. The study also suggests the probable measures that must be adopted to avoid such situations in future in the mining areas.

  13. Text Analysis: A Functional Linguistic Approach of News Introduction

    Institute of Scientific and Technical Information of China (English)

    刘锦凤

    2009-01-01

    The past several decades have witnessed a phenomenal growth in interest in text analysis, in which different kinds of approaches have been studied and applied in this field. This paper aims at analyzing the introduction of a cho-sen CNN News from a functional linguis.tic approach, which is mainly realized through cohesive means and textual infor-mation. The study shows that in written text, well-organized semantic cohesive means and textual information are of great significance for readers to follow the movement of an idea from one sentence to another. Therefore, functional approach plays a momentous role in the analysis of a text.

  14. A Project Risk Ranking Approach Based on Set Pair Analysis

    Institute of Scientific and Technical Information of China (English)

    Gao Feng; Chen Yingwu

    2006-01-01

    Set Pair Analysis (SPA) is a new methodology to describe and process system uncertainty. It is different from stochastic or fuzzy methods in reasoning and operation, and it has been applied in many areas recently. In this paper, the application of SPA in risk ranking is presented, which includes review of risk ranking, introduction of Connecting Degree (CD) that is a key role in SPA., Arithmetic and Tendency Grade (TG) of CDs, and a risk ranking approach proposed. Finally a case analysis is presented to illustrate the reasonability of this approach. It is found that this approach is very convenient to operate, while the ranking result is more comprehensible.

  15. Analysis of spent fuel assemblies using a lead slowing down spectrometer

    International Nuclear Information System (INIS)

    We analyze the potential of using Lead Slowing Down Spectrometer technology for assaying spent fuel. This initial study demonstrates that it may be feasible to design a system that will provide approximately 1% statistical precision in the determination of the 239Pu concentration in a pressurized water reactor spent-fuel assembly, for intermediate-to-high burnup levels, using commercial neutron sources, and an array of ultra-high-purity 238U threshold fission detectors. LSDS technology can also determine the concentration of 241Pu and 235U. There is indication that missing pins can be detected, as can asymmetry in the fuel bundle. The analytical model used to perform the viability assessment is described, as are the systematic effects that were not incorporated in this analysis, but could significantly degrade actual performance. These results provide the justification and impetus for the initiation of followup studies that will incorporate the complete suite of effects that impact the accuracy of LSDS measurements.

  16. Analysis of spent fuel assay with a lead slowing down spectrometer

    International Nuclear Information System (INIS)

    Assay of fissile materials in spent fuel that are produced or depleted during the operation of a reactor, is of paramount importance to nuclear materials accounting, verification of the reactor operation history, as well as for criticality considerations for storage. In order to prevent future proliferation following the spread of nuclear energy, we must develop accurate methods to assay large quantities of nuclear fuels. We analyze the potential of using a Lead Slowing Down Spectrometer for assaying spent fuel. We conclude that it is possible to design a system that will provide around 1% statistical precision in the determination of the 239Pu, 241Pu and 235U concentrations in a PWR spent-fuel assembly, for intermediate-to-high burnup levels, using commercial neutron sources, and a system of 238U threshold fission detectors. Pending further analysis of systematic errors, it is possible that missing pins can be detected, as can asymmetry in the fuel bundle.

  17. Analysis of spent fuel assay with a lead slowing down spectrometer

    International Nuclear Information System (INIS)

    Assay of fissile materials in spent fuel that are produced or depleted during the operation of a reactor, is of paramount importance to nuclear materials accounting, verification of the reactor operation history, as well as for criticality considerations for storage. In order to prevent future proliferation following the spread of nuclear energy, we must develop accurate methods to assay large quantities of nuclear fuels. We analyze the potential of using a Lead Slowing Down Spectrometer for assaying spent fuel. We conclude that it possible to design a system that will provide around 1% statistical precision in the determination of the 239Pu, 241Pu and 235U concentrations in a PWR spent-fuel assembly, for intermediate-to-high burnup levels, using commercial neutron sources, and a system of 238U threshold fission detectors. Pending further analysis of systematic errors, it is possible that missing pins can be detected, as can asymmetry in the fuel bundle. (author)

  18. Correction and analysis of lead content in soil by laser-induced breakdown spectroscopy

    Institute of Scientific and Technical Information of China (English)

    Chengli Xie; Jidong Lu; Pengyan Li; Jie Li; Zhaoxiang Lin

    2009-01-01

    The laser-induced breakdown spectroscopy is used to analyze the lead content in soils. The analyzed spectral line profile is fittcd by Lorentzian function for determining the background and the full-width at half-maximum (FWHM) intensity of spectral line. A self-absorption correction model based on the information of spectral broadening is introduced to calculate the true value of spectral line intensity,which refers to the elcnmntal concentration.Tile results show that the background intensity obtained by spectral profile fitting is very effective and important due to removing the interference of spectral broadening,and a better precision of calibration analysis is acquired by correcting the self-absorption effect.

  19. Environmental monitoring near urban lead refineries by photon and neutron activation analysis

    International Nuclear Information System (INIS)

    Photon activation has been used in conjunction with neutron activation for multielement determinations in airborne particulates, soil, and hair samples collected near two secondary lead refineries in Metropolitan Toronto. Particle size distributions of suspended particulates collected with a high volume Andersen sampler are reported for Al, Sb, As, Br, Cl, Mn, Na, Pb, Ti and V. Increases in the concentrations of Pb, As and Sb associated with particles >3.3 μm diameter on certain days near the refineries has resulted in localized contamination as reflected in higher concentrations of these elements in soil. To assess Pb accumulation in local residents compared with control groups, approximately 250 hair samples were analyzed for Pb by photon activation analysis. Children living close to the refineries, especially boys, exhibit the most elevated levels: up to 20 times urban control values in some cases

  20. Representation of autism in leading newspapers in china: a content analysis.

    Science.gov (United States)

    Bie, Bijie; Tang, Lu

    2015-01-01

    The public's lack of understanding and the public's misconceptions about autism in China contribute to the underdiagnosis and undertreatment of the disorder and the stigma associated with it. Mass media are the primary channel through which people learn about autism. This article examines how leading newspapers in China covered autism in the 10-year period of 2003 through 2012 through a framing analysis. It finds that while autism has received increased media attention, it is increasingly framed as a family problem-family members are cited or quoted more than any other sources and the responsibility of dealing with autism is ultimately assigned to families. Autistic people are largely silenced unless they are autistic savants with special talents. The use of the scientific discourse and the human-interest discourse both decrease over time in percentage, while the use of other discourses such as the public relations discourse becomes more dominant. PMID:25074820

  1. Analysis of ECG Using Filter Bank Approach

    Directory of Open Access Journals (Sweden)

    S. Thulasi Prasad

    2014-01-01

    Full Text Available In recent years scientists and engineers are facing several problems in the biomedical field. However Digital Signal Processing is solving many of those problems easily and effectively. The signal processing of ECG is very useful in detecting selected arrhythmia conditions from a patient’s electrocardiograph (ECG signals. In this paper we performed analysis of noisy ECG by filtering of 50 Hz power line interference using an adaptive LMS notch filter. This is very meaningful in the measurement of biomedical events, particularly when the recorded ECG signal is very weak. The basic ECG has the frequency range from 5 Hz to 100 Hz. It becomes difficult for the Specialist to diagnose the diseases if the artifacts are present in the ECG signal. Methods of noise reduction have decisive influence on performance of all electro-cardio-graphic (ECG signal processing systems. After removing 50/60 Hz powerline interference, the ECG is lowpass filtered in a digital FIR filter. We designed a Filter Bank to separate frequency ranges of ECG signal to enhance the occurrences QRS complexes. Later the positions of R-peaks are identified and shown plotted. The result shows the ECG signal before filtering and after filtering with their frequency spectrums which clearly indicates the reduction of the power line interference in the ECG signal and a filtered ECG with identified R-peaks.

  2. Robust approach to ocular fundus image analysis

    Science.gov (United States)

    Tascini, Guido; Passerini, Giorgio; Puliti, Paolo; Zingaretti, Primo

    1993-07-01

    The analysis of morphological and structural modifications of retinal blood vessels plays an important role both to establish the presence of some systemic diseases as hypertension and diabetes and to study their course. The paper describes a robust set of techniques developed to quantitatively evaluate morphometric aspects of the ocular fundus vascular and micro vascular network. They are defined: (1) the concept of 'Local Direction of a vessel' (LD); (2) a special form of edge detection, named Signed Edge Detection (SED), which uses LD to choose the convolution kernel in the edge detection process and is able to distinguish between the left or the right vessel edge; (3) an iterative tracking (IT) method. The developed techniques use intensively both LD and SED in: (a) the automatic detection of number, position and size of blood vessels departing from the optical papilla; (b) the tracking of body and edges of the vessels; (c) the recognition of vessel branches and crossings; (d) the extraction of a set of features as blood vessel length and average diameter, arteries and arterioles tortuosity, crossing position and angle between two vessels. The algorithms, implemented in C language, have an execution time depending on the complexity of the currently processed vascular network.

  3. [System approach and system analysis in dietology].

    Science.gov (United States)

    Samsonov, M A

    2004-01-01

    There is analysis of using of two variants of the auto-program of dietotherapy in the article: numeric system and basis system. They belong the same kind of building type, but are different in the type of functioning principle. Numeric system is built upon nosological principle taking into consideration the clinicopathogenetic features of disease. The basic diets built upon metabolic principle that a matter is adaptation of chemical content, alimentary and food value of diet for concrete mechanism of metabolic disturbance. At the same time metabolic conveyor is considered as system organization of the separate functional systems that are in the permanent dynamic and the interacting with each other. This organization is combined on the principle of auto regulation and set in correction and recovery of disturbed homeostasis as a whole. Selection of practical using of mentioned principles of diets is a right of the specialist-dieitian. Auto-program of diet building should help him in that and simplify the organization of dietotherapy. PMID:15049149

  4. Developing a New Approach for Arabic Morphological Analysis and Generation

    CERN Document Server

    Gridach, Mourad

    2011-01-01

    Arabic morphological analysis is one of the essential stages in Arabic Natural Language Processing. In this paper we present an approach for Arabic morphological analysis. This approach is based on Arabic morphological automaton (AMAUT). The proposed technique uses a morphological database realized using XMODEL language. Arabic morphology represents a special type of morphological systems because it is based on the concept of scheme to represent Arabic words. We use this concept to develop the Arabic morphological automata. The proposed approach has development standardization aspect. It can be exploited by NLP applications such as syntactic and semantic analysis, information retrieval, machine translation and orthographical correction. The proposed approach is compared with Xerox Arabic Analyzer and Smrz Arabic Analyzer.

  5. Terminal Performance of Lead-Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video

    CERN Document Server

    Courtney, Elijah; Andrusiv, Lubov; Courtney, Michael

    2016-01-01

    Due to concerns about environmental and industrial hazards of lead, a number of military, law enforcement, and wildlife management agencies are giving careful consideration to lead-free ammunition. The goal of lead-free bullets is to gain the advantages of reduced lead use in the environment while maintaining equal or better terminal performance. Accepting reduced terminal performance would foolishly risk the lives of military and law enforcement personnel. This paper uses the established technique of studying bullet impacts in ballistic gelatin to characterize the terminal performance of eight commercial off-the- shelf lead-free handgun bullets for comparison with earlier analysis of jacketed lead bullets. Peak retarding force and energy deposit in calibrated ballistic gelatin are quantified using high speed video. The temporary stretch cavities and permanent wound cavities are also characterized. Two factors tend to reduce the terminal performance of these lead-free projectiles compared to similar jacketed ...

  6. Low-level lead exposure and the IQ of children. A meta-analysis of modern studies

    Energy Technology Data Exchange (ETDEWEB)

    Needleman, H.L.; Gatsonis, C.A. (Univ. of Pittsburgh, PA (USA))

    1990-02-02

    We identified 24 modern studies of childhood exposures to lead in relation to IQ. From this population, 12 that employed multiple regression analysis with IQ as the dependent variable and lead as the main effect and that controlled for nonlead covariates were selected for a quantitative, integrated review or meta-analysis. The studies were grouped according to type of tissue analyzed for lead. There were 7 blood and 5 tooth lead studies. Within each group, we obtained joint P values by two different methods and average effect sizes as measured by the partial correlation coefficients. We also investigated the sensitivity of the results to any single study. The sample sizes ranged from 75 to 724. The sign of the regression coefficient for lead was negative in 11 of 12 studies. The negative partial r's for lead ranged from -.27 to -.003. The power to find an effect was limited, below 0.6 in 7 of 12 studies. The joint P values for the blood lead studies were less than .0001 for both methods of analysis (95% confidence interval for group partial r, -.15 {plus minus} .05), while for the tooth lead studies they were .0005 and .004, respectively (95% confidence interval for group partial r, -.08 {plus minus} .05). The hypothesis that lead impairs children's IQ at low dose is strongly supported by this quantitative review. The effect is robust to the impact of any single study.

  7. Integrated micro-biochemical approach for phytoremediation of cadmium and lead contaminated soils using Gladiolus grandiflorus L cut flower.

    Science.gov (United States)

    Mani, Dinesh; Kumar, Chitranjan; Patel, Niraj Kumar

    2016-02-01

    The potential of vermicompost, elemental sulphur, Thiobacillus thiooxidans and Pseudomonas putida for phytoremediation is well known individually but their integrated approach has not been discovered so far. The present work highlights the consideration of so far overlooked aspects of their integrated treatment by growing the ornamental plant, Gladiolus grandiflorus L in uncontaminated and sewage-contaminated soils (sulphur-deficient alluvial Entisols, pH 7.6-7.8) for phytoremediation of cadmium and lead under pot experiment. Between vermicompost and elemental sulphur, the response of vermicompost was higher towards improvement in the biometric parameters of plants, whereas the response of elemental sulphur was higher towards enhanced bioaccumulation of heavy metals under soils. The integrated treatment (T7: vermicompost 6g and elemental sulphur 0.5gkg(-1) soil and co-inoculation of the plant with T. thiooxidans and P. putida) was found superior in promoting root length, plant height and dry biomass of the plant. The treatment T7 caused enhanced accumulation of Cd up to 6.96 and 6.45mgkg(-1) and Pb up to 22.6 and 19.9mgkg(-1) in corm and shoot, respectively at the contaminated soil. T7 showed maximum remediation efficiency of 0.46% and 0.19% and bioaccumulation factor of 2.92 and 1.21 and uptake of 6.75 and 21.4mgkg(-1) dry biomass for Cd and Pb respectively in the contaminated soil. The integrated treatment T7 was found significant over the individual treatments to promote plant growth and enhance phytoremediation. Hence, authors conclude to integrate vermicompost, elemental sulphur and microbial co-inoculation for the enhanced clean-up of Cd and Pb-contaminated soils. PMID:26615479

  8. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    International Nuclear Information System (INIS)

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of 16O is problematic due to lack of correlation between total and elastic reactions

  9. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    Energy Technology Data Exchange (ETDEWEB)

    Vanhanen, R., E-mail: risto.vanhanen@aalto.fi

    2015-03-15

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of {sup 16}O is problematic due to lack of correlation between total and elastic reactions.

  10. ADVANCEMENTS IN TIME-SPECTRA ANALYSIS METHODS FOR LEAD SLOWING-DOWN SPECTROSCOPY

    International Nuclear Information System (INIS)

    Direct measurement of Pu in spent nuclear fuel remains a key challenge for safeguarding nuclear fuel cycles of today and tomorrow. Lead slowing-down spectroscopy (LSDS) is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic mass with an uncertainty lower than the approximately 10 percent typical of today's confirmatory assay methods. Pacific Northwest National Laboratory's (PNNL) previous work to assess the viability of LSDS for the assay of pressurized water reactor (PWR) assemblies indicated that the method could provide direct assay of Pu-239 and U-235 (and possibly Pu-240 and Pu-241) with uncertainties less than a few percent, assuming suitably efficient instrumentation, an intense pulsed neutron source, and improvements in the time-spectra analysis methods used to extract isotopic information from a complex LSDS signal. This previous simulation-based evaluation used relatively simple PWR fuel assembly definitions (e.g. constant burnup across the assembly) and a constant initial enrichment and cooling time. The time-spectra analysis method was founded on a preliminary analytical model of self-shielding intended to correct for assay-signal nonlinearities introduced by attenuation of the interrogating neutron flux within the assembly.

  11. An approach to the economic analysis of water supply projects

    OpenAIRE

    Lovei, Laszlo

    1992-01-01

    Development economists are increasingly concerned about the correct approach to economic analysis of projects. By looking for a compromise between theory (which identifies ideals) and practice (which deals within the bounds of time and resource constraints), Lovei focuses on potential guidelines for economic appraisals of water supply projects. He summarizes theory and the current World Bank guidelines on the economic analysis of water supply projects; reviews the method of economic analysis ...

  12. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    A method was developed and used for LOFT thermal safety analysis to estimate the statisticl margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  13. Preliminary optimization analysis of the radiation shielding of the China Lead-based Research Reactor

    International Nuclear Information System (INIS)

    Accelerator Driven subcritical System (ADS) is recognized as an efficient nuclear waste transmutation device. Supported by the Strategic Priority Research Program of 'the Future Advanced Nuclear Fission Energy-ADS transmutation system', the China LEAd-based Research Reactor (CLEAR-I) is proposed. Along with the approaching of the CLEAR-I design, the radiation shielding for CLEAR-I is updated and optimized step by step to meet with new shielding requirements. Employing the modeling program MCAM and calculation system VisualBUS developed by FDS Team, the shielding capability was verified using Monte Carlo method. As shown from the results, the fast neutron flux for components in reactor vessel is under the limitation and the neutron radiation for mechanism in containing room has been as low as possible. After shutdown for 7 days, the dose rate in most area of containing room is lower than 100 μSv/hr, allowing hands on operation. Replacement of components such as the spallation target in containing room is possible. (author)

  14. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study

    OpenAIRE

    Tania Dehesh; Najaf Zare; Seyyed Mohammad Taghi Ayatollahi

    2015-01-01

    Background. Univariate meta-analysis (UM) procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS) method as a multivariate meta-analysis approach. Me...

  15. An econometric analysis of the lead-lag relationship between India's NSE Nifty and its derivative contracts

    OpenAIRE

    Sathya Swaroop Debasish

    2009-01-01

    Purpose – The purpose of this paper is to examine the lead-lag relationships between the National Stock Exchange (NSE) Nifty stock market index (in India) and its related futures and options contracts, and also the interrelation between the derivatives markets. Design/methodology/approach – The paper uses serial correlation of return series and autoregressive moving average (ARMA) model for studying the lead-lag relationship between hourly returns on the NSE Nifty index and its derivatives co...

  16. The analysis of the effective lead-in helps optimize the English class

    Institute of Scientific and Technical Information of China (English)

    张蕾

    2014-01-01

    Lead-in, the first part of a class guides students to enter the teaching aspects of learning; it seems to be the simplest part which takes the least time during the whole class. But the effective lead-in helps optimize the English class. Lead-in of English class in junior high school teaching methods are various. This thesis lists several of effective and practical ways and some lead-in problems that need to be taken to avoid.

  17. Rapid lead isotope analysis of archaeological metals by multiple-collector inductively coupled plasma mass spectrometry

    DEFF Research Database (Denmark)

    Baker, J.A.; Stos, S.; Waight, Tod Earle

    2006-01-01

    Lead isotope ratios in archaeological silver and copper were determined by MC-ICPMS using laser ablation and bulk dissolution without lead purification. Laser ablation results on high-lead metals and bulk solution analyses on all samples agree within error of TIMS data, suggesting that problems f...

  18. Spectral Synthesis via Mean Field approach Independent Component Analysis

    CERN Document Server

    Hu, Ning; Kong, Xu

    2015-01-01

    In this paper, we apply a new statistical analysis technique, Mean Field approach to Bayesian Independent Component Analysis (MF-ICA), on galaxy spectral analysis. This algorithm can compress the stellar spectral library into a few Independent Components (ICs), and galaxy spectrum can be reconstructed by these ICs. Comparing to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, MF-ICA approach offers a large improvement in the efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter-recover for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters from the Sloan Digital Sky Survey galaxies. We find that our MF-ICA method not only can fit the observed galaxy spectra efficiently, but also can recover the physical parameters of galaxies accurately. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find...

  19. In silico-screening approaches for lead generation: identification of novel allosteric modulators of human-erythrocyte pyruvate kinase.

    Science.gov (United States)

    Tripathi, Ashutosh; Safo, Martin K

    2012-01-01

    Identification of allosteric binding site modulators have gained increased attention lately for their potential to be developed as selective agents with a novel chemotype and targeting perhaps a new and unique binding site with probable fewer side effects. Erythrocyte pyruvate kinase (R-PK) is an important glycolytic enzyme that can be pharmacologically modulated through its allosteric effectors for the treatment of hemolytic anemia, sickle-cell anemia, hypoxia-related diseases, and other disorders arising from erythrocyte PK malfunction. An in-silico screening approach was applied to identify novel allosteric modulators of pyruvate kinase. A small-molecules database of the National Cancer Institute (NCI), was virtually screened based on structure/ligand-based pharmacophore. The virtual screening campaign led to the identification of several compounds with similar pharmacophoric features as fructose-1,6-bisphosphate (FBP), the natural allosteric activator of the kinase. The compounds were subsequently docked into the FBP-binding site using the programs FlexX and GOLD, and their interactions with the protein were analyzed with the energy-scoring function of HINT. Seven promising candidates were obtained from the NCI and subjected to kinetics analysis, which revealed both activators and inhibitors of the R-isozyme of PK (R-PK). PMID:22052500

  20. The Differential Approach to Demand Analysis and the Rotterdam Model.

    OpenAIRE

    Barnett, William A.; Serletis, Apostolos

    2009-01-01

    This paper presents the differential approach to applied demand analysis. The demand systems of this approach are general, having coefficients which are not neces- sarily constant. We consider the Rotterdam parameterization of differential demand systems and derive the absolute and relative price versions of the Rotterdam model, due to Theil (1965) and Barten (1966). We address estimation issues and point out that, unlike most parametric and semi-nonparametric demand systems, the Rotterdam mo...

  1. Assumed mode approach to fast reactor core seismic analysis

    International Nuclear Information System (INIS)

    The need for a time history approach, rather than a response spectrum approach, to the seismic analysis of fast breeder reactor core structures is described. The use of a Rayleigh-Ritz/Assumed Mode formalism for developing mathematical models of reactor cores is presented. Various factors including structural nonlinearity, fluid inertia, and impact which necessitate abandonment of response spectrum methods are discussed. The use of the assumed mode formalism is described in some detail as it applies to reactor core seismic analysis. To illustrate the use of this formal approach to mathematical modeling, a sample reactor problem with increasing complexities of modeling is presented. Finally, several problem areas--fluid inertia, fluid damping, coulomb friction, impact, and modal choice--are discussed with emphasis on research needs for use in fast reactor seismic analysis

  2. Analysis of the zone approach for plutonium facilities

    International Nuclear Information System (INIS)

    In order to examine the effect of different inspection strategies on inspection effort, an analysis was carried out of the zone approach for the international safeguards verifications of a model nuclear fuel cycle. The fuel cycle includes the fabrication of mixed-oxide fresh fuel for nine light-water reactors and one experimental breeder reactor and the subsequent reprocessing of the spent fuel. There are thus two zones to be considered, a plutonium zone and an irradiated fuel zone. The zone approach entails many fewer verifications of nuclear material flows between different material balance areas (facilities) than the facility-oriented approach, and it requires an annual simultaneous physical inventory verification (PIV) and monthly simultaneous interim inventory verifications for timeliness at all the facilities. Therefore, the zone approach yields snapshots of the disposition of the nuclear materials at the time of the simultaneous inventory verifications, but less verified information than a facility-oriented approach encompassing frequent flow verification

  3. Structural analysis of steam generator internals following feed water main steam line break: DLF approach

    International Nuclear Information System (INIS)

    In order to evaluate the possible release of radioactivity in extreme events, some postulated accidents are analysed and studied during the design stage of Steam Generator (SG). Among the various accidents postulated, the most important are Feed Water Line Break (FWLB) and Main Steam Line Break (MSLB). This report concerns with dynamic structural analysis of SG internals following FWLB/MSLB. The pressure/drag-force time histories considered were corresponding to the conditions leading to the accident of maximum potential. The SG internals were analysed using two approaches of structural dynamics. In first approach simplified DLF method was adopted. This method yields an upper bound values of stresses and deflection. In the second approach time history analysis by Mode Superposition Technique was adopted. This approach gives more realistic results. The structure was qualified as per ASME B and PV Code SecIII NB. It was concluded that in all the components except perforated flow distribution plate, the stress values based on elastic analysis are within the limits specified by ASME Code. In case of perforated flow distribution plate during the MSLB transient the stress values based on elastic analysis are higher than the ASME Code limits. Therefore, its limit load analysis had to be done. Finally, the collapse pressure evaluated using limit load analysis was shown to be within the limits of ASME B and PV Code SecIII Nb. (author). 31 refs., 94 figs., 16 tabs

  4. Illustration and analysis of a coordinated approach to an effective forensic trace evidence capability.

    Science.gov (United States)

    Stoney, David A; Stoney, Paul L

    2015-08-01

    An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation. PMID:26042437

  5. Multivariate geometry as an approach to algal community analysis

    Science.gov (United States)

    Allen, T.F.H.; Skagen, S.

    1973-01-01

    Multivariate analyses are put in the context of more usual approaches to phycological investigations. The intuitive common-sense involved in methods of ordination, classification and discrimination are emphasised by simple geometric accounts which avoid jargon and matrix algebra. Warnings are given that artifacts result from technique abuses by the naive or over-enthusiastic. An analysis of a simple periphyton data set is presented as an example of the approach. Suggestions are made as to situations in phycological investigations, where the techniques could be appropriate. The discipline is reprimanded for its neglect of the multivariate approach.

  6. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  7. Computerized two-lead resting ECG analysis for the detection of coronary artery stenosis

    Directory of Open Access Journals (Sweden)

    Eberhard Grube, Andreas Bootsveld, Seyrani Yuecel, Joseph T. Shen, Michael Imhoff

    2007-01-01

    Full Text Available Background: Resting electrocardiogram (ECG shows limited sensitivity and specificity for the detection of coronary artery disease (CAD. Several methods exist to enhance sensitivity and specificity of resting ECG for diagnosis of CAD, but such methods are not better than a specialist's judgement. We compared a new computer-enhanced, resting ECG analysis device, 3DMP, to coronary angiography to evaluate the device's accuracy in detecting hemodynamically relevant CAD. Methods: A convenience sample of 423 patients without prior coronary revascularization was evaluated with 3DMP before coronary angiography. 3DMP's sensitivity and specificity in detecting hemodynamically relevant coronary stenosis as diagnosed with coronary angiography were calculated as well as odds ratios for the 3DMP severity score and coronary artery disease risk factors. Results: 3DMP identified 179 of 201 patients with hemodynamically relevant stenosis (sensitivity 89.1%, specificity 81.1%. The positive and negative predictive values for identification of coronary stenosis as diagnosed in coronary angiograms were 79% and 90% respectively. CAD risk factors in a logistic regression model had markedly lower predictive power for the presence of coronary stenosis in patients than did 3DMP severity score (odds ratio 3.35 [2.24-5.01] vs. 34.87 [20.00-60.79]. Logistic regression combining severity score with risk factors did not add significantly to the prediction quality (odds ratio 36.73 [20.92-64.51]. Conclusions: 3DMP's computer-based, mathematically derived analysis of resting two-lead ECG data provides detection of hemodynamically relevant CAD with high sensitivity and specificity that appears to be at least as good as those reported for other resting and/or stress ECG methods currently used in clinical practice.

  8. The mobility of Atlantic baric depressions leading to intense precipitation over Italy: a preliminary statistical analysis

    Directory of Open Access Journals (Sweden)

    N. Tartaglione

    2006-01-01

    Full Text Available The speed of Atlantic surface depressions, occurred during the autumn and winter seasons and that lead to intense precipitation over Italy from 1951 to 2000, was investigated. Italy was divided into 5 regions as documented in previous climatological studies (based on Principal Component Analysis. Intense precipitation events were selected on the basis of in situ rain gauge data and clustered according to the region that they hit. For each intense precipitation event we tried to identify an associated surface depression and we tracked it, within a large domain covering the Mediterranean and Atlantic regions, from its formation to cyclolysis in order to estimate its speed. 'Depression speeds' were estimated with 6-h resolution and clustered into slow and non-slow classes by means of a threshold, coinciding with the first quartile of speed distribution and depression centre speeds were associated with their positions. Slow speeds occurring over an area including Italy and the western Mediterranean basin showed frequencies higher than 25%, for all the Italian regions but one. The probability of obtaining by chance the observed more than 25% success rate was estimated by means of a binomial distribution. The statistical reliability of the result is confirmed for only one region. For Italy as a whole, results were confirmed at 95% confidence level. Stability of the statistical inference, with respect to errors in estimating depression speed and changes in the threshold of slow depressions, was analysed and essentially confirmed the previous results.

  9. Raman analysis of ferroelectric switching in niobium-doped lead zirconate titanate thin films

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, P. [Facultad de Física, Pontificia Universidad Católica de Chile, Santiago 7820436 (Chile); Ramos-Moore, E., E-mail: evramos@fis.puc.cl [Facultad de Física, Pontificia Universidad Católica de Chile, Santiago 7820436 (Chile); Guitar, M.A. [Functional Materials, Materials Science Department, Saarland University, Saarbrücken D-66123 (Germany); Cabrera, A.L. [Facultad de Física, Pontificia Universidad Católica de Chile, Santiago 7820436 (Chile)

    2014-04-01

    Characteristic Raman vibration modes of niobium-doped lead zirconate titanate (PNZT) are studied as a function of ferroelectric domain switching. The microstructure of PNZT is characterized by scanning electron microscopy and X-ray diffraction. Ferroelectric switching is achieved by applying voltages between the top (Au) and bottom (Pt) electrodes, while acquiring the Raman spectra in situ. Vibrational active modes associated with paraelectric and ferroelectric phases are identified after measuring above and below the ferroelectric Curie temperature, respectively. Changes in the relative intensities of the Raman peaks are observed as a function of the switching voltage. The peak area associated with the ferroelectric modes is analyzed as a function of the applied voltage within one ferroelectric polarization loop, showing local maxima around the coercive voltage. This behavior can be understood in terms of the correlation between vibrational and structural properties, since ferroelectric switching modifies the interaction between the body-centered atom (Zr, Ti or Nb) and the Pb–O lattice. - Highlights: • Electric fields induce structural distortions on ferroelectric perovskites. • Ferroelectric capacitor was fabricated to perform hysteresis loops. • Raman analysis was performed in situ during ferroelectric switching. • Raman modes show hysteresis and inflections around the coercive voltages. • Data can be understood in terms of vibrational–structural correlations.

  10. Raman analysis of ferroelectric switching in niobium-doped lead zirconate titanate thin films

    International Nuclear Information System (INIS)

    Characteristic Raman vibration modes of niobium-doped lead zirconate titanate (PNZT) are studied as a function of ferroelectric domain switching. The microstructure of PNZT is characterized by scanning electron microscopy and X-ray diffraction. Ferroelectric switching is achieved by applying voltages between the top (Au) and bottom (Pt) electrodes, while acquiring the Raman spectra in situ. Vibrational active modes associated with paraelectric and ferroelectric phases are identified after measuring above and below the ferroelectric Curie temperature, respectively. Changes in the relative intensities of the Raman peaks are observed as a function of the switching voltage. The peak area associated with the ferroelectric modes is analyzed as a function of the applied voltage within one ferroelectric polarization loop, showing local maxima around the coercive voltage. This behavior can be understood in terms of the correlation between vibrational and structural properties, since ferroelectric switching modifies the interaction between the body-centered atom (Zr, Ti or Nb) and the Pb–O lattice. - Highlights: • Electric fields induce structural distortions on ferroelectric perovskites. • Ferroelectric capacitor was fabricated to perform hysteresis loops. • Raman analysis was performed in situ during ferroelectric switching. • Raman modes show hysteresis and inflections around the coercive voltages. • Data can be understood in terms of vibrational–structural correlations

  11. Determination of exposure to lead of subjects from southwestern Poland by human hair analysis.

    Science.gov (United States)

    Michalak, Izabela; Wołowiec, Paulina; Chojnacka, Katarzyna

    2014-04-01

    The aim of the present work was to investigate the exposure to lead from various sources by investigation of mineral composition of human scalp hair. The research was carried out on hair sampled from 267 young adults living in Wrocław (southwest Poland). The effect of the place of residence, diet, and lifestyle on lead content in hair was examined by inductively coupled plasma optical emission spectrometry (ICP-OES). Lead was determined at the wavelength 220.353 nm. These outcomes were reached by linking the results of lead level in hair with the results of questionnaire survey. The mean lead level in hair of the whole examined population was 2.01 ± 2.10 mg kg(-1). Lead can enter the human body mainly by inhalation and gastrointestinal absorption. It was found that consuming cheese, fish, and lettuce caused increased level of lead in hair. On the other hand, drinking of milk, tea, coffee, or lemon resulted in decreased content of lead in hair. Additional source of exposure to lead could be cigarette smoking, distance to the traffic road, painting the walls, amalgam filling. Based on the results, it can be concluded that exposure to lead can occur mainly from eating habits and environmental exposure. PMID:24346348

  12. Inventory Modelling for a Manufacturer of Sweets: An Evaluation of an Adjusted Compund Renewal Approach for B-Items With A Relative Short Production Lead Time

    OpenAIRE

    Heuts, R.M.J.; Luijten, M.L.J.

    1999-01-01

    In this paper we are especially interested how to optimize the production/inventory control for a manufacturer of sweets, under the following circumstances: short production lead times in combination with an intermittent demand pattern for the so-called B-taste items. As for A-taste items a compound renewal approach appeared appropriate to control inventory/production, we formulated and tested an adjusted compound renewal approach for B-taste items, because a certain condition was not satisfi...

  13. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Directory of Open Access Journals (Sweden)

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory. 

  14. Data Warehouse Requirements Analysis Framework: Business-Object Based Approach

    Directory of Open Access Journals (Sweden)

    Anirban Sarkar

    2012-01-01

    Full Text Available Detailed requirements analysis plays a key role towards the design of successful Data Warehouse (DW system. The requirements analysis specifications are used as the prime input for the construction of conceptual level multidimensional data model. This paper has proposed a Business Object based requirements analysis framework for DW system which is supported with abstraction mechanism and reuse capability. It also facilitate the stepwise mapping of requirements descriptions into high level design components of graph semantic based conceptual level object oriented multidimensional data model. The proposed framework starts with the identification of the analytical requirements using business process driven approach and finally refine the requirements in further detail to map into the conceptual level DW design model using either Demand-driven of Mixed-driven approach for DW requirements analysi

  15. Lead Slowing-Down Spectrometry Time Spectral Analysis for Spent Fuel Assay: FY12 Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, Jonathan A.; Anderson, Kevin K.; Casella, Andrew M.; Siciliano, Edward R.; Warren, Glen A.

    2012-09-28

    Executive Summary Developing a method for the accurate, direct, and independent assay of the fissile isotopes in bulk materials (such as used fuel) from next-generation domestic nuclear fuel cycles is a goal of the Office of Nuclear Energy, Fuel Cycle R&D, Material Protection and Control Technology (MPACT) Campaign. To meet this goal, MPACT supports a multi-institutional collaboration, of which PNNL is a part, to study the feasibility of Lead Slowing Down Spectroscopy (LSDS). This technique is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic masses in used fuel with an uncertainty considerably lower than the approximately 10% typical of today’s confirmatory methods. This document is a progress report for FY2012 PNNL analysis and algorithm development. Progress made by PNNL in FY2012 continues to indicate the promise of LSDS analysis and algorithms applied to used fuel assemblies. PNNL further refined the semi-empirical model developed in FY2011 based on singular value decomposition (SVD) to numerically account for the effects of self-shielding. The average uncertainty in the Pu mass across the NGSI-64 fuel assemblies was shown to be less than 3% using only six calibration assemblies with a 2% uncertainty in the isotopic masses. When calibrated against the six NGSI-64 fuel assemblies, the algorithm was able to determine the total Pu mass within <2% uncertainty for the 27 diversion cases also developed under NGSI. Two purely empirical algorithms were developed that do not require the use of Pu isotopic fission chambers. The semi-empirical and purely empirical algorithms were successfully tested using MCNPX simulations as well applied to experimental data measured by RPI using their LSDS. The algorithms were able to describe the 235U masses of the RPI measurements with an average uncertainty of 2.3%. Analyses were conducted that provided valuable insight with regard to design requirements (e

  16. A Comparison of Microeconomic and Macroeconomic Approaches to Deforestation Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Felardo

    2016-01-01

    Full Text Available The economics of deforestation has been explored in detail. Generally, the frame of analysis takes either a microeconomics or macroeconomics approach. The microeconomics approach assumes that individual decision makers are responsible for deforestation as a result of utility maximizing behavior and imperfect property right regimes. The macroeconomics approach explores nationwide trends thought to be associated with forest conversion. This paper investigates the relationship between these two approaches by empirically testing the determinants of deforestation using the same data set from Thailand. The theory for both the microeconomics-based and macroeconomics-based approaches are developed and then tested statistically. The models were constructed using established theoretical frames developed in the literature. The results from both models show statistical significance consistent with prior results in the tropical deforestation literature. A comparison of the two approaches demonstrates that the macro approach is useful in identifying relevant aggregate trends in the deforestation process; the micro approach provides the opportunity to isolate factors of those trends which are necessary for effective policy decisions.

  17. An integrated approach to supply chain risk analysis

    OpenAIRE

    Marco, Alberto; Cagliano, Anna Corinna; Rafele, Carlo; Grimaldi, Sabrina

    2012-01-01

    Despite the increasing attention that supply chain risk management is receiving by both researchers and practitioners, companies still lack a risk culture. Moreover, risk management approaches are either too general or require pieces of information not regularly recorded by organisations. This work develops a risk identification and analysis methodology that integrates widely adopted supply chain and risk management tools. In particular, process analysis is performed by means of the standard ...

  18. Thermo-fluid dynamics and corrosion analysis of a self cooled lead lithium blanket for the HiPER reactor

    Science.gov (United States)

    Juárez, R.; Zanzi, C.; Hernández, J.; Sanz, J.

    2015-09-01

    The HiPER reactor is the HiPER project phase devoted to power production. To reach a preliminary reactor design, tritium breeding schemes need to be adapted to the HiPER project technologies selection: direct drive ignition, 150 \\text{MJ}/\\text{shot}× 10 Hz of power released through fusion reactions, and the dry first wall scheme. In this paper we address the main challenge of the HiPER EUROFER-based self cooled lead lithium blanket, which is related to the corrosive behavior of Pb-15.7Li in contact with EUROFER. We evaluate the cooling and corrosion behavior of the so-called separated first wall blanket (SFWB) configuration by performing thermo-fluid dynamics simulations using a large eddy simulation approach. Despite the expected improvement over the integrated first wall blanket, we still find an unsatisfactory cooling performance, expressed as a low outlet Pb-15.7Li temperature plus too high corrosion rates derived from local Pb-15.7Li high temperature and velocity, which can mainly be attributed to the geometry of the channels. Nevertheless, the analysis allowed us to devise future modifications of the SFWB to overcome the limitations found with the present design.

  19. The Existence Of Leading Islands Securing And The Border Areas Unitary State Of Indonesia An Analysis In Law Perspective

    Directory of Open Access Journals (Sweden)

    Nazali

    2015-08-01

    Full Text Available Abstract The research was carried with the aim to discover the existence of securing the foremost islands and state border region of the Republic of Indonesia reviewed from a legal perspective which is directly related to the existence of security and dispute resolution methods as well as the governance of the foremost islands and border region in Kalimantan which bordering Malaysia. This study was conducted in Nunukan district and the surrounding provinces of Kalimantan in this research method that used is normative legal analysis data with juridical and qualitative descriptive approach. The results showed that the security of foremost islands and border region of law perspective in accordance with the Law No. 34 of 2004 regarding the Indonesian National Army has not been implemented to the fullest to realize the security of foremost islands and border region as the frontline of the Republic of Indonesia. The existence of leading islands securing and the border region of the Republic of Indonesia still contain many weaknesses in terms of both governance and security.

  20. Temperature Distribution Analysis of JAERI 60 kA HTS Lead

    Institute of Scientific and Technical Information of China (English)

    FUYoukun; T.Isono

    2003-01-01

    High temperature suprerconductor (HTS)current lead has an advantage in reducing electric power consumption of a refrigerator for a large current superconducting magnet system such as a fusion device. A fusion device requires more than 20 pairs of large current leads and each current capacity is about 60 kA. The conventional 60 kA current lead needs 100 kW electric power for refrigeration and a 2/3 reduction is available by the application of a HTS current lead.

  1. Thermal-hydraulic analysis for the lead-bismuth eutectic cooled reactor. System analysis by MSG-COPD code

    International Nuclear Information System (INIS)

    The feasibility study for fast breeder reactors (FBRs) including related nuclear fuel cycle systems has been started from the 1999 fiscal year by Japan Nuclear Cycle Development Institute (JNC). Phase 1 studies were finished at the end of March, 2000. Various options of FBRs plant systems was studied and concept of Lead-Bismuth Eutectic (LBE) cooled FBRs have been selected as one of these options. In the United States, the LBE cooled reactor has been examined by Generation IV. Plant dynamics analyses on 2 type of LBE-cooled reactors, forced circulation type which designed by JNC and natural circulation type which was designed by University of California, Berkeley, have been performed to understand the basic thermal-hydraulic characteristics of the reactors. As a result of the analysis on JNC forced circulation reactor, it has been clarified that hot coolant remains in the upper plenum by the thermal stratification in case of a manual trip condition. And the characteristics of pump coast down influences core exit high-temperature in case of a loss of power condition. In addition, as a result of analysis on the natural circulation reactor, the flow-redistribution effect in ductless core channels by the buoyancy force has been evaluated for a candidate duct core channels. (author)

  2. Evaluation of factors influencing child abuse leading to oro-facial lesions in Isfahan, Iran: A qualitative approach

    OpenAIRE

    Firoozeh Nilchian; Seyed Ebrahim Jabbarifar; Navid Khalighinejad; Leyli Sadri; Alireza Saeidi; Leila Arbab

    2012-01-01

    Background: Since child abuse and neglect are serious conditions which can potentially lead to inappropriate dental health, we conducted this qualitative study to define the factors influencing child abuse and neglect, which lead to oro-facial lesions. Materials and Methods: Qualitative semi-structured interviews were conducted by social services employees. Purposive sampling was used to recruit participants to capture a range of experiences such as the physical abuse, sexual abuse, role ...

  3. Causal Comparative Analysis: Comprehensive Literacy Approach or the Traditional Reading Approach

    Science.gov (United States)

    Fuda, Jessica Ann

    2009-01-01

    A comparative analysis study, examining the significance in reading achievement between students in the Comprehensive Literacy Program to students in the Traditional Basal Reading Approach was conducted. Implementation of the Comprehensive Literacy Program was an effort to lessen the achievement gap between proficient and low progressing students.…

  4. Modeling Vocabulary Loss——Approach leading to a comprehensive analysis of vocabulary attrition?

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    <正>Ⅰ.Introduction The article the author has chosen entitling itself as Modeling Vocabulary Loss (Applied Linguistics,2004) is composed by Prof.Paul Meara from University of Wales Swansea.The reason has been chosen here is definitely not because of the tentative move

  5. Why does electron sharing lead to covalent bonding? A variational analysis.

    Science.gov (United States)

    Ruedenberg, Klaus; Schmidt, Michael W

    2007-01-15

    Ground state energy differences between related systems can be elucidated by a comparative variational analysis of the energy functional, in which the concepts of variational kinetic pressure and variational electrostatic potential pull are found useful. This approach is applied to the formation of the bond in the hydrogen molecule ion. A highly accurate wavefunction is shown to be the superposition of two quasiatomic orbitals, each of which consists to 94% of the respective atomic 1s orbital, the remaining 6% deformation being 73% spherical and 27% nonspherical in character. The spherical deformation can be recovered to 99.9% by scaling the 1s orbital. These results quantify the conceptual metamorphosis of the free-atom wavefunction into the molecular wavefunction by orbital sharing, orbital contraction, and orbital polarization. Starting with the 1s orbital on one atom as the initial trial function, the value of the energy functional of the molecule at the equilibrium distance is stepwise lowered along several sequences of wavefunction modifications, whose energies monotonically decrease to the ground state energy of H2+. The contributions of sharing, contraction and polarization to the overall lowering of the energy functional and their kinetic and potential components exhibit a consistent pattern that can be related to the wavefunction changes on the basis of physical reasoning, including the virial theorem. It is found that orbital sharing lowers the variational kinetic energy pressure and that this is the essential cause of covalent bonding in this molecule. PMID:17143869

  6. A Morphogenetic Design Approach with Embedded Structural Analysis

    DEFF Research Database (Denmark)

    Jensen, Mads Brath; Kirkegaard, Poul Henning; Holst, Malene Kirstine

    The present paper explores a morphogenetic design approach with embedded structural analysis for architectural design. A material system based on a combined space truss and membrane system has been derived as a growth system with inspiration from natural growth of plants. The structural system is...

  7. Practical approach on gas pipeline compression system availability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Sidney Pereira dos [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Kurz, Rainer; Lubomirsky, Matvey [Solar Turbines, San Diego, CA (United States)

    2009-12-19

    Gas pipeline projects traditionally have been designed based on load factor and steady state flow. This approach exposes project sponsors to project sustainability risks due to potential losses of revenues and transportation contract penalties related to pipeline capacity shortage as consequence of compressor unit's unavailability. Such unavailability should previously be quantified during the design phase. This paper presents a case study and a methodology that highlights the practical benefits of applying Monte Carlo simulation for the compression system availability analysis in conjunction with quantitative risk analysis and economic feasibility study. Project economics main variables and their impacts on the project NPV (Net Present Value) are evaluated with their respective statistics distribution to quantify risk and support decision makers to adopt mitigating measures to guarantee competitiveness while protecting project sponsors from otherwise unpredictable risks. This practical approach is compared to load factor approach and the results are presented and evaluated. (author)

  8. A divergent synthetic approach to diverse molecular scaffolds: assessment of lead-likeness using LLAMA, an open-access computational tool.

    Science.gov (United States)

    Colomer, Ignacio; Empson, Christopher J; Craven, Philip; Owen, Zachary; Doveston, Richard G; Churcher, Ian; Marsden, Stephen P; Nelson, Adam

    2016-06-01

    Complementary cyclisation reactions of hex-2-ene-1,6-diamine derivatives were exploited in the synthesis of alternative molecular scaffolds. The value of the synthetic approach was analysed using LLAMA, an open-access computational tool for assessing the lead-likeness and novelty of molecular scaffolds. PMID:27145833

  9. An analysis of lead (Pb) from human hair samples (20-40 years of age) by atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    This analysis of lead from human hair samples in five different groups namely scavengers from Payatas Quezon City, tricycle drivers, car shop workers, paint factory workers, and students from Polytechnic University of the Philippines. The people from Nagcarlan, Laguna represented as a ''base-line value'' or as a control group. The method applied was acid digestion using HNO3 and HClO4 then the samples were subjected to atomic absorption spectrophotometer. In terms of lead found from hair, the scavengers from Payatas Q.C. obtained high exposure of lead among the samples that were tested. The result of the analysis of concentration of lead was expressed in mg/L. (Authors)

  10. Vapor cooled lead and stacks thermal performance and design analysis by finite difference techniques

    International Nuclear Information System (INIS)

    Investigation of the combined thermal performance of the stacks and vapor-cooled leads for the Mirror Fusion Test Facility-B (MFTF-B) demonstrates considerable interdependency. For instance, the heat transfer to the vapor-cooled lead (VCL) from warm bus heaters, environmental enclosure, and stack is a significant additional heat load to the joule heating in the leads, proportionately higher for the lower current leads that have fewer current-carrying, counter flow coolant copper tubes. Consequently, the specific coolant flow (G/sec-kA-lead pair) increases as the lead current decreases. The definition of this interdependency and the definition of necessary thermal management has required an integrated thermal model for the entire stack/VCL assemblies. Computer simulations based on finite difference thermal analyses computed all the heat interchanges of the six different stack/VCL configurations. These computer simulations verified that the heat load of the stacks beneficially alters the lead temperature profile to provide added stability against thermal runaway. Significant energy is transferred through low density foam filler in the stack from warm ambient sources to the vapor-cooled leads

  11. An Exploration of Students' Motivation to Lead: An Analysis by Race, Gender, and Student Leadership Behaviors

    Science.gov (United States)

    Rosch, David M.; Collier, Daniel; Thompson, Sara E.

    2015-01-01

    This exploratory study examined the motivation to lead of a random sample of 1,338 undergraduate students to determine the degree to which motivation to lead can predict leadership behaviors. Results suggested that students' internal self-identity as a leader positively predicted behavior, while their "social normative" motivation to…

  12. Overview of the use of ATHENA for thermal-hydraulic analysis of systems with lead-bismuth coolant

    International Nuclear Information System (INIS)

    The INEEL and MIT are investigating the suitability of lead-bismuth cooled fast reactor for producing low-cost electricity as well as for actinide burning. This paper is concerned with the general area of thermal-hydraulics of lead-bismuth cooled reactors. The ATHENA code is being used in the thermal-hydraulic design and analysis of lead-bismuth cooled reactors. The ATHENA code was reviewed to determine its applicability for simulating lead-bismuth cooled reactors. Two modifications were made to the code as a result of this review. Specifically, a correlation to represent heat transfer from rod bundles to a liquid metal and a void correlation based on data taken in a mixture of lead-bismuth and steam were added the code. The paper also summarizes the analytical work that is being performed with the code and plans for future analytical work

  13. Low-level lead exposure and children's IQ: A meta-analysis and search for a threshold

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, J. (Harvard School of Public Health, Boston, MA (United States))

    1994-04-01

    To assess the strength of the association between blood lead and children's IQ, a meta-analysis of the studies examining the relationship in school age children was performed. Emphasis was given to the size of the effect, since that allow comparisons that are informative about potential confounding and effect modifiers. Sensitivity analyses were also performed. A highly significant association was found between lead exposure and children's IQ (P < 0.001). An increase in blood lead from 10 to 20 [mu]g/dl was associated with a decrease of 2.6 IQ points in the meta-analysis. This result was robust to inclusion or exclusion of the strongest individual studies and to relaxing the age requirements (school age children) of the meta-analysis. Adding eight studies with effect estimates of O would still leave a significant association with blood lead (P < 0.01). There was no evidence that the effect was limited to disadvantaged children and there was a suggestion of the opposite. The studies with mean blood lead levels of 15 [mu]g/dl or lower in their sample had higher estimated blood lead slopes, suggesting that a threshold at 10 [mu]g/dl is implausible. The study with the lowest mean blood lead level was examined using nonparametric smoothing. It showed no evidence of a threshold down to blood lead concentrations of 1 [mu]g/dl. Lead interferes with GABAergic and dopaminergic neurotransmission. It has been shown to bind to the NMDA receptor and inhibit long-term potentiation in the hippocampal region of the brain. Moreover, experimental studies have demonstrated that blood levels of 10 [mu]g/dl interfere with a broad range of cognitive function in primates. Given this support, these associations in humans should be considered causal. 32 refs., 4 figs., 1 tab.

  14. A functional genomics approach using metabolomics and in silico pathway analysis

    DEFF Research Database (Denmark)

    Förster, Jochen; Gombert, Andreas Karoly; Nielsen, Jens

    2002-01-01

    In the field of functional genomics increasing effort is being undertaken to analyze the function of orphan genes using metabolome data. Improved analytical equipment allows screening simultaneously for a high number of metabolites. Such metabolite profiles are analyzed using multivariate data...... analysis techniques and changes in the genotype will in many cases lead to different metabolite profiles. Here, a theoretical framework that may be applied to identify the function of orphan genes is presented. The approach is based on a combination of metabolome analysis combined with in silico pathway...... analysis. Pathway analysis may be carried out using convex analysis and a change in the active pathway structure of deletion mutants expressed in a different metabolite profile may disclose the function or the functional class of an orphan gene. The concept is illustrated using a simplified model for...

  15. Analysis of sexual behaviour in male rabbits across successive tests leading to sexual exhaustion

    Directory of Open Access Journals (Sweden)

    Pedro Jimenez

    2012-04-01

    Full Text Available Various parameters of sexual behaviour were studied in ten male rabbits daily tested with sexually receptive females (ovariectomized, given estradiol benzoate s.c. 5 µg/day.  The aim of this study was to analyse rabbit sexual behaviour during successive tests leading to sexual exhaustion.  We allowed copulation ad libitum and determined if sexual satiety was reached within a day and sexual exhaustion across several days.  The pair was allowed to copulate freely until the male failed to show sexual interest in that female for 30 minutes. The female was then removed and replaced by another; this procedure was repeated using as many does as needed, until the male showed no interest in any female for 2 hours. Scent-marking (chinning was also recorded, before and after the copulation test.  This whole procedure was repeated daily until the male showed no sexual behaviour at all on a given day.  Within a test, copulation ad libitum led to a gradual increase in the time interval between successive mounts and ejaculations, regardless of the day of testing.  Such increments predicted that the buck was reaching sexual satiety.  The “miss” rate (i.e., the proportion of mounts that did not culminate in ejaculation significantly increased from a median of 25 on the first day to 55 on the last day of testing.  The mean time to reach copulatory inactivity decreased from 4 hrs on the first day to 1 hr on the last day.  The total number of ejaculations within a test decreased from an average of 22 to 6 (first vs last day, respectively and the number of chin marks was reduced by 69% compared with pre-mating values, regardless of the day of testing.  All bucks eventually stopped copulating after a variable number of days (range=2-15 days.  We concluded that, following copulation ad libitum with  several females, male rabbits reach sexual satiety (i.e., they are unable to continue copulating on the same day and, after several days, they also attain

  16. Estimation of bioaccumulation of lead in the aquatic plants using 14 MeV neutron activation analysis

    International Nuclear Information System (INIS)

    Three aquatic plants, water hyacinth, Hydrilla and Pithophora were exposed to different concentrations of lead and the accumulation of lead in these plants for different exposure period was studied using 14 MeV (with a flux of approximately equal to 2x108 ncm-2sec-1) neutron activation analysis technique. The lead uptake in these plants was estimated by measuring gamma activity due to sup(207m)Pb (T=0.8 sec) produced by 14 MeV neutrons. Possibility of using these plants for waste water treatment is discussed. (author)

  17. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  18. Analysis of the stability of native oxide films at liquid lead/metal interfaces

    International Nuclear Information System (INIS)

    The interface between liquid lead and different metallic solids (pure metals: Al, Fe and Ni, and T91 steel) was investigated below 400 deg C under ultrahigh vacuum (UHV) by wetting experiments. The aim was to check the physical stability of native oxide films grown at the surface of the substrates, along a contact with liquid lead. Two types of metallic substrates were used: i) conventional bulk polycrystals, and ii) nanocrystalline films obtained by e-beam evaporation under UHV. The actual contact between liquid lead and the solid substrates was achieved by preparing lead drops in-situ. Wetting experiments were performed using sessile drop and/or liquid bridge methods. Fresh solid surfaces and former liquid/solid interfaces can be explored by squeezing and stretching a liquid lead bridge formed between two parallel and horizontal substrates. It is shown that the contact with liquid lead produces the detachment of the native oxide films grown on the metallic solids. It is concluded that if oxide coatings are needed to protect a metallic solid from attack by liquid lead, they should be self-renewable. (authors)

  19. Determination of exposure to lead of subjects from southwestern Poland by human hair analysis

    OpenAIRE

    Michalak, Izabela; Wołowiec, Paulina; Chojnacka, Katarzyna

    2013-01-01

    The aim of the present work was to investigate the exposure to lead from various sources by investigation of mineral composition of human scalp hair. The research was carried out on hair sampled from 267 young adults living in Wrocław (southwest Poland). The effect of the place of residence, diet, and lifestyle on lead content in hair was examined by inductively coupled plasma optical emission spectrometry (ICP-OES). Lead was determined at the wavelength 220.353 nm. These outcomes were reache...

  20. Solid state NMR as a new approach for the structural characterization of rare-earth doped lead lanthanum zirconate titanate laser ceramics

    International Nuclear Information System (INIS)

    To facilitate the design of laser host materials with optimized emission properties, detailed structural information at the atomic level is essential, regarding the local bonding environment of the active ions (distribution over distinct lattice sites) and their extent of local clustering as well as their population distribution over separate micro- or nano-phases. The present study explores the potential of solid state NMR spectroscopy to provide such understanding for rare-earth doped lead lanthanum zirconate titanate (PLZT) ceramics. As the NMR signals of the paramagnetic dopant species cannot be observed directly, two complementary approaches are utilized: (1) direct observation of diamagnetic mimics using 45Sc NMR and (2) study of the paramagnetic interaction of the constituent host lattice nuclei with the rare-earth dopant, using 207Pb NMR lineshape analysis. 45Sc MAS NMR spectra of scandium-doped PLZT samples unambiguously reveal scandium to be six-coordinated, suggesting that this rare-earth ion substitutes in the B site. Static 207Pb spin echo NMR spectra of a series of Tm-doped PLZT samples reveal a clear influence of paramagnetic rare-earth dopant concentration on the NMR lineshape. In the latter case high-fidelity spectra can be obtained by spin echo mapping under systematic incrementing of the excitation frequency, benefiting from the signal-to-noise enhancement afforded by spin echo train Fourier transforms. Consistent with XRD data, the 207Pb NMR lineshape analysis suggests that statistical incorporation into the PLZT lattice occurs at dopant levels of up to 1 wt.% Tm3+, while at higher levels the solubility limit is reached. (author)

  1. A new approach for the analysis of functionally graded beams

    Directory of Open Access Journals (Sweden)

    M. Mirzababaee

    2006-04-01

    Full Text Available Purpose: It is the intention of the present study to develope a new beam theory for the analysis of functionallygraded compopsite beams to overcome the shortcomings present in the existing beam theories.Design/methodology/approach: Within the displacement field of a first-order shear deformation theory and byusing the Hamilton principle the governing equations of motion are obtained for both the new and the existingbeam theories. The beams are assumed to have isotropic, two-constituent material distribution through thethickness.Findings: It is found that the procedure used is simple and straightforward and similar to the one used in thedevelopment of shear deformation plate and shell theories. It is analytically showed that the new approach yieldsidentical results as those obtained by using the existing first-order shear deformation theory.Research limitations/implications: The new approach can be adopted in developing higher-order sheardeformation and layerwise theories. It is believed that the new approach has advantage with respect to theexisting beam theories especially for developing beam layerwise theories.Practical implications: The new shear deformation beam theory can be used to develop a new beam elementfor analysis of practical composite beam structures.Originality/value: The paper introduces an approach to develop a new theory for modeling composite beams.The resulting equations of motion may be solved analytically or by using finite element method.

  2. An Efficient Soft Set-Based Approach for Conflict Analysis

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  3. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  4. Holistic approach to analysis of medical data: vulvar cancer.

    Science.gov (United States)

    Buković, D; Rudan, I; Ivanisević, M; Sostarić, S; Rubala, D

    1997-06-01

    This paper continues the series of studies introducing holistic approach to analysis of clinical data. Namely, besides the information regarding his/her disease, each hospitalized cancer patient also provides the variety of data regarding his/her psychological, cultural, social, economical, genetic, constitutional and medical background. The aim of this study was to introduce a holistic approach to analysis of medical data, in this case clinical data regarding cancer of the vulva. Such approach requires the collection of data regarding different aspects of the cancer patients, and after the satisfactory sample size is obtained (which should be at least five times greater than the number of examined patient characteristics), the performance of factor analysis. In this study, the authors have processed the data regarding 25 characteristics of all 755 vulvar cancer patients treated between 1938 and 1990 at the Department for Gynecological Oncology of the University Hospital for Gynecology and Obstetrics, Zagreb, Croatia. In factor analysis, the principal components were rotated after the initial extraction (the authors recommended the use of oblimin rotation) in order to obtain better ground for interpretation of the obtained results. The next step in this approach was the stepwise exclusion of characteristics with smallest commonality according to Kaiser-Meyer-Olkin criteria, and retaining the characteristics and components with the most significant impact on the explained system variance. When the number of principal components and initial analyzed characteristics was reduced to 3-4 and 7-10, respectively, the ultimate interpretations and conclusions were made. This approach outlined some clusters of correlations between medical data which are difficult to identify using other statistical procedures, primarily the impacts of various socioeconomic and hereditary-constitutional variables on overall survival. PMID:9225511

  5. Lead Pipe Scale Analysis Using Broad-Beam Argon Ion Milling to Elucidate Drinking Water Corrosion

    Science.gov (United States)

    Herein, we compared the characterization of lead pipe scale removed from a drinking water distribution system using two different cross section methods (conventional polishing and argon ion beam etching). The pipe scale solids were analyzed using scanning electron microscopy (SEM...

  6. Microscopic and electrochemical characterization of lead film electrode applied in adsorptive stripping analysis

    International Nuclear Information System (INIS)

    Lead film electrodes (PbFEs) deposited in situ on glassy carbon or carbon paste supports have recently found application in adsorptive stripping voltammetric determination of inorganic ions and organic substances. In this work, the PbFE, prepared in ammonia buffer solutions, was investigated using scanning electron microscopy, atomic force microscopy and various voltammetric techniques. The microscopic images of the lead films deposited on the glassy carbon substrate showed a considerable variability in microstructure and compactness of the deposited layer depending on the selected experimental conditions, such as the concentration of Pb(II) species, the nucleation and deposition potential, and the time applied. The catalytic adsorptive systems of cobalt and nickel in a solution containing 0.1 ammonia buffer, 2.5 x 10-5 M nioxime and 0.25 M NaNO2 were employed to investigate the electrochemical characteristics and utility of the in situ prepared lead films. The optimal parameters, i.e. the lead concentration in the solution, the procedure of film removal, and the time and potential of lead nucleation and film deposition for the adsorptive determination of metal traces, were selected, resulting in the very good reproducibility (RSD = 4.2% for 35 scans) of recorded signals. The voltammetric utility of the lead film electrode was compared to that of glassy carbon, mercury film and bismuth film electrodes, and was subsequently evaluated as superior.

  7. Lead in atmospheric precipitation: Analysis of atmospheric precipitation pollution monitoring data for location “Kamenički vis”, Serbia

    Directory of Open Access Journals (Sweden)

    Ćosović Aleksandar R.

    2013-01-01

    Full Text Available In this paper an overview of data collected during monitoring of lead content in atmospheric precipitation on GAW/EMEP (Global Atmosphere Watch/European Monitoring and Evaluation Programme station “Kamenicki Vis”, Serbia from 2000 to 2010 is given. Annual arithmetic mean concentrations, weighted arithmetic mean concentrations, and median of week samples are presented. Obtained data was compared with results of analysis of atmospheric precipitation collected on experimental EMEP station “Zeleno brdo“, Serbia and discussed in scope of European average levels of lead content in precipitation and air. Significant increase of average annual lead content in precipitation was observed in 2003 and 2007. Observed peaks can not be seen on average European trends thus lead to conclusion that recorded increases are characteristic for local region. In order to further discuss nature and direction of possible sources of detected lead pollution short analysis of lead emission data was performed. An effort was made to gather data from the counties that lay in directions from which dominant winds blow as well as for Serbia. For this purpose total national emissions from LRTAP (Long-range transboudary air pollution Convention emission inventory report and EMEP emission inventory were used as well as data published by relevant national authorities. According to these emission levels, majority of surrounding countries couldn’t have contributed much to the recorded increases of lead content in precipitation. However, several possible sources were revealed. In all studied countries emission levels steadily dropped during the analyzed period, whereas only for Serbia different trend was observed. Presented data leads to conclusion that recorded increase of lead content in precipitation in 2003 probably originates from trans-boundary contributions, while increases in 2007 and onwards may come from Serbia’s own emissions. [Projekat Ministarstva nauke

  8. A New Approach to Pointer Analysis for Assignments

    Institute of Scientific and Technical Information of China (English)

    HUANG Bo; ZANG Binyu; LI Jing; ZHU Chuanqi

    2001-01-01

    Pointer analysis is a technique to identify at compile-time the po tential values of the pointer expressions in a program, which promises significant benefits for optimizing and parallelizing compilers. In this paper, a new approach to pointer analysis for assignments is presented. In this approach, assignments are clas sified into three categories: pointer assignments, structure (union) assignments and normal assignments which don't affect the point-to information. Pointer analyses for these three kinds of assignments respectively make up the integrated algorithm. When analyzing a pointer assignment, a new method called expression expansion is used to calculate both the left targets and the right targets. The integration of recursive data structure analysis into pointer analysis is a significant originality of this paper, which uniforms the pointer analysis for heap variables and the pointer analysis for stack variables. This algorithm is implemented in Agassiz, an analyzing tool for C programs developed by Institute of Parallel Processing, Fudan University. Its accuracy and effectiveness are illustrated by experimental data.

  9. Leading coordinate analysis of reaction pathways in proton chain transfer: Application to a two-proton transfer model for the green fluorescent protein

    International Nuclear Information System (INIS)

    The 'leading coordinate' approach to computing an approximate reaction pathway, with subsequent determination of the true minimum energy profile, is applied to a two-proton chain transfer model based on the chromophore and its surrounding moieties within the green fluorescent protein (GFP). Using an ab initio quantum chemical method, a number of different relaxed energy profiles are found for several plausible guesses at leading coordinates. The results obtained for different trial leading coordinates are rationalized through the calculation of a two-dimensional relaxed potential energy surface (PES) for the system. Analysis of the 2-D relaxed PES reveals that two of the trial pathways are entirely spurious, while two others contain useful information and can be used to furnish starting points for successful saddle-point searches. Implications for selection of trial leading coordinates in this class of proton chain transfer reactions are discussed, and a simple diagnostic function is proposed for revealing whether or not a relaxed pathway based on a trial leading coordinate is likely to furnish useful information

  10. Analysing the process leading to cooperation or refusal using call record data: A multilevel multinomial modelling approach

    OpenAIRE

    D'Arrigo, Julia; Gabriele B. Durrant; STEELE, FIONA

    2011-01-01

    In recent years, survey agencies have started to collect detailed call record data, including information on the timing and outcome of each interviewer call to a household. In interviewbased household surveys, effective interviewer calling behaviours are critical in achieving cooperation and reducing the likelihood of refusal. This paper aims to analyze interviewer call record data to inform the process leading to cooperation or refusal in face-to-face surveys. Of particular interest are the ...

  11. Discordant diagnoses obtained by different approaches in antithrombin mutation analysis

    DEFF Research Database (Denmark)

    Feddersen, Søren; Nybo, Mads

    OBJECTIVES: In hereditary antithrombin (AT) deficiency it is important to determine the underlying mutation since the future risk of thromboembolism varies considerably between mutations. DNA investigations are in general thought of as flawless and irrevocable, but the diagnostic approach can be...... critical. We therefore investigated mutation results in the AT gene, SERPINC1, with two different approaches. DESIGN AND METHODS: Sixteen patients referred to the Centre for Thrombosis and Haemostasis, Odense University Hospital, with biochemical indications of AT deficiency, but with a negative denaturing...... high-performance liquid chromatography (DHPLC) mutation screening (routine approach until recently) were included. As an alternative mutation analysis, direct sequencing of all exons and exon-intron boundaries without pre-selection by DHPLC was performed. RESULTS: Out of sixteen patients with a...

  12. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  13. Theorizing the Process of Coping with Sexual Disorders Leading to Marital Conflicts based on Grounded Theory Approach

    Directory of Open Access Journals (Sweden)

    M.Alikhani*

    2014-12-01

    Full Text Available The present study was to theorize about the process of coping with sexual disorders leading to marital conflicts. The process of coping with sexual disorders leading to marital conflicts was examined with 12 couples based on grounded theory. The focus of the study was on the period from commencing of symptoms up to start of treatment. Data were collected through semi-organized interviews and were analyzed through constant comparisons. It was recognized that problem solving skills was the main variable in the process of coping with sexual disorders leading to marital conflicts. The main variable consisted of two levels including ‘single-couple’ and ‘interactional’ and five main categories named as recognizing sexual disorder symptoms, personal assessment, self-attempt, threat feeling, consulting with others which ultimately led to searching help, consultation, and treatment. The preliminary individual decision to decrease the symptoms resulted in self-treatment which consequently defered the treatment period. Age, gender, education level, socio-economical status and pre-knowledge of disorders affected people`s decision making time. Individuals with sexual disorders defer the start of treatment and this can bring a family to separation. Couples should take pre-marriage counseling sessions in order to make decision for treatment at the right time when faced by sexual disorders.

  14. Efficient Multidisciplinary Analysis Procedure Using Multi-Level Parallelization Approach

    Science.gov (United States)

    Byun, Chansup; Hatay, Ferhat; Farhangnia, Mehrdad; Guruswamy, Guru; VanDalsem, William R. (Technical Monitor)

    1997-01-01

    implement the proposed approach. The communication data structure required for the proposed approach will be studied in detail. This work will demonstrate the feasibility of using multi-level parallelization approach in multidisciplinary analysis applications.

  15. Applications of Crown Ether Cross-Linked Chitosan for the Analysis of Lead and Cadmium in Environmental Water Samples

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A new type of crown ether cross-linked chitosan was synthesized by the reaction of chitosan with 4,4'-dibromodibenzo-18-crown-6 (Br-DBC). Its token structure was analyzed with FT-IR and NMR and the adsorption behaviors for lead and cadmium in environmental water samples by FAAS were studied. In addition the best analysis conditions were discussed and the adsorption mechanism was explained. As the enrichment factor is above 100, both recoveries are 94%-106%, the detection limits of lead and cadmium are 0.5μg*L-1and 0.04 μg*L-1 and the relatively standard deviations of lead and cadmium are 3.1% and 2.8% respectively, this new method was successfully applied to the determination of environmental water samples. This method is fast and simple and it greatly enhances the determination ability of FAAS for lead and cadmium.

  16. Externalities and energy policy: the life cycle analysis approach

    International Nuclear Information System (INIS)

    Getting the prices right is a prerequisite for energy market mechanisms to work effectively towards the development of sustainable energy mixes. External costs of energy have been recognised and assessed in many studies, and the life cycle analysis (LCA) approach provides a conceptual framework for a detailed and comprehensive, comparative evaluation of alternative technology options. Despite this, results from analytical work on externalities and LCA studies are seldom used in policy making. The International Energy Agency (IEA) and the OECD Nuclear Energy Agency (NEA) organised a workshop on 'Externalities and Energy Policy: The Life Cycle Analysis Approach' to bring together policy makers and experts from governmental agencies and the industry to discuss key issues regarding the role and limitations of external cost evaluations and LCA results. The presentations and discussions reported in these proceedings will be of interest to senior analysts, policy makers and other stakeholders concerned with the sustainable development of the energy sector. (author)

  17. Accounting for Errors in Model Analysis Theory: A Numerical Approach

    Science.gov (United States)

    Sommer, Steven R.; Lindell, Rebecca S.

    2004-09-01

    By studying the patterns of a group of individuals' responses to a series of multiple-choice questions, researchers can utilize Model Analysis Theory to create a probability distribution of mental models for a student population. The eigenanalysis of this distribution yields information about what mental models the students possess, as well as how consistently they utilize said mental models. Although the theory considers the probabilistic distribution to be fundamental, there exists opportunities for random errors to occur. In this paper we will discuss a numerical approach for mathematically accounting for these random errors. As an example of this methodology, analysis of data obtained from the Lunar Phases Concept Inventory will be presented. Limitations and applicability of this numerical approach will be discussed.

  18. Computer vision approaches to medical image analysis. Revised papers

    International Nuclear Information System (INIS)

    This book constitutes the thoroughly refereed post proceedings of the international workshop Computer Vision Approaches to Medical Image Analysis, CVAMIA 2006, held in Graz, Austria in May 2006 as a satellite event of the 9th European Conference on Computer Vision, EECV 2006. The 10 revised full papers and 11 revised poster papers presented together with 1 invited talk were carefully reviewed and selected from 38 submissions. The papers are organized in topical sections on clinical applications, image registration, image segmentation and analysis, and the poster session. (orig.)

  19. Systematic approach to analysis of lympho- and hemopoietic postradiation defects

    International Nuclear Information System (INIS)

    Human intrathymic precursors are characterized by mitogenic reaction on thymic hormone, lacking the cortical and medullar thymic membrane markers and expressing of the common T-cell. Fraction 1.062 g/m3 of periferal white blood cells contains cells with analogous characteristics. The cell fraction of blood from men participated in amelioration after the Chernobyl accident is analysis. Defects of T-cell and hemopoietic differentiation in children migrated from the Chernobyl area to Moscow Region were more substantial than in 'ameliorators' and correlated with drop of thymic serum activity level. Research substantiates the approach to analysis of radiation-induced defects in early stages of lympho- and hemopoiesis

  20. New approach to gallbladder ultrasonic images analysis and lesions recognition.

    Science.gov (United States)

    Bodzioch, Sławomir; Ogiela, Marek R

    2009-03-01

    This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards detection of disease symptoms on processed images. First, in this paper, there is presented a new method of filtering gallbladder contours from USG images. A major stage in this filtration is to segment and section off areas occupied by the said organ. In most cases this procedure is based on filtration that plays a key role in the process of diagnosing pathological changes. Unfortunately ultrasound images present among the most troublesome methods of analysis owing to the echogenic inconsistency of structures under observation. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours. The algorithm is based on rank filtration, as well as on the analysis of histogram sections on tested organs. The second part concerns detecting lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. Usually the final stage is to make a diagnosis based on the detected symptoms. This last stage can be carried out through either dedicated expert systems or more classic pattern analysis approach like using rules to determine illness basing on detected symptoms. This paper discusses the pattern analysis algorithms for gallbladder image interpretation towards classification of the most frequent illness symptoms of this organ. PMID:19124224

  1. Spectral Synthesis via Mean Field approach to Independent Component Analysis

    International Nuclear Information System (INIS)

    We apply a new statistical analysis technique, the Mean Field approach to Independent Component Analysis (MF-ICA) in a Bayseian framework, to galaxy spectral analysis. This algorithm can compress a stellar spectral library into a few Independent Components (ICs), and the galaxy spectrum can be reconstructed by these ICs. Compared to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, the MF-ICA approach offers a large improvement in efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter recovery for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters derived with galaxies from the Sloan Digital Sky Survey. We find that our MF-ICA method can not only fit the observed galaxy spectra efficiently, but can also accurately recover the physical parameters of galaxies. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find it can provide excellent fitting results for low signal-to-noise spectra. (paper)

  2. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  3. Spectral Synthesis via Mean Field approach to Independent Component Analysis

    Science.gov (United States)

    Hu, Ning; Su, Shan-Shan; Kong, Xu

    2016-03-01

    We apply a new statistical analysis technique, the Mean Field approach to Independent Component Analysis (MF-ICA) in a Bayseian framework, to galaxy spectral analysis. This algorithm can compress a stellar spectral library into a few Independent Components (ICs), and the galaxy spectrum can be reconstructed by these ICs. Compared to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, the MF-ICA approach offers a large improvement in efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter recovery for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters derived with galaxies from the Sloan Digital Sky Survey. We find that our MF-ICA method can not only fit the observed galaxy spectra efficiently, but can also accurately recover the physical parameters of galaxies. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find it can provide excellent fitting results for low signal-to-noise spectra.

  4. A Safety Analysis Approach to Clinical Workflows: Application and Evaluation

    Directory of Open Access Journals (Sweden)

    Lamis Al-Qora’n

    2014-11-01

    Full Text Available Clinical workflows are safety critical workflows as they have the potential to cause harm or death to patients. Their safety needs to be considered as early as possible in the development process. Effective safety analysis methods are required to ensure the safety of these high-risk workflows, because errors that may happen through routine workflow could propagate within the workflow to result in harmful failures of the system’s output. This paper shows how to apply an approach for safety analysis of clinical workflows to analyse the safety of the workflow within a radiology department and evaluates the approach in terms of usability and benefits. The outcomes of using this approach include identification of the root causes of hazardous workflow failures that may put patients’ lives at risk. We show that the approach is applicable to this area of healthcare and is able to present added value through the detailed information on possible failures, of both their causes and effects; therefore, it has the potential to improve the safety of radiology and other clinical workflows.

  5. Alternative Approaches for Rating INDCs: a Comparative Analysis

    OpenAIRE

    Davide, Marinella; Vesco, Paola

    2016-01-01

    The “Intended nationally determined contributions” (INDCs) communicated by both developing and developed countries represent a crucial element of the Paris agreement. This paper aims at analysing the INDCs submitted by Parties, through the different tools and approaches proposed by the research community. In particular, our analysis looks at the different ways to assess the effectiveness of the proposed emission reduction pledges, both in terms of aggregate and national efforts. However, we a...

  6. The threat nets approach to information system security risk analysis

    OpenAIRE

    Mirembe, Drake

    2015-01-01

    The growing demand for healthcare services is motivating hospitals to strengthen outpatient case management using information systems in order to serve more patients using the available resources. Though the use of information systems in outpatient case management raises patient data security concerns, it was established that the current approaches to information systems risk analysis do not provide logical recipes for quantifying threat impact and determining the cost-effectiveness of risk m...

  7. A DST-based approach for construction project risk analysis

    OpenAIRE

    Taroun, A; J-B Yang

    2013-01-01

    Despite its huge potential in risk analysis, the Dempster–Shafer Theory of Evidence (DST) has not received enough attention in construction management. This paper presents a DST-based approach for structuring personal experience and professional judgment when assessing construction project risk. DST was innovatively used to tackle the problem of lacking sufficient information through enabling analysts to provide incomplete assessments. Risk cost is used as a common scale for measuring risk im...

  8. ProCAT: a data analysis approach for protein microarrays

    OpenAIRE

    Zhu, Xiaowei; Gerstein, Mark; Snyder, Michael

    2006-01-01

    Protein microarrays provide a versatile method for the analysis of many protein biochemical activities. Existing DNA microarray analytical methods do not translate to protein microarrays due to differences between the technologies. Here we report a new approach, ProCAT, which corrects for background bias and spatial artifacts, identifies significant signals, filters nonspecific spots, and normalizes the resulting signal to protein abundance. ProCAT provides a powerful and flexible new approac...

  9. Probabilistic data flow analysis: a linear equational approach

    OpenAIRE

    Alessandra Di Pierro; Herbert Wiklicky

    2013-01-01

    Speculative optimisation relies on the estimation of the probabilities that certain properties of the control flow are fulfilled. Concrete or estimated branch probabilities can be used for searching and constructing advantageous speculative and bookkeeping transformations. We present a probabilistic extension of the classical equational approach to data-flow analysis that can be used to this purpose. More precisely, we show how the probabilistic information introduced in a control flow graph ...

  10. A Key Event Path Analysis Approach for Integrated Systems

    OpenAIRE

    Jingjing Liao

    2012-01-01

    By studying the key event paths of probabilistic event structure graphs (PESGs), a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN) models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN m...

  11. Inventory management for the health sector: ABC analysis approach

    OpenAIRE

    Nabais, Joana Isabel Baptista

    2010-01-01

    A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics This project aims to analyse a hospital’s inventory management and make suggestions to improve its practices, with special attention on ABC analysis as an optimization tool for the inventory management, control and storage. Other cost reductions approaches are studied in order to contribute for the accurate management of clinical consumption...

  12. An Ultrasound Image Despeckling Approach Based on Principle Component Analysis

    OpenAIRE

    Jawad F. Al-Asad; Ali M. Reza; Udomchai Techavipoo

    2014-01-01

    An approach based on principle component analysis (PCA) to filter out multiplicative noise from ultrasound images is presented in this paper. An image with speckle noise is segmented into small dyadic lengths, depending on the original size of the image, and the global covariance matrix is found. A projection matrix is then formed by selecting the maximum eigenvectors of the global covariance matrix. This projection matrix is used to filter speckle noise by projecting each segment into the si...

  13. MULTI-OBJECTIVE APPROACH TO THE ANALYSIS OF BUSINESS RISKS

    OpenAIRE

    Bashkatova V. S.; Bashkatov V. V.

    2015-01-01

    Business risk is an integral part of the economic activities of any organization. Currently, compulsory and one of the main criteria of normal and stable operation of the economic entity is timely risk assessment of entrepreneurial activity in order to prevent further negative consequences. In this article, we present a technique of multi-criteria approach to the analysis of business risks. The basic signs of increasing risks of entrepreneurship are formulated. On the example of the organizat...

  14. Basic analysis of sugar cane lead and cane fields of an AIC

    International Nuclear Information System (INIS)

    The concentration of minor and trace elements in sugar cane leaves and soils samples from a cuban sugar factory were determine by means of thermal reactor neutron activation analysis (NAA) and X-ray Fluorescence Analysis (XRFA). The samples were taken according to the methodology of Sugar Minister for leaves and soils analysis. The concentration of 28 elements was determinate. the concentration values obtained by NAA, XRFA and previous analysis are compared

  15. Simulation analysis of minimum bending radius for lead frame copper alloys

    OpenAIRE

    Su, Juanhua; Shuguo, Jia; Fengzhang, Ren

    2013-01-01

    Copper alloy has a lot of excellent properties, so it becomes an important alloy for lead frame materials for the integrated circuit. The minimum bending radius of three different copper alloys (Cu-Fe-P, Cu-Ni-Si, Cu-Cr-Sn-Zn) for lead frame materials was analyzed by using finite element. Tensile tests for the three kinds of materials were done to obtain yield stress, ultimate strength and other parameters. The strain-hardening exponent n and normal anisotropy index r of the materials were ob...

  16. Generating function approach to reliability analysis of structural systems

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The generating function approach is an important tool for performance assessment in multi-state systems. Aiming at strength reliability analysis of structural systems, generating function approach is introduced and developed. Static reliability models of statically determinate, indeterminate systems and fatigue reliability models are built by constructing special generating functions, which are used to describe probability distributions of strength (resistance), stress (load) and fatigue life, by defining composite operators of generating functions and performance structure functions thereof. When composition operators are executed, computational costs can be reduced by a big margin by means of collecting like terms. The results of theoretical analysis and numerical simulation show that the generating function approach can be widely used for probability modeling of large complex systems with hierarchical structures due to the unified form, compact expression, computer program realizability and high universality. Because the new method considers twin loads giving rise to component failure dependency, it can provide a theoretical reference and act as a powerful tool for static, dynamic reliability analysis in civil engineering structures and mechanical equipment systems with multi-mode damage coupling.

  17. LEADING WITH LEADING INDICATORS

    International Nuclear Information System (INIS)

    This paper documents Fluor Hanford's use of Leading Indicators, management leadership, and statistical methodology in order to improve safe performance of work. By applying these methods, Fluor Hanford achieved a significant reduction in injury rates in 2003 and 2004, and the improvement continues today. The integration of data, leadership, and teamwork pays off with improved safety performance and credibility with the customer. The use of Statistical Process Control, Pareto Charts, and Systems Thinking and their effect on management decisions and employee involvement are discussed. Included are practical examples of choosing leading indicators. A statistically based color coded dashboard presentation system methodology is provided. These tools, management theories and methods, coupled with involved leadership and employee efforts, directly led to significant improvements in worker safety and health, and environmental protection and restoration at one of the nation's largest nuclear cleanup sites

  18. LEADING WITH LEADING INDICATORS

    Energy Technology Data Exchange (ETDEWEB)

    PREVETTE, S.S.

    2005-01-27

    This paper documents Fluor Hanford's use of Leading Indicators, management leadership, and statistical methodology in order to improve safe performance of work. By applying these methods, Fluor Hanford achieved a significant reduction in injury rates in 2003 and 2004, and the improvement continues today. The integration of data, leadership, and teamwork pays off with improved safety performance and credibility with the customer. The use of Statistical Process Control, Pareto Charts, and Systems Thinking and their effect on management decisions and employee involvement are discussed. Included are practical examples of choosing leading indicators. A statistically based color coded dashboard presentation system methodology is provided. These tools, management theories and methods, coupled with involved leadership and employee efforts, directly led to significant improvements in worker safety and health, and environmental protection and restoration at one of the nation's largest nuclear cleanup sites.

  19. The analysis of modifying effect of gold, silver and iron citrates on embryotoxicity of lead acetate in experiment

    Directory of Open Access Journals (Sweden)

    Shatornaya V.F.

    2014-03-01

    Full Text Available Background. Metals and their nanoforms are widely used in modern medicine and veterinary as antimicrobial bandage and films in surgery (silver and also as an agent for targeted delivery of medicines in oncology (gold. At the same time their impact on embryogenesis and reproductive system is still poorly understood. Objective. The purpose of this experimental work was to investigate the possible modification effect of iron, gold and silver citrates on toxicity of low doses of lead acetate on reproductive function and embryogenesis of rats. Methods. 40 female rats were subdivided into 5 groups: 1st – administration of lead acetate; 2nd – lead acetate + gold citrate; 3rd – lead acetate + silver citrate; 4th – lead acetate + iron citrate; 5th – control. Solutions were administered to pregnant rats through a catheter once a day, daily from the 1st to the 19th days of pregnancy. Results. Introduction of ultra-low doses of lead acetate to pregnant female rats caused embryotoxicity; it resulted in significant decrease in the number of alive fetuses (17% and corpora lutea in ovaries. The combined administration of low doses of lead acetate + metal citrates resulted in the increased number of corpora lutea of pregnancy and percentage of alive fetuses, due to a decrease in general and pre-implantation embryonic mortality compared with the 1st experimental group at almost the same weight of fetuses. Conclusion. Results of the experiment have shown that the administration of gold, iron and silver citrates in combination with lead acetate prevents the negative impact of the latter on the reproductive system and on the processes of embryonic development. Citation: Shatornaya VF, Kaplunenko VG, Chekman IS, Garets VI, Beletskaya EN, Nefedova EA, Onul NM. [The analysis of modifying effect of gold, silver and iron citrates on embryotoxicity of lead acetate in experiment]. Morphologia. 2014;8(1:99-103. Russian.

  20. ANALYSIS OF LEAD IN CANDLE PARTICULATE EMISSIONS BY XRF USING UNIQUANT 4

    Science.gov (United States)

    As part of an extensive program to study the small combustion sources of indoor fine particulate matter (PM), candles with lead-core wicks were burned in a 46-L glass flow- through chamber. The particulate emissions with aerodynamic diameters <10 micrometers (PM10) were captured ...

  1. ANALYSIS AND IMPROVEMENT OF LEAD TIME FOR JOB SHOP UNDER MIXED PRODUCTION SYSTEM

    Institute of Scientific and Technical Information of China (English)

    CHE Jianguo; HE Zhen; EDWARB M Knod

    2006-01-01

    Firstly an overview of the potential impact on work-in-process (WIP) and lead time is provided when transfer lot sizes are undifferentiated from processing lot sizes. Simple performance examples are compared to those from a shop with one-piece transfer lots. Next, a mathematical programming model for minimizing lead time in the mixed-model job shop is presented, in which one-piece transfer lots are used. Key factors affecting lead time are found by analyzing the sum of the longest setup time of individual items among the shared processes (SLST) and the longest processing time of individual items among processes (LPT). And lead time can be minimized by cutting down the SLST and LPT. Reduction of the SLST is described as a traveling salesman problem (TSP), and the minimum of the SLST is solved through job shop scheduling. Removing the bottleneck and leveling the production line optimize the LPT. If the number of items produced is small, the routings are relatively short, and items and facilities are changed infrequently, the optimal schedule will remain valid. Finally a brief example serves to illustrate the method.

  2. Method of analysis for the determination of lead and cadmium in fresh meat

    NARCIS (Netherlands)

    Ruig, de W.G.

    1980-01-01

    This report comprises the result of the RIKILT of an intercomparison on the determination of lead and cadmium in bovine liver and bovine kidney. The aim of this round robbin was to check a wet ashing procedure followed by a flame AAS determination as described too in EEC doc. 2266/VI/77. Special att

  3. [Medical doctors driving technological innovation: questions about and innovation management approaches to incentive structures for lead users].

    Science.gov (United States)

    Bohnet-Joschko, Sabine; Kientzler, Fionn

    2010-01-01

    Management science defines user-generated innovations as open innovation and lead user innovation. The medical technology industry finds user-generated innovations profitable and even indispensable. Innovative medical doctors as lead users need medical technology innovations in order to improve patient care. Their motivation to innovate is mostly intrinsic. But innovations may also involve extrinsic motivators such as gain in reputation or monetary incentives. Medical doctors' innovative activities often take place in hospitals and are thus embedded into the hospital's organisational setting. Hospitals find it difficult to gain short-term profits from in-house generated innovations and sometimes hesitate to support them. Strategic investment in medical doctors' innovative activities may be profitable for hospitals in the long run if innovations provide first-mover competitive advantages. Industry co-operations with innovative medical doctors offer chances but also bear potential risks. Innovative ideas generated by expert users may result in even higher complexity of medical devices; this could cause mistakes when applied by less specialised users and thus affect patient safety. Innovations that yield benefits for patients, medical doctors, hospitals and the medical technology industry can be advanced by offering adequate support for knowledge transfer and co-operation models. PMID:21147434

  4. Structural insights on identification of potential lead compounds targeting WbpP in Vibrio vulnificus through structure-based approaches.

    Science.gov (United States)

    Sasikala, Dakshinamurthy; Jeyakanthan, Jeyaraman; Srinivasan, Pappu

    2016-10-01

    WbpP encoding UDP-GlcNAC C4 epimerase is responsible for the activation of virulence factor in marine pathogen Vibrio vulnificus (V. vulnificus) and it is linked to many aquatic diseases, thus making it a potential therapeutic target. There are few reported compounds that include several natural products and synthetic compounds targeting Vibrio sp, but specific inhibitor targeting WbpP are unavailable. Here, we performed structure-based virtual screening using chemical libraries such as Binding, TOSLab and Maybridge to identify small molecule inhibitors of WbpP with better drug-like properties. Deficient structural information forced to model the structure and the stable protein structure was obtained through 30 ns of MD simulations. Druggability regions are focused for new lead compounds and our screening protocol provides fast docking of entire small molecule library with screening criteria of ADME/Lipinski filter/Docking followed by re-docking of top hits using a method that incorporates both ligand and protein flexibility. Docking conformations of lead molecules interface displays strong H-bond interactions with the key residues Gly101, Ser102, Val195, Tyr165, Arg298, Val209, Ser142, Arg233 and Gln200. Subsequently, the top-ranking compounds were prioritized using the molecular dynamics simulation-based conformation and stability studies. Our study suggests that the proposed compounds may aid as a starting point for the rational design of novel therapeutic agents. PMID:26795501

  5. Engineering approach for medium modeling in piping dynamic analysis

    International Nuclear Information System (INIS)

    Two approaches to the problem of dynamic interaction between pipe and medium are compared in the given paper: 1) The first one treats medium as mass rigidly connected to the pipe finite-element model's nodes. 2) In the second one medium is modeled by the finite-element system of rod-elements. In this case the basic fluid-structure interaction (FSI) effects are taken into account. The main techniques for FE modeling of some pipeline elements are presented in the paper. The second approach can be implemented by the use of general purpose FE programs. A model of a feed water pipeline of VVER-440 type NPP has been developed to study how the FSI affects on pipeline response. The results of the analysis which allow estimation of inaccuracy arising from medium dynamics neglecting are as follows: 1. calculation of Eigen frequency and mode shapes; 2. seismic analysis using the response-spectrum method; 3. accidental blast impact assessment with the use of time history analysis; 4. operating vibration assessment on the basis of harmonic analysis. It has become apparent that the way of medium modeling has an essential influence on the dynamic behavior of pipelines. (author)

  6. A graphical vector autoregressive modelling approach to the analysis of electronic diary data

    Directory of Open Access Journals (Sweden)

    Zipfel Stephan

    2010-04-01

    Full Text Available Abstract Background In recent years, electronic diaries are increasingly used in medical research and practice to investigate patients' processes and fluctuations in symptoms over time. To model dynamic dependence structures and feedback mechanisms between symptom-relevant variables, a multivariate time series method has to be applied. Methods We propose to analyse the temporal interrelationships among the variables by a structural modelling approach based on graphical vector autoregressive (VAR models. We give a comprehensive description of the underlying concepts and explain how the dependence structure can be recovered from electronic diary data by a search over suitable constrained (graphical VAR models. Results The graphical VAR approach is applied to the electronic diary data of 35 obese patients with and without binge eating disorder (BED. The dynamic relationships for the two subgroups between eating behaviour, depression, anxiety and eating control are visualized in two path diagrams. Results show that the two subgroups of obese patients with and without BED are distinguishable by the temporal patterns which influence their respective eating behaviours. Conclusion The use of the graphical VAR approach for the analysis of electronic diary data leads to a deeper insight into patient's dynamics and dependence structures. An increasing use of this modelling approach could lead to a better understanding of complex psychological and physiological mechanisms in different areas of medical care and research.

  7. Experience with a General Gamma-Ray Isotopic Analysis Approach

    Energy Technology Data Exchange (ETDEWEB)

    Ruhter, W D

    2003-06-18

    The gamma-ray data analysis methodology originally developed for the MGA code to determine the relative detection efficiency curve may also be used to determine the relative amounts of the isotopes being measured. This analysis approach is based on the fact that the intensity of any given gamma ray from a sample is determined by the amount of the emitting isotope present in the sample, the emission probability for the gamma ray being measured, the sample self attenuation, the attenuation due to absorbers between the sample and detector, and the detector efficiency. An equation can be written that describes a measured gamma-ray peak intensity in terms of these parameters. By selecting appropriate gamma-ray peaks from the isotopes of interest, we can solve a set of equations for the values of the parameters in any particular measurement including the relative amounts of the selected isotopes. The equations representing the peak intensities are very nonlinear and require an iterative least squares method to solve. We have developed software to ensure that during the iterative process the parameters stay within their appropriate ranges and converge properly in solving the set of equations under various measurement conditions. We have utilized and reported on this approach for determining the plutonium isotopic abundances in samples enriched in Pu-238 and to determine the U-235 enrichment of uranium samples in thick-walled containers. Recently, we have used this approach to determine the plutonium isotopic abundances of plutonium samples in thick-walled containers. We will report on this most recent application, and how this general approach can be adapted quickly to any isotopic analysis problem.

  8. Integrative transcriptome analysis reveals dysregulation of canonical cancer molecular pathways in placenta leading to preeclampsia

    OpenAIRE

    Moslehi, Roxana; Mills, James L; Signore, Caroline; Kumar, Anil; Ambroggio, Xavier; Dzutsev, Amiran

    2013-01-01

    We previously suggested links between specific XPD mutations in the fetal genome and the risk of placental maldevelopment and preeclampsia, possibly due to impairment of Transcription Factor (TF)IIH-mediated functions in placenta. To identify the underlying mechanisms, we conducted the current integrative analysis of several relevant transcriptome data sources. Our meta-analysis revealed downregulation of TFIIH subunits in preeclamptic placentas. Our overall integrative analysis suggested tha...

  9. Surface analysis and depth profiling of corrosion products formed in lead pipes used to supply low alkalinity drinking water.

    Science.gov (United States)

    Davidson, C M; Peters, N J; Britton, A; Brady, L; Gardiner, P H E; Lewis, B D

    2004-01-01

    Modern analytical techniques have been applied to investigate the nature of lead pipe corrosion products formed in pH adjusted, orthophosphate-treated, low alkalinity water, under supply conditions. Depth profiling and surface analysis have been carried out on pipe samples obtained from the water distribution system in Glasgow, Scotland, UK. X-ray diffraction spectrometry identified basic lead carbonate, lead oxide and lead phosphate as the principal components. Scanning electron microscopy/energy-dispersive x-ray spectrometry revealed the crystalline structure within the corrosion product and also showed spatial correlations existed between calcium, iron, lead, oxygen and phosphorus. Elemental profiling, conducted by means of secondary ion mass spectrometry (SIMS) and secondary neutrals mass spectrometry (SNMS) indicated that the corrosion product was not uniform with depth. However, no clear stratification was apparent. Indeed, counts obtained for carbonate, phosphate and oxide were well correlated within the depth range probed by SIMS. SNMS showed relationships existed between carbon, calcium, iron, and phosphorus within the bulk of the scale, as well as at the surface. SIMS imaging confirmed the relationship between calcium and lead and suggested there might also be an association between chloride and phosphorus. PMID:14982163

  10. Application of a series of artificial neural networks to on-site quantitative analysis of lead into real soil samples by laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    El Haddad, J. [Univ. Bordeaux, LOMA, CNRS UMR 5798, F-33400 Talence (France); Bruyère, D. [BRGM, Service Métrologie, Monitoring et Analyse, 3 av. C. Guillemin, B.P 36009, 45060 Orléans Cedex (France); Ismaël, A.; Gallou, G. [IVEA Solution, Centre Scientifique d' Orsay, Bât 503, 91400 Orsay (France); Laperche, V.; Michel, K. [BRGM, Service Métrologie, Monitoring et Analyse, 3 av. C. Guillemin, B.P 36009, 45060 Orléans Cedex (France); Canioni, L. [Univ. Bordeaux, LOMA, CNRS UMR 5798, F-33400 Talence (France); Bousquet, B., E-mail: bruno.bousquet@u-bordeaux.fr [Univ. Bordeaux, LOMA, CNRS UMR 5798, F-33400 Talence (France)

    2014-07-01

    Artificial neural networks were applied to process data from on-site LIBS analysis of soil samples. A first artificial neural network allowed retrieving the relative amounts of silicate, calcareous and ores matrices into soils. As a consequence, each soil sample was correctly located inside the ternary diagram characterized by these three matrices, as verified by ICP-AES. Then a series of artificial neural networks were applied to quantify lead into soil samples. More precisely, two models were designed for classification purpose according to both the type of matrix and the range of lead concentrations. Then, three quantitative models were locally applied to three data subsets. This complete approach allowed reaching a relative error of prediction close to 20%, considered as satisfying in the case of on-site analysis. - Highlights: • Application of a series of artificial neural networks (ANN) to quantitative LIBS • Matrix-based classification of the soil samples by ANN • Concentration-based classification of the soil samples by ANN • Series of quantitative ANN models dedicated to the analysis of data subsets • Relative error of prediction lower than 20% for LIBS analysis of soil samples.

  11. Application of a series of artificial neural networks to on-site quantitative analysis of lead into real soil samples by laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    Artificial neural networks were applied to process data from on-site LIBS analysis of soil samples. A first artificial neural network allowed retrieving the relative amounts of silicate, calcareous and ores matrices into soils. As a consequence, each soil sample was correctly located inside the ternary diagram characterized by these three matrices, as verified by ICP-AES. Then a series of artificial neural networks were applied to quantify lead into soil samples. More precisely, two models were designed for classification purpose according to both the type of matrix and the range of lead concentrations. Then, three quantitative models were locally applied to three data subsets. This complete approach allowed reaching a relative error of prediction close to 20%, considered as satisfying in the case of on-site analysis. - Highlights: • Application of a series of artificial neural networks (ANN) to quantitative LIBS • Matrix-based classification of the soil samples by ANN • Concentration-based classification of the soil samples by ANN • Series of quantitative ANN models dedicated to the analysis of data subsets • Relative error of prediction lower than 20% for LIBS analysis of soil samples

  12. Sensitivity analysis approach to multibody systems described by natural coordinates

    Science.gov (United States)

    Li, Xiufeng; Wang, Yabin

    2014-03-01

    The classical natural coordinate modeling method which removes the Euler angles and Euler parameters from the governing equations is particularly suitable for the sensitivity analysis and optimization of multibody systems. However, the formulation has so many principles in choosing the generalized coordinates that it hinders the implementation of modeling automation. A first order direct sensitivity analysis approach to multibody systems formulated with novel natural coordinates is presented. Firstly, a new selection method for natural coordinate is developed. The method introduces 12 coordinates to describe the position and orientation of a spatial object. On the basis of the proposed natural coordinates, rigid constraint conditions, the basic constraint elements as well as the initial conditions for the governing equations are derived. Considering the characteristics of the governing equations, the newly proposed generalized-α integration method is used and the corresponding algorithm flowchart is discussed. The objective function, the detailed analysis process of first order direct sensitivity analysis and related solving strategy are provided based on the previous modeling system. Finally, in order to verify the validity and accuracy of the method presented, the sensitivity analysis of a planar spinner-slider mechanism and a spatial crank-slider mechanism are conducted. The test results agree well with that of the finite difference method, and the maximum absolute deviation of the results is less than 3%. The proposed approach is not only convenient for automatic modeling, but also helpful for the reduction of the complexity of sensitivity analysis, which provides a practical and effective way to obtain sensitivity for the optimization problems of multibody systems.

  13. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  14. Geographic analysis of health risks of pediatric lead exposure: a golden opportunity to promote healthy neighborhoods.

    Science.gov (United States)

    Oyana, Tonny J; Margai, Florence M

    2007-01-01

    In this retrospective study, the authors investigated pediatric blood lead levels (BLLs) at 2 threshold levels in neighborhoods across the US city of Chicago, examining geographic associations with demographic risk factors and housing characteristics, using data from large-scale childhood BLL screening records from 1997 through 2003. They used logistic regression and geostatistical methods to assess disease dynamics and probability of elevated BLLs. The results showed a significant decline of elevated BLLs, with levels measured at >or= 10 microg/dL decreasing by 74%, compared with a 40% decrease for the lower levels (6-9 microg/dL). The Westside and Southside neighborhoods, with a high concentration of minority populations, had the highest prevalence rates, which were significantly associated with living in pre-1950 housing units. The findings provided insights for lead prevention, implications for lowering the threshold limit, and suggestions for the urgent task of developing healthy neighborhoods. PMID:18316267

  15. Design and performance of lead systems for the analysis of atrial signal components in the ECG

    OpenAIRE

    Ihara, Zenichi

    2006-01-01

    For over a century, electrocardiology has been observing human cardiac activity through recordings of electrocardiograms (ECG). The potential differences derived from the nine electrodes of the standard 12-lead ECG, placed at their designated positions, are the expression of electric dynamics of which the heart is the source. According to well-defined protocols and established criteria of diagnosis, the signals of the electrocardiogram are used as indicators of cardiac pathology. However, of ...

  16. Design and performance of lead systems for the analysis of atrial signal components in the ECG

    OpenAIRE

    Ihara, Zenichi; Vesin, Jean-Marc

    2007-01-01

    For over a century, electrocardiology has been observing human cardiac activity through recordings of electrocardiograms (ECG). The potential differences derived from the nine electrodes of the standard 12-lead ECG, placed at their designated positions, are the expression of electric dynamics of which the heart is the source. According to well-defined protocols and established criteria of diagnosis, the signals of the electrocardiogram are used as indicators of cardiac pathology. However, of ...

  17. Modeling and Analysis of Tritium Transport in Multi-Region Lead-Lithium Liquid Metal Blankets

    OpenAIRE

    Zhang, Hongjie

    2014-01-01

    It is critical to be able to predict tritium transport in lead-lithium liquid metal (LM) blankets with great accuracy to provide information for fusion reactor safety and economy analyses. However, tritium transport processes are complex and affected by multiple physics such as magnetohydrodynamic (MHD) flow, yet there is no single computer code capable of simulating these phenomena inclusively. Thus the objectives of this research are: 1) to develop mathematical models and computational code...

  18. The Stacked Leading Indicators Dynamic Factor Model: A Sensitivity Analysis of Forecast Accuracy using Bootstrapping

    OpenAIRE

    Daniel Grenouilleau

    2006-01-01

    The paper introduces an approximate dynamic factor model based on the extraction of principal components from a very large number of leading indicators stacked at various lags. The model is designed to produce short-term forecasts that are computed with the EM algorithm implemented with the first few eigenvectors ordered by descending eigenvalues. A cross-sectional bootstrap experiment is used to shed light on the sensitivity of the factor model to factor selection and to sampling uncertainty...

  19. Extending dynamic segmentation with lead generation: A latent class Markov analysis of financial product portfolios

    OpenAIRE

    Paas, L.J.; Bijmolt, T.H.A.; Vermunt, J.K.

    2004-01-01

    A recent development in marketing research concerns the incorporation of dynamics in consumer segmentation.This paper extends the latent class Markov model, a suitable technique for conducting dynamic segmentation, in order to facilitate lead generation.We demonstrate the application of the latent Markov model for these purposes using a database containing information on the ownership of twelve financial products and demographics for explaining (changes in) consumer product portfolios.Data we...

  20. Fuel cladding integrity analysis during beam trip transients for China lead-based demonstration reactor

    International Nuclear Information System (INIS)

    Highlights: • Beam trip effect on Accelerator Driven sub-critical System (ADS) is remained a critical issue on ADS reactor technology. • The CFD model of fuel pin of China Lead-based Demonstration Reactor (CLEAR-III) was established. • The thermal hydraulic behaviors of fuel pin during beam trip transient of CLEAR-III were studied. • The thermal stress variation of fuel cladding during beam trip transient of CLEAR-III was evaluated. • Results reveal that beam trip effect on fuel cladding is so small that can be neglected. - Abstract: Frequent beam trips as experienced in the existing high-power proton accelerators may cause thermal fatigue in Accelerator-Driven System (ADS) components, which may lead to degradation of their structural integrity and reduction of their lifetime. In this paper, we focus on the strength and integrity of fuel cladding during the beam trip transients of China Lead-based Demonstration Reactor (CLEAR-III). Typical frequent beam trips and fuel burn-up are addressed to investigate the acceptable beam trip frequency limitation. Correspondingly, the variation magnitude of temperature and thermal stress of fuel cladding are simulated by ANSYS code. Besides, the behavior of cladding material T91 under irradiation, creep and Lead Bismuth Eutectic (LBE) corrosion conditions has been discussed. It shows that beam trips have little influence on the cladding integrity and the acceptable beam trip frequency of the fuel cladding within 10 s of the beam trip time duration is more than 2.5 × 105 times per year, consequently the CLEAR-III’s fuel claddings are expected to have a good resistance to the thermal–mechanical effects induced by beam trips

  1. An Empirical Analysis of Lead-Lag Relationship among Various Financial Markets

    OpenAIRE

    Xiaoli Wang

    2015-01-01

    The efficient-market hypothesis (EMH) asserts that financial markets are "informationally efficient", i.e. all relevant information will be fully and immediately reflected in a security's market price. Other researchers however, have disputed the efficient-market hypothesis both empirically and theoretically. In this paper, we contribute to the discussions of market efficiency by empirically testing the lead-lag relationship among various financial markets. If markets are efficient in process...

  2. In vitro and in vivo approaches for the measurement of oral bioavailability of lead (Pb) in contaminated soils: A review

    International Nuclear Information System (INIS)

    We reviewed the published evidence of lead (Pb) contamination of urban soils, soil Pb risk to children through hand-to-mouth activity, reduction of soil Pb bioavailability due to soil amendments, and methods to assess bioaccessibility which correlate with bioavailability of soil Pb. Feeding tests have shown that urban soils may have much lower Pb bioavailability than previously assumed. Hence bioavailability of soil Pb is the important measure for protection of public health, not total soil Pb. Chemical extraction tests (Pb bioaccessibility) have been developed which are well correlated with the results of bioavailability tests; application of these tests can save money and time compared with feeding tests. Recent findings have revealed that fractional bioaccessibility (bioaccessible compared to total) of Pb in urban soils is only 5-10% of total soil Pb, far lower than the 60% as bioavailable as food-Pb presumed by U.S.-EPA (30% absolute bioavailability used in IEUBK model). - Highlights: → Among direct exposure pathways for Pb in urban environments, inadvertent ingestion of soil is considered the major concern. → The concentration of lead in house dusts is significantly related to that in garden soil, and is highest at older homes. → In modeling risks from diet/water/soil Pb, US-EPA presumes that soil-Pb is 60% as bioavailable as other dietary Pb. → Joplin study proved that RBALP method seriously underestimated the ability of phosphate treatments to reduce soil Pb bioavailability. → Zia et al. method has revealed that urban soils have only 5-10% bioaccessible Pb of total Pb level. - Improved risk evaluation and recommendations for Pb contaminated soils should be based on bioavailability-correlated bioaccessible soil Pb rather than total soil Pb.

  3. A deliberate practice approach to teaching phylogenetic analysis.

    Science.gov (United States)

    Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  4. Thermal analysis of selected tin-based lead-free solder alloys

    DEFF Research Database (Denmark)

    Palcut, Marián; Sopoušek, J.; Trnková, L.;

    2009-01-01

    thermodynamic calculations using the CALPHAD approach. The amount of the alloying elements in the materials was chosen to be close to the respective eutectic composition and the nominal compositions were the following: Sn-3.7Ag-0.7Cu, Sn-1.0Ag-0.5Cu-1Bi (in wt.%). Thermal effects during melting and solidifying...

  5. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  6. A practical approach to fire hazard analysis for offshore structures

    International Nuclear Information System (INIS)

    Offshore quantitative risk assessments (QRA) have historically been complex and costly. For large offshore design projects, the level of detail required for a QRA is often not available until well into the detailed design phase of the project. In these cases, the QRA may be unable to provide timely hazard understanding. As a result, the risk reduction measures identified often come too late to allow for cost effective changes to be implemented. This forces project management to make a number of difficult or costly decisions. This paper demonstrates how a scenario-based approached to fire risk assessment can be effectively applied early in a project's development. The scenario or design basis fire approach calculates the consequence of a select number of credible fire scenarios, determines the potential impact on the platform process equipment, structural members, egress routes, safety systems, and determines the effectiveness of potential options for mitigation. The early provision of hazard data allows the project team to select an optimum design that is safe and will meet corporate or regulatory risk criteria later in the project cycle. The focus of this paper is on the application of the scenario-based approach to gas jet fires. This paper draws on recent experience in the Gulf of Mexico (GOM) and other areas to outline an approach to fire hazard analysis and fire hazard management for deep-water structures. The methods presented will include discussions from the recent June 2002 International Workshop for Fire Loading and Response

  7. Data analysis with the DIANA meta-scheduling approach

    International Nuclear Information System (INIS)

    The concepts, design and evaluation of the Data Intensive and Network Aware (DIANA) meta-scheduling approach for solving the challenges of data analysis being faced by CERN experiments are discussed in this paper. Our results suggest that data analysis can be made robust by employing fault tolerant and decentralized meta-scheduling algorithms supported in our DIANA meta-scheduler. The DIANA meta-scheduler supports data intensive bulk scheduling, is network aware and follows a policy centric meta-scheduling. In this paper, we demonstrate that a decentralized and dynamic meta-scheduling approach is an effective strategy to cope with increasing numbers of users, jobs and datasets. We present 'quality of service' related statistics for physics analysis through the application of a policy centric fair-share scheduling model. The DIANA meta-schedulers create a peer-to-peer hierarchy of schedulers to accomplish resource management that changes with evolving loads and is dynamic and adapts to the volatile nature of the resources

  8. Bioinformatics approaches to single-cell analysis in developmental biology.

    Science.gov (United States)

    Yalcin, Dicle; Hakguder, Zeynep M; Otu, Hasan H

    2016-03-01

    Individual cells within the same population show various degrees of heterogeneity, which may be better handled with single-cell analysis to address biological and clinical questions. Single-cell analysis is especially important in developmental biology as subtle spatial and temporal differences in cells have significant associations with cell fate decisions during differentiation and with the description of a particular state of a cell exhibiting an aberrant phenotype. Biotechnological advances, especially in the area of microfluidics, have led to a robust, massively parallel and multi-dimensional capturing, sorting, and lysis of single-cells and amplification of related macromolecules, which have enabled the use of imaging and omics techniques on single cells. There have been improvements in computational single-cell image analysis in developmental biology regarding feature extraction, segmentation, image enhancement and machine learning, handling limitations of optical resolution to gain new perspectives from the raw microscopy images. Omics approaches, such as transcriptomics, genomics and epigenomics, targeting gene and small RNA expression, single nucleotide and structural variations and methylation and histone modifications, rely heavily on high-throughput sequencing technologies. Although there are well-established bioinformatics methods for analysis of sequence data, there are limited bioinformatics approaches which address experimental design, sample size considerations, amplification bias, normalization, differential expression, coverage, clustering and classification issues, specifically applied at the single-cell level. In this review, we summarize biological and technological advancements, discuss challenges faced in the aforementioned data acquisition and analysis issues and present future prospects for application of single-cell analyses to developmental biology. PMID:26358759

  9. A Practical Approach to the Investigation of an rSr' Pattern in Leads V1-V2.

    Science.gov (United States)

    Koppikar, Sahil; Barbosa-Barros, Raimundo; Baranchuk, Adrian

    2015-12-01

    The differential diagnosis of an rSr' pattern in leads V1-V2 on electrocardiogram is a frequently encountered entity in clinical cardiology. This finding often presents itself in asymptomatic and healthy individuals. The causes might vary from benign and nonpathological, to severe and life-threatening diseases, such as Brugada syndrome or arrhythmogenic right ventricular dysplasia. Workup of these patients involves a history and physical examination to screen for underlying cardiac disease and potential triggers. Routine investigation involves blood work and a thorough electrocardiographic examination. Echocardiography has a role in evaluating patients in whom structural heart disease is suspected. Pulmonary testing using computed tomography can be conducted if right ventricular enlargement is identified. More advanced testing is not warranted if these initial investigations are reassuring. Referral to an arrhythmia specialist should be considered for patients in whom this finding is due to Brugada syndrome, arrhythmogenic right ventricular dysplasia, or Wolf-Parkinson-White syndrome. We propose a clinical and electrocardiographic algorithm that will assist clinicians in narrowing their differential diagnosis. PMID:26143139

  10. Lead us not into tanktation: a simulation modelling approach to gain insights into incentives for sporting teams to tank.

    Directory of Open Access Journals (Sweden)

    Geoffrey N Tuck

    Full Text Available Annual draft systems are the principal method used by teams in major sporting leagues to recruit amateur players. These draft systems frequently take one of three forms: a lottery style draft, a weighted draft, or a reverse-order draft. Reverse-order drafts can create incentives for teams to deliberately under-perform, or tank, due to the perceived gain from obtaining quality players at higher draft picks. This paper uses a dynamic simulation model that captures the key components of a win-maximising sporting league, including the amateur player draft, draft choice error, player productivity, and between-team competition, to explore how competitive balance and incentives to under-perform vary according to league characteristics. We find reverse-order drafts can lead to some teams cycling between success and failure and to other teams being stuck in mid-ranking positions for extended periods of time. We also find that an incentive for teams to tank exists, but that this incentive decreases (i as uncertainty in the ability to determine quality players in the draft increases, (ii as the number of teams in the league reduces, (iii as team size decreases, and (iv as the number of teams adopting a tanking strategy increases. Simulation models can be used to explore complex stochastic dynamic systems such as sports leagues, where managers face difficult decisions regarding the structure of their league and the desire to maintain competitive balance.

  11. Immobilization of lead in a Korean military shooting range soil using eggshell waste: an integrated mechanistic approach.

    Science.gov (United States)

    Ahmad, Mahtab; Hashimoto, Yohey; Moon, Deok Hyun; Lee, Sang Soo; Ok, Yong Sik

    2012-03-30

    This study evaluated the effectiveness of eggshell and calcined eggshell on lead (Pb) immobilization in a shooting range soil. Destructive and non-destructive analytical techniques were employed to determine the mechanism of Pb immobilization. The 5% additions of eggshell and calcined eggshell significantly decreased the TCLP-Pb concentration by 68.8% due mainly to increasing soil pH. Eggshell and calcined-eggshell amendments decreased the exchangeable Pb fraction to ≈ 1% of the total Pb in the soil, while the carbonate-associated Pb fraction was increased to 40.0-47.1% at >15% application rates. The thermodynamic modeling on Pb speciation in the soil solution predicted the precipitation of Pb-hydroxide [Pb(OH)(2)] in soils amended with eggshell and calcined eggshell. The SEM-EDS, XAFS and elemental dot mapping revealed that Pb in soil amended with calcined eggshell was associated with Si and Ca, and may be immobilized by entrapping into calcium-silicate-hydrate. Comparatively, in the soil amended with eggshell, Pb was immobilized via formation of Pb-hydroxide or lanarkite [Pb(2)O(SO(4))]. Applications of amendments increased activities of alkaline phosphatase up to 3.7 times greater than in the control soil. The use of eggshell amendments may have potential as an integrated remediation strategy that enables Pb immobilization and soil biological restoration in shooting range soils. PMID:22309654

  12. Determination of lead, cadmium and thallium by neutron activation analysis in environmental samples

    International Nuclear Information System (INIS)

    A radiochemical procedure for simultaneous determination of lead (203Pb), thallium (202Tl) and cadmium (115Cd→115mIn) after fast neutron activation, based on ion-exchange separation from bromide medium and additional purification steps for Pb and Tl is described. Radioactive tracers 210Pb and 10'9Cd were used for determination of the chemical yields of Pb and Cd; for Tl it was determined gravimetrically. Two standard reference materials, BCR CRM No. 146 Sewage Sludge and NIST SRM 1633a Coal Fly Ash were analyzed and satisfactory agreement with certified values was obtained. (author) 17 refs.; 3 tabs.; 3 schemes

  13. Numerical Analysis of Lead-Bismuth-Water Direct Contact Boiling Heat Transfer

    Science.gov (United States)

    Yamada, Yumi; Takahashi, Minoru

    Direct contact boiling heat transfer of sub-cooled water with lead-bismuth eutectic (Pb-Bi) was investigated for the evaluation of the performance of steam generation in direct contact of feed water with primary Pb-Bi coolant in upper plenum above the core in Pb-Bi-cooled direct contact boiling water fast reactor. An analytical two-fluid model was developed to estimate the heat transfer numerically. Numerical results were compared with experimental ones for verification of the model. The overall volumetric heat transfer coefficient was calculated from heat exchange rate in the chimney. It was confirmed that the calculated results agreed well with the experimental result.

  14. Analysis of Base-Apex Lead Electrocardiogram in Clinically Healthy Kermani Sheep

    OpenAIRE

    Tajik, Javad; Amir Saeed SAMIMI; Shojaeepour, Saeedeh; JARAKANI, Sajad

    2016-01-01

    The normal electrocardiographic (ECG) parameters in the base-apex lead were evaluated in 40 clinically healthy Kermani sheep, and compared between sexes and three age groups. The heart rate varied from 83-192 beats/min with a mean of 128.9±4.7 beats/min (Mean±SEM). Sinus arrhythmia was the only observed cardiac dysrhythmia on the ECG traces diagnosed in 45% of animals. No significant difference was found in heart rate, amplitude, and duration of ECG waves and intervals between two sexes. Neve...

  15. Kinematic Accuracy Analysis of Lead Screw W Insertion Mechanism with Flexibility

    Science.gov (United States)

    He, Hu; Zhang, Lei; Kong, Jiayuan

    According to the actual requirements of w insertion, a set of variable lead screw w mechanism was designed, motion characteristics of the mechanism were analyzed and kinematics simulation was carried out with MATLAB. Mechanism precision was analyzed with the analytical method and the error coefficient curve of each component in the mechanism was obtained. Dynamics simulation for rigid mechanism and mechanism with flexibility in different speed was conducted with ADAMS, furthermore, real-time elastic deformation of the flexible Connecting rod was obtained. In consideration of the influences of the elastic connecting rod, the outputs motion error and elastic deformation of components were increased with the speed of the loom.

  16. Intelligent Systems Approaches to Product Sound Quality Analysis

    Science.gov (United States)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  17. Analysis of four Brazilian seismic areas using a nonextensive approach

    Science.gov (United States)

    Scherrer, T. M.; França, G. S.; Silva, R.; de Freitas, D. B.; Vilar, C. S.

    2015-02-01

    We analyse four seismic areas in Brazil using a nonextensive model and the data from the Brazilian Seismic Bulletin between 1720 and 2013. Two of those regions are contrasting zones, while the other two are dominated by seismic active faults. We notice that intraplate seismic zones present q-values similar to others fault zones, but the adjustment in contrast areas results in higher values for this parameter. The results reveal the nonextensive approach adjusts robustly also in case of intraplate earthquakes, showing that the Tsallis formalism is unquestionably a powerful tool to the analysis of this phenomenon.

  18. A MANAGERIAL AND COST ACCOUNTING APPROACH OF CUSTOMER PROFITABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    CARDOS Ildiko Reka

    2010-07-01

    Full Text Available In the last years many organizations realized that market orientation is essential to their success. Satisfying the needs of customers, offering them products and services which meet their desires and demands, customer loyalty can increase profitability for long term. After analyzing the existing journal literature in this field we would like to emphasize that managerial accounting, cost calculation methods and techniques, the analysis of costs provides relevant information when analyzing the customer’s profitability. We pay special attention on cost systems. An activity based costing approach takes customer profitability to new levels of accuracy and usefulness, provides the basis for creating, communicating and delivering value to the customers.

  19. Analysis of hand radiographs with neural network approach

    International Nuclear Information System (INIS)

    Early radiographic detection of osseous abnormalities is important for diagnosis of conditions such as osteoporosis, arthritis, or renal osteodystrophy due to hyperparathyroidism. The authors of this paper are developing computerized methods to analyze the degree of severity of these abnormalities seen in hand radiographs. The hand radiographs are digitized with a high-resolution laser digitizer to maintain the fine features of early osseous changes. To perform the analysis efficiently and automatically, they employ a two step approach. A data-compression technique is first applied to the image, and a neural network identifies the region of interest (ROI) on the compressed image

  20. Mechanical and Numerical Analysis Concerning Compressive Properties of Tin-Lead Open-Cell Foams

    Science.gov (United States)

    Belhadj, Abd-Elmouneïm; Gavrus, Adinel; Bernard, Fabrice; Azzaz, Mohammed

    2015-10-01

    The design of new or innovative materials has to meet two essential criteria: increased mechanical performance and minimization of the mass. This dual requirement leads to interest in the study of various classes of metallic foams. The actual research is focused on open-cell Tin-Lead foams manufactured by replication process using NaCl preform. A mechanical press equipped with a load cell and a local extensometer with a controlled deformation rate is used. Experimental tests were carried out in order to study the influences of both the cell size and of the relative density on the mechanical behavior during a compression deformation and to analyze the obtained properties variation within a new framework. This study has three main sections which start with the manufacturing description and mechanical characterization of the proposed metallic foams followed by the understanding and modeling of their response to a compression load via a Gibson-Ashby model, a Féret law, a proposed simple Avrami model, and a generalized Avrami model. Finally, an exposition of a numerical simulation analyzing the compression of the Sn-Pb foams concerning the variation of the relative densities with respect to the plastic strain is proposed.

  1. Preliminary Analysis of Natural Circulation Characteristics of China Lead-based Research Reactor (CLEAR-I)

    International Nuclear Information System (INIS)

    A large science project named “Advanced Nuclear Fission Energy-ADS Transmutation System” has been launched by Chinese Academy of Sciences. China LEAd-based Research Reactor (CLEAR-I) as a verification facility reactor will be developed in the first phase of the ADS program. In CLEAR-I, the liquid Lead-Bismuth eutectic (PbBi) was chosen as coolant in primary cooling system and driven by natural circulation since the power at a low level about 10MWt. The natural circulation of primary coolant system is a coupling of thermal-hydraulic problems of the core, upper and lower plenums of the reactor, reactor internal parts. Therefore, understanding the natural circulation flow capability and characteristics of CLEAR-I is very important, which can significantly improve the security features of the reactor. In this work, natural circulation characteristics and flow distribution at steady state in primary system of CLEAR-I were preliminary evaluated and discussed. A numerical simulation was conducted by using CFD code. The flow characteristics was illustrated and discussed. The temperature and pressure distribution were evaluated. (author)

  2. Region-of-influence approach to a frequency analysis of heavy precipitation in Slovakia

    Directory of Open Access Journals (Sweden)

    L. Gaál

    2007-07-01

    Full Text Available The L-moment-based regionalization approach developed by Hosking and Wallis (1997 is a frequently used tool in regional frequency modeling of heavy precipitation events. The method consists of the delineation of homogeneous pooling groups with a fixed structure, which may, however, lead to undesirable step-like changes in growth curves and design value estimates in the case of a transition from one pooling group to another. Unlike the standard methodology, the region-of-influence (ROI approach does not make use of groups of sites (regions with a fixed structure; instead, each site has its own "region", i.e. a group of sites that are sufficiently similar to the site of interest. The aim of the study is to develop a version of the ROI approach, which was originally proposed in order to overcome inconsistencies involved in flood frequency analysis, for the modeling of probabilities of heavy precipitation amounts. Various settings of the distance metric and pooled weighting factors are evaluated, and a comparison with the standard regional frequency analysis over the area of Slovakia is performed. The advantages of the ROI approach are assessed by means of simulation studies. It is demonstrated that almost any setting of parameters of the ROI method yields estimates of growth curves and design values at individual sites that are superior to the standard regional and at-site estimates.

  3. Natural and anthropogenic lead in soils and vegetables around Guiyang city, southwest China: A Pb isotopic approach

    Energy Technology Data Exchange (ETDEWEB)

    Li, Fei-Li [College of Biological and Environmental Engineering, Zhejiang University of Technology, Hangzhou 310032 (China); Liu, Cong-Qiang, E-mail: liucongqiang@vip.skleg.cn [State Key Laboratory of Environmental Geochemistry, Institute of Geochemistry, Chinese Academy of Sciences, Guiyang 550002 (China); Yang, Yuan-Gen [State Key Laboratory of Environmental Geochemistry, Institute of Geochemistry, Chinese Academy of Sciences, Guiyang 550002 (China); Bi, Xiang-Yang [State Key Laboratory of Biogeology and Environmental Geology, China University of Geosciences, Wuhan 430074 (China); Liu, Tao-Ze; Zhao, Zhi-Qi [State Key Laboratory of Environmental Geochemistry, Institute of Geochemistry, Chinese Academy of Sciences, Guiyang 550002 (China)

    2012-08-01

    Soils, vegetables and rainwaters from three vegetable production bases in the Guiyang area, southwest China, were analyzed for Pb concentrations and isotope compositions to trace its sources in the vegetables and soils. Lead isotopic compositions were not distinguishable between yellow soils and calcareous soils, but distinguishable among sampling sites. The highest {sup 207}Pb/{sup 206}Pb and {sup 208}Pb/{sup 206}Pb ratios were found for rainwaters (0.8547-0.8593 and 2.098-2.109, respectively), and the lowest for soils (0.7173-0.8246 and 1.766-2.048, respectively). The {sup 207}Pb/{sup 206}Pb and {sup 208}Pb/{sup 206}Pb ratios increased in vegetables in the order of roots < stems < leaves < fruits. Plots of the {sup 207}Pb/{sup 206}Pb ratios versus the {sup 208}Pb/{sup 206}Pb ratios from all samples formed a straight line and supported a binary end-member mixing model for Pb in vegetables. Using deep soils and rainwaters as geogenic and anthropogenic end members in the mixing model, it was estimated that atmospheric Pb contributed 30-77% to total Pb for vegetable roots, 43-71% for stems, 72-85% for leaves, and 90% for capsicum fruits, whereas 10-70% of Pb in all vegetable parts was derived from soils. This research supports that heavy metal contamination in vegetables can result mainly from atmospheric deposition, and Pb isotope technique is useful for tracing the sources of Pb contamination in vegetables.

  4. Immobilization of lead in a Korean military shooting range soil using eggshell waste: An integrated mechanistic approach

    International Nuclear Information System (INIS)

    Highlights: ► Eggshell and calcined eggshell immobilized Pb in the shooting range soil. ► Calcined eggshell was more effective on Pb immobilization compared to eggshell. ► Exchangeable Pb fractions were transformed to carbonate bound fractions. ► Calcined eggshell stabilized Pb by enwrapping into calcium silicate hydrate. ► Soil Pb toxicity can be reduced by applying eggshell and calcined eggshell. - Abstract: This study evaluated the effectiveness of eggshell and calcined eggshell on lead (Pb) immobilization in a shooting range soil. Destructive and non-destructive analytical techniques were employed to determine the mechanism of Pb immobilization. The 5% additions of eggshell and calcined eggshell significantly decreased the TCLP-Pb concentration by 68.8% due mainly to increasing soil pH. Eggshell and calcined-eggshell amendments decreased the exchangeable Pb fraction to ∼1% of the total Pb in the soil, while the carbonate-associated Pb fraction was increased to 40.0–47.1% at >15% application rates. The thermodynamic modeling on Pb speciation in the soil solution predicted the precipitation of Pb-hydroxide [Pb(OH)2] in soils amended with eggshell and calcined eggshell. The SEM-EDS, XAFS and elemental dot mapping revealed that Pb in soil amended with calcined eggshell was associated with Si and Ca, and may be immobilized by entrapping into calcium-silicate-hydrate. Comparatively, in the soil amended with eggshell, Pb was immobilized via formation of Pb-hydroxide or lanarkite [Pb2O(SO4)]. Applications of amendments increased activities of alkaline phosphatase up to 3.7 times greater than in the control soil. The use of eggshell amendments may have potential as an integrated remediation strategy that enables Pb immobilization and soil biological restoration in shooting range soils.

  5. Standard error in the Jacobson and Truax Reliable Change Index: the "classical approach" leads to poor estimates.

    Science.gov (United States)

    Temkin, Nancy R

    2004-10-01

    Different authors have used different estimates of variability in the denominator of the Reliable Change Index (RCI). Maassen attempts to clarify some of the differences and the assumptions underlying them. In particular he compares the 'classical' approach using an estimate S(Ed) supposedly based on measurement error alone with an estimate S(Diff) based on the variability of observed differences in a population that should have no true change. Maassen concludes that not only is S(Ed) based on classical theory, but it properly estimates variability due to measurement error and practice effect while S(Diff) overestimates variability by accounting twice for the variability due to practice. Simulations show Maassen to be wrong on both accounts. With an error rate nominally set to 10%, RCI estimates using S(Diff) wrongly declare change in 10.4% and 9.4% of simulated cases without true change while estimates using S(Ed) wrongly declare change in 17.5% and 12.3% of the simulated cases (p practice effects, SEd estimates the variability of change due to measurement error to be .34, when the true variability due to measurement error was .014. Neuropsychologists should not use SEd in the denominator of the RCI. PMID:15637781

  6. Systematic approach for tolerance analysis of photonic systems

    Science.gov (United States)

    van Gurp, J. F. C.; Tichem, M.; Staufer, U.

    2011-08-01

    Passive alignment of photonic components is an assembly method compatible with a high production volume. Its precision performance relies completely on the dimensional accuracies of geometrical alignment features. A tolerance analysis plays a key role in designing and optimizing these passive alignment features. The objective of this paper is to develop a systematic approach for conducting such tolerance analysis, starting with a conceptual package design, setting up the tolerance chain, describing it mathematically and converting the misalignment to a coupling loss probability distribution expressed in dB. The method has successfully been applied to a case study where an indium phosphide (InP) chip is aligned with a TriPleX1 (SiO2 cladding with Si3N4 core) interposer via a silicon optical bench (SiOB).

  7. An Augmented Lagrangian Approach for Sparse Principal Component Analysis

    CERN Document Server

    Lu, Zhaosong

    2009-01-01

    Principal component analysis (PCA) is a widely used technique for data analysis and dimension reduction with numerous applications in science and engineering. However, the standard PCA suffers from the fact that the principal components (PCs) are usually linear combinations of all the original variables, and it is thus often difficult to interpret the PCs. To alleviate this drawback, various sparse PCA approaches were proposed in literature [15, 6, 17, 28, 8, 25, 18, 7, 16]. Despite success in achieving sparsity, some important properties enjoyed by the standard PCA are lost in these methods such as uncorrelation of PCs and orthogonality of loading vectors. Also, the total explained variance that they attempt to maximize can be too optimistic. In this paper we propose a new formulation for sparse PCA, aiming at finding sparse and nearly uncorrelated PCs with orthogonal loading vectors while explaining as much of the total variance as possible. We also develop a novel augmented Lagrangian method for solving a ...

  8. Sensitivity analysis in linear programming approach to optimal SVM classification

    Directory of Open Access Journals (Sweden)

    Roberto Ragona

    2014-06-01

    Full Text Available At present, linear programming (LP techniques for optimal one-class and two-class classification can be considered well established and feasible; they pose an alternative to the quadratic programming (QP approach, which is usually credited with having greater complexity. Sensitivity analysis, well developed in the LP context, is generally employed to furnish answers describing how an optimal solution changes when varying the parameters in an LP problem; as a possible application in optimal classification, it can be employed for the definition of the free parameters present in LP procedures, reducing the events of computational restart from scratch when searching for a satisfactory classifier through repeated trials. The proposed method is demonstrated on a simple example which exhibits its effectiveness in reducing the computational burden, but this procedure can be extrapolated to large problems as well. Keywords: Linear Programming, Optimal Classification, Sensitivity Analysis, Support Vector Machines. Normal 0 14 false false false IT X-NONE X-NONE

  9. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  10. Thermal Structure Analysis of SIRCA Tile for X-34 Wing Leading Edge TPS

    Science.gov (United States)

    Milos, Frank S.; Squire, Thomas H.; Rasky, Daniel J. (Technical Monitor)

    1997-01-01

    This paper will describe in detail thermal/structural analyses of SIRCA tiles which were performed at NASA Ames under the The Tile Analysis Task of the X-34 Program. The analyses used the COSMOS/M finite element software to simulate the material response in arc-jet tests, mechanical deflection tests, and the performance of candidate designs for the TPS system. Purposes of the analysis were to verify thermal and structural models for the SIRCA tiles, to establish failure criteria for stressed tiles, to simulate the TPS response under flight aerothermal and mechanical load, and to confirm that adequate safety margins exist for the actual TPS design.

  11. Analysis of integral circulation and decay heat removal experiments in the lead-bismuth CIRCE facility with RELAP5 code

    International Nuclear Information System (INIS)

    In this paper, the results of the post-test analysis of some integral circulation experiments conducted on the lead-bismuth CIRCE facility are presented in comparison with the experimental data. These experiments include the simulation of unprotected loss of flow and unprotected loss of heat sink transients in a pool-type heavy liquid metal reactor. Furthermore, the results of the pre-test analysis of a protected loss of heat sink and flow transient with decay heat removal by a heat exchanger immersed in the pool and operating in natural circulation is presented. All transient analyses have been performed with the RELAP5 thermal-hydraulic code. (author)

  12. Benefits of automatic multielemental analysis of zinc-lead ore slurries by radioisotope X-ray fluorescence

    International Nuclear Information System (INIS)

    The radioisotope X-ray fluorescence measuring system has been developed for automatic multielement analysis of zinc-lead ore slurries. The system consists of several XRF measuring probes, electronic unit and minicomputer with its peripherals. The system has been used for simultaneous determination of Fe, Zn and Pb in flotation streams with accuracy within 3-15%, depending on metal concentration. Improved control of the flotation process resulting from on-stream analysis has led to increases of up to 3.4% in metal recovery. (author). 3 refs, 4 figs, 1 tab

  13. A ZEUS next-to-leading-order QCD analysis of data on deep inelastic scattering

    CERN Document Server

    Chekanov, S; Adamczyk, L; Adamus, M; Adler, V; Aghuzumtsyan, G; Allfrey, P D; Antonioli, P; Antonov, A; Arneodo, M; Bailey, D S; Bamberger, A; Barakbaev, A N; Barbagli, G; Barbi, M; Bari, G; Barreiro, F; Bartsch, D; Basile, M; Behrens, U; Bell, M A; Bellagamba, L; Bellan, P M; Benen, A; Bertolin, A; Bhadra, S; Bloch, I; Bold, T; Boos, E G; Borras, K; Boscherini, D; Brock, I; Brook, N H; Brugnera, R; Brümmer, N; Bruni, A; Bruni, G; Bussey, P J; Butterworth, J M; Büttner, C; Bylsma, B; Caldwell, A; Capua, M; Cara Romeo, G; Carli, T; Carlin, R; Cassel, D G; Catterall, C D; Chwastowski, J; Abramowicz, H; Ciborowski, J; Ciesielski, R; Cifarelli, Luisa; Cindolo, F; Cole, J E; Collins-Tooth, C; Contin, A; Cooper-Sarkar, A M; Coppola, N; Corradi, M; Corriveau, F; Costa, M; Cottrell, A; Cui, Y; D'Agostini, G; Dal Corso, F; Danilov, P; De Pasquale, S; Dementiev, R K; Derrick, M; Devenish, R C E; Dhawan, S; Dobur, D; Dolgoshein, B A; Doyle, A T; Drews, G; Durkin, L S; Dusini, S; Eisenberg, Y; Ermolov, P F; Eskreys, Andrzej; Everett, A; Ferrando, J; Ferrero, M I; Figiel, J; Foster, B; Foudas, C; Fourletov, S; Fourletova, J; Fry, C; Gabareen, A; Galas, A; Gallo, E; Garfagnini, A; Geiser, A; Genta, C; Gialas, I; Giusti, P; Gladilin, L K; Gladkov, D; Glasman, C; Göbel, F; Goers, S; Goncalo, R; González, O; Gosau, T; Göttlicher, P; Grabowska-Bold, I; Graciani-Díaz, R; Grigorescu, G; Grijpink, S; Groys, M; Grzelak, G; Gutsche, O; Gwenlan, C; Haas, T; Hain, W; Hall-Wilton, R; Hamatsu, R; Hamilton, J; Hanlon, S; Hart, C; Hartmann, H; Hartner, G; Heaphy, E A; Heath, G P; Helbich, M; Hilger, E; Hochman, D; Holm, U; Horn, C; Iacobucci, G; Iga, Y; Irrgang, P; Jakob, H P; Jiménez, M; Jones, T W; Kagawa, S; Kahle, B; Kaji, H; Kananov, S; Karshon, U; Karstens, F; Kasemann, M; Kataoka, M; Katkov, I I; Kcira, D; Keramidas, A; Khein, L A; Kim, J Y; Kind, O; Kisielewska, D; Kitamura, S; Koffeman, E; Kohno, T; Kooijman, P; Koop, T; Korzhavina, I A; Kotanski, A; Kötz, U; Kowal, A M; Kowalski, H; Kramberger, G; Kreisel, A; Krumnack, N; Kulinski, P; Kuze, M; Kuzmin, V A; Labarga, L; Lammers, S; Lelas, D; Levchenko, B B; Levy, A; Li, L; Lightwood, M S; Lim, H; Limentani, S; Ling, T Y; Liu, C; Liu, X; Löhr, B; Lohrmann, E; Loizides, J H; Long, K R; Longhin, A; Lukasik, J; Lukina, O Yu; Luzniak, P; Ma, K J; Maddox, E; Magill, S; Malka, J; Mankel, R; Margotti, A; Marini, G; Martin, J F; Martínez, M; Mastroberardino, A; Matsuzawa, K; Mattingly, M C K; Melzer-Pellmann, I A; Menary, S R; Metlica, F; Meyer, U; Miglioranzi, S; Milite, M; Mirea, A; Monaco, V; Montanari, A; Musgrave, B; Nagano, K; Namsoo, T; Nania, R; Nguyen, C N; Nigro, A; Ning, Y; Noor, U; Notz, D; Nowak, R J; Nuncio-Quiroz, A E; Oh, B Y; Olkiewicz, K; Ota, O; Padhi, S; Palmonari, F; Patel, S; Paul, E; Pavel, Usan; Pawlak, J M; Pelfer, P G; Pellegrino, A; Pesci, A; Piotrzkowski, K; Plamondon, M; Plucinsky, P P; Pokrovskiy, N S; Polini, A; Proskuryakov, A S; Przybycien, M B; Rautenberg, J; Raval, A; Reeder, D D; Ren, Z; Renner, R; Repond, J; Ri, Y D; Rinaldi, L; Robins, S; Rosin, M; Ruspa, M; Ryan, P; Sacchi, R; Salehi, H; Santamarta, R; Sartorelli, G; Savin, A A; Saxon, D H; Schagen, S; Schioppa, M; Schlenstedt, S; Schleper, P; Schmidke, W B; Schneekloth, U; Schörner-Sadenius, T; Sciulli, F; Shcheglova, L M; Skillicorn, I O; Slominski, W; Smith, W H; Soares, M; Solano, A; Son, D; Sosnovtsev, V V; Stairs, D G; Stanco, L; Standage, J; Stifutkin, A; Stonjek, S; Stopa, P; Stösslein, U; Straub, P B; Suchkov, S; Susinno, G; Suszycki, L; Sutiak, J; Sutton, M R; Sztuk, J; Szuba, D; Szuba, J; Tapper, A D; Targett-Adams, C; Tassi, E; Tawara, T; Terron, J; Tiecke, H G; Tokushuku, K; Tsurugai, T; Turcato, M; Tymieniecka, T; Tyszkiewicz, A; Ukleja, A; Ukleja, J; Vázquez, M; Vlasov, N N; Voss, K C; Walczak, R; Walsh, R; Wang, M; Whitmore, J J; Whyte, J; Wichmann, K; Wick, K; Wiggers, L; Wills, H H; Wing, M; Wlasenko, M; Wolf, G; Yagues-Molina, A G; Yamada, S; Yamazaki, Y; Yoshida, R; Youngman, C; Zambrana, M; Zawiejski, L; Zeuner, W; Zhautykov, B O; Zhou, C; Zichichi, A; Ziegler, A; Zotkin, D S; Zotkin, S A; De Favereau, J; De Wolf, E; Del Peso, J

    2003-01-01

    Next-to-leading order QCD analyses of the ZEUS data on deep inelastic scattering together with fixed-target data have been perfomed, from which the gluon and the quark densities of the proton and the value of the strong coupling constant, alpha_s(M_Z), were extracted. The study includes a full treatment of the experimental systematic uncertainties including point-to-point correlations. The resulting uncertainties in the parton density functions are presented. A combined fit for alpha_s(M_Z) and the gluon and qurak densities yields a value of alpha_s(M_Z) in agreement with the world average. The parton density functions derived from ZEUS data alone indicate the importance of HERA data in determining sea quark and gluon distributions at low x. The limits of applicability of the theoretical formalism have been explored by comparing the fit predictions to ZEUS data at very low Q^2.

  14. Modeling and analysis for determining optimal suppliers under stochastic lead times

    DEFF Research Database (Denmark)

    Abginehchi, Soheil; Farahani, Reza Zanjirani

    2010-01-01

    The policy of simultaneously splitting replenishment orders among several suppliers has received considerable attention in the last few years and continues to attract the attention of researchers. In this paper, we develop a mathematical model which considers multiple-supplier single-item inventory...... systems. The item acquisition lead times of suppliers are random variables. Backorder is allowed and shortage cost is charged based on not only per unit in shortage but also per time unit. Continuous review (s,Q) policy has been assumed. When the inventory level depletes to a reorder level, the total......, procurement cost, inventory holding cost, and shortage cost, is minimized. We also conduct extensive numerical experiments to show the advantages of our model compared with the models in the literature. According to our extensive experiments, the model developed in this paper is the best model in the...

  15. Stability Region Analysis of PID and Series Leading Correction PID Controllers for the Time Delay Systems

    Directory of Open Access Journals (Sweden)

    D. RAMA REDDY

    2012-07-01

    Full Text Available This paper describes the stability regions of PID (Proportional +Integral+ Derivative and a new PID with series leading correction (SLC for Networked control system with time delay. The new PID controller has a tuning parameter ‘β’. The relation between β, KP, KI and KD is derived. The effect of plant parameters on stabilityregion of PID controllers and SLC-PID controllers in first-order and second-order systems with time delay are also studied. Finally, an open-loop zero was inserted into the plant-unstable second order system with time delay so that the stability regions of PID and SLC-PID controllers get effectively enlarged. The total system isimplemented using MATLAB/Simulink.

  16. Top leads for swine influenza A/H1N1 virus revealed by steered molecular dynamics approach.

    Science.gov (United States)

    Mai, Binh Khanh; Viet, Man Hoang; Li, Mai Suan

    2010-12-27

    Since March 2009, the rapid spread of infection during the recent A/H1N1 swine flu pandemic has raised concerns of a far more dangerous outcome should this virus become resistant to current drug therapies. Currently oseltamivir (tamiflu) is intensively used for the treatment of influenza and is reported effective for 2009 A/H1N1 virus. However, as this virus is evolving fast, some drug-resistant strains are emerging. Therefore, it is critical to seek alternative treatments and identify roots of the drug resistance. In this paper, we use the steered molecular dynamics (SMD) approach to estimate the binding affinity of ligands to the glycoprotein neuraminidase. Our idea is based on the hypothesis that the larger is the force needed to unbind a ligand from a receptor the higher its binding affinity. Using all-atom models with Gromos force field 43a1 and explicit water, we have studied the binding ability of 32 ligands to glycoprotein neuraminidase from swine flu virus A/H1N1. The electrostatic interaction is shown to play a more important role in binding affinity than the van der Waals one. We have found that four ligands 141562, 5069, 46080, and 117079 from the NSC set are the most promising candidates to cope with this virus, while peramivir, oseltamivir, and zanamivir are ranked 8, 11, and 20. The observation that these four ligands are better than existing commercial drugs has been also confirmed by our results on the binding free energies obtained by the molecular mechanics-Poisson-Boltzmann surface area (MM-PBSA) method. Our prediction may be useful for the therapeutic application. PMID:21090736

  17. Failed anterior cruciate ligament reconstruction: analysis of factors leading to instability after primary surgery

    Institute of Scientific and Technical Information of China (English)

    MA Yong; AO Ying-fang; YU Jia-kuo; DAI Ling-hui; SHAO Zhen-xing

    2013-01-01

    Background Revision anterior cruciate ligament (ACL) surgery can be expected to become more common as the number of primary reconstruction keeps increasing.This study aims to investigate the factors causing instability after primary ACL reconstruction,which may provide an essential scientific base to prevent surgical failure.Methods One hundred and ten revision ACL surgeries were performed at our institute between November 2001 and July 2012.There were 74 men and 36 women,and the mean age at the time of revision was 27.6 years (range 16-56 years).The factors leading to instability after primary ACL reconstruction were retrospectively reviewed.Results Fifty-one knees failed because of bone tunnel malposition,with too anterior femoral tunnels (20 knees),posterior wall blowout (1 knee),vertical femoral tunnels (7 knees),too posterior tibial tunnels (12 knees),and too anterior tibial tunnels (10 knees).There was another knee performed with open surgery,where the femoral tunnel was drilled through the medial condyle and the tibial tunnel was too anterior.Five knees were found with malposition of the fixation.One knee with allograft was suspected of rejection and a second surgery had been made to take out the graft.Three knees met recurrent instability after postoperative infection.The other factors included traumatic (48 knees) and unidentified (12 knees).Conclusion Technical errors were the main factors leading to instability after primary ACL reconstructions,while attention should also be paid to the risk factors of re-injury and failure of graft incorporation.

  18. An integrated sampling and analysis approach for improved biodiversity monitoring

    Science.gov (United States)

    DeWan, Amielle A.; Zipkin, Elise F.

    2010-01-01

    Successful biodiversity conservation requires high quality monitoring data and analyses to ensure scientifically defensible policy, legislation, and management. Although monitoring is a critical component in assessing population status and trends, many governmental and non-governmental organizations struggle to develop and implement effective sampling protocols and statistical analyses because of the magnitude and diversity of species in conservation concern. In this article we describe a practical and sophisticated data collection and analysis framework for developing a comprehensive wildlife monitoring program that includes multi-species inventory techniques and community-level hierarchical modeling. Compared to monitoring many species individually, the multi-species approach allows for improved estimates of individual species occurrences, including rare species, and an increased understanding of the aggregated response of a community to landscape and habitat heterogeneity. We demonstrate the benefits and practicality of this approach to address challenges associated with monitoring in the context of US state agencies that are legislatively required to monitor and protect species in greatest conservation need. We believe this approach will be useful to regional, national, and international organizations interested in assessing the status of both common and rare species.

  19. An Ultrasound Image Despeckling Approach Based on Principle Component Analysis

    Directory of Open Access Journals (Sweden)

    Jawad F. Al-Asad

    2014-07-01

    Full Text Available An approach based on principle component analysis (PCA to filter out multiplicative noise from ultrasound images is presented in this paper. An image with speckle noise is segmented into small dyadic lengths, depending on the original size of the image, and the global covariance matrix is found. A projection matrix is then formed by selecting the maximum eigenvectors of the global covariance matrix. This projection matrix is used to filter speckle noise by projecting each segment into the signal subspace. The approach is based on the assumption that the signal and noise are independent and that the signal subspace is spanned by a subset of few principal eigenvectors. When applied on simulated and real ultrasound images, the proposed approach has outperformed some popular nonlinear denoising techniques such as 2D wavelets, 2D total variation filtering, and 2D anisotropic diffusion filtering in terms of edge preservation and maximum cleaning of speckle noise. It has also showed lower sensitivity to outliers resulting from the log transformation of the multiplicative noise.

  20. Group theoretical approach to the analysis of interferometers

    International Nuclear Information System (INIS)

    A geometric approach to the analysis of interferometers is presented in which beam splitters and phase shifters are described as performing rotations in an abstract three space on the state vector of the light. This formulation allows one to take full advantage of representation theory for the group SU(2) in analyzing interferometer sensitivity. With these techniques it is easy to show that in conventional interferometry, where light enters only one input port, the phase sensitivity scales as the inverse square root of the number of photons that pass through the interferometer during measurement time. By injecting suitably correlated light into both input ports of an interferometer, a phase sensitivity approaching 1/n can be achieved, where n is the number of photons. Input states that allow this sensitivity can readily be identified using SU(2) representation theory. Methods of generating such states using four-wave mixers or parametric amplifiers are described. By employing the group SU(1,1) this formalism may be extended to describe active interferometers in which the beam splitters are replaced by four-wave mixers or parametric devices. Such interferometers can also achieve a phase sensitivity approaching 1/n. The potential advantages of these interferometers over conventional interferometers will be discussed

  1. A QSAR approach for virtual screening of lead-like molecules en route to antitumor and antibiotic drugs from marine and microbial natural products

    Directory of Open Access Journals (Sweden)

    Florbela Pereira

    2014-05-01

    Figure 1. The unreported 15 lead antibiotic MNPs and MbNPs from AntiMarin database, using the best Rfs antibiotic model with a probability of being antibiotic greater than or equal to 0.8. Figure 2. The selected 4 lead antitumor MNPs and MbNPs from the AntiMarin database, using the best Rfs antitumor model with a probability of being antitumor greater than or equal to 0.8. The present work corroborates by one side the results of our previous work6 and enables the presentation of a new set of possible lead like bioactive compounds. Additionally, it is shown the usefulness of quantum-chemical descriptors in the discrimination of biological active and inactive compounds. The use of the εHOMO quantum-chemical descriptor in the discrimination of large scale data sets of lead-like or drug-like compounds has never been reported. This approach results in the reduction, in great extent, of the number of compounds used in real screens, and it reinforces the results of our previous work. Furthermore, besides the virtual screening, the computational methods can be very useful to build appropriate databases, allowing for effective shortcuts of NP extracts dereplication procedures, which will certainly result in increasing the efficiency of drug discovery.

  2. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  3. Information Retrieval and Graph Analysis Approaches for Book Recommendation

    Directory of Open Access Journals (Sweden)

    Chahinez Benkoussas

    2015-01-01

    Full Text Available A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  4. Genotypic and environmental variation in cadmium, chromium, lead and copper in rice and approaches for reducing the accumulation

    International Nuclear Information System (INIS)

    The field scale trials revealed significant genotypic and environmental differences in grain heavy metal (HM) concentrations of 158 newly developed rice varieties grown in twelve locations of Zhejiang province of China. Grain Pb and Cd contents in 5.3% and 0.4% samples, respectively, were above the maximum permissible concentration (MPC); none of samples had Cr/Cu exceeding MPC. Stepwise multiple linear regression analysis estimated soil HM critical levels for safe rice production. Low grain HM accumulation cultivars such as Xiushui817, Jiayou08-1 and Chunyou689 were recommended as suitable cultivars for planting in slight/medium HM contaminated soils. The alleviating regulator (AR) of (NH4)2SO4 as N fertilizer coupled with foliar spray of a mixture containing glutathione (GSH), Si, Zn and Se significantly decreased grain Cd, Cr, Cu and Pb concentrations grown in HM contaminated fields with no effect on yield, indicating a promising measurement for further reducing grain HM content to guarantee safe food production. - Highlights: • Field trials evaluated situation of grain HM in main rice growing areas of Zhejiang. • Forecasting index system to predict rice grain HM concentration was achieved. • Hybrid rice holds higher grain Cd concentration than conventional cultivars. • Low grain HM accumulation rice cultivars were successfully identified. • Developed alleviating regulator which effectively reduced grain toxic HM

  5. Genotypic and environmental variation in cadmium, chromium, lead and copper in rice and approaches for reducing the accumulation

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Fangbin; Wang, Runfeng [Institute of Crop Science, Department of Agronomy, College of Agriculture and Biotechnology, Zijingang Campus, Zhejiang University, Hangzhou 310058 (China); Cheng, Wangda [Jiaxing Academy of Agricultural Sciences, Jiaxing 314016 (China); Zeng, Fanrong; Ahmed, Imrul Mosaddek; Hu, Xinna; Zhang, Guoping [Institute of Crop Science, Department of Agronomy, College of Agriculture and Biotechnology, Zijingang Campus, Zhejiang University, Hangzhou 310058 (China); Wu, Feibo, E-mail: wufeibo@zju.edu.cn [Institute of Crop Science, Department of Agronomy, College of Agriculture and Biotechnology, Zijingang Campus, Zhejiang University, Hangzhou 310058 (China)

    2014-10-15

    The field scale trials revealed significant genotypic and environmental differences in grain heavy metal (HM) concentrations of 158 newly developed rice varieties grown in twelve locations of Zhejiang province of China. Grain Pb and Cd contents in 5.3% and 0.4% samples, respectively, were above the maximum permissible concentration (MPC); none of samples had Cr/Cu exceeding MPC. Stepwise multiple linear regression analysis estimated soil HM critical levels for safe rice production. Low grain HM accumulation cultivars such as Xiushui817, Jiayou08-1 and Chunyou689 were recommended as suitable cultivars for planting in slight/medium HM contaminated soils. The alleviating regulator (AR) of (NH{sub 4}){sub 2}SO{sub 4} as N fertilizer coupled with foliar spray of a mixture containing glutathione (GSH), Si, Zn and Se significantly decreased grain Cd, Cr, Cu and Pb concentrations grown in HM contaminated fields with no effect on yield, indicating a promising measurement for further reducing grain HM content to guarantee safe food production. - Highlights: • Field trials evaluated situation of grain HM in main rice growing areas of Zhejiang. • Forecasting index system to predict rice grain HM concentration was achieved. • Hybrid rice holds higher grain Cd concentration than conventional cultivars. • Low grain HM accumulation rice cultivars were successfully identified. • Developed alleviating regulator which effectively reduced grain toxic HM.

  6. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  7. An Approach for Economic Analysis of Intermodal Transportation

    Directory of Open Access Journals (Sweden)

    Bahri Sahin

    2014-01-01

    Full Text Available A different intermodal transportation model based on cost analysis considering technical, economical, and operational parameters is presented. The model consists of such intermodal modes as sea-road, sea-railway, road-railway, and multimode of sea-road-railway. A case study of cargo transportation has been carried out by using the suggested model. Then, the single road transportation mode has been compared to intermodal modes in terms of transportation costs. This comparison takes into account the external costs of intermodal transportation. The research reveals that, in the short distance transportation, single transportation modes always tend to be advantageous. As the transportation distance gets longer, intermodal transportation advantages begin to be effective on the costs. In addition, the proposed method in this study leads to determining the fleet size and capacity for transportation and the appropriate transportation mode.

  8. CFD analysis of decay heat removal scenarios of the lead cooled ELSY reactor

    International Nuclear Information System (INIS)

    The lead cooled European reactor concept ELSY is characterized by its innovative, compact design, where all components of the primary loops are located inside the reactor vessel. The vessel includes 8 steam generators and pumps which generate a coolant flux of 126 tons/s. At nominal operation conditions the core releases about 1500 MW converted to an electric power of 600 MW. For the coolant temperatures 400 C at the core inlet and 480 C are envisaged. If the reactor is shut down and the pumps are switched off, the decay heat is removed by isolation condenser (IC) systems acting on the SG secondary circuits. If the IC's are not available, heat removal by 4 dip coolers is foreseen, which can be operated by gravitation driven water flow or by air. Additionally the outer vessel wall is permanently cooled by a RVACS (reactor vessel air cooling) system located between the outer vessel wall and the reactor cavity. The main intention of this work is the investigation of the passive cooling systems which are used for the decay heat removal. The CFD vessel model of about 20 million cells takes advantage of the components symmetry and simulates a 90 deg. section of the reactor. The spatial resolution of the computational grid varies between 5 mm close to walls and 100 mm in undisturbed regions where only small gradients are expected. The core, the pumps and the SG's are simulated with porous media models including volumetric source terms for momentum and energy. For the SG's detailed CFD studies with at least one order of magnitude finer grid resolution are performed in order to obtain data for pressure losses. Solids like pipe walls or the core barrel are taking into account by heat conduction. As the model considers a closed system the coolant flux is controlled by momentum sources of the pumps and frictional losses mainly by the core and the SG's. The simulations are performed as single phase flows, therefore the free lead surface at the unclosed upper part of the vessel

  9. Computerized two-lead resting ECG analysis for the detection of coronary artery stenosis

    OpenAIRE

    Eberhard Grube, Andreas Bootsveld, Seyrani Yuecel, Joseph T. Shen, Michael Imhoff

    2007-01-01

    Background: Resting electrocardiogram (ECG) shows limited sensitivity and specificity for the detection of coronary artery disease (CAD). Several methods exist to enhance sensitivity and specificity of resting ECG for diagnosis of CAD, but such methods are not better than a specialist's judgement. We compared a new computer-enhanced, resting ECG analysis device, 3DMP, to coronary angiography to evaluate the device's accuracy in detecting hemodynamically relevant CAD. Methods: A convenience sa...

  10. Heat Recovery in a Pasta Factory. Pinch Analysis Leads to Optimal Heat Pump Usage.

    OpenAIRE

    Staine, Frédéric; Favrat, Daniel; Krummenacher, Pierre

    1994-01-01

    In the previous issue of the IEA Heat Pump Centre Newsletter (Vol, 12, No.3, pp. 29-31), an article by these authors described the use of pinch analysis (also known as pinch technology) in a buildings application. This article describes a similar procedure for integrating a heat pump into a pasta production process. Many industrial processes, and particularly those dealing with drying, are characterized by an overabundance of low- grade heat which often cannot be effi...

  11. Scientific publications from Arab world in leading journals of Integrative and Complementary Medicine: a bibliometric analysis

    OpenAIRE

    Zyoud, Sa’ed H; Al-Jabi, Samah W.; Sweileh, Waleed M.

    2015-01-01

    Background Bibliometric analysis is increasingly employed as a useful tool to assess the quantity and quality of research performance. The specific goal of the current study was to evaluate the performance of research output originating from Arab world and published in international Integrative and Complementary Medicine (ICM) journals. Methods Original scientific publications and reviews from the 22 Arab countries that were published in 22 international peer-reviewed ICM journals during all ...

  12. Application of CFD to Safety and Thermal-Hydraulic Analysis of Lead-Cooled Systems

    OpenAIRE

    Jeltsov, Marti

    2011-01-01

    Computational Fluid Dynamics (CFD) is increasingly being used in nuclear reactor safety analysis as a tool that enables safety related physical phenomena occurring in the reactor coolant system to be described in more detail and accuracy. Validation is a necessary step in improving predictive capability of a computationa code or coupled computational codes. Validation refers to the assessment of model accuracy incorporating any uncertainties (aleatory and epistemic) that may be of importance....

  13. A life cycle analysis approach to D and D decision-making

    Energy Technology Data Exchange (ETDEWEB)

    Yuracko, K.L.; Gresalfi, M. [Oak Ridge National Lab., TN (United States); Yerace, P. [Dept. of Energy, Fernald, OH (United States). Fernald Environmental Management; Flora, J. [West Valley Demonstration Project, NY (United States); Krstich, M.; Gerrick, D. [Environmental Management Solutions, Mason, OH (United States)

    1998-05-01

    This paper describes a life cycle analysis (LCA) approach that makes decontamination and decommissioning (D and D) of US Department of Energy facilities more efficient and more responsive to the concerns of the society. With the considerable complexity of D and D projects and their attendant environmental and health consequences, projects can no longer be designed based on engineering and economic criteria alone. Using the LCA D and D approach, the evaluation of material disposition alternatives explicitly includes environmental impacts, health and safety impacts, socioeconomic impacts, and stakeholder attitudes -- in addition to engineering and economic criteria. Multi-attribute decision analysis is used to take into consideration the uncertainties and value judgments that are an important part of all material disposition decisions. Use of the LCA D and D approach should lead to more appropriate selections of material disposition pathways and a decision-making process that is both understandable and defensible. The methodology and procedures of the LCA D and D approach are outlined and illustrated by an application of the approach at the Department of Energy`s West Valley Demonstration Project. Specifically, LCA was used to aid decisions on disposition of soil and concrete from the Tank Pad D and D Project. A decision tree and the Pollution Prevention/Waste Minimization Users Guide for Environmental Restoration Projects were used to identify possible alternatives for disposition of the soil and concrete. Eight alternatives encompassing source reduction, segregation, treatment, and disposal were defined for disposition of the soil; two alternatives were identified for disposition of the concrete. Preliminary results suggest that segregation and treatment are advantageous in the disposition of both the soil and the concrete. This and other recent applications illustrate the strength and ease of application of the LCA D and D approach.

  14. A life cycle analysis approach to D and D decision-making

    International Nuclear Information System (INIS)

    This paper describes a life cycle analysis (LCA) approach that makes decontamination and decommissioning (D and D) of US Department of Energy facilities more efficient and more responsive to the concerns of the society. With the considerable complexity of D and D projects and their attendant environmental and health consequences, projects can no longer be designed based on engineering and economic criteria alone. Using the LCA D and D approach, the evaluation of material disposition alternatives explicitly includes environmental impacts, health and safety impacts, socioeconomic impacts, and stakeholder attitudes -- in addition to engineering and economic criteria. Multi-attribute decision analysis is used to take into consideration the uncertainties and value judgments that are an important part of all material disposition decisions. Use of the LCA D and D approach should lead to more appropriate selections of material disposition pathways and a decision-making process that is both understandable and defensible. The methodology and procedures of the LCA D and D approach are outlined and illustrated by an application of the approach at the Department of Energy's West Valley Demonstration Project. Specifically, LCA was used to aid decisions on disposition of soil and concrete from the Tank Pad D and D Project. A decision tree and the Pollution Prevention/Waste Minimization Users Guide for Environmental Restoration Projects were used to identify possible alternatives for disposition of the soil and concrete. Eight alternatives encompassing source reduction, segregation, treatment, and disposal were defined for disposition of the soil; two alternatives were identified for disposition of the concrete. Preliminary results suggest that segregation and treatment are advantageous in the disposition of both the soil and the concrete. This and other recent applications illustrate the strength and ease of application of the LCA D and D approach

  15. Observation and analysis of nanodomain textures in dielectric relaxor lead magnesium niobate

    Energy Technology Data Exchange (ETDEWEB)

    Bursill, L.A.; Qian, Hua; Peng, Julin; Fan, X.D.

    1995-10-01

    High-resolution (0.2nm) images are used to locate chemical domains occurring with length scales of 1-5nm in the dielectric relaxor lead magnesium niobate (PMN). The experimental HRTEM images are analysed using computer-simulations and image matching in order to clarify and characterize the nature of the chemical ordering. Madelung electrostatic energy calculations are used to rank a set of structural models for possible ordered and disordered distributions of Nb and Mg over the B-sites of perovskite ABO{sub 3}. Next, the chemical domain textures are modelled using next-nearest-neighbour Ising (NNNI) models and Monte Carlo methods. This results in a preferred model for the B-site distribution (the extended NNN-Ising model), which is used for image simulations. Both HRTEM many-beam bright-and dark-field and single-beam dark-field TEM images are obtained and compared with the experimental images. The final result is a realistic atomic model for the Nb, Mg distribution of PMN. 42 refs., 2 tabs., 10 figs.

  16. Observation and analysis of nanodomain textures in dielectric relaxor lead magnesium niobate

    International Nuclear Information System (INIS)

    High-resolution (0.2nm) images are used to locate chemical domains occurring with length scales of 1-5nm in the dielectric relaxor lead magnesium niobate (PMN). The experimental HRTEM images are analysed using computer-simulations and image matching in order to clarify and characterize the nature of the chemical ordering. Madelung electrostatic energy calculations are used to rank a set of structural models for possible ordered and disordered distributions of Nb and Mg over the B-sites of perovskite ABO3. Next, the chemical domain textures are modelled using next-nearest-neighbour Ising (NNNI) models and Monte Carlo methods. This results in a preferred model for the B-site distribution (the extended NNN-Ising model), which is used for image simulations. Both HRTEM many-beam bright-and dark-field and single-beam dark-field TEM images are obtained and compared with the experimental images. The final result is a realistic atomic model for the Nb, Mg distribution of PMN. 42 refs., 2 tabs., 10 figs

  17. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  18. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  19. Chemical characterization of tin-lead glazed ceramics from Aragon (Spain) by neutron activation analysis

    International Nuclear Information System (INIS)

    Majolica pottery was the most characteristic tableware produced in Spain during the Medieval and Renaissance periods. A study of the three main production centers in the historical region of Aragon during Middle Ages and Renaissance was conducted on a set of 71 samples. The samples were analyzed by instrumental neutron activation analysis (INAA), and the resulting data were interpreted using an array of multivariate statistical procedures. Our results show a clear discrimination among different production centers allowing a reliable provenance attribution of ceramic sherds from the Aragonese workshops. (orig.)

  20. Chemical characterization of tin-lead glazed ceramics from Aragon (Spain) by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Inanez, J.G. [Smithsonian Institution, Suitland, MD (United States). Museum Conservation Inst.; Barcelona Univ. (Spain). Facultat de Geografia i Historia; Speakman, R.J. [Smithsonian Institution, Suitland, MD (United States). Museum Conservation Inst.; Buxeda i Garrigos, J. [Barcelona Univ. (Spain). Facultat de Geografia i Historia; Glascock, M.D. [Missouri Univ., Columbia, MO (United States). Research Reactor Center

    2010-07-01

    Majolica pottery was the most characteristic tableware produced in Spain during the Medieval and Renaissance periods. A study of the three main production centers in the historical region of Aragon during Middle Ages and Renaissance was conducted on a set of 71 samples. The samples were analyzed by instrumental neutron activation analysis (INAA), and the resulting data were interpreted using an array of multivariate statistical procedures. Our results show a clear discrimination among different production centers allowing a reliable provenance attribution of ceramic sherds from the Aragonese workshops. (orig.)

  1. Analysis of physical mechanisms which lead to destruction of fuel containing materials

    International Nuclear Information System (INIS)

    Analysis of physical processes that may cause the fragility and destruction of the lava-like fuel containing materials (FCM) of 4-th unit of Chernobyl nuclear plant has been carried out. The following processes are considered: 1) influence of electric fields arising in medium with incorporated radio-nuclides, 2) the role of the defect creation by irradiation of incorporated nuclides, 3) residual mechanical strains caused by their cooling from the melting temperature in the time of the accident to the ambient temperature at present. It is shown that mechanical strains of such kind might be one of the causes of degradation and destruction of fuel containing materials

  2. Symmetric or asymmetric oil prices? A meta-analysis approach

    International Nuclear Information System (INIS)

    The analysis of price asymmetries in the gasoline market is one of the most widely studied in energy economics. However, the great variation in the outcomes reported makes the drawing of any definitive conclusions difficult. Given this situation, a meta-analysis serves as an excellent tool to discover which characteristics of the various markets analyzed, and which specific features of these studies, might account for these differences. In adopting such an approach, this paper shows how the particular segment of the industry analyzed, the characteristics of the data, the years under review, the type of publication and the introduction of control variables might explain this heterogeneity in results. The paper concludes on these grounds that increased competition may significantly reduce the possibility of occurrence of asymmetric behavior. These results should be taken into consideration therefore in future studies of asymmetries in the oil industry. - Highlights: ► I study asymmetries in the price gasoline industry through a meta-analysis regression. ► The asymmetries are produced mainly in the retail market. ► The asymmetries are less frequent when we analyze recent cases. ► There may be some degree of publication bias. ► The level of competition may explain the patterns of asymmetry

  3. Synthetic approach to MT analysis of long period data

    International Nuclear Information System (INIS)

    Complete text of publication follows. In magnetotelluric analysis of long period data, between 2 hours and 1 Day, the impedance tensor can be seriously influenced by the Sq-variation. The superposition of the plane wave variations arising from Dst and the magnetic daily variation arising from Sq-current results in a polarisation of the magnetic variational field. Our initial study shows that this polarisation is not a problem for the analysis, if there is a significant non-polarized magnetic component in the data. The study now focuses on the polarized part of the magnetic variational field because it contains much more energy which results in better estimates. While a complete and constant polarised field can not be analyzed using a bivariate approach, there are day-to-day changes in the polarisation direction of the daily variations which should allow a bivariate analysis. To get a better understanding of the effect of Sq-vortex on the impedance tensor, a program is used which generates synthetic magnetic and electric fields using a synthetic and therefore known impedance tensor. The advantage is that every part of the variational-fields are known, hence you can relate the outcome to its excitation.

  4. Analysis of induced electrical currents from magnetic field coupling inside implantable neurostimulator leads

    Directory of Open Access Journals (Sweden)

    Seidman Seth J

    2011-10-01

    Full Text Available Abstract Background Over the last decade, the number of neurostimulator systems implanted in patients has been rapidly growing. Nearly 50, 000 neurostimulators are implanted worldwide annually. The most common type of implantable neurostimulators is indicated for pain relief. At the same time, commercial use of other electromagnetic technologies is expanding, making electromagnetic interference (EMI of neurostimulator function an issue of concern. Typically reported sources of neurostimulator EMI include security systems, metal detectors and wireless equipment. When near such sources, patients with implanted neurostimulators have reported adverse events such as shock, pain, and increased stimulation. In recent in vitro studies, radio frequency identification (RFID technology has been shown to inhibit the stimulation pulse of an implantable neurostimulator system during low frequency exposure at close distances. This could potentially be due to induced electrical currents inside the implantable neurostimulator leads that are caused by magnetic field coupling from the low frequency identification system. Methods To systematically address the concerns posed by EMI, we developed a test platform to assess the interference from coupled magnetic fields on implantable neurostimulator systems. To measure interference, we recorded the output of one implantable neurostimulator, programmed for best therapy threshold settings, when in close proximity to an operating low frequency RFID emitter. The output contained electrical potentials from the neurostimulator system and those induced by EMI from the RFID emitter. We also recorded the output of the same neurostimulator system programmed for best therapy threshold settings without RFID interference. Using the Spatially Extended Nonlinear Node (SENN model, we compared threshold factors of spinal cord fiber excitation for both recorded outputs. Results The electric current induced by low frequency RFID emitter

  5. Detection and Analysis of Lead,Cadmium and Arsenic Content in Common Vegetables

    Institute of Scientific and Technical Information of China (English)

    Yining; HE; Peixia; CHENG; Ming; WANG; Minyu; HU

    2014-01-01

    This study was carried out to detect content of heavy metals( Pb,Cd,and As) in vegetables,understand the current situation of heavy metal contamination in vegetables,and to provide scientific reference for further researches. It randomly selected 6 large vegetable markets and 6 supermarkets in Changsha City,selected 8 types of typical vegetables,and detected 96 samples. In accordance with maximum level of contaminants in foods in existing GB2762- 2012 standard,Nemerow composite pollution index( Pt) and grading standards,it made evaluation: uncontaminated( Pt≤ 1),mildly contaminated( 1 < Pt≤2),moderately contaminated( 2 < Pt≤3),and highly contaminated( Pt>3). Among 96 samples,range of content of Pb,Cd and As is( 0. 06- 1. 41),( 0. 06- 1. 26) and( 0. 00- 0. 91) mg / kg respectively; the over- limit rate of these metals exceeding the safety level is 78. 13%,45. 83%,and 34. 38% separately; the composite pollution index is in( 0. 90-6. 05),the eggplant is 6. 05 and hot pepper is 3. 24; the content of Pb( F =23. 908,P =0. 001) and Cd( F =64. 908,P =0. 000)are significantly different between 8 types of vegetables and there is no significant difference between the content of As( F = 4. 634,P = 0. 705> 0. 05) in 8 types of vegetables. Study shows that common vegetables in Changsha City has problem of excess Pb,Cd and As,and the Pb over- limit rate is the highest. The composite pollution index indicates that most heavy metal contamination of vegetables is mild and moderate contamination,melon,fruit and vegetable contamination is high contamination,and Cd is the major factor leading to contamination of melons,fruits and vegetables.

  6. Numerical and experimental analysis of factors leading to suture dehiscence after Billroth II gastric resection.

    Science.gov (United States)

    Cvetkovic, Aleksandar M; Milasinovic, Danko Z; Peulic, Aleksandar S; Mijailovic, Nikola V; Filipovic, Nenad D; Zdravkovic, Nebojsa D

    2014-11-01

    The main goal of this study was to numerically quantify risk of duodenal stump blowout after Billroth II (BII) gastric resection. Our hypothesis was that the geometry of the reconstructed tract after BII resection is one of the key factors that can lead to duodenal dehiscence. We used computational fluid dynamics (CFD) with finite element (FE) simulations of various models of BII reconstructed gastrointestinal (GI) tract, as well as non-perfused, ex vivo, porcine experimental models. As main geometrical parameters for FE postoperative models we have used duodenal stump length and inclination between gastric remnant and duodenal stump. Virtual gastric resection was performed on each of 3D FE models based on multislice Computer Tomography (CT) DICOM. According to our computer simulation the difference between maximal duodenal stump pressures for models with most and least preferable geometry of reconstructed GI tract is about 30%. We compared the resulting postoperative duodenal pressure from computer simulations with duodenal stump dehiscence pressure from the experiment. Pressure at duodenal stump after BII resection obtained by computer simulation is 4-5 times lower than the dehiscence pressure according to our experiment on isolated bowel segment. Our conclusion is that if the surgery is performed technically correct, geometry variations of the reconstructed GI tract by themselves are not sufficient to cause duodenal stump blowout. Pressure that develops in the duodenal stump after BII resection using omega loop, only in the conjunction with other risk factors can cause duodenal dehiscence. Increased duodenal pressure after BII resection is risk factor. Hence we recommend the routine use of Roux en Y anastomosis as a safer solution in terms of resulting intraluminal pressure. However, if the surgeon decides to perform BII reconstruction, results obtained with this methodology can be valuable. PMID:25201585

  7. Pencil lead scratches on steel surfaces as a substrate for LIBS analysis of dissolved salts in liquids

    Energy Technology Data Exchange (ETDEWEB)

    Jijon, D; Costa, C, E-mail: judijival@hotmail.com [Departamento de Fisica, Escuela Politecnica Nacional, Ladron de Guevara E11-256, Apartado 17-12-866, Quito (Ecuador)

    2011-01-01

    A new substrate for the quantitative analysis of salts dissolved in liquids with Laser-induced Breakdown Spectroscopy (LIBS) is introduced for the first time. A steel surface scratched with HB pencil lead is introduced as a very efficient and sensitive substrate for quantitative analysis of dissolved salts in liquids. In this work we demonstrate the analytical quality of this system with the analysis of the crystalline deposits formed by the dried aqueous solutions of salts. We focused on analytical parameters such as sensitivity and linearity for the salt cations in each case. Four salts were studied (Sr(NO{sub 3}){sub 2}, LiSO{sub 4}, RbCl and BaCl), at nine different concentrations each. To improve linearity and lower the overall error in the calibration curves, we introduce a novel outlier removal method that takes into account the homogeneity of the dry deposits on the analytical surface.

  8. Pencil lead scratches on steel surfaces as a substrate for LIBS analysis of dissolved salts in liquids

    International Nuclear Information System (INIS)

    A new substrate for the quantitative analysis of salts dissolved in liquids with Laser-induced Breakdown Spectroscopy (LIBS) is introduced for the first time. A steel surface scratched with HB pencil lead is introduced as a very efficient and sensitive substrate for quantitative analysis of dissolved salts in liquids. In this work we demonstrate the analytical quality of this system with the analysis of the crystalline deposits formed by the dried aqueous solutions of salts. We focused on analytical parameters such as sensitivity and linearity for the salt cations in each case. Four salts were studied (Sr(NO3)2, LiSO4, RbCl and BaCl), at nine different concentrations each. To improve linearity and lower the overall error in the calibration curves, we introduce a novel outlier removal method that takes into account the homogeneity of the dry deposits on the analytical surface.

  9. Microcanonical thermostatistics analysis without histograms: cumulative distribution and Bayesian approaches

    CERN Document Server

    Alves, Nelson A; Rizzi, Leandro G

    2015-01-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...

  10. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  11. Comparative Analysis for Polluted Agricultural Soils with Arsenic, Lead, and Mercury in Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Yarto-Ramirez, Mario; Santos-Santos, Elvira; Gavilan-Garcia, Arturo; Castro-Diaz, Jose; Gavilan-Garcia, Irma Cruz; Rosiles, Rene; Suarez, Sara

    2004-03-31

    The use of mercury in Mexico has been associated with the mining industry of Zacatecas. This activity has polluted several areas currently used for agriculture. The main objective of this study was to investigate the heavy metal concentration (Hg, As and Pb) in soil of Guadalupe Zacatecas in order to justify a further environmental risk assessment in the site. A 2X3 km grid was used for the sampling process and 20 soil samples were taken. The analysis was developed using EPA SW 846: 3050B/6010B method for arsenic and metals and EPA SW 846: 7471A for total mercury. It was concluded that there are heavy metals in agricultural soils used for corn and bean farming. For this it is required to make an environmental risk assessment and a bioavailability study in order to determine if there's a risk for heavy metals bioaccumulation in animals or human beings or metal lixiviation to aquifers.

  12. A comparative analysis of research and development in Iran and four leading countries

    Directory of Open Access Journals (Sweden)

    Tayeb Dehghani

    2015-09-01

    Full Text Available Research and Development (R & D are indicative of countries' advancement and investment in R & D units is considered as a competitive advantage. Nowadays, only countries which attempt to pave the way for rapid scientific advance and economic success through pursuing new technological advances and bridging their knowledge and technology gaps can take part in global competition. The present paper aims at contributing to the extant literature and investigates various R & D success factors and their status in Iran and developed countries including Japan, the U. S., China, and Germany. Then, a contrastive analysis of the countries follows, and ultimately, some suggestions are postulated to develop research status in Iran. In the light of the results, industrial countries in comparison with Iran view investment in R & D as an indispensible principle of economy and have increasingly developed the requisite infrastructures for its growth as well as full bloom of science and technology.

  13. Design analysis of a lead-lithium/supercritical CO2 printed circuit heat exchanger for primary power recovery

    International Nuclear Information System (INIS)

    The Spanish National Program TECNOFUS is developing the functionalities of dual-coolant breeding blanket (DCBB) design concept and its plant auxiliary systems for a future power reactor (DEMO). The Dual-coolant He/Liquid metal breeding blanket option represents the best placed option in terms of power conversion efficiency. Recent results for an optimum power conversion lay-out show gross efficiencies in the range of 47%, with eutectic lead-lithium as main primary nuclear power recovering fluid and supercritical CO2 for the secondary circuit. This work presents a design exercise (sizing, pressure drop optimizations and tritium permeation analysis) of a lead-lithium/C02 printed circuit heat exchanger at DCBB primary coolant power range.

  14. Meta-analysis for 2 x 2 tables: a Bayesian approach.

    Science.gov (United States)

    Carlin, J B

    1992-01-30

    This paper develops and implements a fully Bayesian approach to meta-analysis, in which uncertainty about effects in distinct but comparable studies is represented by an exchangeable prior distribution. Specifically, hierarchical normal models are used, along with a parametrization that allows a unified approach to deal easily with both clinical trial and case-control study data. Monte Carlo methods are used to obtain posterior distributions for parameters of interest, integrating out the unknown parameters of the exchangeable prior or 'random effects' distribution. The approach is illustrated with two examples, the first involving a data set on the effect of beta-blockers after myocardial infarction, and the second based on a classic data set comprising 14 case-control studies on the effects of smoking on lung cancer. In both examples, rather different conclusions from those previously published are obtained. In particular, it is claimed that widely used methods for meta-analysis, which involve complete pooling of 'O-E' values, lead to understatement of uncertainty in the estimation of overall or typical effect size. PMID:1349763

  15. Equivalence of ADM Hamiltonian and Effective Field Theory approaches at next-to-next-to-leading order spin1-spin2 coupling of binary inspirals

    International Nuclear Information System (INIS)

    The next-to-next-to-leading order spin1-spin2 potential for an inspiralling binary, that is essential for accuracy to fourth post-Newtonian order, if both components in the binary are spinning rapidly, has been recently derived independently via the ADM Hamiltonian and the Effective Field Theory approaches, using different gauges and variables. Here we show the complete physical equivalence of the two results, thereby we first prove the equivalence of the ADM Hamiltonian and the Effective Field Theory approaches at next-to-next-to-leading order with the inclusion of spins. The main difficulty in the spinning sectors, which also prescribes the manner in which the comparison of the two results is tackled here, is the existence of redundant unphysical spin degrees of freedom, associated with the spin gauge choice of a point within the extended spinning object for its representative worldline. After gauge fixing and eliminating the unphysical degrees of freedom of the spin and its conjugate at the level of the action, we arrive at curved spacetime generalizations of the Newton-Wigner variables in closed form, which can also be used to obtain further Hamiltonians, based on an Effective Field Theory formulation and computation. Finally, we make use of our validated result to provide gauge invariant relations among the binding energy, angular momentum, and orbital frequency of an inspiralling binary with generic compact spinning components to fourth post-Newtonian order, including all known sectors up to date

  16. A Chemoinformatics Approach to the Discovery of Lead-Like Molecules from Marine and Microbial Sources En Route to Antitumor and Antibiotic Drugs

    Directory of Open Access Journals (Sweden)

    Florbela Pereira

    2014-01-01

    Full Text Available The comprehensive information of small molecules and their biological activities in the PubChem database allows chemoinformatic researchers to access and make use of large-scale biological activity data to improve the precision of drug profiling. A Quantitative Structure–Activity Relationship approach, for classification, was used for the prediction of active/inactive compounds relatively to overall biological activity, antitumor and antibiotic activities using a data set of 1804 compounds from PubChem. Using the best classification models for antibiotic and antitumor activities a data set of marine and microbial natural products from the AntiMarin database were screened—57 and 16 new lead compounds for antibiotic and antitumor drug design were proposed, respectively. All compounds proposed by our approach are classified as non-antibiotic and non-antitumor compounds in the AntiMarin database. Recently several of the lead-like compounds proposed by us were reported as being active in the literature.

  17. THE EVOLUTION OF THEORETICAL APPROACHES OF THE CORPORATIONS’ ANALYSIS

    Directory of Open Access Journals (Sweden)

    Victor BAZILEVICH

    2013-12-01

    Full Text Available The article explores the basis of methodology of the corporations' analysis. It is summarized the theoretical approaches to determine their economic substance. Characteristic features inherent in corporations are identified. It is stated that the study of the evolution of scientific views on the problem of the formation and development corporations in the world economic literature has revealed the most significant in terms of theoretical and methodological concept, updated the need for scientific development of the problem of formation of corporations in the economic system. Nowadays the scientists are united by a perception of corporations as a difficult economic institution, but there is no unity of opinion on its essential features. Thus, a holistic theory of corporations is at the stage of active formation.

  18. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  19. The Network Analysis of Urban Streets: A Primal Approach

    CERN Document Server

    Porta, S; Latora, V; Porta, Sergio; Crucitti, Paolo; Latora, Vito

    2005-01-01

    The network metaphor in the analysis of urban and territorial cases has a long tradition especially in transportation/land-use planning and economic geography. More recently, urban design has brought its contribution by means of the "space syntax" methodology. All these approaches, though under different terms like accessibility, proximity, integration,connectivity, cost or effort, focus on the idea that some places (or streets) are more important than others because they are more central. The study of centrality in complex systems,however, originated in other scientific areas, namely in structural sociology, well before its use in urban studies; moreover, as a structural property of the system, centrality has never been extensively investigated metrically in geographic networks as it has been topologically in a wide range of other relational networks like social, biological or technological. After two previous works on some structural properties of the dual and primal graph representations of urban street ne...

  20. A Subjective Risk Analysis Approach of Container Supply Chains

    Institute of Scientific and Technical Information of China (English)

    Zai-Li Yang; Jin Wang; Steve Bonsall; Jian-Bo Yang; Quan-Gen Fang

    2005-01-01

    After the 9/11 terrorism attacks, the lock-out of the American West Ports in 2002 and the breakout of SARS disease in 2003 have further focused mind of both the public and industrialists to take effective and timely measures for assessing and controlling the risks related to container supply chains (CSCs). However, due to the complexity of the risks in the chains, conventional quantitative risk assessment (QRA) methods may not be capable of providing sufficient safety management information, as achieving such a functionality requires enabling the possibility of conducting risk analysis in view of the challenges and uncertainties posed by the unavailability and incompleteness of historical failure data. Combing the fuzzy set theory (FST) and an evidential reasoning (ER) approach, the paper presents a subjective method to deal with the vulnerability-based risks, which are more ubiquitous and uncertain than the traditional hazard-based ones in the chains.

  1. Sensitivity analysis for nonrandom dropout: a local influence approach.

    Science.gov (United States)

    Verbeke, G; Molenberghs, G; Thijs, H; Lesaffre, E; Kenward, M G

    2001-03-01

    Diggle and Kenward (1994, Applied Statistics 43, 49-93) proposed a selection model for continuous longitudinal data subject to nonrandom dropout. It has provoked a large debate about the role for such models. The original enthusiasm was followed by skepticism about the strong but untestable assumptions on which this type of model invariably rests. Since then, the view has emerged that these models should ideally be made part of a sensitivity analysis. This paper presents a formal and flexible approach to such a sensitivity assessment based on local influence (Cook, 1986, Journal of the Royal Statistical Society, Series B 48, 133-169). The influence of perturbing a missing-at-random dropout model in the direction of nonrandom dropout is explored. The method is applied to data from a randomized experiment on the inhibition of testosterone production in rats. PMID:11252620

  2. Statistical approach to the analysis of cell desynchronization data

    Science.gov (United States)

    Milotti, Edoardo; Del Fabbro, Alessio; Dalla Pellegrina, Chiara; Chignola, Roberto

    2008-07-01

    Experimental measurements on semi-synchronous tumor cell populations show that after a few cell cycles they desynchronize completely, and this desynchronization reflects the intercell variability of cell-cycle duration. It is important to identify the sources of randomness that desynchronize a population of cells living in a homogeneous environment: for example, being able to reduce randomness and induce synchronization would aid in targeting tumor cells with chemotherapy or radiotherapy. Here we describe a statistical approach to the analysis of the desynchronization measurements that is based on minimal modeling hypotheses, and can be derived from simple heuristics. We use the method to analyze existing desynchronization data and to draw conclusions on the randomness of cell growth and proliferation.

  3. Analysis of patient diaries in Danish ICUs: a narrative approach

    DEFF Research Database (Denmark)

    Egerod, Ingrid; Christensen, Doris

    2009-01-01

    OBJECTIVES: The objective was to describe the structure and content of patient diaries written for critically ill patients in Danish intensive care units (ICUs). BACKGROUND: Critical illness is associated with physical and psychological aftermath including cognitive impairment and post-traumatic ...... of the narratives may pave the way for insights to improve critical care nursing and ICU rehabilitation.......OBJECTIVES: The objective was to describe the structure and content of patient diaries written for critically ill patients in Danish intensive care units (ICUs). BACKGROUND: Critical illness is associated with physical and psychological aftermath including cognitive impairment and post......-traumatic stress. Patient diaries written in the intensive care unit are used to help ICU-survivors come to terms with their illness. RESEARCH METHODOLOGY: The study had a qualitative, descriptive and explorative design, using a narrative approach of analysis. Data were analysed on several levels: extra-case level...

  4. Stochastic approach to observability analysis in water networks

    Directory of Open Access Journals (Sweden)

    S. Díaz

    2016-07-01

    Full Text Available This work presents an alternative technique to the existing methods for observability analysis (OA in water networks, which is a prior essential step for the implementation of state estimation (SE techniques within such systems. The methodology presented here starts from a known hydraulic state and assumes random gaussian distributions for the uncertainty of some hydraulic variables, which is then propagated to the rest of the system. This process is repeated again to analyze the change in the network uncertainty when metering devices considered as error-free are included, based on which the network observability can be evaluated. The method’s potential is presented in an illustrative example, which shows the additional information that this methodology provides with respect to traditional OA approaches. This proposal allows a better understanding of the network and constitutes a practical tool to prioritize the location of additional meters, thus enhancing the transformation of large urban areas into actual smart cities.

  5. Strategic Technology Investment Analysis: An Integrated System Approach

    Science.gov (United States)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  6. Pediatric consent: case study analysis using a principles approach.

    Science.gov (United States)

    Azotam, Adaorah N U

    2012-07-01

    This article will explore pediatric consent through the analysis of a clinical case study using the principles of biomedical ethics approach. Application of the principles of autonomy, nonmaleficence, beneficence, and justice will be dissected in order to attempt to establish resolution of the ethical dilemma. The main conflict in this case study deals with whether the wishes of an adolescent for end-of-life care should be followed or should the desire of his parents outweigh this request. In terminal cancer, the hope of early palliative care and dignity in dying serve as priorities in therapy. Application of the moral principles to both sides of the dilemma aided in providing an objective resolution to uphold pediatric consent. PMID:22753459

  7. New program with new approach for spectral data analysis

    CERN Document Server

    Sochi, Taha

    2013-01-01

    This article presents a high-throughput computer program, called EasyDD, for batch processing, analyzing and visualizing of spectral data; particularly those related to the new generation of synchrotron detectors and X-ray powder diffraction applications. This computing tool is designed for the treatment of large volumes of data in reasonable time with affordable computational resources. A case study in which this program was used to process and analyze powder diffraction data obtained from the ESRF synchrotron on an alumina-based nickel nanoparticle catalysis system is also presented for demonstration. The development of this computing tool, with the associated protocols, is inspired by a novel approach in spectral data analysis.

  8. Random matrix approach to multivariate categorical data analysis

    CERN Document Server

    Patil, Aashay

    2015-01-01

    Correlation and similarity measures are widely used in all the areas of sciences and social sciences. Often the variables are not numbers but are instead qualitative descriptors called categorical data. We define and study similarity matrix, as a measure of similarity, for the case of categorical data. This is of interest due to a deluge of categorical data, such as movie ratings, top-10 rankings and data from social media, in the public domain that require analysis. We show that the statistical properties of the spectra of similarity matrices, constructed from categorical data, follow those from random matrix theory. We demonstrate this approach by applying it to the data of Indian general elections and sea level pressures in North Atlantic ocean.

  9. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  10. A practical approach to object based requirements analysis

    Science.gov (United States)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  11. METHODOLOGICAL APPROACH AND MODEL ANALYSIS FOR IDENTIFICATION OF TOURIST TRENDS

    Directory of Open Access Journals (Sweden)

    Neven Šerić

    2015-06-01

    Full Text Available The draw and diversity of the destination’s offer is an antecedent of the tourism visits growth. The destination supply differentiation is carried through new, specialised tourism products. The usual approach consists of forming specialised tourism products in accordance with the existing tourism destination image. Another approach, prevalent in practice of developed tourism destinations is based on innovating the destination supply through accordance with the global tourism trends. For this particular purpose, it is advisable to choose a monitoring and analysis method of tourism trends. The goal is to determine actual trends governing target markets, differentiating whims from trends during the tourism preseason. When considering the return on investment, modifying the destination’s tourism offer on the basis of a tourism whim is a risky endeavour, indeed. Adapting the destination’s supply to tourism whims can result in a shifted image, one that is unable to ensure a long term interest and tourist vacation growth. With regard to tourism trend research and based on the research conducted, an advisable model for evaluating tourism phenomena is proposed, one that determines whether tourism phenomena is a tourism trend or a tourism whim.

  12. Qualitative Organic Analysis: An Efficient, Safer, and Economical Approach to Preliminary Tests and Functional Group Analysis

    Science.gov (United States)

    Dhingra, Sunita; Angrish, Chetna

    2011-01-01

    Qualitative organic analysis of an unknown compound is an integral part of the university chemistry laboratory curriculum. This type of training is essential as students learn to approach a problem systematically and to interpret the results logically. However, considerable quantities of waste are generated by using conventional methods of…

  13. Study of the speciation of lead and zinc in industrial dusts and slags and in a contaminated soil: a spectroscopic approach

    International Nuclear Information System (INIS)

    As the study of physicochemical forms of metals in polluted soils is necessary to understand their mobilisation, and therefore to assess the risk they represent for the environment, the objective of this research thesis is to determine the speciation of lead and zinc in a soil contaminated by particles (dust and slag) released by a lead production plant. This determination is performed by using a spectroscopic approach, optic microscopy, X ray diffraction, scanning electronic microscopy, transmission electronic microscopy, electronic microprobe, and Raman micro-spectrometry. In order to understand the evolution of speciation of metals and of their propagation in soils, dust and slag produced by the industrial process have been sampled, and morphologically characterized. Associations of metals with other compounds like iron oxides and carbonates have been highlighted. The author shows that the contact with the ground results in a higher alteration of particles and in metal mobilisation. She reports the study of lead and zinc localisation in various particles, and of the influence of a change of soil physicochemical conditions (pH decrease, reduction by soil clogging during humid periods)

  14. A Critical Analysis of Rational & Emotional Approaches in Car Selling

    Directory of Open Access Journals (Sweden)

    Krishn A. Goyal

    2010-12-01

    Full Text Available A well known fact is that investment in a Car is the costliest investment made in a life time only next to construction of a house, for any human being. It is a common knowledge that all of us are attracted towards cars right from childhood and we have developed our own perceptions for cars. When we acquire the capacity to buy cars, our experience of buying, involves both emotional and rational aspects which lead to a purchase decision. Unlike other consumable durables, the decision to buy specific brand of Car is shaped over a long period of time. The period between recognition of need to buy a car and the actual purchase may run into many weeks or even months. Considerable research has focussed on conceptually and operationally defining various factors that lead to a purchase decision. However, because of the inherent difficulties in deciphering consumer behaviour coupled with exponential changes in consumer aspirations, there is a need to constantly re-define our perceptions about consumer behaviour. Revealed Preference Theory of Samuelson & Bounded rationality Theory of Herbert Simon and many others have provided a conceptual analysis of Consumer Behaviour from the perspective of economics, we have still not been able to pinpoint whether consumers are Rational or Emotional when it comes to buying Cars. “ According to some early economic theorists (e.g., Adam Smith, Jeremy Bentham, Alfred Marshall, man’s/woman’s desire for goods and services exceed his/her ability to pay. Therefore, buying decisions are made through a rational process during which we assign a value to each desired product or service offering based upon our assessment of the ability of that offering to satisfy our needs and desires. This want satisfying ability is termed “utility.” As different offerings possess different levels of utility, rational behavior dictates that one seek to maximize utility.

  15. Lead Poisoning

    Science.gov (United States)

    Lead is a metal that occurs naturally in the earth's crust. Lead can be found in all parts of our ... from human activities such as mining and manufacturing. Lead used to be in paint; older houses may ...

  16. Comparative proteomic analysis of Typha angustifolia leaf under chromium, cadmium and lead stress

    International Nuclear Information System (INIS)

    The present study investigated Typha angustifolia leaf proteome in response to Cr, Cd and Pb stress. T. angustifolia of 90 (D90) and 130 d (D130) old plants were subjected to 1 mM Cr, Cd and Pb and samples were collected 30 d after treatment. 2-DE coupled with MS (mass spectrometry) was used to analyze and identify Cr, Cd and Pb-responsive proteins. More than 1600 protein spots were reproducibly detected on each gel, wherein 44, 46, 66 and 33, 26, 62 spots in D90 and D130 samples were differentially expressed by Cr, Cd, Pb over the control, respectively. Of these differentially expressed proteins, 3, 1, 8 overlapped in D90 and D130; while 5, 8, 5 with regulation factors above 3 in one of D90 or D130 samples. Total of 22 and 4 up- and down-regulated proteins were identified using MS and data bank analysis. Cr-induced expression of ATP synthase, RuBisCO small subunit and coproporphyrinogen III oxidase; Cd-induced RuBisCO large subunit; Pb up-regulated carbohydrate metabolic pathway enzymes of fructokinase, and improved RuBisCO activase and large subunit, Mg-protoporphyrin IX chelatase. Contrarily, elF4F was inhibited by Cr/Pb, chloroplast FtsZ-like protein and GF14omega impeded by Cd and Pb, respectively.

  17. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  18. Service quality measurement. A new approach based on Conjoint Analysis

    Directory of Open Access Journals (Sweden)

    Valerio Gatta

    2013-03-01

    Full Text Available This article is concerned with the measurement of service quality. The main objective is to suggest an alternative criterion for service quality definition and measurement. After a brief description of the most traditional techniques and with the intent to overcome some critical factors pertaining them, I focus my attention on the choice-based conjoint analysis, a particular stated preferences method that estimates the structure of consumers’ preferences given their choices between alternative service options. Discrete choice models and the traditional compensatory utility maximization framework are extended by the inclusion of the attribute cutoffs into the decision problem formulation. The major theoretical aspects of the described approach are examined and discussed, showing that it is able to identify the relative importance of the relevant attributes, calculating elasticity and monetary evaluation, and to determine a service quality index. Then simulations enable the identification of potential service quality levels, so that marketing managers have valuable information to plan their best business strategies. We present findings from an empirical study in the public transport sector designed to gain insights into the use of the choice-based conjoint analysis.

  19. DNA Microarray Data Analysis: A Novel Biclustering Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Tewfik Ahmed H

    2006-01-01

    Full Text Available Biclustering algorithms refer to a distinct class of clustering algorithms that perform simultaneous row-column clustering. Biclustering problems arise in DNA microarray data analysis, collaborative filtering, market research, information retrieval, text mining, electoral trends, exchange analysis, and so forth. When dealing with DNA microarray experimental data for example, the goal of biclustering algorithms is to find submatrices, that is, subgroups of genes and subgroups of conditions, where the genes exhibit highly correlated activities for every condition. In this study, we develop novel biclustering algorithms using basic linear algebra and arithmetic tools. The proposed biclustering algorithms can be used to search for all biclusters with constant values, biclusters with constant values on rows, biclusters with constant values on columns, and biclusters with coherent values from a set of data in a timely manner and without solving any optimization problem. We also show how one of the proposed biclustering algorithms can be adapted to identify biclusters with coherent evolution. The algorithms developed in this study discover all valid biclusters of each type, while almost all previous biclustering approaches will miss some.

  20. The approach to risk analysis in three industries

    International Nuclear Information System (INIS)

    It is the purpose of this paper to review how risk and safety analysis is performed in the three major industries of nuclear power, space flight, and chemical and petroleum processes. The underlying reason for such a review is the belief that efficiencies and safety enhancements may result from a greater exchange of risk assessment technology between these industries. The thrust of this discussion related to the engineered systems involved in the three industries. The industries are very different. The chemical industry epitomizes the highly competitive private sector and its bottom-line emphasis; the nuclear power industry is unique by the degree to which it is regulated; and the space industry is essentially a government business just beginning to have commercial implications. Institutional differences are extreme; however, from a societal needs, and their safety implications have a far reaching impact on public opinion and support. In reviewing the risk and safety analysis activities, particular attention is given to the use of such quantitative approaches as probabilistic risk assessment (PRA) as it has evolved in the nuclear power industry

  1. DNA Microarray Data Analysis: A Novel Biclustering Algorithm Approach

    Science.gov (United States)

    Tchagang, Alain B.; Tewfik, Ahmed H.

    2006-12-01

    Biclustering algorithms refer to a distinct class of clustering algorithms that perform simultaneous row-column clustering. Biclustering problems arise in DNA microarray data analysis, collaborative filtering, market research, information retrieval, text mining, electoral trends, exchange analysis, and so forth. When dealing with DNA microarray experimental data for example, the goal of biclustering algorithms is to find submatrices, that is, subgroups of genes and subgroups of conditions, where the genes exhibit highly correlated activities for every condition. In this study, we develop novel biclustering algorithms using basic linear algebra and arithmetic tools. The proposed biclustering algorithms can be used to search for all biclusters with constant values, biclusters with constant values on rows, biclusters with constant values on columns, and biclusters with coherent values from a set of data in a timely manner and without solving any optimization problem. We also show how one of the proposed biclustering algorithms can be adapted to identify biclusters with coherent evolution. The algorithms developed in this study discover all valid biclusters of each type, while almost all previous biclustering approaches will miss some.

  2. Mediman: Object oriented programming approach for medical image analysis

    International Nuclear Information System (INIS)

    Mediman is a new image analysis package which has been developed to analyze quantitatively Positron Emission Tomography (PET) data. It is object-oriented, written in C++ and its user interface is based on InterViews on top of which new classes have been added. Mediman accesses data using external data representation or import/export mechanism which avoids data duplication. Multimodality studies are organized in a simple database which includes images, headers, color tables, lists and objects of interest (OOI's) and history files. Stored color table parameters allow to focus directly on the interesting portion of the dynamic range. Lists allow to organize the study according to modality, acquisition protocol, time and spatial properties. OOI's (points, lines and regions) are stored in absolute 3-D coordinates allowing correlation with other co-registered imaging modalities such as MRI or SPECT. OOI's have visualization properties and are organized into groups. Quantitative ROI analysis of anatomic images consists of position, distance, volume calculation on selected OOI's. An image calculator is connected to mediman. Quantitation of metabolic images is performed via profiles, sectorization, time activity curves and kinetic modeling. Mediman is menu and mouse driven, macro-commands can be registered and replayed. Its interface is customizable through a configuration file. The benefit of the object-oriented approach are discussed from a development point of view

  3. Pathway analysis in attention deficit hyperactivity disorder: An ensemble approach.

    Science.gov (United States)

    Mooney, Michael A; McWeeney, Shannon K; Faraone, Stephen V; Hinney, Anke; Hebebrand, Johannes; Nigg, Joel T; Wilmot, Beth

    2016-09-01

    Despite a wealth of evidence for the role of genetics in attention deficit hyperactivity disorder (ADHD), specific and definitive genetic mechanisms have not been identified. Pathway analyses, a subset of gene-set analyses, extend the knowledge gained from genome-wide association studies (GWAS) by providing functional context for genetic associations. However, there are numerous methods for association testing of gene sets and no real consensus regarding the best approach. The present study applied six pathway analysis methods to identify pathways associated with ADHD in two GWAS datasets from the Psychiatric Genomics Consortium. Methods that utilize genotypes to model pathway-level effects identified more replicable pathway associations than methods using summary statistics. In addition, pathways implicated by more than one method were significantly more likely to replicate. A number of brain-relevant pathways, such as RhoA signaling, glycosaminoglycan biosynthesis, fibroblast growth factor receptor activity, and pathways containing potassium channel genes, were nominally significant by multiple methods in both datasets. These results support previous hypotheses about the role of regulation of neurotransmitter release, neurite outgrowth and axon guidance in contributing to the ADHD phenotype and suggest the value of cross-method convergence in evaluating pathway analysis results. © 2016 Wiley Periodicals, Inc. PMID:27004716

  4. Application of potentiometric stripping analysis with constant inverse current in the analytic step for determining lead in glassware

    Directory of Open Access Journals (Sweden)

    ZVONIMIR J. SUTUROVIC

    2002-03-01

    Full Text Available The trace amounts of lead in extraction glassware products were determined by potentiometric stripping analysis with constant inverse current in the analytic step (PSA-iR, an electrochemical technique of high sensitivity and selectivity. This paper deals with an investigation which was directed to the effect of a great number of factors on the results of PSA-iR, of lead in glassware, such as the mercury time electrodeposition, the electrolysis potential, the solution stirring rate and the constant inverse current. Linearity of the lead analytical signal was achieved within the range of mass concentrations from 2.5 mg/dm3 to 4.5 mg/dm3. A detection limit of 0.64 mg/dm3 was obtained, with a reproducibility of 4.14 % expressed as the coefficient of variation. The analyses were carried out using a computerized stripping analyzer of domestic design and manufacture (Faculty of Technology, Novi Sad and “Elektrouniverzal”, Leskovac. The accuracy of the method was confirmed by parallel analyses using flameless atomic absorption spectrophotometry as the reference method.

  5. Analysis of dijet events in diffractive ep interactions with tagged leading proton at the H1 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Polifka, Richard

    2011-08-15

    An inclusive dijet production in diffractive deep-inelastic scattering is measured. The diffractive selection is based on tagging of the leading proton in the Forward Proton Spectrometer. The statistics of events obtained during the HERA II running period (integrated luminosity of 156.7 pb{sup -1}) enables the measurement of jet final states with leading proton for the first time. The data cover the phase space of x{sub P}<0.1, vertical stroke t vertical stroke {<=}1.0 GeV{sup 2} and 4{<=} Q{sup 2} {<=}110 GeV{sup 2}. The dijet data are compared with the next to leading order predictions of the quantum chromodynamics (QCD). The phase space of diffractive dijets is in this analysis by factor of 3 in x{sub P} larger than in previous measurements. The QCD predictions based on the DGLAP parton evolution describe the measured data well even in a non-DGLAP enriched phase space where one on the jets goes into the region close to the direction of the outgoing proton. The measured single-differential cross sections are compared to several Monte Carlo models with different treatment of diffractive exchange implemented. (orig.)

  6. A historical review and bibliometric analysis of research on lead in drinking water field from 1991 to 2007

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Jie; Ma, Yuwei [Faculty of Civil Engineering and Geosciences, Delft University of Technology (Netherlands); Zhang, Liang [Institute of Geodesy and Geophysics, Chinese Academy of Sciences, Wuhan 430077 (China); Gan, Fuxing [School of Resource and Environmental Science, Wuhan University, Wuhan 430079 (China); Ho, Yuh-Shan, E-mail: ysho@asia.edu.tw [Water Research Centre, Asia University, Taichung 41354, Taiwan (China); Department of Public Health, China Medical University, Taichung 40402, Taiwan (China)

    2010-03-01

    A bibliometric analysis based on Science Citation Index (SCI) published by Institute of Scientific Information (ISI) was carried out to identify the global research related to lead in drinking water field from 1991 to 2007 and to improve the understanding of research trends in the same period. The results from this analysis indicate that there have been an increasing number of annual publications mainly during two periods: from 1992 to 1997 and from 2004 to 2007. United States produced 37% of all pertinent articles followed by India with 8.0% and Canada with 4.8%. Science of the Total Environment published the most articles followed by Journal American Water Works Association and Toxicology. Summary of the most frequently used keywords are also provided. 'Cadmium' was the most popular author keyword in the 17 years. Furthermore based on bibliometric results four research aspects were summarized in this paper and the historical research review was also presented.

  7. Chemometric method of spectra analysis leading to isolation of lysozyme and CtDNA spectra affected by osmolytes.

    Science.gov (United States)

    Bruździak, Piotr; Rakowska, Paulina W; Stangret, Janusz

    2012-11-01

    In this paper we present a chemometric method of analysis leading to isolation of Fourier transform infrared (FT-IR) spectra of biomacromolecules (HEW lysozyme, ctDNA) affected by osmolytes (trimethylamine-N-oxide and N,N,N-trimethylglycine, respectively) in aqueous solutions. The method is based on the difference spectra method primarily used to characterize the structure of solvent affected by solute. The cyclical usage of factor analysis allows precise information to be obtained on the shape of "affected spectra" of analyzed biomacromolecules. "Affected spectra" of selected biomacromolecules give valuable information on their structure in the presence of the osmolytes in solution, as well as on the level of perturbation in dependence of osmolyte concentration. The method also gives a possibility of insight into the mechanism of interaction in presented types of systems. It can be easily adapted to various chemical and biochemical problems where vibrational or ultraviolet-visible (UV-Vis) spectroscopy is used. PMID:23146186

  8. A historical review and bibliometric analysis of research on lead in drinking water field from 1991 to 2007

    International Nuclear Information System (INIS)

    A bibliometric analysis based on Science Citation Index (SCI) published by Institute of Scientific Information (ISI) was carried out to identify the global research related to lead in drinking water field from 1991 to 2007 and to improve the understanding of research trends in the same period. The results from this analysis indicate that there have been an increasing number of annual publications mainly during two periods: from 1992 to 1997 and from 2004 to 2007. United States produced 37% of all pertinent articles followed by India with 8.0% and Canada with 4.8%. Science of the Total Environment published the most articles followed by Journal American Water Works Association and Toxicology. Summary of the most frequently used keywords are also provided. 'Cadmium' was the most popular author keyword in the 17 years. Furthermore based on bibliometric results four research aspects were summarized in this paper and the historical research review was also presented.

  9. Saving the world by teaching behavior analysis: A behavioral systems approach

    OpenAIRE

    Malott, Richard W.; Vunovich, Pamela L.; Boettcher, William; Groeger, Corina

    1995-01-01

    This article presents a behavioral systems approach to organizational design and applies that approach to the teaching of behavior analysis. This systems approach consists of three components: goal-directed systems design, behavioral systems engineering, and performance management. This systems approach is applied to the Education Board and Teaching Behavior Analysis Special Interest Group of the Association for Behavior Analysis, with a conclusion that we need to emphasize the recruitment of...

  10. ANALYSIS, SELECTION AND RANKING OF FOREIGN MARKETS. A COMPREHENSIVE APPROACH

    Directory of Open Access Journals (Sweden)

    LIVIU NEAMŢU

    2013-12-01

    Full Text Available Choosing the appropriate markets for growth and development is essential for a company that wishes expanding businesses through international economic exchanges. But in this business case foreign markets research is not sufficient even though is an important chapter in the decision technology and an indispensable condition for achieving firm’s objectives. If in marketing on the national market this market is defined requiring no more than its prospection and segmentation, in the case of the international market outside the research process there is a need of a selection of markets and their classification. Companies that have this intention know little or nothing about the conditions offered by a new market or another. Therefore, they must go, step by step, through a complex analysis process, multilevel- type, composed of selection and ranking of markets followed by the proper research through exploration and segmentation, which can lead to choosing the most profitable markets. In this regard, within this study, we propose a multi-criteria model for selection and ranking of international development markets, allowing companies access to those markets which are in compliance with the company's development strategy.

  11. VOLUMETRIC LEAD ASSAY

    International Nuclear Information System (INIS)

    This report describes a system for handling and radioassay of lead, consisting of a robot, a conveyor, and a gamma spectrometer. The report also presents a cost-benefit analysis of options: radioassay and recycling lead vs. disposal as waste

  12. Sequential injection analysis of lead using time-based colorimetric detection and preconcentration on an anionic-exchange resin.

    Science.gov (United States)

    Aracama, Nestor Zárate; Araújo, Alberto N; Perez-Olmos, Ricardo

    2004-04-01

    The development of a sequential injection analysis manifold for the colorimetric determination of lead in water samples is described The concentration of lead was assessed from its catalytic effect on the reaction of resazurine reduction caused by sulfide in an alkali medium. To that effect, the reaction zone was stopped at the detector, and the time interval required for the attainment of an absorbance decrease of 0.800 at the wavelength of 610 nm was estimated. Interference of other transition metals of the samples was minimized by adding potassium iodide to the sample and retaining the iodocomplexes formed in an on-line anionic resin (AGI X8). Elution was made with a 2 mol/L sodium hydroxide solution. The relationship [SIA] microg/L = 0.99 (+/- 0.11) x [ETAAS] microg/L + 0 (+/- 4) was obtained upon comparing the results given by the proposed system and by electrothermal atomization atomic absorption spectrometry (ETAAS) after the analysis of ten water samples. PMID:15116968

  13. Glycan Node Analysis: A Bottom-up Approach to Glycomics.

    Science.gov (United States)

    Zaare, Sahba; Aguilar, Jesús S; Hu, Yueming; Ferdosi, Shadi; Borges, Chad R

    2016-01-01

    facilitates relative quantification of individual glycan nodes in a sample. Although presently constrained in terms of its absolute limits of detection, this method expedites the analysis of clinical biofluids and shows considerable promise as a complementary approach to traditional top-down glycomics. PMID:27284957

  14. Fluorescent microscopy approaches of quantitative soil microbial analysis

    Science.gov (United States)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    hybridization method (FISH). This approach was used for evaluation of contribution of each gram-negative bactera group. No significant difference between the main soil gram-negative bacterial groups (phylum Proteobacteria and Bacteroidetes) was found both under anaerobic and anaerobic conditions in chernozem in the topsoil. Thus soil gram-negative bacteria play an important ecological role in natural polymer degradation as common group of microorganisms. Another approach with using cascade filtration technique for bacterial population density estimation in chernozem was compared to classical method of fluorescent microscopy. Quantification of soil bacteria with cascade filtration provided by filters with different diameters and filtering of soil suspension in fixed amount. In comparison to the classical fluorescent microscopy method the modification with filtration of soil suspension provided to quantify more bacterial cells. Thus biomass calculation results of soil bacteria by using classical fluorescent microscopy could be underestimated and combination with cascade filtration technique allow to avoid potential experimental error. Thereby, combination and comparison of several fluorescent microscopy methods modifications established during the research provided miscellaneous approaches in soil bacteria quantification and analysis of ecological roles of soil microorganisms.

  15. A performance-based approach to landslide risk analysis

    Science.gov (United States)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  16. Investigation of mercury-free potentiometric stripping analysis and the influence of mercury in the analysis of trace-elements lead and zinc

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Andersen, Laust

    1997-01-01

    Application of Potentiometric Stripping Analysis (PSA), without any mercury, to determination of trace-elements lead and zinc, results in linear responses between stripping-peak areas and concentrations within the range 0-2000 ng/g. The best response, as determined by the size of stripping areas......, was obtained with an electrode prepared with mercury but without mercury ions in the electrolyte. In 0.09-0.1 M HCl lead is analysed by a freshly polished glassy-carbon electrode while analysis of zinc requires an electrode activation procedure. The electrode activation is performed by stripping zinc...... is proposed which explains the co-deposition of mercury and test metals in the electrolysis step in terms of a charge-distribution parameter. The model explains that the decrease of stripping peak area, as a function of concentration, is entirely due to mercury ions being simultaneously reduced...

  17. Extreme storm surges: a comparative study of frequency analysis approaches

    Directory of Open Access Journals (Sweden)

    Y. Hamdi

    2013-11-01

    Full Text Available In France, nuclear facilities were designed to very low probabilities of failure. Nevertheless, exceptional climatic events have given rise to surges much larger than observations (outliers and had clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches including the Annual Maxima (AM, the Peaks-Over Threshold (POT and the r-Largest Order Statistics (r-LOS. These methods are illustrated in a real analysis case study. All the data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameters stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on: (i the uncertainty degrees, (ii the adequacy criteria and tests and (iii the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distributions parameters and return level estimates and have systematically shown values of the 100 and 500 yr return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fitting at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativity of outliers in data sets. Findings are of practical relevance not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  18. Poverty Analysis of Rice Farming Households: A Multidimensional Approach

    Directory of Open Access Journals (Sweden)

    Adenuga A. H

    2013-12-01

    Full Text Available The official measurement and analysis of poverty in Nigeria has historically relied upon the single dimension, consumption based monetary approach with little attention on multidimensional poverty assessment. This study was therefore carried out to assess the multidimensional poverty index of rice farming households in Nasarawa/Benue Rice Hub, Nigeria. The study employed stratified random sampling technique to select 149 rice farming households in the study area. Descriptive statistics, the Alkire and Foster Multidimensional Poverty Index Methodology using two different cut-off points and the Tobit regression model were the main analytical tools employed for the study. The results of the multidimensional poverty index analysis revealed that female headed households were poorer than the male headed households. On the overall, 66 percent of the rice farming households was multidimensionally poor. The study also showed that the rice farming households were deprived in 48 percent of the dimensions. A multidimensional poverty index of 0.32 was obtained for the rice farming households in the study area with varying values obtained for the male and female headed households. The result of the Tobit regression model showed that gender of the household head, health, marital status and membership of association were the major determinants of multidimensional poverty of the rice farming households in the study area. The study concluded that the rice farming households in the study area were multidimensionally poor. It was recommended that the government should give priorities to the development of the rural areas with special consideration for women through the provision of essential infrastructural facilities.

  19. Analysis of risk perception decision context and approaches

    International Nuclear Information System (INIS)

    Risk perception can be studied with various approaches. Some of their determinants are first briefly reviewed in this paper: initial postulates, decisional system, nature of the concerned decisions. Finally, perceived risk is studied by considering the constructionistic approach. (author)

  20. Analysis of ballistic transport in nanoscale devices by using an accelerated finite element contact block reduction approach

    Science.gov (United States)

    Li, H.; Li, G.

    2014-08-01

    An accelerated Finite Element Contact Block Reduction (FECBR) approach is presented for computational analysis of ballistic transport in nanoscale electronic devices with arbitrary geometry and unstructured mesh. Finite element formulation is developed for the theoretical CBR/Poisson model. The FECBR approach is accelerated through eigen-pair reduction, lead mode space projection, and component mode synthesis techniques. The accelerated FECBR is applied to perform quantum mechanical ballistic transport analysis of a DG-MOSFET with taper-shaped extensions and a DG-MOSFET with Si/SiO2 interface roughness. The computed electrical transport properties of the devices obtained from the accelerated FECBR approach and associated computational cost as a function of system degrees of freedom are compared with those obtained from the original CBR and direct inversion methods. The performance of the accelerated FECBR in both its accuracy and efficiency is demonstrated.

  1. Analysis of ballistic transport in nanoscale devices by using an accelerated finite element contact block reduction approach

    International Nuclear Information System (INIS)

    An accelerated Finite Element Contact Block Reduction (FECBR) approach is presented for computational analysis of ballistic transport in nanoscale electronic devices with arbitrary geometry and unstructured mesh. Finite element formulation is developed for the theoretical CBR/Poisson model. The FECBR approach is accelerated through eigen-pair reduction, lead mode space projection, and component mode synthesis techniques. The accelerated FECBR is applied to perform quantum mechanical ballistic transport analysis of a DG-MOSFET with taper-shaped extensions and a DG-MOSFET with Si/SiO2 interface roughness. The computed electrical transport properties of the devices obtained from the accelerated FECBR approach and associated computational cost as a function of system degrees of freedom are compared with those obtained from the original CBR and direct inversion methods. The performance of the accelerated FECBR in both its accuracy and efficiency is demonstrated

  2. A combined approach for comparative exoproteome analysis of Corynebacterium pseudotuberculosis

    Directory of Open Access Journals (Sweden)

    Scrivens James H

    2011-01-01

    Full Text Available Abstract Background Bacterial exported proteins represent key components of the host-pathogen interplay. Hence, we sought to implement a combined approach for characterizing the entire exoproteome of the pathogenic bacterium Corynebacterium pseudotuberculosis, the etiological agent of caseous lymphadenitis (CLA in sheep and goats. Results An optimized protocol of three-phase partitioning (TPP was used to obtain the C. pseudotuberculosis exoproteins, and a newly introduced method of data-independent MS acquisition (LC-MSE was employed for protein identification and label-free quantification. Additionally, the recently developed tool SurfG+ was used for in silico prediction of sub-cellular localization of the identified proteins. In total, 93 different extracellular proteins of C. pseudotuberculosis were identified with high confidence by this strategy; 44 proteins were commonly identified in two different strains, isolated from distinct hosts, then composing a core C. pseudotuberculosis exoproteome. Analysis with the SurfG+ tool showed that more than 75% (70/93 of the identified proteins could be predicted as containing signals for active exportation. Moreover, evidence could be found for probable non-classical export of most of the remaining proteins. Conclusions Comparative analyses of the exoproteomes of two C. pseudotuberculosis strains, in addition to comparison with other experimentally determined corynebacterial exoproteomes, were helpful to gain novel insights into the contribution of the exported proteins in the virulence of this bacterium. The results presented here compose the most comprehensive coverage of the exoproteome of a corynebacterial species so far.

  3. Analysis of resource efficiency: a production frontier approach.

    Science.gov (United States)

    Hoang, Viet-Ngu

    2014-05-01

    This article integrates the material/energy flow analysis into a production frontier framework to quantify resource efficiency (RE). The emergy content of natural resources instead of their mass content is used to construct aggregate inputs. Using the production frontier approach, aggregate inputs will be optimised relative to given output quantities to derive RE measures. This framework is superior to existing RE indicators currently used in the literature. Using the exergy/emergy content in constructing aggregate material or energy flows overcomes a criticism that mass content cannot be used to capture different quality of differing types of resources. Derived RE measures are both 'qualitative' and 'quantitative', whereas existing RE indicators are only qualitative. An empirical examination into the RE of 116 economies was undertaken to illustrate the practical applicability of the new framework. The results showed that economies, on average, could reduce the consumption of resources by more than 30% without any reduction in per capita gross domestic product (GDP). This calculation occurred after adjustments for differences in the purchasing power of national currencies. The existence of high variations in RE across economies was found to be positively correlated with participation of people in labour force, population density, urbanisation, and GDP growth over the past five years. The results also showed that economies of a higher income group achieved higher RE, and those economies that are more dependent on imports and primary industries would have lower RE performance. PMID:24632401

  4. Transdimensional Bayesian approach to pulsar timing noise analysis

    Science.gov (United States)

    Ellis, J. A.; Cornish, N. J.

    2016-04-01

    The modeling of intrinsic noise in pulsar timing residual data is of crucial importance for gravitational wave detection and pulsar timing (astro)physics in general. The noise budget in pulsars is a collection of several well-studied effects including radiometer noise, pulse-phase jitter noise, dispersion measure variations, and low-frequency spin noise. However, as pulsar timing data continue to improve, nonstationary and non-power-law noise terms are beginning to manifest which are not well modeled by current noise analysis techniques. In this work, we use a transdimensional approach to model these nonstationary and non-power-law effects through the use of a wavelet basis and an interpolation-based adaptive spectral modeling. In both cases, the number of wavelets and the number of control points in the interpolated spectrum are free parameters that are constrained by the data and then marginalized over in the final inferences, thus fully incorporating our ignorance of the noise model. We show that these new methods outperform standard techniques when nonstationary and non-power-law noise is present. We also show that these methods return results consistent with the standard analyses when no such signals are present.

  5. Smoothing spline analysis of variance approach for global sensitivity analysis of computer codes

    International Nuclear Information System (INIS)

    The paper investigates a nonparametric regression method based on smoothing spline analysis of variance (ANOVA) approach to address the problem of global sensitivity analysis (GSA) of complex and computationally demanding computer codes. The two steps algorithm of this method involves an estimation procedure and a variable selection. The latter can become computationally demanding when dealing with high dimensional problems. Thus, we proposed a new algorithm based on Landweber iterations. Using the fact that the considered regression method is based on ANOVA decomposition, we introduced a new direct method for computing sensitivity indices. Numerical tests performed on several analytical examples and on an application from petroleum reservoir engineering showed that the method gives competitive results compared to a more standard Gaussian process approach

  6. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study

    Science.gov (United States)

    Dehesh, Tania; Zare, Najaf; Ayatollahi, Seyyed Mohammad Taghi

    2015-01-01

    Background. Univariate meta-analysis (UM) procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS) method as a multivariate meta-analysis approach. Methods. We evaluated the efficiency of four new approaches including zero correlation (ZC), common correlation (CC), estimated correlation (EC), and multivariate multilevel correlation (MMC) on the estimation bias, mean square error (MSE), and 95% probability coverage of the confidence interval (CI) in the synthesis of Cox proportional hazard models coefficients in a simulation study. Result. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. Conclusion. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients. PMID:26413142

  7. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study

    Directory of Open Access Journals (Sweden)

    Tania Dehesh

    2015-01-01

    Full Text Available Background. Univariate meta-analysis (UM procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS method as a multivariate meta-analysis approach. Methods. We evaluated the efficiency of four new approaches including zero correlation (ZC, common correlation (CC, estimated correlation (EC, and multivariate multilevel correlation (MMC on the estimation bias, mean square error (MSE, and 95% probability coverage of the confidence interval (CI in the synthesis of Cox proportional hazard models coefficients in a simulation study. Result. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. Conclusion. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.

  8. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  9. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  10. ANALYSIS OF WORKING CAPITAL MANAGEMENT OF LEADING COMPANIES IN THE HUNGARIAN DAIRY SECTOR BETWEEN 2008 AND 2012

    Directory of Open Access Journals (Sweden)

    Dorisz Talas

    2014-07-01

    Full Text Available This study analyses trends in the working capital management of those Hungarian dairy companies that feature the highest levels of sales revenues in the domestic market and diversified product structures. In view of the significance of food industry in the national economy, a particularly important question to examine in this context is to see what impacts the economic crisis has had on the business operations of the dominant companies of the sector, what processes it has triggered in working capital management. In 2012, 44 of the 500 companies with the largest amounts of sales were operating in the food processing sector. Within the group of these 44 enterprises, dairy companies had an 11% share. On the other hand, 82% of the total sales revenues of the dairy industry was given by the 15 companies where the individual amounts of registered capital are over HUF 250 million, which reflects a strong sales revenue concentration. Therefore, it has been an interesting aspect to study how in such a concentrated sector the leading companies shape their working capital management, what processes can be observed in this respect. This study has relied on the annual reports, i.e. the balance sheets and profit & loss accounts of the companies for the fiscal period of 2008–2012. The research methodology of the analysis is based on the review of financial indicators internationally accepted and used in connection with working capital management. Moreover, the study determines a cash conversion cycle, and in this context the inventory and receivables turnover, as well as the payables turnover. It has been assessed what changes the above-mentioned indicators of the competitors belonging to the target group of analysis have undergone, if there has been a general tendency of changes to be identified. For outstanding values it has been analysed what the large-scale changes could have been caused by. Fundamentally, I have aspired to reveal whether there has been

  11. Impact of right-ventricular apical pacing on the optimal left-ventricular lead positions measured by phase analysis of SPECT myocardial perfusion imaging

    Energy Technology Data Exchange (ETDEWEB)

    Hung, Guang-Uei [Chang Bing Show Chwan Memorial Hospital, Changhua (China); China Medical University, Department of Biomedical Imaging and Radiological Science, Taichung (China); Huang, Jin-Long [Taichung Veterans General Hospital, Cardiovascular Center, Taichung (China); School of Medicine, National Yang-Ming University, Institute of Clinical Medicine, and Cardiovascular Research Institute, Department of Medicine, Taipei (China); Chung-Shan Medical University, Department of Medicine, School of Medicine, Taichung (China); Lin, Wan-Yu; Tsai, Shih-Chung [Taichung Veterans General Hospital, Department of Nuclear Medicine, Taichung (China); Wang, Kuo-Yang [Taichung Veterans General Hospital, Cardiovascular Center, Taichung (China); Chung-Shan Medical University, Department of Medicine, School of Medicine, Taichung (China); Chen, Shih-Ann [School of Medicine, National Yang-Ming University, Institute of Clinical Medicine, and Cardiovascular Research Institute, Department of Medicine, Taipei (China); Taipei Veterans General Hospital, Division of Cardiology, Department of Medicine, Taipei (China); Lloyd, Michael S.; Chen, Ji [Emory University, Department of Radiology and Imaging Sciences, Atlanta, GA (United States)

    2014-06-15

    The use of SPECT phase analysis to optimize left-ventricular (LV) lead positions for cardiac resynchronization therapy (CRT) was performed at baseline, but CRT works as simultaneous right ventricular (RV) and LV pacing. The aim of this study was to assess the impact of RV apical (RVA) pacing on optimal LV lead positions measured by SPECT phase analysis. This study prospectively enrolled 46 patients. Two SPECT myocardial perfusion scans were acquired under sinus rhythm with complete left bundle branch block and RVA pacing, respectively, following a single injection of {sup 99m}Tc-sestamibi. LV dyssynchrony parameters and optimal LV lead positions were measured by the phase analysis technique and then compared between the two scans. The LV dyssynchrony parameters were significantly larger with RVA pacing than with sinus rhythm (p ∝0.01). In 39 of the 46 patients, the optimal LV lead positions were the same between RVA pacing and sinus rhythm (kappa = 0.861). In 6 of the remaining 7 patients, the optimal LV lead positions were along the same radial direction, but RVA pacing shifted the optimal LV lead positions toward the base. The optimal LV lead positions measured by SPECT phase analysis were consistent, no matter whether the SPECT images were acquired under sinus rhythm or RVA pacing. In some patients, RVA pacing shifted the optimal LV lead positions toward the base. This study supports the use of baseline SPECT myocardial perfusion imaging to optimize LV lead positions to increase CRT efficacy. (orig.)

  12. Lead poisoning

    Science.gov (United States)

    ... free solder, lead is still found in some modern faucets. Soil contaminated by decades of car exhaust ... NOT store wine, spirits, or vinegar-based salad dressings in lead crystal decanters for long periods of ...

  13. Lead Toxicity

    Science.gov (United States)

    ... in children over time may lead to reduced IQ, slow learning, Attention Deficit Hyperactivity Disorder (ADHD), or ... avoid exposure to soil. Is there a medical test for lead exposure? • Blood samples can be tested ...

  14. Global analysis of nuclear parton distribution functions and their uncertainties at next-to-next-to-leading order

    CERN Document Server

    Khanpour, Hamzeh

    2016-01-01

    We perform a next-to-next-to-leading order (NNLO) analysis of nuclear parton distribution functions (nPDFs) using neutral current charged-lepton ($\\ell ^\\pm$ + nucleus) deeply inelastic scattering (DIS) data and Drell-Yan (DY) cross-section ratios $\\sigma_{DY}^{A}/\\sigma_{DY}^{A^\\prime}$ for several nuclear targets. We study in details the parameterizations and the atomic mass (A) dependence of the nuclear PDFs at this order. The present nuclear PDFs global analysis provides us a complete set of nuclear PDFs, $f_i^{(A,Z)}(x,Q^2)$, with a full functional dependence on $x$, A, Q$^2$. The uncertainties of the obtained nuclear modification factors for each parton flavour are estimated using the well-known Hessian method. The nuclear charm quark distributions are also added into the analysis. We compare the parametrization results with the available data and the results of other nuclear PDFs groups. We found our nuclear PDFs to be in reasonably good agreement with them. The estimates of errors provided by our glob...

  15. Overview of the FEP analysis approach to model development

    International Nuclear Information System (INIS)

    This report heads a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed safety assessment. A five stage approach has been adopted, which provides a systematic framework for addressing uncertainty and for the documentation of all modelling decisions and assumptions. The five stages are as follows: Stage 1: EP Analysis - compilation and structuring of a FEP database; Stage 2: Scenario and Conceptual Model Development; Stage 3: Mathematical Model Development; Stage 4: Software Development; Stage 5: confidence Building. This report describes the development and structuring of a FEP database as a Master Directed Diagram (MDD) and explains how this may be used to identify different modelling scenarios, based upon the identification of scenario -defining FEPs. The methodology describes how the possible evolution of a repository system can be addressed in terms of a base scenario, a broad and reasonable representation of the 'natural' evolution of the system, and a number of variant scenarios, representing the effects of probabilistic events and processes. The MDD has been used to identify conceptual models to represent the base scenario and the interactions between these conceptual models have been systematically reviewed using a matrix diagram technique. This has led to the identification of modelling requirements for the base scenario, against which existing assessment software capabilities have been reviewed. A mechanism for combining probabilistic scenario-defining FEPs to construct multi-FEP variant scenarios has been proposed and trialled using the concept of a 'timeline', a defined sequence of events, from which consequences can be assessed. An iterative approach, based on conservative modelling principles, has been proposed for the evaluation of

  16. Isotope ratio analysis of lead in blood and environmental samples by multi-collector inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    It is widely recognized that lead (Pb) affects children's cognitive function, even at relatively low blood lead levels (-1). The determination of the source of Pb in children is essential for effective risk management. The use of multi-collector ICPMS (MC-ICPMS) for isotope ratio measurements of Pb in environmental and biological samples was examined for this purpose. MC-ICPMS with an instrumental mass fractionation correction by Tl allowed accurate isotope ratio measurements of the Pb isotopic reference material NIST SRM 981. However, the presence of matrix elements (Al, Ca, Fe and Na) at more than 10 mg kg-1 in the sample solution significantly deteriorated the accuracy. The separation of Pb from the matrix is necessary for accurate measurements of the isotope ratio of Pb in environmental and biological samples. Bromide-complexation, followed by anion exchange was found to be satisfactory in terms of the recovery of Pb (90 to 104%) and the efficiency of matrix separation. The procedure was applied to a preliminary source analysis of Pb in the blood of Japanese children, and a significant contribution of indoor dust was demonstrated. (author)

  17. Comparative analysis of transcriptomic and hormonal responses to compatible and incompatible plant-virus interactions that lead to cell death.

    Science.gov (United States)

    Pacheco, Remedios; García-Marcos, Alberto; Manzano, Aranzazu; de Lacoba, Mario García; Camañes, Gemma; García-Agustín, Pilar; Díaz-Ruíz, José Ramón; Tenllado, Francisco

    2012-05-01

    Hypersensitive response-related programmed cell death (PCD) has been extensively analyzed in various plant-virus interactions. However, little is known about the changes in gene expression and phytohormone levels associated with cell death caused by compatible viruses. The synergistic interaction of Potato virus X (PVX) with a number of Potyvirus spp. results in increased symptoms that lead to systemic necrosis (SN) in Nicotiana benthamiana. Here, we show that SN induced by a PVX recombinant virus expressing a potyviral helper component-proteinase (HC-Pro) gene is associated with PCD. We have also compared transcriptomic and hormonal changes that occur in response to a compatible synergistic virus interaction that leads to SN, a systemic incompatible interaction conferred by the Tobacco mosaic virus-resistance gene N, and a PCD response conditioned by depletion of proteasome function. Our analysis indicates that the SN response clusters with the incompatible response by the similarity of their overall gene expression profiles. However, the expression profiles of both defense-related genes and hormone-responsive genes, and also the relative accumulation of several hormones in response to SN, relate more closely to the response to depletion of proteasome function than to that elicited by the incompatible interaction. This suggests a potential contribution of proteasome dysfunction to the increased pathogenicity observed in PVX-Potyvirus mixed infections. Furthermore, silencing of coronatine insensitive 1, a gene involved in jasmonate perception, in N. benthamiana accelerated cell death induced by PVX expressing HC-Pro. PMID:22273391

  18. What's the state of energy studies research?: A content analysis of three leading journals from 1999 to 2008

    International Nuclear Information System (INIS)

    We present the results of a content analysis conducted on 2502 papers written by 5318 authors published between 1999 and 2008 in three leading energy studies journals: Energy Policy, The Energy Journal, and The Electricity Journal. Our study finds that authors were most likely to be male, based in North America, possess a background in science or engineering, and affiliated with a university or research institute. Articles were likely to be written by authors working within disciplinary boundaries and using research methods from an economics/engineering background. The US was the most written about country among papers that adopted a country focus and electricity was the most frequently discussed energy source. Energy markets and public policy instruments were the most popular focus areas. According to these findings, we identify five thematic areas whose further investigation could enhance the energy studies field and increase the policy-relevance of contemporary research.

  19. Dynamic metabolic flux analysis using a convex analysis approach: Application to hybridoma cell cultures in perfusion.

    Science.gov (United States)

    Fernandes de Sousa, Sofia; Bastin, Georges; Jolicoeur, Mario; Vande Wouwer, Alain

    2016-05-01

    In recent years, dynamic metabolic flux analysis (DMFA) has been developed in order to evaluate the dynamic evolution of the metabolic fluxes. Most of the proposed approaches are dedicated to exactly determined or overdetermined systems. When an underdetermined system is considered, the literature suggests the use of dynamic flux balance analysis (DFBA). However the main challenge of this approach is to determine an appropriate objective function, which remains valid over the whole culture. In this work, we propose an alternative dynamic metabolic flux analysis based on convex analysis, DMFCA, which allows the determination of bounded intervals for the fluxes using the available knowledge of the metabolic network and information provided by the time evolution of extracellular component concentrations. Smoothing splines and mass balance differential equations are used to estimate the time evolution of the uptake and excretion rates from this experimental data. The main advantage of the proposed procedure is that it does not require additional constraints or objective functions, and provides relatively narrow intervals for the intracellular metabolic fluxes. DMFCA is applied to experimental data from hybridoma HB58 cell perfusion cultures, in order to investigate the influence of the operating mode (batch and perfusion) on the metabolic flux distribution. Biotechnol. Bioeng. 2016;113: 1102-1112. © 2015 Wiley Periodicals, Inc. PMID:26551676

  20. Managing Approach Plate Information Study (MAPLIST): An Information Requirements Analysis of Approach Chart Use

    Science.gov (United States)

    Ricks, Wendell R.; Jonnson, Jon E.; Barry, John S.

    1996-01-01

    Adequately presenting all necessary information on an approach chart represents a challenge for cartographers. Since many tasks associated with using approach charts are cognitive (e.g., planning the approach and monitoring its progress), and since the characteristic of a successful interface is one that conforms to the users' mental models, understanding pilots' underlying models of approach chart information would greatly assist cartographers. To provide such information, a new methodology was developed for this study that enhances traditional information requirements analyses by combining psychometric scaling techniques with a simulation task to provide quantifiable links between pilots' cognitive representations of approach information and their use of approach information. Results of this study should augment previous information requirements analyses by identifying what information is acquired, when it is acquired, and what presentation concepts might facilitate its efficient use by better matching the pilots' cognitive model of the information. The primary finding in this study indicated that pilots mentally organize approach chart information into ten primary categories: communications, geography, validation, obstructions, navigation, missed approach, final items, other runways, visibility requirement, and navigation aids. These similarity categories were found to underlie the pilots' information acquisitions, other mental models, and higher level cognitive processes that are used to accomplish their approach and landing tasks.