WorldWideScience

Sample records for analysis approach leading

  1. Safe job analysis in a lead refinery. A practical approach from the process side

    Energy Technology Data Exchange (ETDEWEB)

    Esser, Knut; Meurer, Urban [BERZELIUS Stolberg GmbH, Stolberg (Germany)

    2011-09-15

    In order to increase safety and to maintain legal requirements, Berzelius Stolberg decided in 2009 to update and change the approach for the safe job analysis (SJA). The new approach takes detailed Standard Operation Procedures (SOPs), which were also updated during the new approach, as a basis for all following documents. Together with supervisors and operators all SOPs were structured in single working steps, because only if the real work is properly described, the afterwards performed safe job analysis makes sense and the risks are correctly identified. After updating the SOPs, a draft of each SJA was discussed by representatives from the refinery management, works council, safety officers and operators. For every identified risk one or more measures to avoid the risk were agreed. For the technical and organisational measures an action plan was created. The behavior related measures were concentrated in a safety handbook, representing the basis for future safety training of the operators. In addition to the Safe Job Analysis the SOPs are also the basis for training manuals and also for FMEAs. All in all the new approach of safe job analysis represents not only a way to increase safety systematically according to OHSAS guidelines, but also satisfies all aspects of quality management. (orig.)

  2. GALA: Group Analysis Leads to Accuracy, a novel approach for solving the inverse problem in exploratory analysis of group MEG recordings

    Directory of Open Access Journals (Sweden)

    Vladimir eKozunov

    2015-04-01

    Full Text Available Although MEG/EEG signals are highly variable between subjects, they allow characterizing systematic changes of cortical activity in both space and time. Traditionally a two-step procedure is used. The first step is a transition from sensor to source space by the means of solving an ill-posed inverse problem for each subject individually. The second is mapping of cortical regions consistently active across subjects. In practice the first step often leads to a set of active cortical regions whose location and timecourses display a great amount of interindividual variability hindering the subsequent group analysis.We propose Group Analysis Leads to Accuracy (GALA - a solution that combines the two steps into one. GALA takes advantage of individual variations of cortical geometry and sensor locations. It exploits the ensuing variability in electromagnetic forward model as a source of additional information. We assume that for different subjects functionally identical cortical regions are located in close proximity and partially overlap and their timecourses are correlated. This relaxed similarity constraint on the inverse solution can be expressed within a probabilistic framework, allowing for an iterative algorithm solving the inverse problem jointly for all subjects.A systematic simulation study showed that GALA, as compared with the standard min-norm approach, improves accuracy of true activity recovery, when accuracy is assessed both in terms of spatial proximity of the estimated and true activations and correct specification of spatial extent of the activated regions. This improvement obtained without using any noise normalization techniques for both solutions, preserved for a wide range of between-subject variations in both spatial and temporal features of regional activation. The corresponding activation timecourses exhibit significantly higher similarity across subjects. Similar results were obtained for a real MEG dataset of face

  3. Next-to-next-to-leading order QCD analysis of spin-dependent parton distribution functions and their uncertainties: Jacobi polynomials approach

    CERN Document Server

    Shahri, F Taghavi; Tehrani, S Atashbar; Yazdi, Z Alizadeh

    2016-01-01

    We present a first global QCD analysis of next-to-next-leading-order (NNLO) contributions of the spin-dependent parton distribution functions (PPDFs) and their uncertainties using the Jacobi polynomial approach. Having the NNLO contributions of the quark-quark and gluon-quark splitting functions in perturbative QCD (Nucl. Phys. B 889 (2014) 351-400), one can obtain the evolution of longitudinally polarized parton densities of hadrons up to NNLO accuracy of QCD. A very large sets of recent and up-to-date experimental data of spin structure functions of the proton $g_1^p$, neutron $g_1^n$, and deuteron $g_1^d$ have been used in this analysis. The predictions for the NNLO calculations of the polarized parton distribution functions as well as the proton, neutron and deuteron polarized structure functions are compared with the corresponding results of the NLO approximation. We form a mutually consistent set of polarized PDFs due to the inclusion of the most available experimental data including the recently publis...

  4. A strategic approach to physico-chemical analysis of bis (thiourea) lead chloride - A reliable semi-organic nonlinear optical crystal

    Science.gov (United States)

    Rajagopalan, N. R.; Krishnamoorthy, P.; Jayamoorthy, K.

    2017-03-01

    Good quality crystals of bis thiourea lead chloride (BTLC) have been grown by slow evaporation method from aqueous solution. Orthorhombic structure and Pna21 space group of the crystals have been identified by single crystal X-ray diffraction. Studies on nucleation kinetics of grown BTLC has been carried out from which meta-stable zone width, induction period, free energy change, critical radius, critical number and growth rate have been calculated. The experimental values of interfacial surface energy for the crystal growth process have been compared with theoretical models. Ultra violet transmittance studies resulted in a high transmittance and wide band gap energy suggested the required optical transparency of the crystal. The second harmonic generation (SHG) and phase matching nature of the crystal have been justified by Kurtz-Perry method. The SHG nature of the crystal has been further attested by the higher values of theoretical hyper polarizability. The dielectric nature of the crystals at different temperatures with varying frequencies has been thoroughly studied. The activation energy values of the electrical process have been calculated from ac conductivity study. Solid state parameters including valence electron plasma energy, Penn gap, Fermi energy and polarisability have been unveiled by theoretical approach and correlated with the crystal's SHG efficiency. The values of hardness number, elastic stiffness constant, Meyer's Index, minimum level of indentation load, load dependent constant, fracture toughness, brittleness index and corrected hardness obtained from Vicker's hardness test clearly showed that the BTLC crystal has good mechanical stability required for NLO device fabrication.

  5. Microdistribution of lead in bone: A new approach

    Energy Technology Data Exchange (ETDEWEB)

    Jones, K.W. [Brookhaven National Lab., Upton, NY (United States); Bockman, R.S. [Cornell Univ., New York, NY (United States). Medical Coll.; Bronner, F. [Connecticut Univ. Health Center, Farmington, CT (United States)

    1991-12-31

    A knowledge of the microdistribution of lead in bone is important in order to understand the mechanisms for accumulation and release of lead. The availability of the synchrotron x-ray microscope for sensitive measurements of bone content and distribution of lead provides a valuable tool which, when combined with kinetic, balance, and tissue measurements, can lead to better evaluation of lead toxicity. It may also provide the basis for the development of a suitable model of how lead behaves in the human body. An outline of an experimental protocol for exploitation of the x-ray microscope is given, along with synchrotron x-ray microscope measurements of the distribution of gallium in rat bone that demonstrate the feasibility of the experimental approach.

  6. Microdistribution of lead in bone: A new approach

    Energy Technology Data Exchange (ETDEWEB)

    Jones, K.W. (Brookhaven National Lab., Upton, NY (United States)); Bockman, R.S. (Cornell Univ., New York, NY (United States). Medical Coll.); Bronner, F. (Connecticut Univ. Health Center, Farmington, CT (United States))

    1991-01-01

    A knowledge of the microdistribution of lead in bone is important in order to understand the mechanisms for accumulation and release of lead. The availability of the synchrotron x-ray microscope for sensitive measurements of bone content and distribution of lead provides a valuable tool which, when combined with kinetic, balance, and tissue measurements, can lead to better evaluation of lead toxicity. It may also provide the basis for the development of a suitable model of how lead behaves in the human body. An outline of an experimental protocol for exploitation of the x-ray microscope is given, along with synchrotron x-ray microscope measurements of the distribution of gallium in rat bone that demonstrate the feasibility of the experimental approach.

  7. Lead Tap Sampling Approaches: What Do They Tell You

    Science.gov (United States)

    There is no single, universally applicable sampling approach for lead in drinking water. The appropriate type of sampling is dictated by the question being asked. There is no reason when a customer asks to have their home water tested to see if it's "safe" that they s...

  8. Real analysis a constructive approach

    CERN Document Server

    Bridger, Mark

    2012-01-01

    A unique approach to analysis that lets you apply mathematics across a range of subjects This innovative text sets forth a thoroughly rigorous modern account of the theoretical underpinnings of calculus: continuity, differentiability, and convergence. Using a constructive approach, every proof of every result is direct and ultimately computationally verifiable. In particular, existence is never established by showing that the assumption of non-existence leads to a contradiction. The ultimate consequence of this method is that it makes sense-not just to math majors but also to students from a

  9. Neurological Fallacies Leading to Malpractice: A Case Studies Approach.

    Science.gov (United States)

    Johnston, James C; Wester, Knut; Sartwelle, Thomas P

    2016-08-01

    A young woman presents with an intracranial arachnoid cyst. Another is diagnosed with migraine headache. An elderly man awakens with a stroke. And a baby delivered vaginally after 2 hours of questionable electronic fetal monitoring patterns grows up to have cerebral palsy. These seemingly disparate cases share a common underlying theme: medical myths. Myths that may lead not only to misdiagnosis and treatment harms but to seemingly never-ending medical malpractice lawsuits, potentially culminating in a settlement or judgment against an unsuspecting neurologist. This article provides a case studies approach exposing the fallacies and highlighting proper management of these common neurologic presentations.

  10. A basic approach for wing leading deicing by smart structures

    Science.gov (United States)

    Struggl, Stephan; Korak, Johannes; Feyrer, Christoph

    2011-04-01

    An investigation regarding de-icing of wing leading edges through the use of smart structures is performed. Piezoelectric actuators are used to excite the structures at their natural frequencies. This vibration excites shear stresses at the surface, which lead to the shedding off of ice. For optimal excitation of the structure, the frequency and the placement of piezo elements are determined, in order to maximize the shear stress. First, experimental investigations on a clamped aluminum plate are carried out. With these findings, the transition to an aluminum sample of a wing leading edge is performed. Practical experiments have been carried out on a sample of an aluminum wing leading edge. First, the structural behavior is determined by a modal analysis so that the natural frequencies and the eigenmodes can be calculated. By FE simulation all parameter combinations can be calculated, so the practical tests can be adapted accordingly. Practical experiments have been carried out under realistic conditions in terms of ice formation in an icing research tunnel. Different types of ice have been considered, which require a different level of shear stresses for the de-icing. Further investigations will concern the determination of the suitable frequency and furthermore an ongoing monitoring of the process to take up account on different icing conditions. The studies point to a further possibility of energy efficient de-icing.

  11. Analysis of lead toxicity in human cells

    Directory of Open Access Journals (Sweden)

    Gillis Bruce S

    2012-07-01

    Full Text Available Abstract Background Lead is a metal with many recognized adverse health side effects, and yet the molecular processes underlying lead toxicity are still poorly understood. Quantifying the injurious effects of lead is also difficult because of the diagnostic limitations that exist when analyzing human blood and urine specimens for lead toxicity. Results We analyzed the deleterious impact of lead on human cells by measuring its effects on cytokine production and gene expression in peripheral blood mononuclear cells. Lead activates the secretion of the chemokine IL-8 and impacts mitogen-dependent activation by increasing the secretion of the proinflammatory cytokines IL-6 and TNF-α and of the chemokines IL-8 and MIP1-α in the presence of phytohemagglutinin. The recorded changes in gene expression affected major cellular functions, including metallothionein expression, and the expression of cellular metabolic enzymes and protein kinase activity. The expression of 31 genes remained elevated after the removal of lead from the testing medium thereby allowing for the measurement of adverse health effects of lead poisoning. These included thirteen metallothionein transcripts, three endothelial receptor B transcripts and a number of transcripts which encode cellular metabolic enzymes. Cellular responses to lead correlated with blood lead levels and were significantly altered in individuals with higher lead content resultantly affecting the nervous system, the negative regulation of transcription and the induction of apoptosis. In addition, we identified changes in gene expression in individuals with elevated zinc protoporphyrin blood levels and found that genes regulating the transmission of nerve impulses were affected in these individuals. The affected pathways were G-protein mediated signaling, gap junction signaling, synaptic long-term potentiation, neuropathic pain signaling as well as CREB signaling in neurons. Cellular responses to lead were

  12. Interstage Flammability Analysis Approach

    Science.gov (United States)

    Little, Jeffrey K.; Eppard, William M.

    2011-01-01

    The Interstage of the Ares I launch platform houses several key components which are on standby during First Stage operation: the Reaction Control System (ReCS), the Upper Stage (US) Thrust Vector Control (TVC) and the J-2X with the Main Propulsion System (MPS) propellant feed system. Therefore potentially dangerous leaks of propellants could develop. The Interstage leaks analysis addresses the concerns of localized mixing of hydrogen and oxygen gases to produce deflagration zones in the Interstage of the Ares I launch vehicle during First Stage operation. This report details the approach taken to accomplish the analysis. Specified leakage profiles and actual flammability results are not presented due to proprietary and security restrictions. The interior volume formed by the Interstage walls, bounding interfaces with the Upper and First Stages, and surrounding the J2-X engine was modeled using Loci-CHEM to assess the potential for flammable gas mixtures to develop during First Stage operations. The transient analysis included a derived flammability indicator based on mixture ratios to maintain achievable simulation times. Validation of results was based on a comparison to Interstage pressure profiles outlined in prior NASA studies. The approach proved useful in the bounding of flammability risk in supporting program hazard reviews.

  13. Lead

    Science.gov (United States)

    ... found? Who is at risk? What are the health effects of lead? Get educational material about lead Get certified as a Lead Abatement Worker, or other abatement discipline Lead in drinking water Lead air pollution Test your child Check and maintain your home ...

  14. How lead consultants approach educational change in postgraduate medical education.

    NARCIS (Netherlands)

    Fokkema, J.P.; Westerman, M.; Teunissen, P.W.; Lee, N.; Scherpbier, A.J.J.A.; Vleuten, C.P.M. van der; Dorr, P.J.; Scheele, F.

    2012-01-01

    CONTEXT: Consultants in charge of postgraduate medical education (PGME) in hospital departments ('lead consultants') are responsible for the implementation of educational change. Although difficulties in innovating in medical education are described in the literature, little is known about how lead

  15. Studies on the Analysis of Lead and Silica in Lead Processing Samples

    Directory of Open Access Journals (Sweden)

    R. Ravichandra Babu

    2014-01-01

    Full Text Available In Practice two samples have to be proceeded separately to determine both lead and silica in lead process samples like concentrates, coarse sinter etc. In the present recommended procedure, the lead and silica are precipitated together by using acid fuming, but, instead of using ammonium acetate treatment to separate lead and silica, a mixture of hydrochloric acid-sodium chloride is used to achieve the same purpose. This helps in determine all the components of lead process samples from the same solution without loss of accuracy in any of the components. This method saves considerable time and especially useful for routine analysis in process control lab.

  16. Leading neutron production at HERA in the color dipole approach

    Directory of Open Access Journals (Sweden)

    Carvalho F.

    2016-01-01

    Full Text Available In this work we study leading neutron production in e + p → e + n + X collisions at high energies and calculate the Feynman xL distribution of these neutrons. The differential cross section is written in terms of the pion flux and of the photon-pion total cross section. We describe this process using the color dipole formalism and, assuming the validity of the additive quark model, we relate the dipole-pion with the well determined dipoleproton cross section. In this formalism we can estimate the impact of the QCD dynamics at high energies as well as the contribution of gluon saturation effects to leading neutron production. With the parameters constrained by other phenomenological information, we are able to reproduce the basic features of the recently released H1 leading neutron spectra.

  17. A Mixed Approach Of Automated ECG Analysis

    Science.gov (United States)

    De, A. K.; Das, J.; Majumder, D. Dutta

    1982-11-01

    ECG is one of the non-invasive and risk-free technique for collecting data about the functional state of the heart. However, all these data-processing techniques can be classified into two basically different approaches -- the first and second generation ECG computer program. Not the opposition, but simbiosis of these two approaches will lead to systems with the highest accuracy. In our paper we are going to describe a mixed approach which will show higher accuracy with lesser amount of computational work. Key Words : Primary features, Patients' parameter matrix, Screening, Logical comparison technique, Multivariate statistical analysis, Mixed approach.

  18. Lead and Conduct Problems: A Meta-Analysis

    Science.gov (United States)

    Marcus, David K.; Fulton, Jessica J.; Clarke, Erin J.

    2010-01-01

    This meta-analysis examined the association between conduct problems and lead exposure. Nineteen studies on 8,561 children and adolescents were included. The average "r" across all 19 studies was 0.19 (p less than 0.001), which is considered a medium effect size. Studies that assessed lead exposure using hair element analysis yielded…

  19. The Fault tree analysis of the lead acid battery’s degradation

    Directory of Open Access Journals (Sweden)

    K. BRIK

    2008-06-01

    Full Text Available In this paper the authors present an approach of reliability to analyze lead-acid battery’s degradation. The construction of causal tree analysis offers a framework privileged to the deductive analysis which consists in seeking the various possible combinations of events leading to the loss of batteries capacity. The description of the causality chain is completed by a fault tree analysis (FTA established from the equivalent electric circuit of battery.

  20. The Fault tree analysis of the lead acid battery’s degradation

    OpenAIRE

    K. BRIK; F. BEN AMMAR

    2008-01-01

    In this paper the authors present an approach of reliability to analyze lead-acid battery’s degradation. The construction of causal tree analysis offers a framework privileged to the deductive analysis which consists in seeking the various possible combinations of events leading to the loss of batteries capacity. The description of the causality chain is completed by a fault tree analysis (FTA) established from the equivalent electric circuit of battery.

  1. Climate Change Management Approaches of Cities: A Comparative Study Between Globally Leading and Turkish Metropolitan Cities

    Directory of Open Access Journals (Sweden)

    Solmaz Filiz Karabag

    2011-05-01

    Full Text Available Many studies have focused on climate change policies and action at the national level, but few have studied policies and action at the city level, especially cities in emerging economies. To address this gap, the present study analyzes the management strategies globally leading cities have developed to address climate change and related issues and compares them with the city strategies of one rapidly urbanizing emerging economy, Turkey. In the analysis, the strategic plans of five leading global cities are compared with those of sixteen Turkish cities. While the leading global cities have specific managerial approaches to mitigate climate change, none of the Turkish cities exhibits any comprehensive approach. Furthermore, while leading global cities modify urban services to reduce greenhouse gas (GHG emissions, few Turkish cities adjust any services to address this challenge. Some Turkish cities propose an increased use of renewable energy sources and modification in their transportation system, but the focus in these plans is the current daily needs of their inhabitants. The findings of this study suggest several climate change strategies both for Turkish cities and cities in other developing countries.

  2. LEADING CHANGES IN ASSESSMENT USING AN EVIDENCE BASED APPROACH

    Directory of Open Access Journals (Sweden)

    J. O. Macaulay

    2015-08-01

    Full Text Available Introduction and objectivesIt is has been widely accepted that assessment of learning is a critical component of education and that assessment drives/guides student learning through shaping study habits and student approaches to learning. However, although most academics would agree that assessment is a critical aspect of their roles as teachers it is often an aspect of teaching that is regarded more as an additional task rather than an integral component of the teaching/learning continuum. An additional impediment to high quality assessment is the non-evidence based-approach to the decision making process. The overall aim of this project was to improve the quality of assessment in Biochemistry and Molecular Biology undergraduate education by promoting high quality assessment.Materials and methodsTo do this we developed and trialled an audit tool for mapping assessment practices. The audit tool was designed to gather data on current assessment practices and identify areas of good practice in which assessment aligned with the learning objectives and areas in need of improvement. This evidence base will then be used to drive change in assessment.Results and conclusionsUsing the assessment mapping tool we have mapped the assessment regime in a Biochemistry and Molecular Biology major at Monash University. Criteria used included: assessment type, format, timing, assessors, provision of feedback, level of learning (Bloom’s, approaches taken to planning assessment. We have mapped assessment of content and the systematic development of higher order learning and skills progression throughout the program of study. The data has enabled us to examine the assessment at unit (course level as well as the vertical development across the major. This information is now being used to inform a review of the units and the major.

  3. Fascinating morphologies of lead tungstate nanostructures by chimie douce approach

    Energy Technology Data Exchange (ETDEWEB)

    George, Thresiamma; Joseph, Sunny; Sunny, Anu Tresa; Mathew, Suresh [Mahatma Gandhi University, School of Chemical Sciences (India)], E-mail: smathew_mgu@yahoo.com

    2008-04-15

    Lead tungstate occurs in nature as tetragonal stolzite of scheelite (CaWO{sub 4}) type and monoclinic raspite. In this work, we report, the typical growth of snowflake-like tetragonal stolzite and bamboo-leaf-like monoclinic raspite nanocrystals of PbWO{sub 4} via a simple aqueous precipitation method and a polyol (polyethylene glycol-200) mediated precipitation method at room temperature (27 deg. C). The synthesised PbWO{sub 4} nanocrystals were characterised by XRD, SEM, EDAX and TGA-DTA. The UV-Vis absorption and photoluminescence studies of PbWO{sub 4} nanocrystals in the two morphologies were performed. The nuclei of PbWO{sub 4} nanocrystals in aqueous medium self-assemble in a tetragonal manner to form the snowflake-like crystals. In polyol medium, PbWO{sub 4} nuclei preferentially grow by oriented attachment process to form the bamboo-leaf-like morphology. The specific morphology of the regularly assembled PbWO{sub 4} nanocrystals in the two phases finds applications in nanoelectronics and photonics. Compared to other well-known scintillators, PbWO{sub 4} is most attractive for high-energy physics applications, because of its high density, short decay time and high irradiation damage resistance.

  4. Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge

    Science.gov (United States)

    Yap, Keng C.

    2010-01-01

    This viewgraph presentation reviews Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge. The Wing Leading Edge Impact Detection System (WLE IDS) and the Impact Analysis Process are also described to monitor WLE debris threats. The contents include: 1) Risk Management via SHM; 2) Hardware Overview; 3) Instrumentation; 4) Sensor Configuration; 5) Debris Hazard Monitoring; 6) Ascent Response Summary; 7) Response Signal; 8) Distribution of Flight Indications; 9) Probabilistic Risk Analysis (PRA); 10) Model Correlation; 11) Impact Tests; 12) Wing Leading Edge Modeling; 13) Ascent Debris PRA Results; and 14) MM/OD PRA Results.

  5. Lead generation in crop protection research: a portfolio approach to agrochemical discovery.

    Science.gov (United States)

    Loso, Michael R; Garizi, Negar; Hegde, Vidyadhar B; Hunter, James E; Sparks, Thomas C

    2017-04-01

    The need for increased food and feed supply to support future global demand with the added challenges of resistance pressure and an evolving regulatory environment necessitates the discovery of new crop protection agents for growers of today and tomorrow. Lead generation is the critical 'engine' for maintaining a robust pipeline of new high-value products. A wide variety of approaches exist for the generation of new leads, many of which have demonstrated success. Each approach features some degree of merit or benefit while also having some inherent drawback or level of risk. While risk for any single approach can be mitigated in a variety of different ways depending on the approach, long-term viability of a successful lead generation program merits utilization of a portfolio of different approaches and methodologies for the generation of new leads. © 2016 Society of Chemical Industry.

  6. Deformation analysis: The Fredericton approach

    OpenAIRE

    Vrečko, Anja; Ambrožič, Tomaž

    2013-01-01

    In this article, the Fredericton approach to deformation analysis is presented. It is possible to use several deformation models to determine the differences between the geodetic observations or between the coordinates of points in geodetic network in more epochs. The most appropriate deformation model has been chosen based on statistical testing and available information about dynamics at the area of interest. First, a theoretical background of the approach ...

  7. Real Analysis A Historical Approach

    CERN Document Server

    Stahl, Saul

    2011-01-01

    A provocative look at the tools and history of real analysis This new edition of Real Analysis: A Historical Approach continues to serve as an interesting read for students of analysis. Combining historical coverage with a superb introductory treatment, this book helps readers easily make the transition from concrete to abstract ideas. The book begins with an exciting sampling of classic and famous problems first posed by some of the greatest mathematicians of all time. Archimedes, Fermat, Newton, and Euler are each summoned in turn, illuminating the utility of infinite, power, and trigonome

  8. QCD analysis and effective temperature of direct photons in lead-lead collisions at the LHC

    CERN Document Server

    Klasen, M; König, F; Wessels, J P

    2014-01-01

    We present a systematic theoretical analysis of the ALICE measurement of low-$p_T$ direct-photon production in central lead-lead collisions at the LHC with a centre-of-mass energy of $\\sqrt{s_{NN}}=2.76$ TeV. Using next-to-leading order of perturbative QCD, we compute the relative contributions to prompt-photon production from different initial and final states and the theoretical uncertainties coming from independent variations of the renormalisation and factorisation scales, the nuclear parton densities and the fragmentation functions. Based on different fits to the unsubtracted and prompt-photon subtracted ALICE data, we consistently find an exponential, possibly thermal, photon spectrum from the quark-gluon plasma (or hot medium) with slope $T=304\\pm 58$ MeV and $309\\pm64$ MeV at $p_T\\in[0.8;2.2]$ GeV and $p_T\\in[1.5;3.5]$ GeV as well as a power-law ($p_T^{-4}$) behavior for $p_T>4$ GeV as predicted by QCD hard scattering.

  9. Practical approaches in accident analysis

    Science.gov (United States)

    Stock, M.

    An accident analysis technique based on successive application of structural response, explosion dynamics, gas cloud formation, and plant operation failure mode models is proposed. The method takes into account the nonideal explosion characteristic of a deflagration in the unconfined cloud. The resulting pressure wave differs significantly from a shock wave and the response of structures like lamp posts and walls can differ correspondingly. This gives a more realistic insight into explosion courses than a simple TNT-equivalent approach.

  10. Lead optimization attrition analysis (LOAA): a novel and general methodology for medicinal chemistry.

    Science.gov (United States)

    Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos

    2015-08-01

    Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery.

  11. Corrosion by liquid lead and lead-bismuth: experimental results review and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jinsuo [Los Alamos National Laboratory

    2008-01-01

    Liquid metal technologies for liquid lead and lead-bismuth alloy are under wide investigation and development for advanced nuclear energy systems and waste transmutation systems. Material corrosion is one of the main issues studied a lot recently in the development of the liquid metal technology. This study reviews corrosion by liquid lead and lead bismuth, including the corrosion mechanisms, corrosion inhibitor and the formation of the protective oxide layer. The available experimental data are analyzed by using a corrosion model in which the oxidation and scale removal are coupled. Based on the model, long-term behaviors of steels in liquid lead and lead-bismuth are predictable. This report provides information for the selection of structural materials for typical nuclear reactor coolant systems when selecting liquid lead or lead bismuth as heat transfer media.

  12. A new approach to evaluate the leading hadronic corrections to the muon g-2

    Directory of Open Access Journals (Sweden)

    C.M. Carloni Calame

    2015-06-01

    Full Text Available We propose a novel approach to determine the leading hadronic corrections to the muon g-2. It consists in a measurement of the effective electromagnetic coupling in the space-like region extracted from Bhabha scattering data. We argue that this new method may become feasible at flavor factories, resulting in an alternative determination potentially competitive with the accuracy of the present results obtained with the dispersive approach via time-like data.

  13. A new approach to evaluate the leading hadronic corrections to the muon g-2

    Energy Technology Data Exchange (ETDEWEB)

    Carloni Calame, C.M., E-mail: carlo.carloni.calame@pv.infn.it [Dipartimento di Fisica, Università di Pavia, Pavia (Italy); Passera, M., E-mail: massimo.passera@pd.infn.it [INFN, Sezione di Padova, Padova (Italy); Trentadue, L., E-mail: luca.trentadue@cern.ch [Dipartimento di Fisica e Scienze della Terra “M. Melloni”, Università di Parma, Parma (Italy); INFN, Sezione di Milano Bicocca, Milano (Italy); Venanzoni, G., E-mail: graziano.venanzoni@lnf.infn.it [INFN, Laboratori Nazionali di Frascati, Frascati (Italy)

    2015-06-30

    We propose a novel approach to determine the leading hadronic corrections to the muon g-2. It consists in a measurement of the effective electromagnetic coupling in the space-like region extracted from Bhabha scattering data. We argue that this new method may become feasible at flavor factories, resulting in an alternative determination potentially competitive with the accuracy of the present results obtained with the dispersive approach via time-like data.

  14. Population attributable fraction analysis of leading chronic diseases in India

    Directory of Open Access Journals (Sweden)

    A. Choudhury

    2016-08-01

    Full Text Available Chronic diseases and their associated risk factors are increasing in India. We aim to quantify the Population Attributable Fractions (PAF of leading chronic diseases in India associated with significant modifiable risk factors. In calculating adjusted population attributable fraction, non modifiable risk factors are taken as confounders. Our findings highlight that an agenda to improve public health in India must include effective interventions to control tobacco use for cancer and heart disease prevention. There is also an urgent need to educate the general public to maintain proper BMI level thereby reducing diabetes burden in India. The analysis is based on a country wide large scale survey.

  15. Identification of sources of lead exposure in French children by lead isotope analysis: a cross-sectional study

    Directory of Open Access Journals (Sweden)

    Lucas Jean-Paul

    2011-08-01

    Full Text Available Abstract Background The amount of lead in the environment has decreased significantly in recent years, and so did exposure. However, there is no known safe exposure level and, therefore, the exposure of children to lead, although low, remains a major public health issue. With the lower levels of exposure, it is becoming more difficult to identify lead sources and new approaches may be required for preventive action. This study assessed the usefulness of lead isotope ratios for identifying sources of lead using data from a nationwide sample of French children aged from six months to six years with blood lead levels ≥25 μg/L. Methods Blood samples were taken from 125 children, representing about 600,000 French children; environmental samples were taken from their homes and personal information was collected. Lead isotope ratios were determined using quadrupole ICP-MS (inductively coupled plasma - mass spectrometry and the isotopic signatures of potential sources of exposure were matched with those of blood in order to identify the most likely sources. Results In addition to the interpretation of lead concentrations, lead isotope ratios were potentially of use for 57% of children aged from six months to six years with blood lead level ≥ 25 μg/L (7% of overall children in France, about 332,000 children, with at least one potential source of lead and sufficiently well discriminated lead isotope ratios. Lead isotope ratios revealed a single suspected source of exposure for 32% of the subjects and were able to eliminate at least one unlikely source of exposure for 30% of the children. Conclusions In France, lead isotope ratios could provide valuable additional information in about a third of routine environmental investigations.

  16. Strangeness $S=-1$ hyperon-nucleon scattering at leading order in the covariant Weinberg's approach

    CERN Document Server

    Li, Kai-Wen; Geng, Li-Sheng

    2016-01-01

    Inspired by the success of covariant baryon chiral perturbation theory in the one baryon sector and in the heavy-light systems, we explore the relevance of relativistic effects in the construction of the strangeness $S=-1$ hyperon-nucleon interaction using chiral perturbation theory. Due to the non-perturbative nature of the hyperon-nucleon interaction, we follow the covariant Weinberg's approach recently proposed by Epelbaum and Gegelia to sum the leading order chiral potential using the Kadyshevsky equation (Epelbaum, 2012) in this exploratory work. By fitting the five low-energy constants to available experimental data, we find that the cutoff dependence is mitigated compared with the results obtained in the Weinberg's approach for both partial wave phase shifts and the description of experimental data. Nevertheless, at leading order, the description of experimental data remains quantitatively similar. We discuss in detail the cutoff dependence of the partial wave phase shifts and cross sections in the Wei...

  17. [Subclavian vein puncture as a primary approach for pacemaker lead implantation].

    Science.gov (United States)

    Kronski, D; Haas, H

    2001-12-01

    In the beginning of transvenous pacemaker therapy, the external or alternatively internal jugular vein was commonly used for lead implantation. Due to frequent long-term complications both approaches are nowadays obsolete. In most pacemaker centers implantation via the cephalic vein has become standard. As an alternative, in 1975 Sterz et al. introduced puncture of the subclavian vein in the Seldinger technique as an approach for lead implantation. At this time, the commonly used introducers of pacemaker leads had to be cut for removal. No earlier than 1980 "peel away" introducers were commercially available. Since then, we consequently use this technique for implantation of single or dual chamber pacemaker devices. In the course of the last seven years merely 1.5-2% of implantations were performed via the cephalic vein; no jugular vein approach was performed. Due to a routinely performed subclavian vein puncture, we were able to optimize the procedure, proven by an enormous reduction in implantation time (local anesthesia - skin closure), x-ray time and complication rate. In the year 2000 we performed 52 implantations of a single chamber device with an average fluoroscopy time of 1.5 (0.3-9.3) minutes, radiation dose of 4.5 (0.1-47) Gycm(2) and implantation time of 17.6 (8-40) minutes and 144 implantations of a dual chamber device with an average fluoroscopy time of 2.86 (0.7-6.6) minutes, radiation dose of 8.31 (0.7-28) Gycm(2) and implantation time of 21.25 (10-45) minutes. Complications were rare, clinically irrelevant arterial punctures. Neither nerval damage nor pneumothoraces with the necessity for chest tube placement were seen in the above mentioned time frame. No early or late thrombosis of the subclavian vein was encountered.    The primary subclavian vein approach led to an enormous reduction in overall procedure time without significant morbidity.

  18. Electrode alignment of transverse tripoles using a percutaneous triple-lead approach in spinal cord stimulation

    Science.gov (United States)

    Sankarasubramanian, V.; Buitenweg, J. R.; Holsheimer, J.; Veltink, P.

    2011-02-01

    The aim of this modeling study is to determine the influence of electrode alignment of transverse tripoles on the paresthesia coverage of the pain area in spinal cord stimulation, using a percutaneous triple-lead approach. Transverse tripoles, comprising a central cathode and two lateral anodes, were modeled on the low-thoracic vertebral region (T10-T12) using percutaneous triple-lead configurations, with the center lead on the spinal cord midline. The triple leads were oriented both aligned and staggered. In the staggered configuration, the anodes were offset either caudally (caudally staggered) or rostrally (rostrally staggered) with respect to the midline cathode. The transverse tripolar field steering with the aligned and staggered configurations enabled the estimation of dorsal column fiber thresholds (IDC) and dorsal root fiber thresholds (IDR) at various anodal current ratios. IDC and IDR were considerably higher for the aligned transverse tripoles as compared to the staggered transverse tripoles. The aligned transverse tripoles facilitated deeper penetration into the medial dorsal columns (DCs). The staggered transverse tripoles always enabled broad and bilateral DC activation, at the expense of mediolateral steerability. The largest DC recruited area was obtained with the rostrally staggered transverse tripole. Transverse tripolar geometries, using percutaneous leads, allow for selective targeting of either medial or lateral DC fibers, if and only if the transverse tripole is aligned. Steering of anodal currents between the lateral leads of the staggered transverse tripoles cannot target medially confined populations of DC fibers in the spinal cord. An aligned transverse tripolar configuration is strongly recommended, because of its ability to provide more post-operative flexibility than other configurations.

  19. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  20. Environmental health risk assessment of ambient lead levels in Lisbon, Portugal: A full chain study approach

    DEFF Research Database (Denmark)

    Casimiro, E.; Philippe Ciffroy, P.; Serpa, P.;

    2011-01-01

    The multi-causality interactions between environment and health are complex and call for an integrated multidisciplinary study approach. Emerging computational toxicology tools that link toxicology, chemistry, environmental sciences, biostatistics, and computer sciences are proving to be very...... useful for integrated full-chain human health risk assessments. In this study we use a newly developed computational tool – the 2FUN player to conduct a full-chain assessment combining measured ambient air lead concentrations with multi-media modelling and PBPK simulations to estimate the health risks...... from ambient air levels of lead in air-borne particulates (PM10) in Lisbon, Portugal. Ambient air Pb concentrations were used together with local climate variables in the 2FUN atmospheric model to calculate the amount of Pb deposited (wet and dry) onto soil. The 2FUN environmental and PBPK models were...

  1. An Automated Microscale Thermophoresis Screening Approach for Fragment-Based Lead Discovery.

    Science.gov (United States)

    Linke, Pawel; Amaning, Kwame; Maschberger, Melanie; Vallee, Francois; Steier, Valerie; Baaske, Philipp; Duhr, Stefan; Breitsprecher, Dennis; Rak, Alexey

    2016-04-01

    Fragment-based lead discovery has proved to be an effective alternative to high-throughput screenings in identifying chemical matter that can be developed into robust lead compounds. The search for optimal combinations of biophysical techniques that can correctly and efficiently identify and quantify binding can be challenging due to the physicochemical properties of fragments. In order to minimize the time and costs of screening, optimal combinations of biophysical techniques with maximal information content, sensitivity, and robustness are needed. Here we describe an approach utilizing automated microscale thermophoresis (MST) affinity screening to identify fragments active against MEK1 kinase. MST identified multiple hits that were confirmed by X-ray crystallography but not detected by orthogonal methods. Furthermore, MST also provided information about ligand-induced aggregation and protein denaturation. The technique delivered a large number of binders while reducing experimentation time and sample consumption, demonstrating the potential of MST to execute and maximize the efficacy of fragment screening campaigns.

  2. Information-theoretic approach to lead-lag effect on financial markets

    Science.gov (United States)

    Fiedor, Paweł

    2014-08-01

    Recently the interest of researchers has shifted from the analysis of synchronous relationships of financial instruments to the analysis of more meaningful asynchronous relationships. Both types of analysis are concentrated mostly on Pearson's correlation coefficient and consequently intraday lead-lag relationships (where one of the variables in a pair is time-lagged) are also associated with them. Under the Efficient-Market Hypothesis such relationships are not possible as all information is embedded in the prices, but in real markets we find such dependencies. In this paper we analyse lead-lag relationships of financial instruments and extend known methodology by using mutual information instead of Pearson's correlation coefficient. Mutual information is not only a more general measure, sensitive to non-linear dependencies, but also can lead to a simpler procedure of statistical validation of links between financial instruments. We analyse lagged relationships using New York Stock Exchange 100 data not only on an intraday level, but also for daily stock returns, which have usually been ignored.

  3. Flow Analysis: A Novel Approach For Classification.

    Science.gov (United States)

    Vakh, Christina; Falkova, Marina; Timofeeva, Irina; Moskvin, Alexey; Moskvin, Leonid; Bulatov, Andrey

    2016-09-01

    We suggest a novel approach for classification of flow analysis methods according to the conditions under which the mass transfer processes and chemical reactions take place in the flow mode: dispersion-convection flow methods and forced-convection flow methods. The first group includes continuous flow analysis, flow injection analysis, all injection analysis, sequential injection analysis, sequential injection chromatography, cross injection analysis, multi-commutated flow analysis, multi-syringe flow injection analysis, multi-pumping flow systems, loop flow analysis, and simultaneous injection effective mixing flow analysis. The second group includes segmented flow analysis, zone fluidics, flow batch analysis, sequential injection analysis with a mixing chamber, stepwise injection analysis, and multi-commutated stepwise injection analysis. The offered classification allows systematizing a large number of flow analysis methods. Recent developments and applications of dispersion-convection flow methods and forced-convection flow methods are presented.

  4. Gamma radiation shielding analysis of lead-flyash concretes.

    Science.gov (United States)

    Singh, Kanwaldeep; Singh, Sukhpal; Dhaliwal, A S; Singh, Gurmel

    2014-11-04

    Six samples of lead-flyash concrete were prepared with lead as an admixture and by varying flyash content - 0%, 20%, 30%, 40%, 50% and 60% (by weight) by replacing cement and keeping constant w/c ratio. Different gamma radiation interaction parameters used for radiation shielding design were computed theoretically and measured experimentally at 662keV, 1173keV and 1332keV gamma radiation energy using narrow transmission geometry. The obtained results were compared with ordinary-flyash concretes. The radiation exposure rate of gamma radiation sources used was determined with and without lead-flyash concretes.

  5. The Nd Break-Up Process in Leading Order in a Three-Dimensional Approach

    CERN Document Server

    Fachuddin, I; Glöckle, W; Elster, Ch.

    2003-01-01

    A three-dimensional approach based on momentum vectors as variables for solving the three nucleon Faddeev equation in first order is presented. The nucleon-deuteron break-up amplitude is evaluated in leading order in the NN T-matrix, which is also generated directly in three dimensions avoiding a summation of partial wave contributions. A comparison of semi-exclusive observables in the $d(p,n)pp$ reaction calculated in this scheme with those generated by a traditional partial wave expansion shows perfect agreement at lower energies. At about 200 MeV nucleon laboratory energies deviations in the peak of the cross section appear, which may indicate that special care is required in a partial wave approach for energies at and higher than 200 MeV. The role of higher order rescattering processes beyond the leading order in the NN T-matrix is investigated with the result, that at 200 MeV rescattering still provides important contributions to the cross section and certain spin observables. The influence of a relativi...

  6. Nuclear microprobe analysis of lead profile in crocodile bones

    Science.gov (United States)

    Orlic, I.; Siegele, R.; Hammerton, K.; Jeffree, R. A.; Cohen, D. D.

    2003-09-01

    Elevated concentrations of lead were found in Australian free ranging saltwater crocodile ( Crocodylus porosus) bone and flesh. Lead shots were found as potential source of lead in these animals. ANSTO's heavy ion nuclear microprobe was used to measure the distribution of Pb in a number of bones and osteoderms. The aim was to find out if elevated Pb concentration remains in growth rings and if the concentration is correlated with the blood levels recorded at the time. Results of our study show a very distinct distribution of accumulated Pb in bones and osteoderms as well as good correlation with the level of lead concentration in blood. To investigate influence of ion species on detection limits measurements of the same sample were performed by using 3 MeV protons, 9 MeV He ions and 20 MeV carbon ions. Peak to background ratios, detection limits and the overall 'quality' of obtained spectra are compared and discussed.

  7. GCM Analysis of the collective properties of lead isotopes with exact projection on particle numbers

    CERN Document Server

    Heenen, P H; Bender, M; Bonche, P; Flocard, Hubert

    2001-01-01

    We present a microscopic analysis of the collective behaviour of the lead isotopes in the vicinity of Pb208. In this study, we rely on a coherent approach based on the Generator Coordinate Method including exact projection on N and Z numbers within a collective space generated by means of the constrained Hartree-Fock BCS method. With the same Hamiltonian used in HF+BCS calculations, we have performed a comprehensive study including monopole, quadrupole and octupole excitations as well as pairing vibrations. We find that, for the considered nuclei, the collective modes which modify the most the conclusions drawn from the mean-field theory are the octupole and pairing vibrations.

  8. Analysis of lead content in herbal preparations in Malaysia.

    Science.gov (United States)

    Ang, H H; Lee, E L; Matsumoto, K

    2003-08-01

    In Malaysia, the phase 3 registration for traditional medicines was implemented on 1 January 1992 under the Control of Drugs and Cosmetics Regulation 1984, emphasizing quality, efficacy and safety (including the detection of the presence of heavy metals) in all pharmaceutical dosage forms of traditional medicine preparations. Therefore, a total of 100 products in various pharmaceutical dosage forms of a herbal preparation, were analysed for lead content using atomic absorption spectrophotometer. Results showed that 8% (eight products) possessed 10.64-20.72 ppm of lead, and therefore, do not comply with the quality requirement for traditional medicines in Malaysia. One of these products, M-Tongkat Ali (exhibited 10.64 +/-0.37 ppm of lead), was in fact already registered with the DCA Malaysia. The rest, Sukarno Tongkat Ali, Eurycoma Madu, Super Pill Tongkat Ali, Force Pill Tongkat Ali, Tender Pill Tongkat Ali, Super Pill Tongkat Ali Plus and Great Pill Tongkat Ali Plus have not registered with the DCA Malaysia and exhibited 12.24-20.72 ppm of lead. Although this study showed that only 92% of the products complied with the quality requirement for traditional medicines in Malaysia, however, they cannot be assumed safe from lead contamination because of batch-to-batch inconsistency.

  9. Transient analysis and burnout of high temperature superconducting current leads

    Science.gov (United States)

    Seol, S. Y.; Hull, J. R.

    The transient behaviour of high-temperature superconductor (HTS) current leads operated between liquid helium and liquid nitrogen temperatures is analysed for burnout conditions upon transition of the HTS into the normal state. Leads composed of HTS only and of HTS sheathed by pure silver or silver alloy are investigated numerically for temperature-dependent properties and analytically for temperature-independent properties. For lower values of shape factor (current density times length), the lead can be operated indefinitely without burnout. At higher values of shape factor, the lead reaches burnout in a finite time. With high current densities, the leads heat adiabatically. For a fixed shape factor, low current densities are desired to achieve long burnout times. To achieve a low helium boil-off rate in the superconducting state without danger of burnout, there is a preferred temperature dependence for thermal conductivity, and silver alloy sheaths are preferred to pure silver sheaths. However, for a given current density, pure silver sheaths take longer to burn out.

  10. Crude to leads: a triple-pronged direct NMR approach in coordination with docking simulation.

    Science.gov (United States)

    Tanoli, Sheraz A K; Tanoli, Nazish U; Bondancia, Tatiani M; Usmani, Saman; Kerssebaum, Rainer; Ferreira, Antonio G; Fernandes, Joao B; Ul-Haq, Zaheer

    2013-09-01

    The screening of compounds that bind to the target of interest (specific proteins) plays a vital role in drug discovery. Usually, the identification of biologically active compounds is done from a library of structurally known compounds. However, we successfully illustrate here, that NMR techniques including saturation transfer difference (STD), transfer nuclear Overhauser spectroscopy (TrNOESY) and STD-TOCSY (total correlation spectroscopy) in combination with separation methods not only enable the rapid and comprehensive screening of active components, but also their unequivocal structural characterization. Furthermore, a time saving for the recognition of leads is also possible with this application. To probe the binding studies, a hydroethanolic fraction of crude extract (1 mg) from natural product (Rauia resinous) was used for the initial assessment with BSA protein. The docking simulation was performed with BSA in the region of Thr190, Arg198, Arg217, Trp213, Arg256, Ala290 and Tyr451 to further refine the active compound towards the leads. Docking results mimic binding as identified by STD, Tr-NOESY and STD-TOCSY. Isovetexine-2-rhamnosoide (2) was found to be most active through group epitope mapping results as well as the docking simulation with relative free energy of -7.2770. This experiment provided excellent results through the direct NMR screening method. Using Bovine Serum Albumin as a reference, we illustrate that this approach offers an excellent way for the first hand detection of the active constituents/inhibitors from natural remedies used in folk medicinal treatments.

  11. Neoclassic drug discovery: the case for lead generation using phenotypic and functional approaches.

    Science.gov (United States)

    Lee, Jonathan A; Berg, Ellen L

    2013-12-01

    Innovation and new molecular entity production by the pharmaceutical industry has been below expectations. Surprisingly, more first-in-class small-molecule drugs approved by the U.S. Food and Drug Administration (FDA) between 1999 and 2008 were identified by functional phenotypic lead generation strategies reminiscent of pre-genomics pharmacology than contemporary molecular targeted strategies that encompass the vast majority of lead generation efforts. This observation, in conjunction with the difficulty in validating molecular targets for drug discovery, has diminished the impact of the "genomics revolution" and has led to a growing grassroots movement and now broader trend in pharma to reconsider the use of modern physiology-based or phenotypic drug discovery (PDD) strategies. This "From the Guest Editors" column provides an introduction and overview of the two-part special issues of Journal of Biomolecular Screening on PDD. Terminology and the business case for use of PDD are defined. Key issues such as assay performance, chemical optimization, target identification, and challenges to the organization and implementation of PDD are discussed. Possible solutions for these challenges and a new neoclassic vision for PDD that combines phenotypic and functional approaches with technology innovations resulting from the genomics-driven era of target-based drug discovery (TDD) are also described. Finally, an overview of the manuscripts in this special edition is provided.

  12. Lead identification for the K-Ras protein: virtual screening and combinatorial fragment-based approaches

    Directory of Open Access Journals (Sweden)

    Pathan AAK

    2016-05-01

    Full Text Available Akbar Ali Khan Pathan,1,2,* Bhavana Panthi,3,* Zahid Khan,1 Purushotham Reddy Koppula,4–6 Mohammed Saud Alanazi,1 Sachchidanand,3 Narasimha Reddy Parine,1 Mukesh Chourasia3,* 1Genome Research Chair (GRC, Department of Biochemistry, College of Science, King Saud University, 2Integrated Gulf Biosystems, Riyadh, Kingdom of Saudi Arabia; 3Department of Pharmacoinformatics, National Institute of Pharmaceutical Education and Research, Hajipur, India; 4Department of Internal Medicine, School of Medicine, 5Harry S. Truman Memorial Veterans Affairs Hospital, 6Department of Radiology, School of Medicine, Columbia, MO, USA *These authors contributed equally to this work Objective: Kirsten rat sarcoma (K-Ras protein is a member of Ras family belonging to the small guanosine triphosphatases superfamily. The members of this family share a conserved structure and biochemical properties, acting as binary molecular switches. The guanosine triphosphate-bound active K-Ras interacts with a range of effectors, resulting in the stimulation of downstream signaling pathways regulating cell proliferation, differentiation, and apoptosis. Efforts to target K-Ras have been unsuccessful until now, placing it among high-value molecules against which developing a therapy would have an enormous impact. K-Ras transduces signals when it binds to guanosine triphosphate by directly binding to downstream effector proteins, but in case of guanosine diphosphate-bound conformation, these interactions get disrupted. Methods: In the present study, we targeted the nucleotide-binding site in the “on” and “off” state conformations of the K-Ras protein to find out suitable lead compounds. A structure-based virtual screening approach has been used to screen compounds from different databases, followed by a combinatorial fragment-based approach to design the apposite lead for the K-Ras protein. Results: Interestingly, the designed compounds exhibit a binding preference for the

  13. Differences between IC Analysis and TG Approach

    Institute of Scientific and Technical Information of China (English)

    张美玲

    2014-01-01

    Structuralism and generative approach are two representative syntax theories, which study language from different per-spectives. They employ different methodologies i.e. Immediate constituent (IC) Analysis and transformational-generative (TG) approach to make syntactical analysis. In this paper, these two methods will be applied to analyze some sentences for further dis-cussion and comparison. After analysis by examples, we find that both these two methods have their merits and inadequacies. To some extent, TG method can help IC analysis solve some problems. However, TG grammar is by no means complete and perfect. Improvements are needed to reach its ultimate goal of producing a universal grammar for all human languages.

  14. Document Analysis by Crosscount Approach

    Institute of Scientific and Technical Information of China (English)

    王海琴; 戴汝为

    1998-01-01

    In this paper a new feature called crosscount for document analysis is introduced.The reature crosscount is a function of white line segment with its start on the edge of document images.It reflects not only the contour of image,but also the periodicity of white lines(background)and text lines in the document images.In complex printed-page layouts,there are different blocks such as textual,graphical,tabular,and so on.Of these blocks,textual ones have the most obvious periodicity with their homogeneous white lines arranged regularly.The important property of textual blocks can be extracted by crosscount functions.here the document layouts are classified into three classes on the basis of their physical structures.Then the definition and properties of the crosscount function are described.According to the classification of document layouts,the application of this new feature to different types of document images' analysis and understanding is discussed.

  15. Analysis of Lead and Zinc by Mercury-Free Potentiometric Stripping Analysis

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    A method is presented for trace-element analysis of lead and zinc by potentiometric stripping analysis (PSA) where both the glassy-carbon working electrode and the electrolyte are free of mercury. Analysis of zinc requires an activation procedure of the glassy-carbon electrode. The activation...... is performed by pre-concentrating zinc on glassy carbon at -1400 mV(SCE) in a mercury-free electrolyte containing 0.1 M HCl and 2 ppm Zn2+, followed by stripping at approx. -1050 mV. A linear relationship between stripping peak areas, recorded in the derivative mode, and concentration was found...

  16. Analysis Of Transport Properties of Mechanically Alloyed Lead Tin Telluride

    Science.gov (United States)

    Krishna, Rajalakshmi

    these inclusions would not be less than that expected in alloys without these inclusions while the portion of the thermal conductivity that is not due to charge carriers (the lattice thermal conductivity) would be less than what would be expected from alloys that do not have these inclusions. Furthermore, it would be possible to approximate the observed changes in the electrical and thermal transport properties using existing physical models for the scattering of electrons and phonons by small inclusions. The approach taken to investigate this hypothesis was to first experimentally characterize the mobile carrier concentration at room temperature along with the extent and type of secondary phase inclusions present in a series of three mechanically alloyed Pb1-xSnxTe alloys with different Sn content. Second, the physically based computational model was developed. This model was used to determine what the electronic conductivity, Seebeck coefficient, total thermal conductivity, and the portion of the thermal conductivity not due to mobile charge carriers would be in these particular Pb1-x SnxTe alloys if there were to be no secondary phase inclusions. Third, the electronic conductivity, Seebeck coecient and total thermal conductivity was experimentally measured for these three alloys with inclusions present at elevated temperatures. The model predictions for electrical conductivity and Seebeck coefficient were directly compared to the experimental elevated temperature electrical transport measurements. The computational model was then used to extract the lattice thermal conductivity from the experimentally measured total thermal conductivity. This lattice thermal conductivity was then compared to what would be expected from the alloys in the absence of secondary phase inclusions. Secondary phase inclusions were determined by X-ray diraction analysis to be present in all three alloys to a varying extent. The inclusions were found not to significantly degrade electrical

  17. Long lead-time flood forecasting using data-driven modeling approaches

    Science.gov (United States)

    Bhatia, N.; He, J.; Srivastav, R. K.

    2014-12-01

    In spite of numerous structure measures being taken for floods, accurate flood forecasting is essential to condense the damages in hazardous areas considerably. The need of producing more accurate flow forecasts motivates the researchers to develop advanced innovative methods. In this study, it is proposed to develop a hybrid neural network model to exploit the strengths of artificial neural networks (ANNs). The proposed model has two components: i.) Dual - ANN model developed using river flows; and ii.) Multiple Linear Regression (MLR) model trained on meteorological data (Rainfall and Snow on ground). Potential model inputs that best represent the process of river basin were selected in stepwise manner by identifying input-output relationship using a linear approach, Partial Correlation Input Selection (PCIS) combined with Akaike Information Criterion (AIC) technique. The presented hybrid model was compared with three conventional methods: i) Feed-forward artificial neural network (FF-ANN) using daily river flows; ii) FF-ANN applied on decomposed river flows (low flow, rising limb and falling limb of hydrograph); and iii) Recursive method for daily river flows with lead-time of 7 days. The applicability of the presented model is illustrated through daily river flow data of Bow River, Canada. Data from 1912 to 1976 were used to train the models while data from 1977 to 2006 were used to validate the models. The results of the study indicate that the proposed model is robust enough to capture the non-linear nature of hydrograph and proves to be highly promising to forecast peak flows (extreme values) well in advance (higher lead time).

  18. Boolean approach to common event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Worrell, R.B.; Stack, D.W.

    1980-01-01

    Although different phenomena may be involved, the problem that must be solved for each kind of common event analysis is essentially the same: to determine the effect of common events on the behavior of a system. A Boolean approach to the problem is set forth. Because of the large equations that arise, processing must be done by computers. Vital location analysis is a particular kind of common event analysis that is used to study ways to prevent the sabotage of nuclear reactors. (RWR)

  19. Introduction to Real Analysis An Educational Approach

    CERN Document Server

    Bauldry, William C

    2011-01-01

    An accessible introduction to real analysis and its connection to elementary calculus Bridging the gap between the development and history of real analysis, Introduction to Real Analysis: An Educational Approach presents a comprehensive introduction to real analysis while also offering a survey of the field. With its balance of historical background, key calculus methods, and hands-on applications, this book provides readers with a solid foundation and fundamental understanding of real analysis. The book begins with an outline of basic calculus, including a close examination of problems illust

  20. A quantitative approach to scar analysis.

    Science.gov (United States)

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology.

  1. Drag Coefficient of Water Droplets Approaching the Leading Edge of an Airfoil

    Science.gov (United States)

    Vargas, Mario; Sor, Suthyvann; Magarino, Adelaida Garcia

    2013-01-01

    This work presents results of an experimental study on droplet deformation and breakup near the leading edge of an airfoil. The experiment was conducted in the rotating rig test cell at the Instituto Nacional de Tecnica Aeroespacial (INTA) in Madrid, Spain. An airfoil model was placed at the end of the rotating arm and a monosize droplet generator produced droplets that fell from above, perpendicular to the path of the airfoil. The interaction between the droplets and the airfoil was captured with high speed imaging and allowed observation of droplet deformation and breakup as the droplet approached the airfoil near the stagnation line. Image processing software was used to measure the position of the droplet centroid, equivalent diameter, perimeter, area, and the major and minor axes of an ellipse superimposed over the deforming droplet. The horizontal and vertical displacement of each droplet against time was also measured, and the velocity, acceleration, Weber number, Bond number, Reynolds number, and the drag coefficients were calculated along the path of the droplet to the beginning of breakup. Results are presented and discussed for drag coefficients of droplets with diameters in the range of 300 to 1800 micrometers, and airfoil velocities of 50, 70 and 90 meters/second. The effect of droplet oscillation on the drag coefficient is discussed.

  2. Comprehensive analysis of 5-aminolevulinic acid dehydrogenase (ALAD variants and renal cell carcinoma risk among individuals exposed to lead.

    Directory of Open Access Journals (Sweden)

    Dana M van Bemmel

    Full Text Available BACKGROUND: Epidemiologic studies are reporting associations between lead exposure and human cancers. A polymorphism in the 5-aminolevulinic acid dehydratase (ALAD gene affects lead toxicokinetics and may modify the adverse effects of lead. METHODS: The objective of this study was to evaluate single-nucleotide polymorphisms (SNPs tagging the ALAD region among renal cancer cases and controls to determine whether genetic variation alters the relationship between lead and renal cancer. Occupational exposure to lead and risk of cancer was examined in a case-control study of renal cell carcinoma (RCC. Comprehensive analysis of variation across the ALAD gene was assessed using a tagging SNP approach among 987 cases and 1298 controls. Occupational lead exposure was estimated using questionnaire-based exposure assessment and expert review. Odds ratios (OR and 95% confidence intervals (CI were calculated using logistic regression. RESULTS: The adjusted risk associated with the ALAD variant rs8177796(CT/TT was increased (OR = 1.35, 95%CI = 1.05-1.73, p-value = 0.02 when compared to the major allele, regardless of lead exposure. Joint effects of lead and ALAD rs2761016 suggest an increased RCC risk for the homozygous wild-type and heterozygous alleles ((GGOR = 2.68, 95%CI = 1.17-6.12, p = 0.01; (GAOR = 1.79, 95%CI = 1.06-3.04 with an interaction approaching significance (p(int = 0.06. No significant modification in RCC risk was observed for the functional variant rs1800435(K68N. Haplotype analysis identified a region associated with risk supporting tagging SNP results. CONCLUSION: A common genetic variation in ALAD may alter the risk of RCC overall, and among individuals occupationally exposed to lead. Further work in larger exposed populations is warranted to determine if ALAD modifies RCC risk associated with lead exposure.

  3. Index analysis approach theory at work

    CERN Document Server

    Lowen, R

    2015-01-01

    A featured review of the AMS describes the author’s earlier work in the field of approach spaces as, ‘A landmark in the history of general topology’. In this book, the author has expanded this study further and taken it in a new and exciting direction.   The number of conceptually and technically different systems which characterize approach spaces is increased and moreover their uniform counterpart, uniform gauge spaces, is put into the picture. An extensive study of completions, both for approach spaces and for uniform gauge spaces, as well as compactifications for approach spaces is performed. A paradigm shift is created by the new concept of index analysis.   Making use of the rich intrinsic quantitative information present in approach structures, a technique is developed whereby indices are defined that measure the extent to which properties hold, and theorems become inequalities involving indices; therefore vastly extending the realm of applicability of many classical results. The theory is the...

  4. Comparative analysis of using natural and radiogenic lead as heat-transfer agent in fast reactors

    Science.gov (United States)

    Laas, R. A.; Gizbrekht, R. V.; Komarov, P. A.; Nesterov, V. N.

    2016-06-01

    Fast reactors with lead coolant have several advantages over analogues. Performance can be further improved by replacement of natural composition lead with radiogenic one. Thus, two main issues need to be addressed: induced radioactivity in coolant and efficient neutron multiplication factor in the core will be changed and need to be estimated. To address these issues analysis of the scheme of the nuclear transformations in the lead heat-transfer agent in the process of radiation was carried out. Induced radioactivity of radiogenic and natural lead has been studied. It is shown that replacement of lead affects multiplication factor in a certain way. Application of radiogenic lead can significantly affect reactor operation.

  5. Publication Trends in Thanatology: An Analysis of Leading Journals.

    Science.gov (United States)

    Wittkowski, Joachim; Doka, Kenneth J; Neimeyer, Robert A; Vallerga, Michael

    2015-01-01

    To identify important trends in thanatology as a discipline, the authors analyzed over 1,500 articles that appeared in Death Studies and Omega over a 20-year period, coding the category of articles (e.g., theory, application, empirical research), their content focus (e.g., bereavement, death attitudes, end-of-life), and for empirical studies, their methodology (e.g., quantitative, qualitative). In general, empirical research predominates in both journals, with quantitative methods outnumbering qualitative procedures 2 to 1 across the period studied, despite an uptick in the latter methods in recent years. Purely theoretical articles, in contrast, decline in frequency. Research on grief and bereavement is the most commonly occurring (and increasing) content focus of this work, with a declining but still substantial body of basic research addressing death attitudes. Suicidology is also well represented in the corpus of articles analyzed. In contrast, publications on topics such as death education, medical ethics, and end-of-life issues occur with lower frequency, in the latter instances likely due to the submission of such work to more specialized medical journals. Differences in emphasis of Death Studies and Omega are noted, and the analysis of publication patterns is interpreted with respect to overall trends in the discipline and the culture, yielding a broad depiction of the field and some predictions regarding its possible future.

  6. Potentiometric stripping analysis of lead and cadmium leaching from dental prosthetic materials and teeth

    OpenAIRE

    GORAN M. NIKOLIC; BILJANA M. KALICANIN; RUZICA S. NIKOLIC

    2004-01-01

    Potentiometric stipping analysis (PSA) was applied for the determination of lead and cadmium leaching from dental prosthetic materials and teeth. The soluble lead content in finished dental implants was found to be much lower than that of the individual components used for their preparation. Cadmium was not detected in dental implants and materials under the defined conditions. The soluble lead and cadmium content of teeth was slightly lower than the lead and cadmium content in whole teeth (w...

  7. A New Approach for Aeroelastic Robust Stability Analysis

    Institute of Scientific and Technical Information of China (English)

    Wu Zhigang; Yang Chao

    2008-01-01

    Air vehicles undergo variations in structural mass and stiffness because of fuel consumption and the failure of structural components, which might lead to serious influences on the aeroelastic characteristics. An approach for aeroelastic robust stability analysis taking into account the perturbations of structural mass and stiffness is developed. Applying the perturbation method and harmonic unsteady aerodynamic forces, the frequency-domain linear fractal transformation (LFT) representation of pertorbed aeroelastic system is modeled.Then, the robust stability is analyzed by using the structured singular value μ-method. The numerical results of a bi-spar wing show its effectiveness and low computational time in dealing with the robust problems with mass and stiffness perturbations. In engineering analysis for solving aeroelastic problems, the robust approach can be applied to flutter analysis for airplane with the fuel load variation and taking the damage conditions into consideration.

  8. Potential of ayurgenomics approach in complex trait research: leads from a pilot study on rheumatoid arthritis.

    Directory of Open Access Journals (Sweden)

    Ramesh C Juyal

    Full Text Available BACKGROUND: Inconsistent results across association studies including Genome-wide association, have posed a major challenge in complex disease genetics. Of the several factors which contribute to this, phenotypic heterogeneity is a serious limitation encountered in modern medicine. On the other hand, Ayurveda, a holistic Indian traditional system of medicine, enables subgrouping of individuals into three major categories namely Vata, Pitta and Kapha, based on their physical and mental constitution, referred to as Prakriti. We hypothesised that conditioning association studies on prior risk, predictable in Ayurveda, will uncover much more variance and potentially open up more predictive health. OBJECTIVES AND METHODS: Identification of genetic susceptibility markers by combining the prakriti based subgrouping of individuals with genetic analysis tools was attempted in a Rheumatoid arthritis (RA cohort. Association of 21 markers from commonly implicated inflammatory and oxidative stress pathways was tested using a case-control approach in a total cohort comprising 325 cases and 356 controls and in the three subgroups separately. We also tested few postulates of Ayurveda on the disease characteristics in different prakriti groups using clinico-genetic data. RESULTS: Inflammatory genes like IL1β (C-C-C haplotype, p=0.0005, OR=3.09 and CD40 (rs4810485 allelic, p=0.04, OR=2.27 seem to be the determinants in Vata subgroup whereas oxidative stress pathway genes are observed in Pitta (SOD3 rs699473, p=0.004, OR=1.83; rs2536512 p=0.005; OR=1.88 and PON1 rs662, p=0.04, OR=1.53 and Kapha (SOD3 rs2536512, genotypic, p=0.02, OR=2.39 subgroups. Fixed effect analysis of the associated markers from CD40, SOD3 and TNFα with genotype X prakriti interaction terms suggests heterogeneity of effects within the subgroups. Further, disease characteristics such as severity was most pronounced in Vata group. CONCLUSIONS: This exploratory study suggests discrete causal

  9. Linking cases of illegal shootings of the endangered California condor using stable lead isotope analysis

    Energy Technology Data Exchange (ETDEWEB)

    Finkelstein, Myra E., E-mail: myraf@ucsc.edu [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Kuspa, Zeka E. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States); Welch, Alacia [National Park Service, Pinnacles National Park, 5000 Highway 146, Paicines, CA 95043 (United States); Eng, Curtis; Clark, Michael [Los Angeles Zoo and Botanical Gardens, 5333 Zoo Drive, Los Angeles, CA 90027 (United States); Burnett, Joseph [Ventana Wildlife Society, 19045 Portola Dr. Ste. F-1, Salinas, CA 93908 (United States); Smith, Donald R. [Microbiology and Environmental Toxicology Department, University of California, Santa Cruz, CA 95064 (United States)

    2014-10-15

    Lead poisoning is preventing the recovery of the critically endangered California condor (Gymnogyps californianus) and lead isotope analyses have demonstrated that ingestion of spent lead ammunition is the principal source of lead poisoning in condors. Over an 8 month period in 2009, three lead-poisoned condors were independently presented with birdshot embedded in their tissues, evidencing they had been shot. No information connecting these illegal shooting events existed and the timing of the shooting(s) was unknown. Using lead concentration and stable lead isotope analyses of feathers, blood, and recovered birdshot, we observed that: i) lead isotope ratios of embedded shot from all three birds were measurably indistinguishable from each other, suggesting a common source; ii) lead exposure histories re-constructed from feather analysis suggested that the shooting(s) occurred within the same timeframe; and iii) two of the three condors were lead poisoned from a lead source isotopically indistinguishable from the embedded birdshot, implicating ingestion of this type of birdshot as the source of poisoning. One of the condors was subsequently lead poisoned the following year from ingestion of a lead buckshot (blood lead 556 µg/dL), illustrating that ingested shot possess a substantially greater lead poisoning risk compared to embedded shot retained in tissue (blood lead ∼20 µg/dL). To our knowledge, this is the first study to use lead isotopes as a tool to retrospectively link wildlife shooting events. - Highlights: • We conducted a case-based analysis of illegal shootings of California condors. • Blood and feather Pb isotopes were used to reconstruct the illegal shooting events. • Embedded birdshot from the three condors had the same Pb isotope ratios. • Feather and blood Pb isotopes indicated that the condors were shot in a common event. • Ingested shot causes substantially greater lead exposure compared to embedded shot.

  10. Practical Approach to Fragility Analysis of Bridges

    Directory of Open Access Journals (Sweden)

    Yasamin Rafie Nazari

    2012-12-01

    Full Text Available Damages during past earthquakes reveal seismic vulnerability of bridge structures and the necessity of probabilistic approach toward seismic performance evaluation of bridges and its interpretation in terms of decision variables such as repair cost, downtime and life loss. This Procedure involves hazard analysis, structural analysis, damage analysis and loss analysis. The purpose of present study is reviewing different methods developed to derive fragility curves for damage analysis of bridges and demonstrating a simple procedure for fragility analysis using Microsoft Office Excel worksheet to reach probability of occurring predefined level of damage due to different levels of seismic demand parameters. The input of this procedure is the intensity of ground motion and the output is an appropriate estimate of the expected damage. Different observed damages of the bridges are discussed and compared the practical definition of damage states. Different methods of fragility analyses are discussed and a practical step by step example is illustrated.

  11. The conformal approach to asymptotic analysis

    CERN Document Server

    Nicolas, Jean-Philippe

    2015-01-01

    This essay was written as an extended version of a talk given at a conference in Strasbourg on "Riemann, Einstein and geometry", organized by Athanase Papadopoulos in September 2014. Its aim is to present Roger Penrose's approach to asymptotic analysis in general relativity, which is based on conformal geometric techniques, focusing on historical and recent aspects of two specialized topics~: conformal scattering and peeling.

  12. An Ethnografic Approach to Video Analysis

    DEFF Research Database (Denmark)

    Holck, Ulla

    2007-01-01

    European Music Therapy Congress, June 16-20, 2004 Jyväskylä, Finland. P. 1094-1110. eBook available at MusicTherapyToday.com Vol.6. Issue 4 (November 2005). Holck, U. (2007). An Ethnographic Descriptive Approach to Video Micro Analysis. In: T. Wosch & T. Wigram (Eds.) Microanalysis in music therapy...

  13. Approaches to Assessment in Multivariate Analysis.

    Science.gov (United States)

    O'Connell, Ann A.

    This paper reviews trends in assessment in quantitative courses and illustrates several options and approaches to assessment for advanced courses at the graduate level, especially in multivariate analysis. The paper provides a summary of how a researcher has used alternatives to traditional methods of assessment in a course on multivariate…

  14. Introduction to audio analysis a MATLAB approach

    CERN Document Server

    Giannakopoulos, Theodoros

    2014-01-01

    Introduction to Audio Analysis serves as a standalone introduction to audio analysis, providing theoretical background to many state-of-the-art techniques. It covers the essential theory necessary to develop audio engineering applications, but also uses programming techniques, notably MATLAB®, to take a more applied approach to the topic. Basic theory and reproducible experiments are combined to demonstrate theoretical concepts from a practical point of view and provide a solid foundation in the field of audio analysis. Audio feature extraction, audio classification, audio segmentation, au

  15. A factorization approach to next-to-leading-power threshold logarithms

    NARCIS (Netherlands)

    Bonocore, D.; Laenen, E.; Magnea, L.; Melville, S.; Vernazza, L.; White, C. D.

    2015-01-01

    Threshold logarithms become dominant in partonic cross sections when the selected final state forces gluon radiation to be soft or collinear. Such radiation factorizes at the level of scattering amplitudes, and this leads to the resummation of threshold logarithms which appear at leading power in th

  16. Global analysis of nuclear parton distribution functions at leading and next-to-leading order perturbative QCD

    CERN Document Server

    Paukkunen, Hannu

    2009-06-01

    This is the introductory part of my PhD thesis which consists of two parts, the separate introduction and four published articles. The introduction begins by a technically detailed description of the DGLAP evolution and the fast numerical solving method for the DGLAP equations, which has been used in the numerical works of the published articles of this thesis. A write-up of the next-to-leading order (NLO) calculations for the deeply inelastic scattering (DIS) and the Drell-Yan (DY) dilepton production cross-sections is also included. The formalism of the inclusive single hadron production at NLO is described as well, although less rigorously. The introductory part ends with a discussion of the global QCD analyses in general, with a special attention paid to the major work of this thesis, the NLO analysis of nuclear parton densities and their uncertainties.

  17. An International Pooled Analysis for Obtaining a Benchmark Dose for Environmental Lead Exposure in Children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce;

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack...... yielding lower confidence limits (BMDLs) of about 0.1-1.0 for the dose leading to a loss of one IQ point. We conclude that current allowable blood lead concentrations need to be lowered and further prevention efforts are needed to protect children from lead toxicity....

  18. Blood, urine, and hair kinetic analysis following an acute lead intoxication.

    Science.gov (United States)

    Ho, G; Keutgens, A; Schoofs, R; Kotolenko, S; Denooz, R; Charlier, C

    2011-01-01

    A case of lead exposure resulting from the accidental ingestion of a lead-containing solution is reported. Because of clinical management rapidly performed through chelation therapy by 2,3-dimercaptopropane sulfonate sodium and meso-2,3-dimercaptosuccinic acid, blood lead levels of this 51-year-old patient were moderate (412.9 μg/L) and no clinical symptoms were observed. Numerous blood and urine samples were collected for kinetic analysis of lead elimination. However, we report the first case in which hair samples were analyzed to determine the excretion level of lead after acute intoxication.

  19. Reclamation of lead/zinc processing wastes at Kabwe, Zambia: A phytogeochemical approach

    OpenAIRE

    Leteinturier, B.; Laroche, J.; Matera, J; Malaisse, François

    2001-01-01

    The lead/zinc mining industry of Kabwe (Central Province of Zambia), in operation from 1906 to 1994, generated metalliferous slag heaps covering an area of over 75 ha. The slag heaps are responsible for aerosol emissions with a high heavy metal content over the mine townships of Kasanda and Chowa, resulting in health risks for the local population. In this phytogeochemical investigation, soil samples showed very high lead, zinc and copper concentrations in topsoil. Plant surveys identified 39...

  20. A new analytical approach to understanding nanoscale lead-iron interactions in drinking water distribution systems.

    Science.gov (United States)

    Trueman, Benjamin F; Gagnon, Graham A

    2016-07-05

    High levels of iron in distributed drinking water often accompany elevated lead release from lead service lines and other plumbing. Lead-iron interactions in drinking water distribution systems are hypothesized to be the result of adsorption and transport of lead by iron oxide particles. This mechanism was explored using point-of-use drinking water samples characterized by size exclusion chromatography with UV and multi-element (ICP-MS) detection. In separations on two different stationary phases, high apparent molecular weight (>669 kDa) elution profiles for (56)Fe and (208)Pb were strongly correlated (average R(2)=0.96, N=73 samples representing 23 single-unit residences). Moreover, (56)Fe and (208)Pb peak areas exhibited an apparent linear dependence (R(2)=0.82), consistent with mobilization of lead via adsorption to colloidal particles rich in iron. A UV254 absorbance peak, coincident with high molecular weight (56)Fe and (208)Pb, implied that natural organic matter was interacting with the hypothesized colloidal species. High molecular weight UV254 peak areas were correlated with both (56)Fe and (208)Pb peak areas (R(2)=0.87 and 0.58, respectively). On average, 45% (std. dev. 10%) of total lead occurred in the size range 0.05-0.45 μm.

  1. An Overview of Focal Approaches of Critical Discourse Analysis

    OpenAIRE

    Maryam Jahedi; Faiz Sathi Abdullah; Jayakaran Mukundan

    2014-01-01

    This article aims to present detailed accounts of central approaches to Critical Discourse Analysis. It focuses on the work of three prominent scholars such as Fairclough’s critical approach, Wodak’s discourse-historical approach and Van Dijk’s socio-cognitive approach. This study concludes that a combination of these three approaches can be useful to critical analysis of texts. Keywords: Critical Discourse Analysis; Discourse-historical Approach; Socio-cognitive Approach

  2. The right side? Under time pressure, approach motivation leads to right-oriented bias

    NARCIS (Netherlands)

    M. Roskes; D. Sligte; S. Shalvi; C.K.W. de Dreu

    2011-01-01

    Approach motivation, a focus on achieving positive outcomes, is related to relative left-hemispheric brain activation, which translates to a variety of right-oriented behavioral biases. In two studies, we found that approach-motivated individuals display a right-oriented bias, but only when they are

  3. ANALYSIS APPROACHES TO EVALUATION OF INFORMATION PROTECTION

    Directory of Open Access Journals (Sweden)

    Zyuzin A. S.

    2015-03-01

    Full Text Available The article is devoted to an actual problem of information systems’ security assessment and the importance of objective quantitative assessment results receiving. The author offers the creation of complex system of information security with system approach, which will be used at each stage of information system’s life cycle. On the basis of this approach the author formulates the general scheme of an information security assessment of information system, and also the principles of an assessment’s carrying out method choice. In this work the existing methods of a quantitative assessment based on object-oriented methods of the system analysis, and also the objectivity of the received estimates on the basis of this approach are considered. On the basis of the carried-out analysis, serious shortcomings of the used modern techniques of an information systems’ security assessment are allocated, then the idea of the scientific and methodical device providing the increase of objectivity and complexity of an information assessment means on the basis of expert data formalization creation necessity was formulated. The possibility of this approach application for expeditious receiving a quantitative information security assessment in the conditions security threat’s dynamics changes, functioning and developments of information system is considered. The problem definition of automated information systems’ security assessment is executed, and the general technique of protection means of information in systems of this type was formulated

  4. A Modular Approach for Automating Video Analysis

    OpenAIRE

    Nadarajan, Gayathri; Renouf, Arnaud

    2007-01-01

    International audience; Automating the steps involved in video processing has yet to be tackled with much success by vision developers and knowledge engineers. This is due to the difficulty in formulating vision problems and their solutions in a generalised manner. In this collaborated work, we introduce a modular approach that utilises ontologies to capture the goals, domain description and capabilities for performing video analysis. This modularisation is tested on real-world videos from an...

  5. A factorization approach to next-to-leading-power threshold logarithms

    CERN Document Server

    Bonocore, D; Magnea, L; Melville, S; Vernazza, L; White, C D

    2015-01-01

    Threshold logarithms become dominant in partonic cross sections when the selected final state forces gluon radiation to be soft or collinear. Such radiation factorizes at the level of scattering amplitudes, and this leads to the resummation of threshold logarithms which appear at leading power in the threshold variable. In this paper, we consider the extension of this factorization to include effects suppressed by a single power of the threshold variable. Building upon the Low-Burnett-Kroll-Del Duca (LBKD) theorem, we propose a decomposition of radiative amplitudes into universal building blocks, which contain all effects ultimately responsible for next-to-leading power (NLP) threshold logarithms in hadronic cross sections for electroweak annihilation processes. In particular, we provide a NLO evaluation of the "radiative jet function", responsible for the interference of next-to-soft and collinear effects in these cross sections. As a test, using our expression for the amplitude, we reproduce all abelian-lik...

  6. Analysis approaches and interventions with occupational performance

    Science.gov (United States)

    Ahn, Sinae

    2016-01-01

    [Purpose] The purpose of this study was to analyze approaches and interventions with occupational performance in patients with stroke. [Subjects and Methods] In this study, articles published in the past 10 years were searched. The key terms used were “occupational performance AND stroke” and “occupational performance AND CVA”. A total 252 articles were identified, and 79 articles were selected. All interventions were classified according to their approaches according to 6 theories. All interventions were analyzed for frequency. [Results] Regarding the approaches, there were 25 articles for studies that provided high frequency interventions aimed at improving biomechanical approaches (31.6%). This included electrical stimulation therapy, robot therapy, and sensory stimulation training, as well as others. Analysis of the frequency of interventions revealed that the most commonly used interventions, which were used in 18 articles (22.8%), made use of the concept of constraint-induced therapy. [Conclusion] The results of this study suggest an approach for use in clinics for selecting an appropriate intervention for occupational performance. PMID:27799719

  7. Recovering lead from cupel waste generated in gold analysis by Pb-Fire assay.

    Science.gov (United States)

    Cerceau, Cristiane Isaac; Carvalho, Cornélio de Freitas; Rabelo, Ana Carolina Silveira; Dos Santos, Cláudio Gouvea; Gonçalves, Sabrina Mayra Dias; Varejão, Eduardo Vinícius Vieira

    2016-12-01

    Because of its precision and accuracy, Pb-Fire assay is the most employed method for gold analysis in geological materials. At the second stage of the method, namely cupellation, lead is oxidized to PbO which is absorbed by the cupel, leading to metallic gold as a tiny bend at the bottom of the recipient. After cupellation, cupel becomes highly contaminated with lead, making its disposal a serious risk of environmental contamination. In the present work, a leaching process for removing lead from cupel waste is proposed, which allowed for removing 96% of PbO by weight. After a precipitation step, 92.0% of lead was recovered from leachates in the form of PbSO4. Lead in the solid wastes left by the extraction was above the limit established by Brazilian legislation and these were classified as non-hazardous. Finally, secondary effluents generated after the precipitation step presented lead content more than twenty times lower than that of leachates from cupel waste. Tons of cupel waste are annually generated from gold analysis by Pb-Fire assay. Thus, the proposed method can contribute to prevent the discharge of high amounts of lead into the environment. Also, recovery of lead can help to partially meet the industrial demand for lead compounds.

  8. Lead Slowing-Down Spectrometry Time Spectral Analysis for Spent Fuel Assay: FY11 Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, Jonathan A.; Anderson, Kevin K.; Bowyer, Sonya M.; Casella, Andrew M.; Gesh, Christopher J.; Warren, Glen A.

    2011-09-30

    Developing a method for the accurate, direct, and independent assay of the fissile isotopes in bulk materials (such as used fuel) from next-generation domestic nuclear fuel cycles is a goal of the Office of Nuclear Energy, Fuel Cycle R&D, Material Protection and Control Technology (MPACT) Campaign. To meet this goal, MPACT supports a multi-institutional collaboration, of which PNNL is a part, to study the feasibility of Lead Slowing Down Spectroscopy (LSDS). This technique is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic masses in used fuel with an uncertainty considerably lower than the approximately 10% typical of today's confirmatory assay methods. This document is a progress report for FY2011 PNNL analysis and algorithm development. Progress made by PNNL in FY2011 continues to indicate the promise of LSDS analysis and algorithms applied to used fuel. PNNL developed an empirical model based on calibration of the LSDS to responses generated from well-characterized used fuel. The empirical model, which accounts for self-shielding effects using empirical basis vectors calculated from the singular value decomposition (SVD) of a matrix containing the true self-shielding functions of the used fuel assembly models. The potential for the direct and independent assay of the sum of the masses of 239Pu and 241Pu to within approximately 3% over a wide used fuel parameter space was demonstrated. Also, in FY2011, PNNL continued to develop an analytical model. Such efforts included the addition of six more non-fissile absorbers in the analytical shielding function and the non-uniformity of the neutron flux across the LSDS assay chamber. A hybrid analytical-empirical approach was developed to determine the mass of total Pu (sum of the masses of 239Pu, 240Pu, and 241Pu), which is an important quantity in safeguards. Results using this hybrid method were of approximately the same accuracy as the

  9. NASA Armstrong's Approach to Store Separation Analysis

    Science.gov (United States)

    Acuff, Chris; Bui, Trong

    2015-01-01

    Presentation will an overview of NASA Armstrong's store separation capabilities and how they have been applied recently. Objective of the presentation is to brief Generation Orbit and other potential partners on NASA Armstrong's store separation capabilities. It will include discussions on the use of NAVSEP and Cart3D, as well as some Python scripting work to perform the analysis, and a short overview of this methodology applied to the Towed Glider Air Launch System. Collaboration with potential customers in this area could lead to funding for the further development of a store separation capability at NASA Armstrong, which would boost the portfolio of engineering expertise at the center.

  10. Affordability analysis of lead emission controls for a smelter-refinery. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Scherer, T.M.

    1989-10-01

    This document evaluates the affordability and economic impact of additional control measures deemed necessary for a smelter-refinery to meet the lead emission standard. The emphasis in the analysis is on the impact of control costs on the smelter-refinery's profitability. The analysis was performed using control-cost data from two different lead-smelter studies in conjunction with other existing industry data.

  11. Evaluating the All-Ages Lead Model Using SiteSpecific Data: Approaches and Challenges

    Science.gov (United States)

    Lead (Pb) exposure continues to be a problem in the United States. Even after years of progress in reducing environmental levels, CDC estimates at least 500,000 U.S. children ages 1-5 years have blood Pb levels (BLL) above the CDC reference level of 5 µg/dL. Childhood Pb ex...

  12. Dual-action Hybrid Compounds - A New Dawn in the Discovery of Multi-target Drugs: Lead Generation Approaches.

    Science.gov (United States)

    Abdolmalekia, Azizeh; Ghasemi, Jahan B

    2016-09-27

    Finding high quality beginning compounds is a critical job at the start of the lead generation stage for multi-target drug discovery (MTDD). Designing hybrid compounds as a selective multi-target chemical entity is a challenge, opportunity, and new idea to better act against specific multiple targets. One hybrid molecule is formed by two (or more) pharmacophore group's participation. So, these new compounds often exhibit two or more activities going about as multi-target drugs (mt-drugs) and may have superior safety or efficacy. Application of integrating a range of information and sophisticated new in silico, bioinformatics, structural biology, pharmacogenomics methods may be useful to discover/design, and synthesis of the new hybrid molecules. In this regard, many rational and screening approaches have followed by medicinal chemists for the lead generation in MTDD. Here, we review some popular lead generation approaches that have been used for designing multiple ligands (DMLs). This paper focuses on dual- acting chemical entities that incorporate a part of two drugs or bioactive compounds to compose hybrid molecules. Also, it presents some of key concepts and limitations/strengths of lead generation methods by comparing combination framework method with screening approaches. Besides, a number of examples to represent applications of hybrid molecules in the drug discovery are included.

  13. Chemical and instrumental approaches to cheese analysis.

    Science.gov (United States)

    Subramanian, Anand; Rodriguez-Saona, Luis

    2010-01-01

    Overcoming the complexity of cheese matrix to reliably analyze cheese composition, flavor, and ripening changes has been a challenge. Several sample isolation or fractionation methods, chemical and enzymatic assays, and instrumental methods have been developed over the decades. While some of the methods are well established standard methods, some still need to be researched and improved. This chapter reviews the chemical and instrumental methods available to determine cheese composition and monitor biochemical events (e.g., glycolysis, lipolysis, and proteolysis) during cheese ripening that lead to the formation of cheese flavor. Chemical and enzymatic methods available for analysis of cheese composition (fat, protein, lactose, salt, nitrogen content, moisture, etc.) are presented. Electrophoretic, chromatographic, and spectroscopic techniques are also reviewed in the light of their application to monitor cheese ripening and flavor compounds. Novel instrumental methods based on Fourier-transform infrared spectroscopy that are currently being researched and applied to cheese analysis are introduced.

  14. Leading Public Housing Organisation in a Problematic Situation: a critical soft systems methodology approach

    OpenAIRE

    Staadt, Jurgen

    2014-01-01

    markdownabstract__Abstract__ The challenges ahead such as climate change and social injustice require governments and their public organisations to be adaptive and open to learning. This necessitates the adoption of new ways of thinking so as cope to with complexity, dynamics as well as behavioural aspects. The leading public housing organisation used in this single case study is connected with disciplines such as transport for example which suggests the adoption of a systems thinking approac...

  15. Fostering the Capacity for Distributed Leadership: A Post-Heroic Approach to Leading School Improvement

    Science.gov (United States)

    Klar, Hans W.; Huggins, Kristin Shawn; Hammonds, Hattie L.; Buskey, Frederick C.

    2016-01-01

    Principals are being encouraged to distribute leadership to increase schools' organizational capacities, and enhance student growth and learning. Extant research on distributed leadership practices provides an emerging basis for adopting such approaches. Yet, relatively less attention has been paid to examining the principal's role in fostering…

  16. Do Different Approaches to Examining Construct Comparability in Multilanguage Assessments Lead to Similar Conclusions?

    Science.gov (United States)

    Oliveri, Maria E.; Ercikan, Kadriye

    2011-01-01

    In this study, we examine the degree of construct comparability and possible sources of incomparability of the English and French versions of the Programme for International Student Assessment (PISA) 2003 problem-solving measure administered in Canada. Several approaches were used to examine construct comparability at the test- (examination of…

  17. A geospatial analysis of soil lead concentrations around regional Oklahoma airports.

    Science.gov (United States)

    McCumber, Alexander; Strevett, K A

    2017-01-01

    Lead has been banned from automobile gasoline since 1995; however, lead is still used as an additive to aviation gasoline (avgas). Airports are now one of the greatest sources of lead air emission in the US. The objectives of this study were (1) to evaluate soil lead levels radially from three regional airports; (2) collect historical meteorological data; (3) examine the soil organic matter content and (4) develop correlation coefficients to evaluate correlations among variables. Soil samples were collected from 3 different airports in Oklahoma and the soil lead concentration was measured using x-ray fluorescence (XRF). The measured soil lead concentrations were plotted with the corresponding GPS location in ArcGIS and Inverse Distance Weight spatial analysis was used to create modeled isopleths of soil lead concentrations. One of the three airports was found to have soil lead concentrations that correlate with soil organic matter with one other showing correlation between soil lead concentration and distance from the airport. The spatial modeled isopleths showed elevated soil lead concentrations in the direction of prevailing winds with "hot spots" near the avgas fueling stations.

  18. A Systematic Approach for Engagement Analysis Under Multitasking Environments

    Science.gov (United States)

    Zhang, Guangfan; Leddo, John; Xu, Roger; Richey, Carl; Schnell, Tom; McKenzie, Frederick; Li, Jiang

    2011-01-01

    An overload condition can lead to high stress for an operator and further cause substantial drops in performance. On the other extreme, in automated systems, an operator may become underloaded; in which case, it is difficult for the operator to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, a disengaged operation may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss a systematic approach monitor for extremes of cognitive workload and engagement in multitasking environments. Inferences of cognitive workload ar engagement are based on subjective evaluations, objective performance measures, physiological signals, and task analysis results. The systematic approach developed In this paper aggregates these types of information collected under the multitasking environment and can provide a real-time assessment or engagement.

  19. A DSRPCL-SVM Approach to Informative Gene Analysis

    Institute of Scientific and Technical Information of China (English)

    Wei Xiong; Zhibin Cai; Jinwen Ma

    2008-01-01

    Microarray data based tumor diagnosis is a very interesting topic in bioinformatics. One of the key problems is the discovery and analysis of informative genes of a tumor. Although there are many elaborate approaches to this problem, it is still difficult to select a reasonable set of informative genes for tumor diagnosis only with microarray data. In this paper, we classify the genes expressed through microarray data into a number of clusters via the distance sensitive rival penalized competitive learning (DSRPCL) algorithm and then detect the informative gene cluster or set with the help of support vector machine (SVM). Moreover, the critical or powerful informative genes can be found through further classifications and detections on the obtained informative gene clusters. It is well demonstrated by experiments on the colon, leukemia, and breast cancer datasets that our proposed DSRPCL-SVM approach leads to a reasonable selection of informative genes for tumor diagnosis.

  20. Risk Analysis Approach to Rainwater Harvesting Systems

    Directory of Open Access Journals (Sweden)

    Nadia Ursino

    2016-08-01

    Full Text Available Urban rainwater reuse preserves water resources and promotes sustainable development in rapidly growing urban areas. The efficiency of a large number of urban water reuse systems, operating under different climate and demand conditions, is evaluated here on the base of a new risk analysis approach. Results obtained by probability analysis (PA indicate that maximum efficiency in low demanding scenarios is above 0.5 and a threshold, distinguishing low from high demanding scenarios, indicates that in low demanding scenarios no significant improvement in performance may be attained by increasing the storage capacity of rainwater harvesting tanks. Threshold behaviour is displayed when tank storage capacity is designed to match both the average collected volume and the average reuse volume. The low demand limit cannot be achieved under climate and operating conditions characterized by a disproportion between harvesting and demand volume.

  1. Potentiometric stripping analysis of lead and cadmium leaching from dental prosthetic materials and teeth

    Directory of Open Access Journals (Sweden)

    GORAN M. NIKOLIC

    2004-07-01

    Full Text Available Potentiometric stipping analysis (PSA was applied for the determination of lead and cadmium leaching from dental prosthetic materials and teeth. The soluble lead content in finished dental implants was found to be much lower than that of the individual components used for their preparation. Cadmium was not detected in dental implants and materials under the defined conditions. The soluble lead and cadmium content of teeth was slightly lower than the lead and cadmium content in whole teeth (w/w reported by other researchers, except in the case of a tooth with removed amalgam filling. The results of this work suggest that PSA may be a good method for lead and cadmium leaching studies for investigation of the biocompatibility of dental prosthetic materials.

  2. High levels of Mercury and Lead detected by hair analysis in two Venezuelan environments

    OpenAIRE

    Marcano,Eunice; Labady,Mary; Gomes,Clara; Aguiar,Guillermina; Laine,Jorge

    2009-01-01

    Mercury and Lead concentrations obtained by ICP-OAS analysis of human hair from riverside communities along the Orinoco river in the Amazon state (Venezuela) were compared with those from Caracas, Venezuela. Taking into account the characteristics of these two environments and the values of the average concentrations of Mercury and Lead, baselines were established suggesting that gold mining activity near the Orinoco river is responsible for the high levels of Mercury in hair from the Amazon ...

  3. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    . Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual......Lawyers generally explain legal development by looking at explicit amendments to statutory law and modifications in judicial practice. As far as the latter are concerned, leading cases occupy a special place. This article empirically studies the process in which certain cases become leading cases...

  4. Total materials consumption; an estimation methodology and example using lead; a materials flow analysis

    Science.gov (United States)

    Biviano, Marilyn B.; Wagner, Lorie A.; Sullivan, Daniel E.

    1999-01-01

    Materials consumption estimates, such as apparent consumption of raw materials, can be important indicators of sustainability. Apparent consumption of raw materials does not account for material contained in manufactured products that are imported or exported and may thus under- or over-estimate total consumption of materials in the domestic economy. This report demonstrates a methodology to measure the amount of materials contained in net imports (imports minus exports), using lead as an example. The analysis presents illustrations of differences between apparent and total consumption of lead and distributes these differences into individual lead-consuming sectors.

  5. Cadmium and lead interaction with diatom surfaces: A combined thermodynamic and kinetic approach

    Science.gov (United States)

    Gélabert, A.; Pokrovsky, O. S.; Schott, J.; Boudou, A.; Feurtet-Mazel, A.

    2007-08-01

    This work is devoted to the physico-chemical study of cadmium and lead interaction with diatom-water interfaces for two marine planktonic ( Thalassiosira weissflogii, TW; Skeletonema costatum, SC) and two freshwater periphytic species ( Achnanthidium minutissimum, AMIN; Navicula minima, NMIN) by combining adsorption measurements with surface complexation modeling. Adsorption kinetics was studied as a function of pH and initial metal concentration in sodium nitrate solution and in culture media. Kinetic data were consistent with a two-step mechanism in which the loss of a water molecule from the inner coordination sphere of the metal is rate limiting. Reversible adsorption experiments, with 3 h of exposure to metal, were performed as a function of pH (2-9), metal concentration in solution (10 -9-10 -3 M), and ionic strength (10 -3-1.0 M). While the shape of pH-dependent adsorption edge is similar among all four diatom species, the constant-pH adsorption isotherm and maximal binding capacities differ. Measurements of electrophoretic mobilities ( μ) revealed negative surface potential for AMIN diatom, however, the absolute value of μ decreases with increase of [Pb 2+] aq suggesting the metal adsorption on negative surface sites. These observations allowed us to construct a surface complexation model (SCM) for cadmium and lead binding by diatom surfaces that postulates the Constant Capacitance of the electric double layer and considers Cd and Pb complexation with mainly carboxylic and, partially, silanol groups. In the full range of investigated Cd concentration, the SCM is able to describe the concentration of adsorbed metal as a function of [Cd 2+] aq without implying the presence of high affinity, low abundance sites, that are typically used to model the metal interactions with natural multi-component organic substances. At the same time, Cd fast initial reaction requires the presence of "highly reactive sites" those concentration represents only 2.5-3% of the

  6. Our On-Its-Head-and-In-Your-Dreams Approach Leads to Clean Energy

    Energy Technology Data Exchange (ETDEWEB)

    Kazmerski, Lawrence; Gwinner, Don; Hicks, Al

    2013-07-18

    Representing the Center for Inverse Design (CID), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE: energy. The mission of the CID is to revolutionize the discovery of new materials by design with tailored properties through the development and application of a novel inverse design approach powered by theory guiding experiment with an initial focus on solar energy conversion.

  7. Discovery and development of Seliciclib. How systems biology approaches can lead to better drug performance.

    Science.gov (United States)

    Khalil, Hilal S; Mitev, Vanio; Vlaykova, Tatyana; Cavicchi, Laura; Zhelev, Nikolai

    2015-05-20

    Seliciclib (R-Roscovitine) was identified as an inhibitor of CDKs and has undergone drug development and clinical testing as an anticancer agent. In this review, the authors describe the discovery of Seliciclib and give a brief summary of the biology of the CDKs Seliciclib inhibits. An overview of the published in vitro and in vivo work supporting the development as an anti-cancer agent, from in vitro experiments to animal model studies ending with a summary of the clinical trial results and trials underway is presented. In addition some potential non-oncology applications are explored and the potential mode of action of Seliciclib in these areas is described. Finally the authors argue that optimisation of the therapeutic effects of kinase inhibitors such as Seliciclib could be enhanced using a systems biology approach involving mathematical modelling of the molecular pathways regulating cell growth and division.

  8. Li Meixiang leads women to become well-off. Integrated approach.

    Science.gov (United States)

    Gao, M

    1996-02-01

    This article describes the experiences of a family planning worker in Meihekou City, Jilin Province, in implementing the integrated approach to family planning. Madam Li Meixiang is credited with being an outstanding example in improving program implementation. The new approach has helped people accept a smaller family size norm. Madam Li in 1981 married and shared in the responsibility of supporting a six-member family and raising her own daughter. Their success in eliminating debt and improving their standard of living is attributed to their efforts to grow rice on 1.3 hectares of land, running a small grain-processing factory in the village, and raising pigs who were fed the chaff produced from processed rice. Li invested in a truck and the family moved into a larger house with modern conveniences. The Li family was the first to rise from poverty to a well-off position in the community. Li became interested in family planning issues after an IEC presentation in the village. In 1990 Madam Li was elected head of the women's association of the village and assumed leadership of family planning activities. Her objective was to teach other women about the advantages of a small family and to offer advice in solving economic problems. During the off-farming season Madam Li worked to establish income generation activities for other women and acceptors. Li organized an village effort to shell walnuts for a local export and import company. About 30% of the village women were involved the first year, and almost 80% were involved in the past two years. The project showed villagers how ingenuity without any cash outlay could yield profits. Over 200 families join the project in the winter and receive an average income of over 2500 yuan. Madam Li has helped villagers obtain bank loans for small scale projects and given advice on how to increase profits.

  9. Ancillary Resistor leads to Sparse Glitches: an Extra Approach to Avert Hacker using Syndicate Browser Design

    Directory of Open Access Journals (Sweden)

    Devaki Pendlimarri

    2012-01-01

    Full Text Available After the invention of internet most of the people all over the world have become a fan of it because of its vast exploitation for information exchange, e-mail, e-commerce etc. for their easy leading of life. On the other side, may be equally or less/more, many people are also using it for the purpose of hacking the information which is being communicated. Because, the data/information that is being communicated through the internet is via an unsecured networks. This gives breaches to the hacker who is known as the man-in-the-middle to hack the data/information. In this paper, we describe some novel methodologies to prevent the hacker in hacking the data/information. The web browser design is being  carried out in our R&D lab and we have found that the novel methodology has given solution to prevent the man-in-the-middle from several attacks.

  10. Lead toxicity to Lemna minor predicted using a metal speciation chemistry approach.

    Science.gov (United States)

    Antunes, Paula M C; Kreager, Nancy J

    2014-10-01

    In the present study, predictive measures for Pb toxicity and Lemna minor were developed from bioassays with 7 surface waters having varied chemistries (0.5-12.5 mg/L dissolved organic carbon, pH of 5.4-8.3, and water hardness of 8-266 mg/L CaCO3 ). As expected based on water quality, 10%, 20%, and 50% inhibitory concentration (IC10, IC20, and IC50, respectively) values expressed as percent net root elongation (%NRE) varied widely (e.g., IC20s ranging from 306 nM to >6920 nM total dissolved Pb), with unbounded values limited by Pb solubility. In considering chemical speciation, %NRE variability was better explained when both Pb hydroxides and the free lead ion were defined as bioavailable (i.e., f{OH} ) and colloidal Fe(III)(OH)3 precipitates were permitted to form and sorb metals (using FeOx as the binding phase). Although cause and effect could not be established because of covariance with alkalinity (p = 0.08), water hardness correlated strongly (r(2)  = 0.998, p hardness and %NRE vs f{OH} ), IC20 and IC50 values produced were within a factor of 2.9 times and 2.2 times those measured, respectively. The results provide much needed effect data for L. minor and highlight the importance of chemical speciation in Pb-based risk assessments for aquatic macrophytes.

  11. Lamina specific loss of inhibition may lead to distinct neuropathic manifestations: a computational modeling approach

    Directory of Open Access Journals (Sweden)

    Erick Javier Argüello Prada

    Full Text Available Introduction It has been reported that inhibitory control at the superficial dorsal horn (SDH can act in a regionally distinct manner, which suggests that regionally specific subpopulations of SDH inhibitory neurons may prevent one specific neuropathic condition. Methods In an attempt to address this issue, we provide an alternative approach by integrating neuroanatomical information provided by different studies to construct a network-model of the SDH. We use Neuroids to simulate each neuron included in that model by adapting available experimental evidence. Results Simulations suggest that the maintenance of the proper level of pain sensitivity may be attributed to lamina II inhibitory neurons and, therefore, hyperalgesia may be elicited by suppression of the inhibitory tone at that lamina. In contrast, lamina III inhibitory neurons are more likely to be responsible for keeping the nociceptive pathway from the mechanoreceptive pathway, so loss of inhibitory control in that region may result in allodynia. The SDH network-model is also able to replicate non-linearities associated to pain processing, such as Aβ-fiber mediated analgesia and frequency-dependent increase of the neural response. Discussion By incorporating biophysical accuracy and newer experimental evidence, the SDH network-model may become a valuable tool for assessing the contribution of specific SDH connectivity patterns to noxious transmission in both physiological and pathological conditions.

  12. Experimental Finite Element Approach for Stress Analysis

    Directory of Open Access Journals (Sweden)

    Ahmet Erklig

    2014-01-01

    Full Text Available This study aims to determining the strain gauge location points in the problems of stress concentration, and it includes both experimental and numerical results. Strain gauges were proposed to be positioned to corresponding locations on beam and blocks to related node of elements of finite element models. Linear and nonlinear cases were studied. Cantilever beam problem was selected as the linear case to approve the approach and conforming contact problem was selected as the nonlinear case. An identical mesh structure was prepared for the finite element and the experimental models. The finite element analysis was carried out with ANSYS. It was shown that the results of the experimental and the numerical studies were in good agreement.

  13. Re-analysis of fatigue data for welded joints using the notch stress approach

    DEFF Research Database (Denmark)

    Pedersen, Mikkel Melters; Mouritsen, Ole Ø.; Hansen, Michael Rygaard

    2010-01-01

    Experimental fatigue data for welded joints have been collected and subjected to re-analysis using the notch stress approach according to IIW recommendations. This leads to an overview regarding the reliability of the approach, based on a large number of results (767 specimens). Evidently......-welded joints agree quite well with the FAT 225 curve; however a reduction to FAT 200 is suggested in order to achieve approximately the same safety as observed in the nominal stress approach....

  14. A Community-Based Approach to Leading the Nation in Smart Energy Use

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2013-12-31

    Project Objectives The AEP Ohio gridSMART® Demonstration Project (Project) achieved the following objectives: • Built a secure, interoperable, and integrated smart grid infrastructure in northeast central Ohio that demonstrated the ability to maximize distribution system efficiency and reliability and consumer use of demand response programs that reduced energy consumption, peak demand, and fossil fuel emissions. • Actively attracted, educated, enlisted, and retained consumers in innovative business models that provided tools and information reducing consumption and peak demand. • Provided the U.S. Department of Energy (DOE) information to evaluate technologies and preferred smart grid business models to be extended nationally. Project Description Ohio Power Company (the surviving company of a merger with Columbus Southern Power Company), doing business as AEP Ohio (AEP Ohio), took a community-based approach and incorporated a full suite of advanced smart grid technologies for 110,000 consumers in an area selected for its concentration and diversity of distribution infrastructure and consumers. It was organized and aligned around: • Technology, implementation, and operations • Consumer and stakeholder acceptance • Data management and benefit assessment Combined, these functional areas served as the foundation of the Project to integrate commercially available products, innovative technologies, and new consumer products and services within a secure two-way communication network between the utility and consumers. The Project included Advanced Metering Infrastructure (AMI), Distribution Management System (DMS), Distribution Automation Circuit Reconfiguration (DACR), Volt VAR Optimization (VVO), and Consumer Programs (CP). These technologies were combined with two-way consumer communication and information sharing, demand response, dynamic pricing, and consumer products, such as plug-in electric vehicles and smart appliances. In addition, the Project

  15. Lead Coolant Test Facility Systems Design, Thermal Hydraulic Analysis and Cost Estimate

    Energy Technology Data Exchange (ETDEWEB)

    Soli Khericha; Edwin Harvego; John Svoboda; Ryan Dalling

    2012-01-01

    The Idaho National Laboratory prepared a preliminary technical and functional requirements (T&FR), thermal hydraulic design and cost estimate for a lead coolant test facility. The purpose of this small scale facility is to simulate lead coolant fast reactor (LFR) coolant flow in an open lattice geometry core using seven electrical rods and liquid lead or lead-bismuth eutectic coolant. Based on review of current world lead or lead-bismuth test facilities and research needs listed in the Generation IV Roadmap, five broad areas of requirements were identified as listed: (1) Develop and Demonstrate Feasibility of Submerged Heat Exchanger; (2) Develop and Demonstrate Open-lattice Flow in Electrically Heated Core; (3) Develop and Demonstrate Chemistry Control; (4) Demonstrate Safe Operation; and (5) Provision for Future Testing. This paper discusses the preliminary design of systems, thermal hydraulic analysis, and simplified cost estimate. The facility thermal hydraulic design is based on the maximum simulated core power using seven electrical heater rods of 420 kW; average linear heat generation rate of 300 W/cm. The core inlet temperature for liquid lead or Pb/Bi eutectic is 4200 C. The design includes approximately seventy-five data measurements such as pressure, temperature, and flow rates. The preliminary estimated cost of construction of the facility is $3.7M (in 2006 $). It is also estimated that the facility will require two years to be constructed and ready for operation.

  16. Lead coolant test facility systems design, thermal hydraulic analysis and cost estimate

    Energy Technology Data Exchange (ETDEWEB)

    Khericha, Soli, E-mail: slk2@inel.gov [Battelle Energy Alliance, LLC, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Harvego, Edwin; Svoboda, John; Evans, Robert [Battelle Energy Alliance, LLC, Idaho National Laboratory, Idaho Falls, ID 83415 (United States); Dalling, Ryan [ExxonMobil Gas and Power Marketing, Houston, TX 77069 (United States)

    2012-01-15

    The Idaho National Laboratory prepared a preliminary technical and functional requirements (T and FR), thermal hydraulic design and cost estimate for a lead coolant test facility. The purpose of this small scale facility is to simulate lead coolant fast reactor (LFR) coolant flow in an open lattice geometry core using seven electrical rods and liquid lead or lead-bismuth eutectic coolant. Based on review of current world lead or lead-bismuth test facilities and research needs listed in the Generation IV Roadmap, five broad areas of requirements were identified as listed below: Bullet Develop and demonstrate feasibility of submerged heat exchanger. Bullet Develop and demonstrate open-lattice flow in electrically heated core. Bullet Develop and demonstrate chemistry control. Bullet Demonstrate safe operation. Bullet Provision for future testing. This paper discusses the preliminary design of systems, thermal hydraulic analysis, and simplified cost estimated. The facility thermal hydraulic design is based on the maximum simulated core power using seven electrical heater rods of 420 kW; average linear heat generation rate of 300 W/cm. The core inlet temperature for liquid lead or Pb/Bi eutectic is 4200 Degree-Sign C. The design includes approximately seventy-five data measurements such as pressure, temperature, and flow rates. The preliminary estimated cost of construction of the facility is $3.7M (in 2006 $). It is also estimated that the facility will require two years to be constructed and ready for operation.

  17. Foreign Policy: Approaches, Levels Of Analysis, Dimensions

    Directory of Open Access Journals (Sweden)

    Nina Šoljan

    2012-01-01

    Full Text Available This paper provides an overview of key issues related to foreign policy and foreign policy theories in the wider context of political science. Discussing the origins and development of foreign policy analysis (FPA, as well as scholarly work produced over time, it argues that today FPA encompasses a variety of theoretical approaches, models and tools. These share the understanding that foreign policy outputs cannot be fully explained if analysis is confined to the systemic level. Furthermore, this paper conceptualizes foreign policy by comparing it to other types of policy. Although during the Cold War period foreign policy was equated with foreign security policy, in today’s world, security policy is only one dimension. Foreign policy’s scope has expanded to cover other issues such as trade, human rights and the environment. The growing number of domestic, international and transnational issues, stakeholders and inputs into the policy making process have made the formation and conduct of a coherent foreign policy increasingly challenging.

  18. Mercury-Free Analysis of Lead in Drinking Water by Anodic Stripping Square Wave Voltammetry

    Science.gov (United States)

    Wilburn, Jeremy P.; Brown, Kyle L.; Cliffel, David E.

    2007-01-01

    The analysis of drinking water for lead, which has well-known health effects, is presented as an instructive example for undergraduate chemistry students. It allows the students to perform an experiment and evaluate to monitor risk factors and common hazard of everyday life.

  19. Circuit Board Analysis for Lead by Atomic Absorption Spectroscopy in a Course for Nonscience Majors

    Science.gov (United States)

    Weidenhammer, Jeffrey D.

    2007-01-01

    A circuit board analysis of the atomic absorption spectroscopy, which is used to measure lead content in a course for nonscience majors, is being presented. The experiment can also be used to explain the potential environmental hazards of unsafe disposal of various used electronic equipments.

  20. Lead in Hair and in Red Wine by Potentiometric Stripping Analysis: The University Students' Design.

    Science.gov (United States)

    Josephsen, Jens

    1985-01-01

    A new program for training upper secondary school chemistry teachers (SE 537 693) depends heavily on student project work. A project in which lead in hair and in red wine was examined by potentiometric stripping analysis is described and evaluated. (JN)

  1. Multicriteria approach to data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Hélcio Vieira Junior

    2008-08-01

    Full Text Available With the aim of making Data Envelopment Analysis (DEA more acceptable to the managers' community, the Weights Restrictions approaches were born. They allow DEA to not dispose of any data and permit the Decision Maker (DM to have some management over the method. The purpose of this paper is to suggest a Weights Restrictions DEA model that incorporates the DM preference. In order to perform that, we employed the MACBETH methodology as a tool to find out the bounds of the weights to be used in a Weights Restrictions approach named Virtual Weights Restrictions. Our proposal achieved an outcome that has an expressive correlation with three widely used decision-aids methodologies: the ELECTRE III, the SMART and the PROMETHEE I and II. In addition, our approach was able to join the most significant outcomes of all the above three Multicriteria decision-aids methodologies in one unique outcome.Com o objetivo de fazer a Análise Envoltória de Dados (DEA mais aceitável pela comunidade gerencial, as abordagens de Restrição aos Pesos foram criadas. Estas abordagens fazem com que a DEA não descarte nenhum dado e permitem que o Decisor (DM tenha alguma gerência sobre o método. O objetivo deste artigo é sugerir um modelo de restrição aos pesos que incorpore as preferências do DM. Para realizar isto, nós empregamos a metodologia MACBETH como ferramenta para descobrir os limites dos pesos a serem utilizados na abordagem de restrição aos pesos chamada "Restrição aos Pesos Virtuais". Nossa proposta alcançou um resultado que apresenta uma correlação expressiva com três metodologias de apoio à decisão amplamente utilizadas: o ELECTRE III, o SMART e o PROMETHEE I e II. Adicionalmente, nossa abordagem foi capaz de reunir os resultados mais significativos de todas estas três metodologias de apoio à decisão em um único resultado.

  2. Development and characterisation of disposable gold electrodes, and their use for lead(II) analysis

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Mohd F. M. [Cranfield University, Cranfield Health, Silsoe (United Kingdom); Institute for Medical Research, Toxicology and Pharmacology Unit, Herbal Medicine Research Centre, Kuala Lumpur (Malaysia); Tothill, Ibtisam E. [Cranfield University, Cranfield Health, Silsoe (United Kingdom)

    2006-12-15

    There is an increasing need to assess the harmful effects of heavy-metal-ion pollution on the environment. The ability to detect and measure toxic contaminants on site using simple, cost effective, and field-portable sensors is an important aspect of environmental protection and facilitating rapid decision making. A screen-printed gold sensor in a three-electrode configuration has been developed for analysis of lead(II) by square-wave stripping voltammetry (SWSV). The working electrode was fabricated with gold ink deposited by use of thick-film technology. Conditions affecting the lead stripping response were characterised and optimized. Experimental data indicated that chloride ions are important in lead deposition and subsequent analysis with this type of sensor. A linear concentration range of 10-50 {mu}g L{sup -1} and 25-300 {mu}g L{sup -1} with detection limits of 2 {mu}g L{sup -1} and 5.8 {mu}g L{sup -1} were obtained for lead(II) for measurement times of four and two minutes, respectively. The electrodes can be reused up to 20 times after cleaning with 0.5 mol L{sup -1} sulfuric acid. Interference of other metals with the response to lead were also examined to optimize the sensor response for analysis of environmental samples. The analytical utility of the sensor was demonstrated by applying the system to a variety of wastewater and soil sample extracts from polluted sites. The results are sufficient evidence of the feasibility of using these screen-printed gold electrodes for the determination of lead(II) in wastewater and soil extracts. For comparison purposes a mercury-film electrode and ICP-MS were used for validation. (orig.)

  3. Numerical simulation of soldered joints and reliability analysis of PLCC components with J-shape leads

    Institute of Scientific and Technical Information of China (English)

    Zhang Liang; Xue Songbai; Lu Fangyan; Han Zongjie; Wang Jianxin

    2008-01-01

    This paper deals with a study on SnPb and lead-free soldered joint reliability of PLCC devices with different lead counts under three kinds of temperature cycle profiles, which is based on non-linear finite element method. By analyzing the stress of soldered joints, it is found that the largest stress is at the area between the soldered joints and the leads, and analysis results indicate that the von Mises stress at the location slightly increases with the increase of lead counts. For PLCC with 84 leads the soldered joints was modeled for three typical loading (273-398 K, 218-398 K and 198-398 K) in order to study the influence of acceleration factors on the reliability of soldered joints. And the estimation of equivalent plastic strain of three different lead-free solder alloys (Sn3.8Ag0.7Cu, Sn3.5Ag and Sn37Pb) was also carried out.

  4. Analytical model and stability analysis of the leading edge spar of a passively morphing ornithopter wing.

    Science.gov (United States)

    Wissa, Aimy; Calogero, Joseph; Wereley, Norman; Hubbard, James E; Frecker, Mary

    2015-10-26

    This paper presents the stability analysis of the leading edge spar of a flapping wing unmanned air vehicle with a compliant spine inserted in it. The compliant spine is a mechanism that was designed to be flexible during the upstroke and stiff during the downstroke. Inserting a variable stiffness mechanism into the leading edge spar affects its structural stability. The model for the spar-spine system was formulated in terms of the well-known Mathieu's equation, in which the compliant spine was modeled as a torsional spring with a sinusoidal stiffness function. Experimental data was used to validate the model and results show agreement within 11%. The structural stability of the leading edge spar-spine system was determined analytically and graphically using a phase plane plot and Strutt diagrams. Lastly, a torsional viscous damper was added to the leading edge spar-spine model to investigate the effect of damping on stability. Results show that for the un-damped case, the leading edge spar-spine response was stable and bounded; however, there were areas of instability that appear for a range of spine upstroke and downstroke stiffnesses. Results also show that there exist a damping ratio between 0.2 and 0.5, for which the leading edge spar-spine system was stable for all values of spine upstroke and downstroke stiffnesses.

  5. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  6. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  7. Three Approaches to Data Analysis Test Theory, Rough Sets and Logical Analysis of Data

    CERN Document Server

    Chikalov, Igor; Lozina, Irina; Moshkov, Mikhail; Nguyen, Hung Son; Skowron, Andrzej; Zielosko, Beata

    2013-01-01

    In this book, the following three approaches to data analysis are presented:  - Test Theory, founded by Sergei V. Yablonskii (1924-1998); the first publications appeared in 1955 and 1958, -           Rough Sets, founded by Zdzisław I. Pawlak (1926-2006); the first publications appeared in 1981 and 1982, -           Logical Analysis of Data, founded by Peter L. Hammer (1936-2006); the first publications appeared in 1986 and 1988. These three approaches have much in common, but researchers active in one of these areas often have a limited knowledge about the results and methods developed in the other two. On the other hand, each of the approaches shows some originality and we believe that the exchange of knowledge can stimulate further development of each of them. This can lead to new theoretical results and real-life applications and, in particular, new results based on combination of these three data analysis approaches can be expected.

  8. A User Requirements Analysis Approach Based on Business Processes

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yue-bin; HAN Wen-xiu

    2001-01-01

    Requirements analysis is the most important phase of information system development.Existing requirements analysis techniques concern little or no about features of different business processes.This paper presents a user requirements analysis approach which focuses business processes on the early stage of requirements analysis. It also gives an example of the using of this approach in the analysis of an enterprise information system.

  9. Modal analysis of untransposed bilateral three-phase lines -- a perturbation approach

    Energy Technology Data Exchange (ETDEWEB)

    Faria, J.A.B. [Inst. Superior Tecnico, Lisboa (Portugal). Centro de Electrotecnia Teorica e Medidas Electricas; Mendes, J.H.B. [Univ. de Los Andes, Merida (Venezuela). Escuela de Ingenieria Electrica

    1997-01-01

    Model analysis of three-phase power lines exhibiting bilateral symmetry leads to modal transformation matrices that closely resemble Clarke`s transformation. The authors develop a perturbation theory approach to justify, interpret, and gain understanding of this well known fact. Further, the authors show how to find new frequency dependent correction terms that once added to Clarke`s transformation lead to improved accuracy.

  10. Thermal analysis of selected tin-based lead-free solder alloys

    DEFF Research Database (Denmark)

    Palcut, Marián; Sopoušek, J.; Trnková, L.

    2009-01-01

    The Sn-Ag-Cu alloys have favourable solderability and wetting properties and are, therefore, being considered as potential lead-free solder materials. In the present study, tin-based Sn-Ag-Cu and Sn-Ag-Cu-Bi alloys were studied in detail by a differential scanning calorimetry (DSC) and thermodyna......The Sn-Ag-Cu alloys have favourable solderability and wetting properties and are, therefore, being considered as potential lead-free solder materials. In the present study, tin-based Sn-Ag-Cu and Sn-Ag-Cu-Bi alloys were studied in detail by a differential scanning calorimetry (DSC...... was simulated using the Thermo-Calc software package. This approach enabled us to obtain the enthalpy of cooling for each alloy and to compare its temperature derivative with the experimental DSC curves....

  11. Approaches to data analysis of multiple-choice questions

    Directory of Open Access Journals (Sweden)

    Lin Ding

    2009-09-01

    Full Text Available This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  12. Sonochemical synthesis of two new nano lead(II) coordination polymers: Evaluation of structural transformation via mechanochemical approach.

    Science.gov (United States)

    Aboutorabi, Leila; Morsali, Ali

    2016-09-01

    Two new lead(II) mixed-ligand coordination polymers, [Pb(PNO)(SCN)]n (1) and [Pb(PNO)(N3)]n (2), (HPNO=picolinic acid N-oxide) were synthesized by a sonochemical method and characterized by scanning electron microscopy, X-ray powder diffraction, IR spectroscopy and elemental analysis. Compounds 1 and 2 were structurally characterized by single crystal X-ray diffraction. The thermal behavior of 1 and 2 were studied by thermal gravimetric analysis. Structural transformations of compounds 1 and 2 were evaluated through anion-replacement processes by mechanochemical method. Moreover, the effect of sonication conditions including time, concentrations of initial reagents and power of irradiation were evaluated on size and morphology of compounds 1 and 2.

  13. Analysis of radial basis function interpolation approach

    Institute of Scientific and Technical Information of China (English)

    Zou You-Long; Hu Fa-Long; Zhou Can-Can; Li Chao-Liu; Dunn Keh-Jim

    2013-01-01

    The radial basis function (RBF) interpolation approach proposed by Freedman is used to solve inverse problems encountered in well-logging and other petrophysical issues. The approach is to predict petrophysical properties in the laboratory on the basis of physical rock datasets, which include the formation factor, viscosity, permeability, and molecular composition. However, this approach does not consider the effect of spatial distribution of the calibration data on the interpolation result. This study proposes a new RBF interpolation approach based on the Freedman's RBF interpolation approach, by which the unit basis functions are uniformly populated in the space domain. The inverse results of the two approaches are comparatively analyzed by using our datasets. We determine that although the interpolation effects of the two approaches are equivalent, the new approach is more flexible and beneficial for reducing the number of basis functions when the database is large, resulting in simplification of the interpolation function expression. However, the predicted results of the central data are not sufficiently satisfied when the data clusters are far apart.

  14. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  15. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    Science.gov (United States)

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  16. Approaches Towards the Minimisation of Toxicity in Chemical Solution Deposition Processes of Lead-Based Ferroelectric Thin Films

    Science.gov (United States)

    Bretos, Iñigo; Calzada, M. Lourdes

    The ever-growing environmental awareness in our lives has also been extended to the electroceramics field during the past decades. Despite the strong regulations that have come up (RoHS directive), a number of scientists work on ferroelectric thin film ceramics containing lead. Although the use of these materials in piezoelectric devices is exempt from the RoHS directive, successful ways of decreasing toxic load must be considered a crucial challenge. Within this framework, a few significant advances are presented here, based on different Chemical Solution Deposition strategies. Firstly, the UV sol-gel photoannealing technique (Photochemical Solution Deposition) avoids the volatilisation of hazardous lead from lead-based ferroelectric films, usually observed at conventional annealing temperatures. The key point of this approach lies in the photo-excitation of a few organic components in the gel film. There is also a subsequent annealing of the photo-activated film at temperatures low enough to prevent lead volatilisation, but allowing crystallisation of the pure perovskite phase. Ozonolysis of the films is also promoted when UV-irradiation is carried out in an oxygen atmosphere. This is known to improve electrical response. By this method, nominally stoichiometric solution (i.e., a solution without PbO-excess) derived films with reliable properties, and free of compositional gradients, may be prepared at temperatures as low as 450°C. A PtxPb interlayer between the ferroelectric film and the Pt silicon substrate is observed in the heterostructure of the low-temperature processed films. This is when lead excesses are present in their microstructure. The influence of this interface on the compositional depth profile of the films will be discussed. We will evaluate the feasibility of the UV sol-gel photoannealing technique in fabricating functional films while fulfilling environmental and technological aspects (like integration with silicon IC technology). The second

  17. Machine learning approaches in medical image analysis

    DEFF Research Database (Denmark)

    de Bruijne, Marleen

    2016-01-01

    Machine learning approaches are increasingly successful in image-based diagnosis, disease prognosis, and risk assessment. This paper highlights new research directions and discusses three main challenges related to machine learning in medical imaging: coping with variation in imaging protocols...

  18. Determination of firing distance. Lead analysis on the target by atomic absorption spectroscopy (AAS).

    Science.gov (United States)

    Gagliano-Candela, Roberto; Colucci, Anna P; Napoli, Salvatore

    2008-03-01

    This paper reports a method for the determination of the firing distance. Atomic absorption spectroscopy (AAS) was used to determine the lead (Pb) pattern around bullet holes produced by shots on test targets from the gun. Test shots were made with a Colt 38 Special at 5, 10, 20, 25, 30, 35, 40, 45, 50, 60, 80, and 100 cm target distance. The target was created with sheets of Whatman no. 1 paper on a polystyrene support. The target was subdivided into three carefully cut out rings (1, 2, and 3; with external diameters of 1.4 cm; 5 cm; 10.2 cm, respectively). Each sample was analyzed with graphite furnace AAS. Lead values analysis performed for each ring yielded a linear relation between the firing distance (cm) and the logarithm of lead amounts (microg/cm(2)) in definite target areas (areas 2 + 3): [ln dPb(2+3) = a(0) + a(1)l]; where dPb(2+3) = lead microg/cm(2) of area 2 + 3; a(0) and a(1) are experimentally calculated; l = distance in cm.

  19. Cadmium and lead residue control in a hazard analysis and critical control point (HACCP) environment.

    Science.gov (United States)

    Pagan-Rodríguez, Doritza; O'Keefe, Margaret; Deyrup, Cindy; Zervos, Penny; Walker, Harry; Thaler, Alice

    2007-02-21

    In 2003-2004, the U.S. Department of Agriculture Food Safety and Inspection Service (FSIS) conducted an exploratory assessment to determine the occurrence and levels of cadmium and lead in randomly collected samples of kidney, liver, and muscle tissues of mature chickens, boars/stags, dairy cows, and heifers. The data generated in the study were qualitatively compared to data that FSIS gathered in a 1985-1986 study in order to identify trends in the levels of cadmium and lead in meat and poultry products. The exploratory assessment was necessary to verify that Hazard Analysis and Critical Control Point plans and efforts to control exposure to these heavy metals are effective and result in products that meet U.S. export requirements. A comparison of data from the two FSIS studies suggests that the incidence and levels of cadmium and lead in different slaughter classes have remained stable since the first study was conducted in 1985-1986. This study was conducted to fulfill FSIS mandate to ensure that meat, poultry, and egg products entering commerce in the United States are free of adulterants, including elevated levels of environmental contaminants such as cadmium and lead.

  20. Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)

    Science.gov (United States)

    Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.

    2016-10-01

    Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.

  1. Long-term dietary exposure to lead in young European children: Comparing a pan-European approach with a national exposure assessment

    DEFF Research Database (Denmark)

    Boon, P.E.; Te Biesebeek, J.D.; van Klaveren, J.D.

    2012-01-01

    Long-term dietary exposures to lead in young children were calculated by combining food consumption data of 11 European countries categorised using harmonised broad food categories with occurrence data on lead from different Member States (pan-European approach). The results of the assessment...... in children living in the Netherlands were compared with a long-term lead intake assessment in the same group using Dutch lead concentration data and linking the consumption and concentration data at the highest possible level of detail. Exposures obtained with the pan-European approach were higher than...... the national exposure calculations. For both assessments cereals contributed most to the exposure. The lower dietary exposure in the national study was due to the use of lower lead concentrations and a more optimal linkage of food consumption and concentration data. When a pan-European approach, using...

  2. Calculation of losses in a HTS current lead with the help of the dimensional analysis

    Energy Technology Data Exchange (ETDEWEB)

    Douine, B.; Leveque, J.; Netter, D.; Rezzoug, A

    2003-12-01

    The calculation of losses is highly required to design any superconducting device. To do that the analytical approach is the best way in term of parameter analysis. Bean's model is based on the fact that the resistive transition is sudden. This assumption is more suitable for low critical temperature superconductors. For ceramics, the transition is smoother, so the variation of electric field E with current density is a function well approached by kJ{sup n}. Using this kind of function and a dimensional analysis the authors propose a new analytic formula to calculate the losses in the case of incomplete penetration of current. Calculated results are compared to measured ones and the validity limit is shown.

  3. Application of the Nernst-Planck approach to lead ion exchange in Ca-loaded Pelvetia canaliculata.

    Science.gov (United States)

    Costa, Joana F de Sá S; Vilar, Vítor J P; Botelho, Cidália M S; da Silva, Eduardo A B; Boaventura, Rui A R

    2010-07-01

    Ca-loaded Pelvetia canaliculata biomass was used to remove Pb(2+) in aqueous solution from batch and continuous systems. The physicochemical characterization of algae Pelvetia particles by potentiometric titration and FTIR analysis has shown a gel structure with two major binding groups - carboxylic (2.8 mmol g(-1)) and hydroxyl (0.8 mmol g(-1)), with an affinity constant distribution for hydrogen ions well described by a Quasi-Gaussian distribution. Equilibrium adsorption (pH 3 and 5) and desorption (eluents: HNO(3) and CaCl(2)) experiments were performed, showing that the biosorption mechanism was attributed to ion exchange among calcium, lead and hydrogen ions with stoichiometry 1:1 (Ca:Pb) and 1:2 (Ca:H and Pb:H). The uptake capacity of lead ions decreased with pH, suggesting that there is a competition between H(+) and Pb(2+) for the same binding sites. A mass action law for the ternary mixture was able to predict the equilibrium data, with the selectivity constants alpha(Ca)(H)=9+/-1 and alpha(Ca)(Pb)=44+/-5, revealing a higher affinity of the biomass towards lead ions. Adsorption (initial solution pH 4.5 and 2.5) and desorption (0.3M HNO(3)) kinetics were performed in batch and continuous systems. A mass transfer model using the Nernst-Planck approximation for the ionic flux of each counter-ion was used for the prediction of the ions profiles in batch systems and packed bed columns. The intraparticle effective diffusion constants were determined as 3.73x10(-7)cm(2)s(-1) for H(+), 7.56x10(-8)cm(2)s(-1) for Pb(2+) and 6.37x10(-8)cm(2)s(-1) for Ca(2+).

  4. [Partial lease squares approach to functional analysis].

    Science.gov (United States)

    Preda, C

    2006-01-01

    We extend the partial least squares (PLS) approach to functional data represented in our models by sample paths of stochastic process with continuous time. Due to the infinite dimension, when functional data are used as a predictor for linear regression and classification models, the estimation problem is an ill-posed one. In this context, PLS offers a simple and efficient alternative to the methods based on the principal components of the stochastic process. We compare the results given by the PLS approach and other linear models using several datasets from economy, industry and medical fields.

  5. One step behind to step ahead – femoral approach to stabilize and to extract functional pacing lead to regain venous access

    OpenAIRE

    Maciąg, Aleksander; Syska, Paweł; Kuśmierski, Krzysztof; Broy, Beata; Sterliński, Maciej

    2013-01-01

    Transvenous lead extraction can be a method to regain venous access. We present the case of a man, aged 67, with indications to upgrade an ICD to a resynchronization therapy device. Since innominate vein occlusion was diagnosed and extraction of an abandoned ventricular pacing lead did not provide lumen regain, a functional atrial lead was extracted with the femoral approach to stabilization and venous access was regained. Asymptomatic vein wall damage but no other complications were recorded...

  6. A Multiliteracies Approach to Materials Analysis

    Science.gov (United States)

    Rowland, Luke; Canning, Nick; Faulhaber, David; Lingle, Will; Redgrave, Andrew

    2014-01-01

    Multiliteracies pedagogy is an approach to literacy education emphasising the diverse ways in which people make meanings and communicate their understandings to others. Within this view of literacy teaching and learning, the construal and expression of meaning is considered a result of people engaging in various knowledge processes as they…

  7. Next-to-Next-Leading Order analysis of electroweak vacuum stability and rising inflection point

    CERN Document Server

    Iacobellis, Giuseppe

    2016-01-01

    We show an analysis on the gauge-independent observables associated with two stationary configurations of the Standard Model (SM) potential (extrapolated to high energy at Next-to-Next-to-Leading-Order (NNLO)): i) the value of the top mass ensuring stability of the SM electroweak vacuum and ii) the value of the Higgs potential at a rising inflection point. We examine in detail and reappraise the experimental and theoretical uncertainties which plague their determination, keeping alive the possibility for the SM of being stable and studying applications of such configuration to models of primordial inflation.

  8. Structural characterization of lead sulfide thin films by means of X-ray line profile analysis

    Indian Academy of Sciences (India)

    N Choudhury; B K Sarma

    2009-02-01

    X-ray diffraction patterns of chemically deposited lead sulphide thin films have been recorded and X-ray line profile analysis studies have been carried out. The lattice parameter, crystallite size, average internal stress and microstrain in the film are calculated and correlated with molarities of the solutions. Both size and strain are found to contribute towards the broadening of X-ray diffraction line. The values of the crystallite size are found to be within the range from 22–33 nm and the values of strain to be within the range from 1.0 × 10-3–2.5 × 10-3.

  9. Next-to-leading order QCD analysis of polarized deep inelastic scattering data

    OpenAIRE

    Abe, K.; Akagi, Takami; Anderson, B. D.; Anthony, P.L.; Arnold, Raymond G.; Averett, T.; Brand, H.R.; Berisso, C. M.; Bogorad, P.; Borel, H.; Bosted, P.E.; Breton, V.; Buenerd, Michel J.; Cates, Gordon D.; Chupp, Timothy E.

    1997-01-01

    We present a Next-to-Leading order perturbative QCD analysis of world data on the spin dependent structure functions $g_1^p, g_1^n$, and $g_1^d$, including the new experimental information on the $Q^2$ dependence of $g_1^n$. Careful attention is paid to the experimental and theoretical uncertainties. The data constrain the first moments of the polarized valence quark distributions, but only qualitatively constrain the polarized sea quark and gluon distributions. The NLO results are used to de...

  10. Sentiment Analysis Using Hybrid Approach: A Survey

    Directory of Open Access Journals (Sweden)

    Chauhan Ashish P

    2015-01-01

    Full Text Available Sentiment analysis is the process of identifying people’s attitude and emotional state’s from language. The main objective is realized by identifying a set of potential features in the review and extracting opinion expressions about those features by exploiting their associations. Opinion mining, also known as Sentiment analysis, plays an important role in this process. It is the study of emotions i.e. Sentiments, Expressionsthat are stated in natural language. Natural language techniques are applied to extract emotions from unstructured data. There are several techniques which can be used to analysis such type of data. Here, we are categorizing these techniques broadly as ”supervised learning”, ”unsupervised learning” and ”hybrid techniques”. The objective of this paper is to provide the overview of Sentiment Analysis, their challenges and a comparative analysis of it’s techniques in the field of Natural Language Processing

  11. Comparative analysis of employment dynamics in leading and lagging rural regions of the EU, 1980-1997.

    NARCIS (Netherlands)

    Terluin, I.J.; Post, J.H.; Sjöström, Å.

    1999-01-01

    In this study a comparative analysis of factors hampering and encouraging the development of employment in 9 leading and 9 lagging regions in the EU during the 1980s and the first half of the 1990s is made. Derived from this comparative analysis, some lessons, which leading and lagging rural regions

  12. Morphological Analysis as Classification an Inductive-Learning Approach

    CERN Document Server

    Van den Bosch, A; Weijters, T; Bosch, Antal van den; Daelemans, Walter; Weijters, Ton

    1996-01-01

    Morphological analysis is an important subtask in text-to-speech conversion, hyphenation, and other language engineering tasks. The traditional approach to performing morphological analysis is to combine a morpheme lexicon, sets of (linguistic) rules, and heuristics to find a most probable analysis. In contrast we present an inductive learning approach in which morphological analysis is reformulated as a segmentation task. We report on a number of experiments in which five inductive learning algorithms are applied to three variations of the task of morphological analysis. Results show (i) that the generalisation performance of the algorithms is good, and (ii) that the lazy learning algorithm IB1-IG performs best on all three tasks. We conclude that lazy learning of morphological analysis as a classification task is indeed a viable approach; moreover, it has the strong advantages over the traditional approach of avoiding the knowledge-acquisition bottleneck, being fast and deterministic in learning and process...

  13. Modeling the effect of levothyroxine therapy on bone mass density in postmenopausal women: a different approach leads to new inference

    Directory of Open Access Journals (Sweden)

    Tavangar Seyed

    2007-06-01

    Full Text Available Abstract Background The diagnosis, treatment and prevention of osteoporosis is a national health emergency. Osteoporosis quietly progresses without symptoms until late stage complications occur. Older patients are more commonly at risk of fractures due to osteoporosis. The fracture risk increases when suppressive doses of levothyroxine are administered especially in postmenopausal women. The question is; "When should bone mass density be tested in postmenopausal women after the initiation of suppressive levothyroxine therapy?". Standard guidelines for the prevention of osteoporosis suggest that follow-up be done in 1 to 2 years. We were interested in predicting the level of bone mass density in postmenopausal women after the initiation of suppressive levothyroxine therapy with a novel approach. Methods The study used data from the literature on the influence of exogenous thyroid hormones on bone mass density. Four cubic polynomial equations were obtained by curve fitting for Ward's triangle, trochanter, spine and femoral neck. The behaviors of the models were investigated by statistical and mathematical analyses. Results There are four points of inflexion on the graphs of the first derivatives of the equations with respect to time at about 6, 5, 7 and 5 months. In other words, there is a maximum speed of bone loss around the 6th month after the start of suppressive L-thyroxine therapy in post-menopausal women. Conclusion It seems reasonable to check bone mass density at the 6th month of therapy. More research is needed to explain the cause and to confirm the clinical application of this phenomenon for osteoporosis, but such an approach can be used as a guide to future experimentation. The investigation of change over time may lead to more sophisticated decision making in a wide variety of clinical problems.

  14. Application of risk analysis and quality control methods for improvement of lead molding process

    Directory of Open Access Journals (Sweden)

    H. Gołaś

    2016-10-01

    Full Text Available The aim of the paper is to highlight the significance of implication of risk analysis and quality control methods for the improvement of parameters of lead molding process. For this reason, Fault Mode and Effect Analysis (FMEA was developed in the conceptual stage of a new product TC-G100-NR. However, the final product was faulty (a complete lack of adhesion of brass insert to leak regardless of the previously defined potential problem and its preventive action. It contributed to the recognition of root causes, corrective actions and change of production parameters. It showed how these methods, level of their organization, systematic and rigorous study affect molding process parameters.

  15. An Analysis of the Factors Leading to Rising Credit Risk in the Zimbabwe Banking Sector

    Directory of Open Access Journals (Sweden)

    Maxwell Sandada

    2016-02-01

    Full Text Available The study sought to analyse the factors that lead to rising credit risk in the Zimbabwean banking sector. The objective was to ascertain the impact of macroeconomic, industry and bank specific factors on rising credit risk in in Zimbabwe. The study aimed at contributing to credit risk management literature by providing evidence Sub Saharan context. Being anchored on the positivist quantitative research approach, a survey was carried out gather the data that were analysed using descriptive, correlation and regression analyses. The results revealed that the most significant factors leading to credit risk in the Zimbabwean banking sector were macroeconomic and bank specific factors. The industry factors did not show a significant influence on the rising credit risk. The research findings of this study will a valuable addition to the existing knowledge and provide a platform for further research on how the credit risk problems can be dealt with. While credit risk is known as one of the risks inherent to any banking institutions, the alarming levels of credit risk in the Zimbabwe banking sector has motivated this current study to critically analyse the factors that have led to the high credit risk levels.

  16. Study of Object Oriented Analysis and Design Approach

    Directory of Open Access Journals (Sweden)

    Sunil K. Pandey

    2011-01-01

    Full Text Available Problem statement: Object and component technologies, rapidly maturing branches of information technology, have been becoming pervasive elements of systems development, especially the recently popular Internet applications and thus leading to increased complexity and at the same time broader range of applications. Approach: This needs to be understood in order to maximize its benefits and applications with consistent results. However, mainstream Object Oriented Systems Development (OOSD, consisting of Object Oriented Analysis and Design (OOAD and Object- Oriented Programming (OOP, has a history of difficulties and is still struggling to gain prevalent acceptance. Results: There have been number of studies and experiments conducted by experts and researchers in the past which provides a solid base to take up this study and look into various intricacies present. There have been several studies and focused efforts in this direction which laid down the basis for a segment of people to form the opinion as “technology adoption is mostly the result of marketing forces, not scientific evidence” whereas there have been another segment that believes that object technology is “still long on hype and short on results ...”. The gurus of OOSD continue to tout its vast superiority over conventional systems development, even to the extent of developing a unified software development process. Conclusion: The advocates of OOSD claim many advantages including easier modeling, increased code reuse, higher system quality and easier maintenance. It is well understood that analysis and design are extremely critical aspects of successful systems development especially in the case of OOSD. As the development of any successful information system must begin with a well-conceived and implemented analysis and design, this study will focus on the most recent empirical evidence on the pros and cons of OOAD.

  17. Multivariate analysis of 2-DE protein patterns - Practical approaches

    DEFF Research Database (Denmark)

    Jacobsen, Charlotte; Jacobsen, Susanne; Grove, H.;

    2007-01-01

    Practical approaches to the use of multivariate data analysis of 2-DE protein patterns are demonstrated by three independent strategies for the image analysis and the multivariate analysis on the same set of 2-DE data. Four wheat varieties were selected on the basis of their baking quality. Two...

  18. Workflow-based approaches to neuroimaging analysis.

    Science.gov (United States)

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  19. Morphological Analysis and Solubility of Lead Particles: Effect of Phosphates and Implications to Drinking Water (Presentation)

    Science.gov (United States)

    Describe lead synthesis experiments conduced to model the impact of water quality on lead particles and solubility Develop a model system that can be used for lead solubility studies Understand how phosphates impact morphology and solubility transformations with time

  20. Analysis of COSIMA spectra: Bayesian approach

    Directory of Open Access Journals (Sweden)

    H. J. Lehto

    2014-11-01

    Full Text Available We describe the use of Bayesian analysis methods applied to TOF-SIMS spectra. The method finds the probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes, positions in mass intervals over the whole spectrum. We discuss the results we can expect from this analysis. We discuss the effects the instrument dead time causes in the COSIMA TOF SIMS. We address this issue in a new way. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method in two ways, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique for COSIMA. Finally, we point out that the Bayesian method can be thought as a means to solve inverse problems but with forward calculations only.

  1. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP-AES and Portable XRF Instruments: A Comparative Study.

    Science.gov (United States)

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-03-30

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP-AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP-AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP-AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP-AES analysis data, PXRF analysis data, both ICP-AES and transformed PXRF analysis data by considering the correlation between the ICP-AES and PXRF analysis data, and co-kriging to both the ICP-AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP-AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP-AES and PXRF analysis data.

  2. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP–AES and Portable XRF Instruments: A Comparative Study

    Science.gov (United States)

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-01-01

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP–AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP–AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP–AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP–AES analysis data, PXRF analysis data, both ICP–AES and transformed PXRF analysis data by considering the correlation between the ICP–AES and PXRF analysis data, and co-kriging to both the ICP–AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP–AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP–AES and PXRF analysis data. PMID:27043594

  3. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP–AES and Portable XRF Instruments: A Comparative Study

    Directory of Open Access Journals (Sweden)

    Hyeongyu Lee

    2016-03-01

    Full Text Available Understanding spatial variation of potentially toxic trace elements (PTEs in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP–AES analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP–AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP–AES and portable X-ray fluorescence (PXRF analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP–AES analysis data, PXRF analysis data, both ICP–AES and transformed PXRF analysis data by considering the correlation between the ICP–AES and PXRF analysis data, and co-kriging to both the ICP–AES (primary variable and PXRF analysis data (secondary variable. Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP–AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP–AES and PXRF analysis data.

  4. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  5. Analysis of COSIMA spectra: Bayesian approach

    Directory of Open Access Journals (Sweden)

    H. J. Lehto

    2015-06-01

    secondary ion mass spectrometer (TOF-SIMS spectra. The method is applied to the COmetary Secondary Ion Mass Analyzer (COSIMA TOF-SIMS mass spectra where the analysis can be broken into subgroups of lines close to integer mass values. The effects of the instrumental dead time are discussed in a new way. The method finds the joint probability density functions of measured line parameters (number of lines, and their widths, peak amplitudes, integrated amplitudes and positions. In the case of two or more lines, these distributions can take complex forms. The derived line parameters can be used to further calibrate the mass scaling of TOF-SIMS and to feed the results into other analysis methods such as multivariate analyses of spectra. We intend to use the method, first as a comprehensive tool to perform quantitative analysis of spectra, and second as a fast tool for studying interesting targets for obtaining additional TOF-SIMS measurements of the sample, a property unique to COSIMA. Finally, we point out that the Bayesian method can be thought of as a means to solve inverse problems but with forward calculations, only with no iterative corrections or other manipulation of the observed data.

  6. Integrated micro-biochemical approach for phytoremediation of cadmium and lead contaminated soils using Gladiolus grandiflorus L cut flower.

    Science.gov (United States)

    Mani, Dinesh; Kumar, Chitranjan; Patel, Niraj Kumar

    2016-02-01

    The potential of vermicompost, elemental sulphur, Thiobacillus thiooxidans and Pseudomonas putida for phytoremediation is well known individually but their integrated approach has not been discovered so far. The present work highlights the consideration of so far overlooked aspects of their integrated treatment by growing the ornamental plant, Gladiolus grandiflorus L in uncontaminated and sewage-contaminated soils (sulphur-deficient alluvial Entisols, pH 7.6-7.8) for phytoremediation of cadmium and lead under pot experiment. Between vermicompost and elemental sulphur, the response of vermicompost was higher towards improvement in the biometric parameters of plants, whereas the response of elemental sulphur was higher towards enhanced bioaccumulation of heavy metals under soils. The integrated treatment (T7: vermicompost 6g and elemental sulphur 0.5gkg(-1) soil and co-inoculation of the plant with T. thiooxidans and P. putida) was found superior in promoting root length, plant height and dry biomass of the plant. The treatment T7 caused enhanced accumulation of Cd up to 6.96 and 6.45mgkg(-1) and Pb up to 22.6 and 19.9mgkg(-1) in corm and shoot, respectively at the contaminated soil. T7 showed maximum remediation efficiency of 0.46% and 0.19% and bioaccumulation factor of 2.92 and 1.21 and uptake of 6.75 and 21.4mgkg(-1) dry biomass for Cd and Pb respectively in the contaminated soil. The integrated treatment T7 was found significant over the individual treatments to promote plant growth and enhance phytoremediation. Hence, authors conclude to integrate vermicompost, elemental sulphur and microbial co-inoculation for the enhanced clean-up of Cd and Pb-contaminated soils.

  7. A global optimization approach to multi-polarity sentiment analysis.

    Science.gov (United States)

    Li, Xinmiao; Li, Jing; Wu, Yukeng

    2015-01-01

    Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From

  8. Simulation Approach to Mission Risk and Reliability Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  9. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  10. A hybrid transfinite element approach for nonlinear transient thermal analysis

    Science.gov (United States)

    Tamma, Kumar K.; Railkar, Sudhir B.

    1987-01-01

    A new computational approach for transient nonlinear thermal analysis of structures is proposed. It is a hybrid approach which combines the modeling versatility of contemporary finite elements in conjunction with transform methods and classical Bubnov-Galerkin schemes. The present study is limited to nonlinearities due to temperature-dependent thermophysical properties. Numerical test cases attest to the basic capabilities and therein validate the transfinite element approach by means of comparisons with conventional finite element schemes and/or available solutions.

  11. Microbial community analysis of soils contaminated with lead, chromium and petroleum hydrocarbons.

    Science.gov (United States)

    Joynt, Janet; Bischoff, Marianne; Turco, Ron; Konopka, Allan; Nakatsu, Cindy H

    2006-02-01

    The impact on the microbial community of long-term environmental exposure to metal and organic contamination was investigated. Twenty-four soil samples were collected along a transect dug in soils contaminated with road paint and paint solvents, mainly toluene. Chemical analysis along the transect revealed a range from high to low concentrations of metals (lead and chromium) and organic solvent compounds. Principal components analysis of microbial community structure based on denaturing gradient gel electrophoresis of the V3 region of the 16S rRNA gene and fatty acid methyl esters derived from phospholipids (phospholipid fatty acid analysis) showing samples with similar fingerprints also had similar contaminant concentrations. There was also a weak positive correlation between microbial biomass and the organic carbon concentration. Results indicated that microbial populations are present despite some extreme contaminant levels in this mixed-waste contaminated site. Nucleotide sequence determination of the 16S rRNA gene indicated the presence of phylogenetically diverse bacteria belonging to the alpha-, beta-, gamma-, and delta-Proteobacteria, the high and low G + C Gram-positive bacteria, green nonsulfur, OP8, and others that did not group within a described division. This indicates that soils contaminated with both heavy metals and hydrocarbons for several decades have undergone changes in community composition, but still contain a phylogenetically diverse group of bacteria (including novel phylotypes) that warrant further investigation.

  12. Development of a digital signal processor-based new 12-lead synchronization electrocardiogram automatic analysis system.

    Science.gov (United States)

    Yang, Yuxing; Yin, Dongyuan; Freyer, Richard

    2002-07-01

    This paper presents a digital signal processor (DSP)-based new multichannel electrocardiogram (ECG) system for 12-lead synchronization ECG automatic analysis in real-time with high sampling rate at 1000 Hz and 12-bits precision. Using the hardware structure of double-CPU based on Microprocessor (MPU) 89C55 and DSP TMS320F206 combines the powerful control ability of MPU with DSPs fast computation ability. Fully utilizing the double-CPUs resource, the system can distribute the reasonable CPU-time for the real-time tasks of multichannel synchronization ECG sampling, digital filter, data storing, waveform automatic analysis and print at high sampling rate. The digital ECG system has the advantages of simple structure, sampling with high speed and precision, powerful real-time processing ability and good quality. The paper discusses the system's principle and the skilful hardware design, also gives the ECG processing using the fast simple integer-coefficient filter method and the automatic calculation algorithms of the ECG parameters such as heart rate, P-R interval, Q-T interval and deflexion angle of ECG-axis etc. The system had been successfully tested and used in the ECG automatic analysis instrument.

  13. AN EXPLORATORY ANALYSIS ON HALF-HOURLY ELECTRICITY LOAD PATTERNS LEADING TO HIGHER PERFORMANCES IN NEU

    Directory of Open Access Journals (Sweden)

    K.A.D. Deshani

    2014-05-01

    Full Text Available Accurate prediction of electricity demand can bring extensive benefits to any country as the forecasted values help the relevant authorities to take decisions regarding electricity generation, transmission and distribution appropriately. The literature reveals that, when compared to conventional time series techniques, the improved artificial intelligent approaches provide better prediction accuracies. However, the accuracy of predictions using intelligent approaches like neural networks are strongly influenced by the correct selection of inputs and the number of neuro-forecasters used for prediction. Deshani, Hansen, Attygalle, & Karunarathne (2014 suggested that a cluster analysis could be performed to group similar day types, which contribute towards selecting a better set of neuro-forecasters in neural networks. The cluster analysis was based on the daily total electricity demands as their target was to predict the daily total demands using neural networks. However, predicting half-hourly demand seems more appropriate due to the considerable changes of electricity demand observed during a particular day. As such clusters are identified considering half-hourly data within the daily load distribution curves. Thus, this paper is an improvement to Deshani et. al. (2014, which illustrates how the half hourly demand distribution within a day, is incorporated when selecting the inputs for the neuro-forecasters.

  14. Meta-analysis a structural equation modeling approach

    CERN Document Server

    Cheung, Mike W-L

    2015-01-01

    Presents a novel approach to conducting meta-analysis using structural equation modeling. Structural equation modeling (SEM) and meta-analysis are two powerful statistical methods in the educational, social, behavioral, and medical sciences. They are often treated as two unrelated topics in the literature. This book presents a unified framework on analyzing meta-analytic data within the SEM framework, and illustrates how to conduct meta-analysis using the metaSEM package in the R statistical environment. Meta-Analysis: A Structural Equation Modeling Approach begins by introducing the impo

  15. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael

    2017-01-01

    This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power in agricultu...

  16. Slurry analysis after lead collection on a sorbent and its determination by electrothermal atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Baysal, Asli; Tokman, Nilgun [Istanbul Technical University, Faculty of Science and Letters, Department of Chemistry, 34469 Maslak-Istanbul (Turkey); Akman, Suleyman [Istanbul Technical University, Faculty of Science and Letters, Department of Chemistry, 34469 Maslak-Istanbul (Turkey)], E-mail: akmans@itu.edu.tr; Ozeroglu, Cemal [Istanbul University, Department of Chemistry, Faculty of Engineering, 34320 Avcilar-Istanbul (Turkey)

    2008-02-11

    In this study, in order to eliminate the drawbacks of elution step and to reach higher enrichment factors, a novel preconcentration/separation technique for the slurry analysis of sorbent loaded with lead prior to its determination by electrothermal atomic absorption spectrometry was described. For this purpose, at first, lead was collected on ethylene glycol dimethacrylate methacrylic acid copolymer (EGDMA-MA) treated with ammonium pyrolidine dithiocarbamate (APDC) by conventional batch technique. After separation of liquid phase, slurry of the sorbent was prepared and directly pipetted into graphite furnace of atomic absorption spectrophotometer. Optimum conditions for quantitative sorption and preparation of the slurry were investigated. A 100-fold enrichment factor could be easily reached. The analyte element in certified sea-water and Bovine-liver samples was determined in the range of 95% confidence level. The proposed technique was fast and simple and the risks of contamination and analyte loss were low. Detection limit (3{delta}) for Pb was 1.67 {mu}g l{sup -1}.

  17. Luminescence and phonon side band analysis of Eu3+-doped lead fluorosilicate glasses

    Science.gov (United States)

    Manasa, P.; Jayasankar, C. K.

    2016-12-01

    Lead fluorosilicate (SPbKNLFEu) glasses doped with different concentrations of Eu3+ ions have been prepared by the melt quenching technique. The structural and spectroscopic analysis have been carried out by Raman, absorption, excitation, emission, phonon side band (PSB) spectra and decay time measurements. The Judd-Ofelt theory has been used to predict the radiative properties for the emission levels of Eu3+ ions. Local structure around the Eu3+ ions and the phonon energy of SPbKNLFEu glasses have been confirmed on the basis of PSB associated with the 7F0 → 5D2 transition. The decay curves of the 5D0 and 5D1 levels exhibit single exponential nature with a lifetime of 2240 μs and 20 μs, respectively. The multiphonon relaxation rates (Wmp) from the excited levels to the next lower level of Eu3+ ions have been calculated. The higher stimulated emission cross-section and the strong red emission at 613 nm corresponding to the 5D0 → 7F2 transition suggests that the present lead fluorosilicate glasses could be useful for the optical display devices.

  18. Spatial analysis on impacts of mining activities leading to flood disaster in the Erai watershed, India

    Energy Technology Data Exchange (ETDEWEB)

    Katpatal, Y.B.; Patil, S.A. [Visvesvaraya National Institute of Technology, Nagpur (India). Dept. of Civil Engineering

    2010-05-15

    Decisions related to mine management, especially pertaining to dumped material, might lead to several environmental hazards including flood risks in mining areas. Excavation and mine dumps are dominant factors of land use/land cover change in the Erai River watershed of Chandrapur district in Maharashtra, India. Identification and quantification of the extent of mining activities is important for assessing how this change in land use/land cover affects ecosystem components such as aesthetics, biodiversity and mitigation of floods in the Erai watershed. The present study utilizes satellite data of Landsat TM (1989), IRS LISS-3 (1999, 2007) and CARTOSAT (2007) to study the extent of surface mines and management of mine overburden (OB) dumps of Hindustan Lalpeth coal mines, Chandrapur, India. Image processing techniques in conjunction with GIS have been used to visualize the flood scenario, the reasons for floods and area under impact. The study indicates that the development of the mine OB dump within the river channel on both the sides has been responsible for the 2006 flood within the region. Further increase in OB dump heights may result in the risk of floods of greater potential during heavy rainfall in the future. The study presents a spatial analysis to assess the impacts of OB dumps in the recent flood in the area. The study also spatially represents the area under impact leading to a disastrous situation due to floods. The study also suggests the probable measures that must be adopted to avoid such situations in future in the mining areas.

  19. Spatial analysis on impacts of mining activities leading to flood disaster in the Erai watershed, India

    Energy Technology Data Exchange (ETDEWEB)

    Y.B. Katpatal; S.A. Patil [Visvesvaraya National Institute of Technology, Nagpur (India). Civil Engineering Department

    2010-03-15

    Decisions related to mine management, especially pertaining to dumped material, might lead to several environmental hazards including flood risks in mining areas. Excavation and mine dumps are dominant factors of land use/land cover change in the Erai River watershed of Chandrapur district in Maharashtra, India. Identification and quantification of the extent of mining activities is important for assessing how this change in land use/land cover affects ecosystem components such as aesthetics, biodiversity and mitigation of floods in the Erai watershed. The present study utilizes satellite data of Landsat TM (1989), IRS LISS-3 (1999, 2007) and CARTOSAT (2007) to study the extent of surface mines and management of mine over burden (OB) dumps of Hindustan Lalpeth coal mines, Chandrapur, India. Image processing techniques in conjunction with GIS have been used to visualize the flood scenario, the reasons for floods and area under impact. The study indicates that the development of the mine OB dump within the river channel on both the sides has been responsible for the 2006 flood within the region. Further increase in OB dump heights may result in the risk of floods of greater potential during heavy rainfall in the future. The study presents a spatial analysis to assess the impacts of OB dumps in the recent flood in the area. The study also spatially represents the area under impact leading to a disastrous situation due to floods. The study also suggests the probable measures that must be adopted to avoid such situations in future in the mining areas.

  20. Direct numerical simulation and statistical analysis of turbulent convection in lead-bismuth

    Energy Technology Data Exchange (ETDEWEB)

    Otic, I.; Grotzbach, G. [Forschungszentrum Karlsruhe GmbH, Institut fuer Kern-und Energietechnik (Germany)

    2003-07-01

    Improved turbulent heat flux models are required to develop and analyze the reactor concept of an lead-bismuth cooled Accelerator-Driven-System. Because of specific properties of many liquid metals we have still no sensors for accurate measurements of the high frequency velocity fluctuations. So, the development of the turbulent heat transfer models which are required in our CFD (computational fluid dynamics) tools needs also data from direct numerical simulations of turbulent flows. We use new simulation results for the model problem of Rayleigh-Benard convection to show some peculiarities of the turbulent natural convection in lead-bismuth (Pr = 0.025). Simulations for this flow at sufficiently large turbulence levels became only recently feasible because this flow requires the resolution of very small velocity scales with the need for recording long-wave structures for the slow changes in the convective temperature field. The results are analyzed regarding the principle convection and heat transfer features. They are also used to perform statistical analysis to show that the currently available modeling is indeed not adequate for these fluids. Basing on the knowledge of the details of the statistical features of turbulence in this convection type and using the two-point correlation technique, a proposal for an improved statistical turbulence model is developed which is expected to account better for the peculiarities of the heat transfer in the turbulent convection in low Prandtl number fluids. (authors)

  1. Review Essay: Conversation Analysis Versus Other Approaches to Discourse

    OpenAIRE

    Paul Ten Have

    2006-01-01

    This review discusses a recent book by Robin WOOFFITT in which conversation analysis (CA) is confronted with some other analytic approaches to "discourse." The author uses the term discourse analysis in a rather specific way, as a label for an analytic tradition that has become prominent in (social) psychology in the UK. Two other traditions, critical discourse analysis and Foucauldian discourse analysis are also discussed later in the book. The major criticism raised in the review is that th...

  2. BANSAI - An optofluidic approach for biomedical analysis

    Science.gov (United States)

    Knoerzer, Markus; Prokop, Christoph; Rodrigues Ribeiro, Graciete M.; Mayer, Horst; Brümmer, Jens; Mitchell, Arnan; Rabus, Dominik G.; Karnutsch, Christian

    2016-02-01

    Lab-on-a-chip based portable blood analysis systems would allow point-of-care measurements, e.g. in an ambulance, or in remote areas with no fast access to medical care. Such a systemwould provide much faster information about the health of a patient. Here,we present a system that is based on absorption spectroscopy and uses an organic laser, which is tunable in the visible range. The feasibility of the system is shown with a table-top setup using laboratory equipment. Measurements of human albumin show linear behaviour in a range from 2.5 g/L to 60 g/L. In a consecutive setup the system is implemented on a microfluidic chip and is capable of measuring simultaneously transmitted and side scattered intensities, even with ambient light present. Air-suspended grating couplers on polymers are shown as the first element of a lab-on-a-chip implementation.

  3. Towards a More Holistic Stakeholder Analysis Approach

    DEFF Research Database (Denmark)

    Sedereviciute, Kristina; Valentini, Chiara

    2011-01-01

    are identified based on the dimensions of connectivity and the content shared. Accordingly, the study introduces four groups of important actors from social media: unconcerned lurkers, unconcerned influencers, concerned lurkers and concerned influencers and integrates them into the existing Stakeholder Salience...... in finding stakeholders on new environments (social media), where connectivity and relationships play a key role. The argument stems from the need to assess stakeholder presence beyond the dyadic ties. Consequently, the combination of the Stakeholder Salience Model (SSM) and social network analysis (SNA......) is proposed as a more holistic solution for stakeholder identification including those from social media. A process of finding “unknown” but important stakeholders from social media was identified incorporating the content search and the principles of SNA. Consequently, stakeholders from social media...

  4. Sediment Analysis Using a Structured Programming Approach

    Directory of Open Access Journals (Sweden)

    Daniela Arias-Madrid

    2012-12-01

    Full Text Available This paper presents an algorithm designed for the analysis of a sedimentary sample of unconsolidated material and seeks to identify very quickly the main features that occur in a sediment and thus classify them fast and efficiently. For this purpose, it requires that the weight of each particle size to be entered in the program and using the method of Moments, which is based on four equations representing the mean, standard deviation, skewness and kurtosis, is found the attributes of the sample in few seconds. With the program these calculations are performed in an effective and more accurately way, obtaining also the explanations of the results of the features such as grain size, sorting, symmetry and origin, which helps to improve the study of sediments and in general the study of sedimentary rocks.

  5. Root-cause analysis and health failure mode and effect analysis: two leading techniques in health care quality assessment.

    Science.gov (United States)

    Shaqdan, Khalid; Aran, Shima; Daftari Besheli, Laleh; Abujudeh, Hani

    2014-06-01

    In this review article, the authors provide a detailed series of guidelines for effectively performing root-cause analysis (RCA) and health failure mode and effect analysis (HFMEA). RCA is a retrospective approach used to ascertain the "root cause" of a problem that has already occurred, whereas HFMEA is a prospective risk assessment tool whose aim is to recognize risks to patient safety. RCA and HFMEA are used for the prevention of errors or recurring errors to create a safer workplace, maintain high standards in health care quality, and incorporate time-saving and cost-saving modifications to favorably affect the patient care environment. The principles and techniques provided here should allow reviewers to better understand the features of RCA and HFMEA and how to apply these processes appropriately. These principles include how to organize a team, identify root causes, seed out proximate causes, graphically describe the process, conduct a hazard analysis, and develop and implement potential action plans.

  6. Terminal Performance of Lead-Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video

    CERN Document Server

    Courtney, Elijah; Andrusiv, Lubov; Courtney, Michael

    2016-01-01

    Due to concerns about environmental and industrial hazards of lead, a number of military, law enforcement, and wildlife management agencies are giving careful consideration to lead-free ammunition. The goal of lead-free bullets is to gain the advantages of reduced lead use in the environment while maintaining equal or better terminal performance. Accepting reduced terminal performance would foolishly risk the lives of military and law enforcement personnel. This paper uses the established technique of studying bullet impacts in ballistic gelatin to characterize the terminal performance of eight commercial off-the- shelf lead-free handgun bullets for comparison with earlier analysis of jacketed lead bullets. Peak retarding force and energy deposit in calibrated ballistic gelatin are quantified using high speed video. The temporary stretch cavities and permanent wound cavities are also characterized. Two factors tend to reduce the terminal performance of these lead-free projectiles compared to similar jacketed ...

  7. Deterministic and risk-informed approaches for safety analysis of advanced reactors: Part I, deterministic approaches

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Sang Kyu [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of); Kim, Inn Seock, E-mail: innseockkim@gmail.co [ISSA Technology, 21318 Seneca Crossing Drive, Germantown, MD 20876 (United States); Oh, Kyu Myung [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)

    2010-05-15

    The objective of this paper and a companion paper in this issue (part II, risk-informed approaches) is to derive technical insights from a critical review of deterministic and risk-informed safety analysis approaches that have been applied to develop licensing requirements for water-cooled reactors, or proposed for safety verification of the advanced reactor design. To this end, a review was made of a number of safety analysis approaches including those specified in regulatory guides and industry standards, as well as novel methodologies proposed for licensing of advanced reactors. This paper and the companion paper present the review insights on the deterministic and risk-informed safety analysis approaches, respectively. These insights could be used in making a safety case or developing a new licensing review infrastructure for advanced reactors including Generation IV reactors.

  8. An SQL-based approach to physics analysis

    Science.gov (United States)

    Limper, Maaike, Dr

    2014-06-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced "ROOT-ntuple" files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  9. A Project Risk Ranking Approach Based on Set Pair Analysis

    Institute of Scientific and Technical Information of China (English)

    Gao Feng; Chen Yingwu

    2006-01-01

    Set Pair Analysis (SPA) is a new methodology to describe and process system uncertainty. It is different from stochastic or fuzzy methods in reasoning and operation, and it has been applied in many areas recently. In this paper, the application of SPA in risk ranking is presented, which includes review of risk ranking, introduction of Connecting Degree (CD) that is a key role in SPA., Arithmetic and Tendency Grade (TG) of CDs, and a risk ranking approach proposed. Finally a case analysis is presented to illustrate the reasonability of this approach. It is found that this approach is very convenient to operate, while the ranking result is more comprehensible.

  10. Text Analysis: A Functional Linguistic Approach of News Introduction

    Institute of Scientific and Technical Information of China (English)

    刘锦凤

    2009-01-01

    The past several decades have witnessed a phenomenal growth in interest in text analysis, in which different kinds of approaches have been studied and applied in this field. This paper aims at analyzing the introduction of a cho-sen CNN News from a functional linguis.tic approach, which is mainly realized through cohesive means and textual infor-mation. The study shows that in written text, well-organized semantic cohesive means and textual information are of great significance for readers to follow the movement of an idea from one sentence to another. Therefore, functional approach plays a momentous role in the analysis of a text.

  11. In silico-screening approaches for lead generation: identification of novel allosteric modulators of human-erythrocyte pyruvate kinase.

    Science.gov (United States)

    Tripathi, Ashutosh; Safo, Martin K

    2012-01-01

    Identification of allosteric binding site modulators have gained increased attention lately for their potential to be developed as selective agents with a novel chemotype and targeting perhaps a new and unique binding site with probable fewer side effects. Erythrocyte pyruvate kinase (R-PK) is an important glycolytic enzyme that can be pharmacologically modulated through its allosteric effectors for the treatment of hemolytic anemia, sickle-cell anemia, hypoxia-related diseases, and other disorders arising from erythrocyte PK malfunction. An in-silico screening approach was applied to identify novel allosteric modulators of pyruvate kinase. A small-molecules database of the National Cancer Institute (NCI), was virtually screened based on structure/ligand-based pharmacophore. The virtual screening campaign led to the identification of several compounds with similar pharmacophoric features as fructose-1,6-bisphosphate (FBP), the natural allosteric activator of the kinase. The compounds were subsequently docked into the FBP-binding site using the programs FlexX and GOLD, and their interactions with the protein were analyzed with the energy-scoring function of HINT. Seven promising candidates were obtained from the NCI and subjected to kinetics analysis, which revealed both activators and inhibitors of the R-isozyme of PK (R-PK).

  12. Plant gravitropic signal transduction: A network analysis leads to gene discovery

    Science.gov (United States)

    Wyatt, Sarah

    Gravity plays a fundamental role in plant growth and development. Although a significant body of research has helped define the events of gravity perception, the role of the plant growth regulator auxin, and the mechanisms resulting in the gravity response, the events of signal transduction, those that link the biophysical action of perception to a biochemical signal that results in auxin redistribution, those that regulate the gravitropic effects on plant growth, remain, for the most part, a “black box.” Using a cold affect, dubbed the gravity persistent signal (GPS) response, we developed a mutant screen to specifically identify components of the signal transduction pathway. Cloning of the GPS genes have identified new proteins involved in gravitropic signaling. We have further exploited the GPS response using a multi-faceted approach including gene expression microarrays, proteomics analysis, and bioinformatics analysis and continued mutant analysis to identified additional genes, physiological and biochemical processes. Gene expression data provided the foundation of a regulatory network for gravitropic signaling. Based on these gene expression data and related data sets/information from the literature/repositories, we constructed a gravitropic signaling network for Arabidopsis inflorescence stems. To generate the network, both a dynamic Bayesian network approach and a time-lagged correlation coefficient approach were used. The dynamic Bayesian network added existing information of protein-protein interaction while the time-lagged correlation coefficient allowed incorporation of temporal regulation and thus could incorporate the time-course metric from the data set. Thus the methods complemented each other and provided us with a more comprehensive evaluation of connections. Each method generated a list of possible interactions associated with a statistical significance value. The two networks were then overlaid to generate a more rigorous, intersected

  13. A novel approach for endocardial resynchronization therapy: Initial experience with transapical implantation of the left ventricular lead

    NARCIS (Netherlands)

    I. Kassai (Imre); A. Mihalcz (Attila); C. Foldesi (Csaba); A. Kardos (Attila); T. Szili-Torok (Tamas)

    2009-01-01

    textabstractBackground: Coronary sinus lead placement for transvenous left ventricular (LV) pacing in cardiac resynchronization therapy (CRT) has a significant failure rate at implant and a considerable dislocation rate during follow-up. For these patients epicardial pacing lead implantation is the

  14. Mycobiome: Approaches to analysis of intestinal fungi.

    Science.gov (United States)

    Tang, Jie; Iliev, Iliyan D; Brown, Jordan; Underhill, David M; Funari, Vincent A

    2015-06-01

    Massively parallel sequencing (MPSS) of bacterial 16S rDNA has been widely used to characterize the microbial makeup of the human and mouse gastrointestinal tract. However, techniques for fungal microbiota (mycobiota) profiling remain relatively under-developed. Compared to 16S profiling, the size and sequence context of the fungal Internal Transcribed Spacer 1 (ITS1), the most common target for mycobiota profiling, are highly variable. Using representative gastrointestinal tract fungi to build a known "mock" library, we examine how this sequence variability affects data quality derived from Illumina Miseq and Ion Torrent PGM sequencing pipelines. Also, while analysis of bacterial 16S profiles is facilitated by the presence of high-quality well-accepted databases of bacterial 16S sequences, such an accepted database has not yet emerged to facilitate fungal ITS sequence characterization, and we observe that redundant and inconsistent ITS1 sequence representation in publically available fungal reference databases affect quantitation and annotation of species in the gut. To address this problem, we have constructed a manually curated reference database optimized for annotation of gastrointestinal fungi. This targeted host-associated fungi (THF) database contains 1817 ITS1 sequences representing sequence diversity in genera previously identified in human and mouse gut. We observe that this database consistently outperforms three common ITS database alternatives on comprehensiveness, taxonomy assignment accuracy and computational efficiency in analyzing sequencing data from the mouse gastrointestinal tract.

  15. Conformal polishing approach: Tool footprint analysis

    Directory of Open Access Journals (Sweden)

    José A Dieste

    2016-02-01

    Full Text Available Polishing process is one of the most critical manufacturing processes during a metal part production because it determines the final quality of the product. Free-form surface polishing is a handmade process with lots of rejected parts, scrap generation and time and energy consumption. Two different research lines are being developed: prediction models of the final surface quality parameters and an analysis of the amount of material removed depending on the polishing parameters to predict the tool footprint during the polishing task. This research lays the foundations for a future automatic conformal polishing system. It is based on rotational and translational tool with dry abrasive in the front mounted at the end of a robot. A tool to part concept is used, useful for large or heavy workpieces. Results are applied on different curved parts typically used in tooling industry, aeronautics or automotive. A mathematical model has been developed to predict the amount of material removed in function of polishing parameters. Model has been fitted for different abrasives and raw materials. Results have shown deviations under 20% that implies a reliable and controllable process. Smaller amount of material can be removed in controlled areas of a three-dimensional workpiece.

  16. Analysis of ECG Using Filter Bank Approach

    Directory of Open Access Journals (Sweden)

    S. Thulasi Prasad

    2014-01-01

    Full Text Available In recent years scientists and engineers are facing several problems in the biomedical field. However Digital Signal Processing is solving many of those problems easily and effectively. The signal processing of ECG is very useful in detecting selected arrhythmia conditions from a patient’s electrocardiograph (ECG signals. In this paper we performed analysis of noisy ECG by filtering of 50 Hz power line interference using an adaptive LMS notch filter. This is very meaningful in the measurement of biomedical events, particularly when the recorded ECG signal is very weak. The basic ECG has the frequency range from 5 Hz to 100 Hz. It becomes difficult for the Specialist to diagnose the diseases if the artifacts are present in the ECG signal. Methods of noise reduction have decisive influence on performance of all electro-cardio-graphic (ECG signal processing systems. After removing 50/60 Hz powerline interference, the ECG is lowpass filtered in a digital FIR filter. We designed a Filter Bank to separate frequency ranges of ECG signal to enhance the occurrences QRS complexes. Later the positions of R-peaks are identified and shown plotted. The result shows the ECG signal before filtering and after filtering with their frequency spectrums which clearly indicates the reduction of the power line interference in the ECG signal and a filtered ECG with identified R-peaks.

  17. Developing a New Approach for Arabic Morphological Analysis and Generation

    CERN Document Server

    Gridach, Mourad

    2011-01-01

    Arabic morphological analysis is one of the essential stages in Arabic Natural Language Processing. In this paper we present an approach for Arabic morphological analysis. This approach is based on Arabic morphological automaton (AMAUT). The proposed technique uses a morphological database realized using XMODEL language. Arabic morphology represents a special type of morphological systems because it is based on the concept of scheme to represent Arabic words. We use this concept to develop the Arabic morphological automata. The proposed approach has development standardization aspect. It can be exploited by NLP applications such as syntactic and semantic analysis, information retrieval, machine translation and orthographical correction. The proposed approach is compared with Xerox Arabic Analyzer and Smrz Arabic Analyzer.

  18. Inadvertent interchange of electrocardiogram limb lead connections: analysis of predicted consequences part II: double interconnection errors.

    Science.gov (United States)

    Rowlands, Derek J

    2012-01-01

    Limb lead connection errors are known to be very common in clinical practice. The consequences of all possible single limb lead interconnection errors were analyzed in an earlier publication (J Electrocardiology 2008;41:84-90). With a single limb lead interconnection error, 6 combinations of limb lead connections are possible. Two of these combinations give rise to records in which the limb lead morphology is uninterpretable. Such records show a "flat line" in lead II or III. Three of the errors give rise to records that are fully interpretable once the specific interconnection error has been identified (although one of the errors cannot reliably be recognized in the absence of a previous record for comparison). One of the errors produces no change in the electrocardiogram recording. In all cases, the precordial leads are interpretable, although there are very minor changes in the voltages. This communication predicts the changes in limb lead appearances consequent upon all possible double limb lead interchanges and illustrates these with records electively taken with such double interconnection errors. There are only 3 possible double limb lead interconnection errors. In 2 of the possible combinations, interpretation of the limb leads is impossible, and each of these errors gives rise to a flat line in lead I. In the third combination, the record is fully interpretable once the abnormality has been identified. In all 3 types, the precordial leads are interpretable, although there are very minor changes in the voltages.

  19. Hierarchical equations of motion approach to transport through an Anderson impurity coupled to interacting Luttinger liquid leads

    Science.gov (United States)

    Okamoto, Jun-ichi; Mathey, Ludwig; Härtle, Rainer

    2016-12-01

    We generalize the hierarchical equations of motion method to study electron transport through a quantum dot or molecule coupled to one-dimensional interacting leads that can be described as Luttinger liquids. Such leads can be realized, for example, by quantum wires or fractional quantum Hall edge states. In comparison to noninteracting metallic leads, Luttinger liquid leads involve many-body correlations and the single-particle tunneling density of states shows a power-law singularity at the chemical potential. Using the generalized hierarchical equations of motion method, we assess the importance of the singularity and the next-to-leading order many-body correlations. To this end, we compare numerically converged results with second- and first-order results of the hybridization expansion that is inherent to our method. As a test case, we study transport through a single-level quantum dot or molecule that can be described by an Anderson impurity model. Cotunneling effects turn out to be most pronounced for attractive interactions in the leads or repulsive ones if an excitonic coupling between the dot and the leads is realized. We also find that an interaction-induced negative differential conductance near the Coulomb blockade thresholds is slightly suppressed as compared to a first-order and/or rate equation result. Moreover, we find that the two-particle (n -particle) correlations enter as a second-order (n -order) effect and are, thus, not very pronounced at the high temperatures and parameters that we consider.

  20. Four Layered Approach to Non-Functional Requirements Analysis

    CERN Document Server

    Rao, A Ananda

    2012-01-01

    Identification of non-functional requirements is important for successful development and deployment of the software product. The acceptance of the software product by the customer depends on the non-functional requirements which are incorporated in the software. For this, we need to identify all the non-functional requirements required by all stakeholders. In the literature not many approaches are available for this purpose. Hence, we have proposed a four layered analysis approach for identification of non-functional requirements. The proposed layered approach has many advantages over non-layered approach. As part of this approach some rules are also proposed to be used in each layer. The approach is applied successfully on two case studies. The identified non-functional requirements are validated using a check list and in addition the completeness of the identified non-requirements is computed using a metric.

  1. The analysis of the effective lead-in helps optimize the English class

    Institute of Scientific and Technical Information of China (English)

    张蕾

    2014-01-01

    Lead-in, the first part of a class guides students to enter the teaching aspects of learning; it seems to be the simplest part which takes the least time during the whole class. But the effective lead-in helps optimize the English class. Lead-in of English class in junior high school teaching methods are various. This thesis lists several of effective and practical ways and some lead-in problems that need to be taken to avoid.

  2. Higher-order terms in sensitivity analysis through a differential approach

    Energy Technology Data Exchange (ETDEWEB)

    Dubi, A.; Dudziak, D.J.

    1981-06-01

    A differential approach to sensitivity analysis has been developed that eliminates some difficulties existing in previous work. The new development leads to simple explicit expressions for the first-order perturbation as well as any higher-order terms. The higher-order terms are dependent only on differentials of the transport operator, the unperturbed flux, the adjoint flux, and the unperturbed Green's function of the system.

  3. Morphological Analysis and Solubility of Lead Particles: Effect of Phosphates and Implications to Drinking Water Distribution

    Science.gov (United States)

    Objective • Describe lead synthesis experiments conduced to model the impact of water quality on lead particles and solubility • Develop a model system that can be used for lead solubility studies • Understand the how phosphates impact the morphology and solubility transfo...

  4. A genetic algorithm approach to routine gamma spectra analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlevaro, C M [Instituto de FIsica de LIquidos y Sistemas Biologicos, Calle 59 No 789, B1900BTE La Plata (Argentina); Wilkinson, M V [Autoridad Regulatoria Nuclear, Avda. del Libertador 8250, C1429BNP Buenos Aires (Argentina); Barrios, L A [Autoridad Regulatoria Nuclear, Avda. del Libertador 8250, C1429BNP Buenos Aires (Argentina)

    2008-01-15

    In this work we present an alternative method for performing routine gamma spectra analysis based on genetic algorithm techniques. The main idea is to search for patterns of single nuclide spectra obtained by simulation in a sample spectrum targeted for analysis. We show how this approach is applied to the analysis of simulated and real target spectra, and also to the study of interference resolution.

  5. Rapid lead isotope analysis of archaeological metals by multiple-collector inductively coupled plasma mass spectrometry

    DEFF Research Database (Denmark)

    Baker, J.A.; Stos, S.; Waight, Tod Earle

    2006-01-01

    Lead isotope ratios in archaeological silver and copper were determined by MC-ICPMS using laser ablation and bulk dissolution without lead purification. Laser ablation results on high-lead metals and bulk solution analyses on all samples agree within error of TIMS data, suggesting that problems...... from isobaric interferences and/or mass bias variations due to the presence of matrix elements are insignificant. Inaccurate laser ablation analyses on low-lead copper reflect erroneous mass bias corrections from use of a non-matrix matched standard. However, in most cases, silver and copper...... are analysable for lead isotopes by bulk dissolution or laser ablation MC-ICPMS with simplified sample preparation....

  6. A divergent synthetic approach to diverse molecular scaffolds: assessment of lead-likeness using LLAMA, an open-access computational tool.

    Science.gov (United States)

    Colomer, Ignacio; Empson, Christopher J; Craven, Philip; Owen, Zachary; Doveston, Richard G; Churcher, Ian; Marsden, Stephen P; Nelson, Adam

    2016-06-07

    Complementary cyclisation reactions of hex-2-ene-1,6-diamine derivatives were exploited in the synthesis of alternative molecular scaffolds. The value of the synthetic approach was analysed using LLAMA, an open-access computational tool for assessing the lead-likeness and novelty of molecular scaffolds.

  7. Classical mechanics approach applied to analysis of genetic oscillators.

    Science.gov (United States)

    Vasylchenkova, Anastasiia; Mraz, Miha; Zimic, Nikolaj; Moskon, Miha

    2016-04-05

    Biological oscillators present a fundamental part of several regulatory mechanisms that control the response of various biological systems. Several analytical approaches for their analysis have been reported recently. They are, however, limited to only specific oscillator topologies and/or to giving only qualitative answers, i.e., is the dynamics of an oscillator given the parameter space oscillatory or not. Here we present a general analytical approach that can be applied to the analysis of biological oscillators. It relies on the projection of biological systems to classical mechanics systems. The approach is able to provide us with relatively accurate results in the meaning of type of behaviour system reflects (i.e. oscillatory or not) and periods of potential oscillations without the necessity to conduct expensive numerical simulations. We demonstrate and verify the proposed approach on three different implementations of amplified negative feedback oscillator.

  8. Spectral Synthesis via Mean Field approach Independent Component Analysis

    CERN Document Server

    Hu, Ning; Kong, Xu

    2015-01-01

    In this paper, we apply a new statistical analysis technique, Mean Field approach to Bayesian Independent Component Analysis (MF-ICA), on galaxy spectral analysis. This algorithm can compress the stellar spectral library into a few Independent Components (ICs), and galaxy spectrum can be reconstructed by these ICs. Comparing to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, MF-ICA approach offers a large improvement in the efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter-recover for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters from the Sloan Digital Sky Survey galaxies. We find that our MF-ICA method not only can fit the observed galaxy spectra efficiently, but also can recover the physical parameters of galaxies accurately. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find...

  9. Delayed amputation in lower limb trauma: an analysis of factors leading to delayed amputation.

    Science.gov (United States)

    Thiagarajan, P

    1999-03-01

    An in-depth analysis of the course of events leading to 49 delayed amputation of the lower extremity in 47 patients with open lower limb fractures is presented. Seventeen amputations were performed within one month mainly for vascular reasons. Eleven were between one month and one year, due to persistent sepsis and 21 amputations were performed more than a year after the original injury for infected non-union. Below-knee amputation was done in 32 limbs, above-knee amputation in 13 limbs and Symes' amputation in 4 limbs. The delay in timing of the amputation was analysed with respect to the nature of the injury, the primary treatment and the Mangled Extremity Severity Score (MESS). The MESS score was computed for all injuries and a score of 7 or more predicted an early amputation. We suggest that in all severe lower limb injuries, particularly in Type III C fractures with associated neurological injury, the benefits of an early amputation be considered as an alternative to a limb salvage procedure.

  10. Metals and metalloids in atmospheric dust: Use of lead isotopic analysis for source apportionment

    Science.gov (United States)

    Felix Villar, Omar I.

    Mining activities generate aerosol in a wide range of sizes. Smelting activities produce mainly fine particles ( 1 microm). The adverse effects of aerosols on human health depend mainly on two key characteristics: size and chemical composition. One of the main objectives of this research is to analyze the size distribution of contaminants in aerosol produced by mining operations. For this purpose, a Micro-Orifice Uniform Deposit Impactor (MOUDI) was utilized. Results from the MOUDI samples show higher concentrations of the toxic elements like lead and arsenic in the fine fraction (Mitigation strategies could be developed if the source of contamination is well defined. Environmental conditions as wind speed, wind direction, relative humidity and precipitation have an important role in the concentration of atmospheric dust. Dry environments with low relative humidity are ideal for the transport of aerosols. Results obtained from this research show the relationship between dust concentrations and meteorological parameters. Dust concentrations are highly correlated with relative humidity and wind speed. With all the data collected on site and the analysis of the meteorological parameters, models can be develop to predict the transport of particles as well as the concentration of contaminants at a specific point. These models were developed and are part of the results shown in this dissertation.

  11. Ancient bronze coins from Mediterranean basin: LAMQS potentiality for lead isotopes comparative analysis with former mineral

    Science.gov (United States)

    Torrisi, L.; Italiano, A.; Torrisi, A.

    2016-11-01

    Bronze coins coming from the area of the Mediterranean basin, dated back the II-X Cent. A.D., were analyzed using different physical analytical techniques. Characteristic X-ray fluorescence was used with electrons and photons, in order to investigate the elemental composition of both the surface layers and bulk. Moreover, the quadrupole mass spectrometry coupled to laser ablation (LAMQS technique) in high vacuum was used to analyse typical material compounds from surface contamination. Mass spectrometry, at high resolution and sensitivity, extended up to 300 amu, allowed measuring the 208Pb/206Pb and 207Pb/206Pb isotopic ratios into the coins. Quantitative relative analyses of these isotopic ratios identify the coin composition such as a "fingerprint" depending on the mineral used to extract the lead. Isotopic ratios in coins can be compared to those of the possible minerals used to produce the bronze alloy. A comparison between the measured isotope ratios in the analyzed coins and the literature database, related to the mineral containing Pb as a function of its geological and geophysical extraction mine, is presented. The analysis, restricted to old coins and the mines of the Mediterranean basin, indicates a possible correlation between the coin compositions and the possible geological sites of the extracted mineral.

  12. Raman analysis of ferroelectric switching in niobium-doped lead zirconate titanate thin films

    Energy Technology Data Exchange (ETDEWEB)

    Ferrari, P. [Facultad de Física, Pontificia Universidad Católica de Chile, Santiago 7820436 (Chile); Ramos-Moore, E., E-mail: evramos@fis.puc.cl [Facultad de Física, Pontificia Universidad Católica de Chile, Santiago 7820436 (Chile); Guitar, M.A. [Functional Materials, Materials Science Department, Saarland University, Saarbrücken D-66123 (Germany); Cabrera, A.L. [Facultad de Física, Pontificia Universidad Católica de Chile, Santiago 7820436 (Chile)

    2014-04-01

    Characteristic Raman vibration modes of niobium-doped lead zirconate titanate (PNZT) are studied as a function of ferroelectric domain switching. The microstructure of PNZT is characterized by scanning electron microscopy and X-ray diffraction. Ferroelectric switching is achieved by applying voltages between the top (Au) and bottom (Pt) electrodes, while acquiring the Raman spectra in situ. Vibrational active modes associated with paraelectric and ferroelectric phases are identified after measuring above and below the ferroelectric Curie temperature, respectively. Changes in the relative intensities of the Raman peaks are observed as a function of the switching voltage. The peak area associated with the ferroelectric modes is analyzed as a function of the applied voltage within one ferroelectric polarization loop, showing local maxima around the coercive voltage. This behavior can be understood in terms of the correlation between vibrational and structural properties, since ferroelectric switching modifies the interaction between the body-centered atom (Zr, Ti or Nb) and the Pb–O lattice. - Highlights: • Electric fields induce structural distortions on ferroelectric perovskites. • Ferroelectric capacitor was fabricated to perform hysteresis loops. • Raman analysis was performed in situ during ferroelectric switching. • Raman modes show hysteresis and inflections around the coercive voltages. • Data can be understood in terms of vibrational–structural correlations.

  13. Proteomic analysis of the metabolic adaptation of the biocontrol agent Pseudozyma flocculosa leading to glycolipid production

    Directory of Open Access Journals (Sweden)

    Bélanger Richard R

    2010-02-01

    Full Text Available Abstract The yeast-like epiphytic fungus Pseudozyma flocculosa is known to antagonize powdery mildew fungi through proliferation on colonies presumably preceded by the release of an antifungal glycolipid (flocculosin. In culture conditions, P. flocculosa can be induced to produce or not flocculosin through manipulation of the culture medium nutrients. In order to characterize and understand the metabolic changes in P. flocculosa linked to glycolipid production, we conducted a 2-DE proteomic analysis and compared the proteomic profile of P. flocculosa growing under conditions favoring the development of the fungus (control or conducive to flocculosin synthesis (stress. A large number of protein spots (771 were detected in protein extracts of the control treatment compared to only 435 matched protein spots in extracts of the stress cultures, which clearly suggests an important metabolic reorganization in slow-growing cells producing flocculosin. From the latter treatment, we were able to identify 21 protein spots that were either specific to the treatment or up-regulated significantly (2-fold increase. All of them were identified based on similarity between predicted ORF of the newly sequenced genome of P. flocculosa with Ustilago maydis' available annotated sequences. These proteins were associated with the carbon and fatty acid metabolism, and also with the filamentous change of the fungus leading to flocculosin production. This first look into the proteome of P. flocculosa suggests that flocculosin synthesis is elicited in response to specific stress or limiting conditions.

  14. VLSI architecture of leading eigenvector generation for on-chip principal component analysis spike sorting system.

    Science.gov (United States)

    Chen, Tung-Chien; Liu, Wentai; Chen, Liang-Gee

    2008-01-01

    On-chip spike detection and principal component analysis (PCA) sorting hardware in an integrated multi-channel neural recording system is highly desired to ease the bandwidth bottleneck from high-density microelectrode array implanted in the cortex. In this paper, we propose the first leading eigenvector generator, the key hardware module of PCA, to enable the whole framework. Based on the iterative eigenvector distilling algorithm, the proposed flipped structure enables the low cost and low power implementation by discarding the division and square root hardware units. Further, the proposed adaptive level shifting scheme optimizes the accuracy and area trade off by dynamically increasing the quantization parameter according to the signal level.With the specification of four principal components/channel, 32 samples/spike, and nine bits/sample, the proposed hardware can train 312 channels per minute with 1MHz operation frequency. 0.13 mm(2) silicon area and 282microW power consumption are required in 90 nm 1P9M CMOS process.

  15. The mobility of Atlantic baric depressions leading to intense precipitation over Italy: a preliminary statistical analysis

    Directory of Open Access Journals (Sweden)

    N. Tartaglione

    2006-01-01

    Full Text Available The speed of Atlantic surface depressions, occurred during the autumn and winter seasons and that lead to intense precipitation over Italy from 1951 to 2000, was investigated. Italy was divided into 5 regions as documented in previous climatological studies (based on Principal Component Analysis. Intense precipitation events were selected on the basis of in situ rain gauge data and clustered according to the region that they hit. For each intense precipitation event we tried to identify an associated surface depression and we tracked it, within a large domain covering the Mediterranean and Atlantic regions, from its formation to cyclolysis in order to estimate its speed. 'Depression speeds' were estimated with 6-h resolution and clustered into slow and non-slow classes by means of a threshold, coinciding with the first quartile of speed distribution and depression centre speeds were associated with their positions. Slow speeds occurring over an area including Italy and the western Mediterranean basin showed frequencies higher than 25%, for all the Italian regions but one. The probability of obtaining by chance the observed more than 25% success rate was estimated by means of a binomial distribution. The statistical reliability of the result is confirmed for only one region. For Italy as a whole, results were confirmed at 95% confidence level. Stability of the statistical inference, with respect to errors in estimating depression speed and changes in the threshold of slow depressions, was analysed and essentially confirmed the previous results.

  16. An Approach to Structural Approximation Analysis by Artificial Neural Networks

    Institute of Scientific and Technical Information of China (English)

    陆金桂; 周济; 王浩; 陈新度; 余俊; 肖世德

    1994-01-01

    This paper theoretically proves that a three-layer neural network can be applied to implementing exactly the function between the stresses and displacements and the design variables of any elastic structure based on the Kolmogorov’s mapping neural network existence theorem. A new approach to the structural approximation analysis with the global characteristic based on artificial neural networks is presented. The computer simulation experiments made by this paper show that the new approach is effective.

  17. Advanced approaches to failure mode and effect analysis (FMEA applications

    Directory of Open Access Journals (Sweden)

    D. Vykydal

    2015-10-01

    Full Text Available The present paper explores advanced approaches to the FMEA method (Failure Mode and Effect Analysis which take into account the costs associated with occurrence of failures during the manufacture of a product. Different approaches are demonstrated using an example FMEA application to production of drawn wire. Their purpose is to determine risk levels, while taking account of the above-mentioned costs. Finally, the resulting priority levels are compared for developing actions mitigating the risks.

  18. Illustration and analysis of a coordinated approach to an effective forensic trace evidence capability.

    Science.gov (United States)

    Stoney, David A; Stoney, Paul L

    2015-08-01

    An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation.

  19. A machine learning approach to nonlinear modal analysis

    Science.gov (United States)

    Worden, K.; Green, P. L.

    2017-02-01

    Although linear modal analysis has proved itself to be the method of choice for the analysis of linear dynamic structures, its extension to nonlinear structures has proved to be a problem. A number of competing viewpoints on nonlinear modal analysis have emerged, each of which preserves a subset of the properties of the original linear theory. From the geometrical point of view, one can argue that the invariant manifold approach of Shaw and Pierre is the most natural generalisation. However, the Shaw-Pierre approach is rather demanding technically, depending as it does on the analytical construction of a mapping between spaces, which maps physical coordinates into invariant manifolds spanned by independent subsets of variables. The objective of the current paper is to demonstrate a data-based approach motivated by Shaw-Pierre method which exploits the idea of statistical independence to optimise a parametric form of the mapping. The approach can also be regarded as a generalisation of the Principal Orthogonal Decomposition (POD). A machine learning approach to inversion of the modal transformation is presented, based on the use of Gaussian processes, and this is equivalent to a nonlinear form of modal superposition. However, it is shown that issues can arise if the forward transformation is a polynomial and can thus have a multi-valued inverse. The overall approach is demonstrated using a number of case studies based on both simulated and experimental data.

  20. Different approaches to proximate analysis by thermogravimetry analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mayoral, M.C.; Izquierdo, M.T.; Andres, J.M.; Rubio, B. [Instituto de Carboquimica, CSIC, Maria de Luna, n. 12, 50015- Zaragoza (Spain)

    2001-04-04

    The experimental optimization by the simplex method of the proximate analysis of coal and biomass by thermogravimetry analysis (TGA) is reported. Heating rate, final temperature, holding time, Ar flow rate and sample size were the control variables. The response function used was chosen to minimize the difference in percentage of volatile matter with the ASTM characterization. The relative accuracy of the method was demonstrated by determination of the volatile matter contents of a number of coals in parallel with the ASTM certified method. The method is successfully used with biomass samples.

  1. A global optimization approach to multi-polarity sentiment analysis.

    Directory of Open Access Journals (Sweden)

    Xinmiao Li

    Full Text Available Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG and support vector machines (SVM are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA and grid

  2. Lead Slowing-Down Spectrometry Time Spectral Analysis for Spent Fuel Assay: FY12 Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Kulisek, Jonathan A.; Anderson, Kevin K.; Casella, Andrew M.; Siciliano, Edward R.; Warren, Glen A.

    2012-09-28

    Executive Summary Developing a method for the accurate, direct, and independent assay of the fissile isotopes in bulk materials (such as used fuel) from next-generation domestic nuclear fuel cycles is a goal of the Office of Nuclear Energy, Fuel Cycle R&D, Material Protection and Control Technology (MPACT) Campaign. To meet this goal, MPACT supports a multi-institutional collaboration, of which PNNL is a part, to study the feasibility of Lead Slowing Down Spectroscopy (LSDS). This technique is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic masses in used fuel with an uncertainty considerably lower than the approximately 10% typical of today’s confirmatory methods. This document is a progress report for FY2012 PNNL analysis and algorithm development. Progress made by PNNL in FY2012 continues to indicate the promise of LSDS analysis and algorithms applied to used fuel assemblies. PNNL further refined the semi-empirical model developed in FY2011 based on singular value decomposition (SVD) to numerically account for the effects of self-shielding. The average uncertainty in the Pu mass across the NGSI-64 fuel assemblies was shown to be less than 3% using only six calibration assemblies with a 2% uncertainty in the isotopic masses. When calibrated against the six NGSI-64 fuel assemblies, the algorithm was able to determine the total Pu mass within <2% uncertainty for the 27 diversion cases also developed under NGSI. Two purely empirical algorithms were developed that do not require the use of Pu isotopic fission chambers. The semi-empirical and purely empirical algorithms were successfully tested using MCNPX simulations as well applied to experimental data measured by RPI using their LSDS. The algorithms were able to describe the 235U masses of the RPI measurements with an average uncertainty of 2.3%. Analyses were conducted that provided valuable insight with regard to design requirements (e

  3. The Existence Of Leading Islands Securing And The Border Areas Unitary State Of Indonesia An Analysis In Law Perspective

    Directory of Open Access Journals (Sweden)

    Nazali

    2015-08-01

    Full Text Available Abstract The research was carried with the aim to discover the existence of securing the foremost islands and state border region of the Republic of Indonesia reviewed from a legal perspective which is directly related to the existence of security and dispute resolution methods as well as the governance of the foremost islands and border region in Kalimantan which bordering Malaysia. This study was conducted in Nunukan district and the surrounding provinces of Kalimantan in this research method that used is normative legal analysis data with juridical and qualitative descriptive approach. The results showed that the security of foremost islands and border region of law perspective in accordance with the Law No. 34 of 2004 regarding the Indonesian National Army has not been implemented to the fullest to realize the security of foremost islands and border region as the frontline of the Republic of Indonesia. The existence of leading islands securing and the border region of the Republic of Indonesia still contain many weaknesses in terms of both governance and security.

  4. Temperature Distribution Analysis of JAERI 60 kA HTS Lead

    Institute of Scientific and Technical Information of China (English)

    FUYoukun; T.Isono

    2003-01-01

    High temperature suprerconductor (HTS)current lead has an advantage in reducing electric power consumption of a refrigerator for a large current superconducting magnet system such as a fusion device. A fusion device requires more than 20 pairs of large current leads and each current capacity is about 60 kA. The conventional 60 kA current lead needs 100 kW electric power for refrigeration and a 2/3 reduction is available by the application of a HTS current lead.

  5. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  6. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  7. Modeling Vocabulary Loss——Approach leading to a comprehensive analysis of vocabulary attrition?

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    <正>Ⅰ.Introduction The article the author has chosen entitling itself as Modeling Vocabulary Loss (Applied Linguistics,2004) is composed by Prof.Paul Meara from University of Wales Swansea.The reason has been chosen here is definitely not because of the tentative move

  8. In Vitro And In Vivo Approaches For The Measurement Of Oral Bioavailability Of Lead (Pb) In Contaminated Soils: A Review

    Science.gov (United States)

    We reviewed the published evidence of lead (Pb) contamination of urban soils, soil Pb risk to children through hand-to-mouth activity, reduction of soil Pb bioavailability due to soil amendments, and methods to assess bioaccessibility which correlate with bioavailability of soil ...

  9. Eat, Grow, Lead 4-H: An Innovative Approach to Deliver Campus- Based Field Experiences to Pre-Entry Extension Educators

    Science.gov (United States)

    Weeks, Penny Pennington; Weeks, William G.

    2012-01-01

    Eat, Grow, Lead 4-H Club was created as a pilot program for college students seeking to gain experience as non-formal youth educators, specifically serving pre-entry level Extension educators through a university-based 4-H club. Seventeen student volunteers contributed an estimated 630 hours of service to the club during spring 2011. The club…

  10. Why does electron sharing lead to covalent bonding? A variational analysis.

    Science.gov (United States)

    Ruedenberg, Klaus; Schmidt, Michael W

    2007-01-15

    Ground state energy differences between related systems can be elucidated by a comparative variational analysis of the energy functional, in which the concepts of variational kinetic pressure and variational electrostatic potential pull are found useful. This approach is applied to the formation of the bond in the hydrogen molecule ion. A highly accurate wavefunction is shown to be the superposition of two quasiatomic orbitals, each of which consists to 94% of the respective atomic 1s orbital, the remaining 6% deformation being 73% spherical and 27% nonspherical in character. The spherical deformation can be recovered to 99.9% by scaling the 1s orbital. These results quantify the conceptual metamorphosis of the free-atom wavefunction into the molecular wavefunction by orbital sharing, orbital contraction, and orbital polarization. Starting with the 1s orbital on one atom as the initial trial function, the value of the energy functional of the molecule at the equilibrium distance is stepwise lowered along several sequences of wavefunction modifications, whose energies monotonically decrease to the ground state energy of H2+. The contributions of sharing, contraction and polarization to the overall lowering of the energy functional and their kinetic and potential components exhibit a consistent pattern that can be related to the wavefunction changes on the basis of physical reasoning, including the virial theorem. It is found that orbital sharing lowers the variational kinetic energy pressure and that this is the essential cause of covalent bonding in this molecule.

  11. Review Essay: Conversation Analysis Versus Other Approaches to Discourse

    Directory of Open Access Journals (Sweden)

    Paul Ten Have

    2006-03-01

    Full Text Available This review discusses a recent book by Robin WOOFFITT in which conversation analysis (CA is confronted with some other analytic approaches to "discourse." The author uses the term discourse analysis in a rather specific way, as a label for an analytic tradition that has become prominent in (social psychology in the UK. Two other traditions, critical discourse analysis and Foucauldian discourse analysis are also discussed later in the book. The major criticism raised in the review is that the book’s usefulness is limited by its restriction to approaches currently en vogue in Britain and its selective treatment of CA. In an Epilogue the issues raised in the book are discussed in a wider perspective. URN: urn:nbn:de:0114-fqs060239

  12. Data Warehouse Requirements Analysis Framework: Business-Object Based Approach

    Directory of Open Access Journals (Sweden)

    Anirban Sarkar

    2012-01-01

    Full Text Available Detailed requirements analysis plays a key role towards the design of successful Data Warehouse (DW system. The requirements analysis specifications are used as the prime input for the construction of conceptual level multidimensional data model. This paper has proposed a Business Object based requirements analysis framework for DW system which is supported with abstraction mechanism and reuse capability. It also facilitate the stepwise mapping of requirements descriptions into high level design components of graph semantic based conceptual level object oriented multidimensional data model. The proposed framework starts with the identification of the analytical requirements using business process driven approach and finally refine the requirements in further detail to map into the conceptual level DW design model using either Demand-driven of Mixed-driven approach for DW requirements analysi

  13. Radiometric trace analysis quantitative paper chromatography of lead with phosphate-32P

    NARCIS (Netherlands)

    Erkelens, P.C. van

    1961-01-01

    A method is described for the selective determination of lead in paper chromatograms, down to 1 μg (standard deviation 11%). After development and drying, the lead spot is sprayed with a Na2H32PO4 solution and dried. Excess reagent and alkaline earth phosphates are eluted with a borax—oxalate buffer

  14. Ground Truthing the 'Conventional Wisdom' of Lead Corrosion Control Using Mineralogical Analysis

    Science.gov (United States)

    For drinking water distribution systems (DWDS) with lead-bearing plumbing materials some form of corrosion control is typically necessary, with the goal of mitigating lead release by forming adherent, stable corrosion scales composed of low-solubility mineral phases. Conventional...

  15. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Directory of Open Access Journals (Sweden)

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory.

  16. Overview of the use of ATHENA for thermal-hydraulic analysis of systems with lead-bismuth coolant

    Energy Technology Data Exchange (ETDEWEB)

    C. B. Davis; A. S. Shieh

    2000-04-02

    The INEEL and MIT are investigating the suitability of lead-bismuth cooled fast reactor for producing low-cost electricity as well as for actinide burning. This paper is concerned with the general area of thermal-hydraulics of lead-bismuth cooled reactors. The ATHENA code is being used in the thermal-hydraulic design and analysis of lead-bismuth cooled reactors. The ATHENA code was reviewed to determine its applicability for simulating lead-bismuth cooled reactors. Two modifications were made to the code as a result of this review. Specifically, a correlation to represent heat transfer from rod bundles to a liquid metal and a void correlation based on data taken in a mixture of lead-bismuth and steam were added the code. The paper also summarizes the analytical work that is being performed with the code and plans for future analytical work.

  17. Overview of the Use of ATHENA for Thermal-Hydraulic Analysis of Systems with Lead-Bismuth Coolant

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Cliff Bybee; Shieh, Arthur Shan Luk

    2000-04-01

    The INEEL and MIT are investigating the suitability of lead-bismuth cooled fast reactor for producing low-cost electricity as well as for actinide burning. This paper is concerned with the general area of thermal-hydraulics of lead-bismuth cooled reactors. The ATHENA code is being used in the thermal-hydraulic design and analysis of lead-bismuth cooled reactors. The ATHENA code was reviewed to determine its applicability for simulating lead-bismuth cooled reactors. Two modifications were made to the code as a result of this review. Specifically, a correlation to represent heat transfer from rod bundles to a liquid metal and a void correlation based on data taken in a mixture of lead-bismuth and steam were added the code. The paper also summarizes the analytical work that is being performed with the code and plans for future analytical work.

  18. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    NARCIS (Netherlands)

    Sozer, Hasan; Tekinerdogan, Bedir; Aksit, Mehmet; Lemos, de Rogerio; Gacek, Cristina

    2007-01-01

    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  19. Defibrillation lead placement using a transthoracic transatrial approach in a case without transvenous access due to lack of the right superior vena cava.

    Science.gov (United States)

    Otsuka, Yosuke; Okamura, Hideo; Sato, Syunsuke; Nakajima, Ikutaro; Ishibashi, Kohei; Miyamoto, Kouji; Noda, Takashi; Aiba, Takeshi; Kamakura, Shiro; Kobayashi, Junjiro; Yasuda, Satoshi; Ogawa, Hisao; Kusano, Kengo

    2015-06-01

    A 65-year-old woman with a history of syncope was diagnosed with hypertrophic cardiomyopathy. She had previously undergone mastectomy of the left breast owing to breast cancer. Holter electrocardiogram (ECG) and monitor ECG revealed sick sinus syndrome (Type II) and non-sustained ventricular tachycardia. Sustained ventricular tachycardia and ventricular fibrillation were induced in an electrophysiological study. Although the patient was eligible for treatment with a dual chamber implantable cardioverter defibrillator (ICD), venography revealed lack of the right superior vena cava (R-SVC). Lead placement from the left subclavian vein would have increased the risk of lymphedema owing to the patient׳s mastectomy history. Consequently, the defibrillation lead was placed in the right ventricle by direct puncture of the right auricle through the tricuspid valve. The atrial lead was sutured to the atrial wall, and the postoperative course was unremarkable. Defibrillation lead placement using a transthoracic transatrial approach can be an alternative method in cases where a transvenous approach for lead placement is not feasible.

  20. Screening for cardiovascular safety: a structure-activity approach for guiding lead selection of melanin concentrating hormone receptor 1 antagonists.

    Science.gov (United States)

    Kym, Philip R; Souers, Andrew J; Campbell, Thomas J; Lynch, John K; Judd, Andrew S; Iyengar, Rajesh; Vasudevan, Anil; Gao, Ju; Freeman, Jennifer C; Wodka, Dariusz; Mulhern, Mathew; Zhao, Gang; Wagaw, Seble H; Napier, James J; Brodjian, Sevan; Dayton, Brian D; Reilly, Regina M; Segreti, Jason A; Fryer, Ryan M; Preusser, Lee C; Reinhart, Glenn A; Hernandez, Lisa; Marsh, Kennan C; Sham, Hing L; Collins, Christine A; Polakowski, James S

    2006-04-06

    An inactin-anesthetized rat cardiovascular (CV) assay was employed in a screening mode to triage multiple classes of melanin-concentrating hormone receptor 1 (MCHr1) antagonists. Lead identification was based on a compound profile producing high drug concentration in both plasma (>40 microM) and brain (>20 microg/g) with optimization activities on multiple classes of MCHr1 antagonists were terminated. After providing evidence that the cardiovascular liabilities were not a function of MCHr1 antagonism, continued screening identified the chromone-substituted aminopiperidine amides as a class of MCHr1 antagonists that demonstrated a safe cardiovascular profile at high drug concentrations in both plasma and brain. The high incidence of adverse cardiovascular effects associated with an array of MCHr1 antagonists of significant chemical diversity, combined with the stringent safety requirements for antiobesity drugs, highlight the importance of incorporating cardiovascular safety assessment early in the lead selection process.

  1. An electroanalytical approach for evaluation of biochar adsorption characteristics and its application for lead and cadmium determination.

    Science.gov (United States)

    Suguihiro, Talita Mayumi; de Oliveira, Paulo Roberto; de Rezende, Edivaltrys Inayve Pissinati; Mangrich, Antonio Sálvio; Marcolino, Luiz Humberto; Bergamini, Márcio F

    2013-09-01

    This work describes for first time the use of electroanalytical techniques for evaluation of adsorptive proprieties of biochar using it as electrode modifier and its application for preconcentration and determination of Lead(II) and Cadmium(II) under differential pulse adsorptive voltammetric conditions (DPAdSV). Samples of biochars were obtained from castor oil cake using a predefined set of experimental conditions varying the heating rate (V), final temperature (T) and warm-up period (P) and subsequently used for carbon paste modified electrode (CPME) preparation. The proposed method was applied for Lead(II) and Cadmium(II) determination in spiked simulated industrial effluents and the limit of detection obtained for both metals were adequated for determination of these evaluated ions taking into account the limits established by Brazilian legislation. For all samples analyzed, recoveries ranged from 95% to 104% were obtained and no significative interferences were observed for common cations in water samples.

  2. The tradeoff analysis approach : lessons from Ecuador and Peru

    NARCIS (Netherlands)

    Antle, J.; Stoorvogel, J.J.; Bowen, W.; Crissman, C.; Yanggen, D.

    2003-01-01

    In the first part of this paper we identify some of the factors that have limited the success of impact assessment, including the use of benefit-cost analysis as the paradigm for impact assessment. We then present a new approach to integrated assessment of agricultural production systems called Trad

  3. A Morphogenetic Design Approach with Embedded Structural Analysis

    DEFF Research Database (Denmark)

    Jensen, Mads Brath; Kirkegaard, Poul Henning; Holst, Malene Kirstine

    2010-01-01

    The present paper explores a morphogenetic design approach with embedded structural analysis for architectural design. A material system based on a combined space truss and membrane system has been derived as a growth system with inspiration from natural growth of plants. The structural system is...

  4. A general approach to handling missing values in Procrustes analysis

    NARCIS (Netherlands)

    Albers, Casper J.; Gower, John C.

    2010-01-01

    General Procrustes analysis is concerned with transforming a set of given configuration matrices to closest agreement. This paper introduces an approach useful for handling missing values in the configuration matrices in the context of general linear transformations. Centring and/or standardisation

  5. Semiotic Approach to the Analysis of Children's Drawings

    Science.gov (United States)

    Turkcan, Burcin

    2013-01-01

    Semiotics, which is used for the analysis of a number of communication languages, helps describe the specific operational rules by determining the sub-systems included in the field it examines. Considering that art is a communication language, this approach could be used in analyzing children's products in art education. The present study aiming…

  6. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    Science.gov (United States)

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  7. An Eigencurrent Approach for the Analysis of Leaky Coaxial Cables

    NARCIS (Netherlands)

    Addamo, G.; Bekers, D.J.; Tijhuis, A.G.; Hon, B.P. de; Orta, R.; Tascone, R.

    2006-01-01

    An eigencurrent approach for the analysis of Leaky Coaxial Cables (LCX) is developed in this work. It is based on the determination of approximate eigenvalues and eigenfunctions of the integral operator of the scattering problem. Numerical results show that a CPU-time saving with respect to a full w

  8. An Object-Oriented Approach to Partial Wave Analysis

    OpenAIRE

    Cummings, John P.; Weygand, Dennis P.

    2003-01-01

    Partial Wave Analysis has traditionally been carried out using a set of tools handcrafted for each experiment. By taking an object-oriented approach, the design presented in this paper attempts to create a more generally useful, and easily extensible, environment for analyzing many different type of data.

  9. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    Science.gov (United States)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  10. An eigencurrent approach for the analysis of finite antenna arrays

    NARCIS (Netherlands)

    Bekers, D.J.; Eijndhoven, S.J.L. van; Tijhuis, A.G.

    2009-01-01

    An accurate description of typical finite-array behavior such as edge effects and array resonances is essential in the design of various types of antennas. The analysis approach proposed in this paper is essentially based on the concept of eigencurrents and is capable of describing finite-array beha

  11. The Process Writing Approach: A Meta-Analysis

    Science.gov (United States)

    Graham, Steve; Sandmel, Karin

    2011-01-01

    The process approach to writing instruction is one of the most popular methods for teaching writing. The authors conducted meta-analysis of 29 experimental and quasi-experimental studies conducted with students in Grades 1-12 to examine if process writing instruction improves the quality of students' writing and motivation to write. For students…

  12. An Approach to Scenario Analysis, Generation and Evaluation

    NARCIS (Netherlands)

    Chen, Y.; Van Zuylen, H.J.

    2014-01-01

    This article presents an operation-oriented approach for traffic management scenario generation, analysis and evaluation. We start taking a few most applied scenarios from a traffic control centre, analysing each component and structure of the whole, and evaluating the impact of each component and s

  13. Practical approach on gas pipeline compression system availability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Sidney Pereira dos [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Kurz, Rainer; Lubomirsky, Matvey [Solar Turbines, San Diego, CA (United States)

    2009-12-19

    Gas pipeline projects traditionally have been designed based on load factor and steady state flow. This approach exposes project sponsors to project sustainability risks due to potential losses of revenues and transportation contract penalties related to pipeline capacity shortage as consequence of compressor unit's unavailability. Such unavailability should previously be quantified during the design phase. This paper presents a case study and a methodology that highlights the practical benefits of applying Monte Carlo simulation for the compression system availability analysis in conjunction with quantitative risk analysis and economic feasibility study. Project economics main variables and their impacts on the project NPV (Net Present Value) are evaluated with their respective statistics distribution to quantify risk and support decision makers to adopt mitigating measures to guarantee competitiveness while protecting project sponsors from otherwise unpredictable risks. This practical approach is compared to load factor approach and the results are presented and evaluated. (author)

  14. Increase in Organization Effectiveness Using Voice Analysis: The System Approach

    Directory of Open Access Journals (Sweden)

    Lina Bartkienė

    2011-04-01

    Full Text Available The main purpose of this article is to analyze literature related to the system theory and to present the system of increase in organization effectiveness using voice analysis. The concepts of the system approach were analyzed, the definition of the system, its components and classification were discussed. Following the principles of the system theory, the system of increase in organization effectiveness using voice analysis was designed. Each element was briefly discussed, i.e. processes influencing the employee, the environment, voice analysis system, expert system, prime and final organizational effectiveness. In addition, the relations between these elements were indentified. Article in Lithuanian

  15. EAP, Business English and Swales' approach to genre analysis

    OpenAIRE

    Revilla Vicente, Rosa

    2008-01-01

    One of the most influential genre scholars is John Swales. His 1990 book Genre Analysis is a point of reference in the field and he is considered one of the most widely respected and cited researchers. The aim of this article is to demonstrate how John Swales' approach to genre analysis, which was originally designed to research English in academic and research settings, can al so be used for the textual analysis of occupational genres. The theory he develops in his latest two books (1990, 20...

  16. A Case Study on Methodological Approaches to Conversation Analysis

    Institute of Scientific and Technical Information of China (English)

    廉莲

    2014-01-01

    Taking a piece of social interaction as the object of the study, some basic and brief analysis on how meaning is negotiat-ed is offered from both structural and functional perspectives. The potential purpose is to provide readers with a maybe rough but clear presentation of those assorted methods used in conversation analysis. Out of the presentation is developed a possibility for language learners as well as teachers to be more aware of the differences and also the interrelations among these methodological approaches to conversation analysis, which may be of some relevance to teaching practice.

  17. Hot spot analysis for driving the development of hits into leads in fragment based drug discovery

    OpenAIRE

    Hall, David R.; Ngan, Chi Ho; Zerbe, Brandon S.; Kozakov, Dima; Vajda, Sandor

    2011-01-01

    Fragment based drug design (FBDD) starts with finding fragment-sized compounds that are highly ligand efficient and can serve as a core moiety for developing high affinity leads. Although the core-bound structure of a protein facilitates the construction of leads, effective design is far from straightforward. We show that protein mapping, a computational method developed to find binding hot spots and implemented as the FTMap server, provides information that complements the fragment screening...

  18. Environmental determinants of different Blood Lead Levels in children: a quantile analysis from a nationwide survey.

    OpenAIRE

    Etchevers, Anne; Le Tertre, Alain; Lucas, Jean-Paul; Bretin, Philippe; Oulhote, Youssef; Le Bot, Barbara; Glorennec, Philippe

    2015-01-01

    International audience; Background: Blood Lead Levels (BLLs) have substantially decreased in recent decades in children in France. However, further reducing exposure is a public health goal because there is no clear toxicological threshold. The identification of the environmental determinants of BLLs as well as risk factors associated with high BLLs is important to update prevention strategies. We aimed to estimate the contribution of environmental sources of lead to different BLLs in childre...

  19. The Risk Factors of Child Lead Poisoning in China: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    You Li

    2016-03-01

    Full Text Available Background: To investigate the risk factors of child lead poisoning in China. Methods: A document retrieval was performed using MeSH (Medical subject heading terms and key words. The Newcastle-Ottawa Scale (NOS was used to assess the quality of the studies, and the pooled odd ratios with a 95% confidence interval were used to identify the risk factors. We employed Review Manager 5.2 and Stata 10.0 to analyze the data. Heterogeneity was assessed by both the Chi-square and I2 tests, and publication bias was evaluated using a funnel plot and Egger’s test. Results: Thirty-four articles reporting 13,587 lead-poisoned children met the inclusion criteria. Unhealthy lifestyle and behaviors, environmental pollution around the home and potential for parents’ occupational exposure to lead were risk factors of child lead poisoning in the pooled analyses. Our assessments yielded no severe publication biases. Conclusions: Seventeen risk factors are associated with child lead poisoning, which can be used to identify high-risk children. Health education and promotion campaigns should be designed in order to minimize or prevent child lead poisoning in China.

  20. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  1. Small molecule screening in zebrafish: an in vivo approach to identifying new chemical tools and drug leads

    Directory of Open Access Journals (Sweden)

    Patton E Elizabeth

    2010-06-01

    Full Text Available Abstract In the past two decades, zebrafish genetic screens have identified a wealth of mutations that have been essential to the understanding of development and disease biology. More recently, chemical screens in zebrafish have identified small molecules that can modulate specific developmental and behavioural processes. Zebrafish are a unique vertebrate system in which to study chemical genetic systems, identify drug leads, and explore new applications for known drugs. Here, we discuss some of the advantages of using zebrafish in chemical biology, and describe some important and creative examples of small molecule screening, drug discovery and target identification.

  2. Spin structure function g_1 at small x and arbitrary $Q^2: Total resummaion of leading logarithms vs Standard Approach

    CERN Document Server

    Ermolaev, B I; Troyan, S I

    2007-01-01

    The Standard Approach (SA) for description of the structure function g_1 combines the DGLAP evolution equations and Standard Fits for the initial parton densities. The DGLAP equations describe the region of large Q^2 and large x, so there are not theoretical grounds to exploit them at small x. In practice, extrapolation of DGLAP into the region of large Q^2 and small x is done with complementing DGLAP with special, singular (~x^{-a}) phenomenological fits for the initial parton densities. The factors x^{-a} are wrongly believed to be of the non-perturbative origin. Actually, they mimic the resummation of logs of x and should be expelled from the fits when the resummation is accounted for. Contrary to SA, the resummaton of logarithms of x is a straightforward and natural way to describe g_1 in the small-x region. This approach can be used at both large and small Q^2 where DGLAP cannot be used by definition. Confronting this approach and SA demonstrates that the singular initial parton densities and the power Q...

  3. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Directory of Open Access Journals (Sweden)

    Edi Sutoyo

    Full Text Available Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  4. An Efficient Soft Set-Based Approach for Conflict Analysis

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  5. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  6. A systematic approach to multifactorial cardiovascular disease: causal analysis.

    Science.gov (United States)

    Schwartz, Stephen M; Schwartz, Hillel T; Horvath, Steven; Schadt, Eric; Lee, Su-In

    2012-12-01

    The combination of systems biology and large data sets offers new approaches to the study of cardiovascular diseases. These new approaches are especially important for the common cardiovascular diseases that have long been described as multifactorial. This promise is undermined by biologists' skepticism of the spider web-like network diagrams required to analyze these large data sets. Although these spider webs resemble composites of the familiar biochemical pathway diagrams, the complexity of the webs is overwhelming. As a result, biologists collaborate with data analysts whose mathematical methods seem much like those of experts using Ouija boards. To make matters worse, it is not evident how to design experiments when the network implies that many molecules must be part of the disease process. Our goal is to remove some of this mystery and suggest a simple experimental approach to the design of experiments appropriate for such analysis. We will attempt to explain how combinations of data sets that include all possible variables, graphical diagrams, complementation of different data sets, and Bayesian analyses now make it possible to determine the causes of multifactorial cardiovascular disease. We will describe this approach using the term causal analysis. Finally, we will describe how causal analysis is already being used to decipher the interactions among cytokines as causes of cardiovascular disease.

  7. Study of lead accumulation in bones of Wistar rats by X-ray fluorescence analysis: aging effect.

    Science.gov (United States)

    Guimarães, Diana; Carvalho, Maria Luísa; Geraldes, Vera; Rocha, Isabel; Santos, José Paulo

    2012-01-01

    The accumulation of lead in several bones of Wistar rats with time was determined and compared for the different types of bones. Two groups were studied: a control group (n = 20), not exposed to lead and a contaminated group (n = 30), exposed to lead from birth, first indirectly through mother's milk, and then directly through a diet containing lead acetate in drinking water (0.2%). Rats age ranged from 1 to 11 months, with approximately 1 month intervals and each of the collections had 3 contaminated rats and 2 control rats. Iliac, femur, tibia-fibula and skull have been analysed by Energy Dispersive X-ray Fluorescence Technique (EDXRF). Samples of formaldehyde used to preserve the bone tissues were also analysed by Electrothermal Atomic Absorption (ETAAS), showing that there was no significant loss of lead from the tissue to the preservative. The bones mean lead concentration of exposed rats range from 100 to 300 μg g(-1) while control rats never exceeded 10 μg g(-1). Mean bone lead concentrations were compared and the concentrations were higher in iliac, femur and tibia-fibula and after that skull. However, of all the concentrations in the different collections, only those in the skull were statistically significantly different (p bones. Analysis of a radar chart also allowed us to say that these differences tend to diminish with age. The Spearman correlation test applied to mean lead concentrations showed strong and very strong positive correlations between all different types of bones. This test also showed that mean lead concentrations in bones are negatively correlated with the age of the animals. This correlation is strong in iliac and femur and very strong in tibia-fibula and skull. It was also shown that the decrease of lead accumulation with age is made by three plateaus of accumulation, which coincide, in all analysed bones, between 2nd-3rd and 9th-10th months.

  8. Applications of Crown Ether Cross-Linked Chitosan for the Analysis of Lead and Cadmium in Environmental Water Samples

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A new type of crown ether cross-linked chitosan was synthesized by the reaction of chitosan with 4,4'-dibromodibenzo-18-crown-6 (Br-DBC). Its token structure was analyzed with FT-IR and NMR and the adsorption behaviors for lead and cadmium in environmental water samples by FAAS were studied. In addition the best analysis conditions were discussed and the adsorption mechanism was explained. As the enrichment factor is above 100, both recoveries are 94%-106%, the detection limits of lead and cadmium are 0.5μg*L-1and 0.04 μg*L-1 and the relatively standard deviations of lead and cadmium are 3.1% and 2.8% respectively, this new method was successfully applied to the determination of environmental water samples. This method is fast and simple and it greatly enhances the determination ability of FAAS for lead and cadmium.

  9. A New Approach to Pointer Analysis for Assignments

    Institute of Scientific and Technical Information of China (English)

    黄波; 臧斌宇; 等

    2001-01-01

    Pointer analysis is a technique to identify at copile-time the potential values of the pointer expressions in a program,which promises significant benefits for optimzing and parallelizing complilers.In this paper,a new approach to pointer analysis for assignments is presented.In this approach,assignments are classified into three categories:pointer assignments,structure(union)assignents and normal assignments which don't affect the point-to information.Pointer analyses for these three kinds of assignments respectively make up the integrated algorithm.When analyzing a pointer assigemtn.a new method called expression expansion is used to calculate both the left targets and the right targets.The integration of recursive data structure analysis into pointer analysis is a significant originality of this paper,which uniforms the pointer analysis for heap variables and the pointer analysis for stack variables.This algorithm is implemented in Agassiz,an analyzing tool for C programs developed by Institute of Parallel Processing,Fudan University,Its accuracy and effectiveness are illustrated by experimental data.

  10. A New Approach to Pointer Analysis for Assignments

    Institute of Scientific and Technical Information of China (English)

    HUANG Bo; ZANG Binyu; LI Jing; ZHU Chuanqi

    2001-01-01

    Pointer analysis is a technique to identify at compile-time the po tential values of the pointer expressions in a program, which promises significant benefits for optimizing and parallelizing compilers. In this paper, a new approach to pointer analysis for assignments is presented. In this approach, assignments are clas sified into three categories: pointer assignments, structure (union) assignments and normal assignments which don't affect the point-to information. Pointer analyses for these three kinds of assignments respectively make up the integrated algorithm. When analyzing a pointer assignment, a new method called expression expansion is used to calculate both the left targets and the right targets. The integration of recursive data structure analysis into pointer analysis is a significant originality of this paper, which uniforms the pointer analysis for heap variables and the pointer analysis for stack variables. This algorithm is implemented in Agassiz, an analyzing tool for C programs developed by Institute of Parallel Processing, Fudan University. Its accuracy and effectiveness are illustrated by experimental data.

  11. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  12. Hierarchical Cluster Analysis – Various Approaches to Data Preparation

    Directory of Open Access Journals (Sweden)

    Z. Pacáková

    2013-09-01

    Full Text Available The article deals with two various approaches to data preparation to avoid multicollinearity. The aim of the article is to find similarities among the e-communication level of EU states using hierarchical cluster analysis. The original set of fourteen indicators was first reduced on the basis of correlation analysis while in case of high correlation indicator of higher variability was included in further analysis. Secondly the data were transformed using principal component analysis while the principal components are poorly correlated. For further analysis five principal components explaining about 92% of variance were selected. Hierarchical cluster analysis was performed both based on the reduced data set and the principal component scores. Both times three clusters were assumed following Pseudo t-Squared and Pseudo F Statistic, but the final clusters were not identical. An important characteristic to compare the two results found was to look at the proportion of variance accounted for by the clusters which was about ten percent higher for the principal component scores (57.8% compared to 47%. Therefore it can be stated, that in case of using principal component scores as an input variables for cluster analysis with explained proportion high enough (about 92% for in our analysis, the loss of information is lower compared to data reduction on the basis of correlation analysis.

  13. Scaled Facility Design Approach for Pool-Type Lead-Bismuth Eutectic Cooled Small Modular Reactor Utilizing Natural Circulation

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sangrok; Shin, Yong-Hoon; Lee, Jueun; Hwang, Il Soon [Seoul National University, Seoul (Korea, Republic of)

    2015-10-15

    In low carbon era, nuclear energy is the most prominent energy source of electricity. For steady ecofriendly nuclear energy supply, Generation IV reactors which are future nuclear reactor require safety, sustainability, economics and non-proliferation as four criteria. Lead cooled fast reactor (LFR) is one of these reactor type and Generation IV international forum (GIF) adapted three reference LFR systems which are a small and movable systems with long life without refueling, intermediate size and huge electricity generation system for power grid. NUTRECK (Nuclear Transmutation Energy Center of Korea) has been designed reactor called URANUS (Ubiquitous, Rugged, Accident-forgiving, Non-proliferating, and Ultra-lasting Sustainer) which is small modular reactor and using lead-bismuth eutectic coolant. To prove natural circulation capability of URANUS and analyze design based accidents, scaling mock-up experiment facility will be constructed. In this paper, simple specifications of URANUS will be presented. Then based on this feature, scaling law and scaled facility design results are presented. To validate safety feature and thermodynamics characteristic of URANUS, scaled mockup facility of URANUS is designed based on the scaling law. This mockup adapts two area scale factors, core and lower parts of mock-up are scaled for 3D flow experiment. Upper parts are scaled different size to reduce electricity power and LBE tonnage. This hybrid scaling method could distort some thermal-hydraulic parameters, however, key parameters for experiment will be matched for up-scaling. Detailed design of mock-up will be determined through iteration for design optimization.

  14. Structural diversity of biologically interesting datasets: a scaffold analysis approach

    Directory of Open Access Journals (Sweden)

    Khanna Varun

    2011-08-01

    Full Text Available Abstract Background The recent public availability of the human metabolome and natural product datasets has revitalized "metabolite-likeness" and "natural product-likeness" as a drug design concept to design lead libraries targeting specific pathways. Many reports have analyzed the physicochemical property space of biologically important datasets, with only a few comprehensively characterizing the scaffold diversity in public datasets of biological interest. With large collections of high quality public data currently available, we carried out a comparative analysis of current day leads with other biologically relevant datasets. Results In this study, we note a two-fold enrichment of metabolite scaffolds in drug dataset (42% as compared to currently used lead libraries (23%. We also note that only a small percentage (5% of natural product scaffolds space is shared by the lead dataset. We have identified specific scaffolds that are present in metabolites and natural products, with close counterparts in the drugs, but are missing in the lead dataset. To determine the distribution of compounds in physicochemical property space we analyzed the molecular polar surface area, the molecular solubility, the number of rings and the number of rotatable bonds in addition to four well-known Lipinski properties. Here, we note that, with only few exceptions, most of the drugs follow Lipinski's rule. The average values of the molecular polar surface area and the molecular solubility in metabolites is the highest while the number of rings is the lowest. In addition, we note that natural products contain the maximum number of rings and the rotatable bonds than any other dataset under consideration. Conclusions Currently used lead libraries make little use of the metabolites and natural products scaffold space. We believe that metabolites and natural products are recognized by at least one protein in the biosphere therefore, sampling the fragment and scaffold

  15. Convergence Analysis of a Class of Computational Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Junfeng Chen

    2013-01-01

    Full Text Available Computational intelligence approaches is a relatively new interdisciplinary field of research with many promising application areas. Although the computational intelligence approaches have gained huge popularity, it is difficult to analyze the convergence. In this paper, a computational model is built up for a class of computational intelligence approaches represented by the canonical forms of generic algorithms, ant colony optimization, and particle swarm optimization in order to describe the common features of these algorithms. And then, two quantification indices, that is, the variation rate and the progress rate, are defined, respectively, to indicate the variety and the optimality of the solution sets generated in the search process of the model. Moreover, we give four types of probabilistic convergence for the solution set updating sequences, and their relations are discussed. Finally, the sufficient conditions are derived for the almost sure weak convergence and the almost sure strong convergence of the model by introducing the martingale theory into the Markov chain analysis.

  16. ANALYSIS AND IMPROVEMENT OF LEAD TIME FOR JOB SHOP UNDER MIXED PRODUCTION SYSTEM

    Institute of Scientific and Technical Information of China (English)

    CHE Jianguo; HE Zhen; EDWARB M Knod

    2006-01-01

    Firstly an overview of the potential impact on work-in-process (WIP) and lead time is provided when transfer lot sizes are undifferentiated from processing lot sizes. Simple performance examples are compared to those from a shop with one-piece transfer lots. Next, a mathematical programming model for minimizing lead time in the mixed-model job shop is presented, in which one-piece transfer lots are used. Key factors affecting lead time are found by analyzing the sum of the longest setup time of individual items among the shared processes (SLST) and the longest processing time of individual items among processes (LPT). And lead time can be minimized by cutting down the SLST and LPT. Reduction of the SLST is described as a traveling salesman problem (TSP), and the minimum of the SLST is solved through job shop scheduling. Removing the bottleneck and leveling the production line optimize the LPT. If the number of items produced is small, the routings are relatively short, and items and facilities are changed infrequently, the optimal schedule will remain valid. Finally a brief example serves to illustrate the method.

  17. An Analysis of Song-Leading by Kindergarten Teachers in Taiwan and the USA

    Science.gov (United States)

    Liao, Mei-Ying; Campbell, Patricia Shehan

    2014-01-01

    The purpose of this study was to examine components of the song-leading process used by kindergarten teachers in Taiwan and the United States, including the critical matter of starting pitch. Five public school kindergarten teachers in Taipei, Taiwan, and five public kindergarten teachers in Seattle, USA, were invited to participate in this study…

  18. Method of analysis for the determination of lead and cadmium in fresh meat

    NARCIS (Netherlands)

    Ruig, de W.G.

    1980-01-01

    This report comprises the result of the RIKILT of an intercomparison on the determination of lead and cadmium in bovine liver and bovine kidney. The aim of this round robbin was to check a wet ashing procedure followed by a flame AAS determination as described too in EEC doc. 2266/VI/77. Special att

  19. Extending dynamic segmentation with lead generation : A latent class Markov analysis of financial product portfolios

    NARCIS (Netherlands)

    Paas, L.J.; Bijmolt, T.H.A.; Vermunt, J.K.

    2004-01-01

    A recent development in marketing research concerns the incorporation of dynamics in consumer segmentation.This paper extends the latent class Markov model, a suitable technique for conducting dynamic segmentation, in order to facilitate lead generation.We demonstrate the application of the latent M

  20. Information Retrieval and Graph Analysis Approaches for Book Recommendation.

    Science.gov (United States)

    Benkoussas, Chahinez; Bellot, Patrice

    2015-01-01

    A combination of multiple information retrieval approaches is proposed for the purpose of book recommendation. In this paper, book recommendation is based on complex user's query. We used different theoretical retrieval models: probabilistic as InL2 (Divergence from Randomness model) and language model and tested their interpolated combination. Graph analysis algorithms such as PageRank have been successful in Web environments. We consider the application of this algorithm in a new retrieval approach to related document network comprised of social links. We called Directed Graph of Documents (DGD) a network constructed with documents and social information provided from each one of them. Specifically, this work tackles the problem of book recommendation in the context of INEX (Initiative for the Evaluation of XML retrieval) Social Book Search track. A series of reranking experiments demonstrate that combining retrieval models yields significant improvements in terms of standard ranked retrieval metrics. These results extend the applicability of link analysis algorithms to different environments.

  1. Adaptive Molecular Resolution Approach in Hamiltonian Form: An Asymptotic Analysis

    CERN Document Server

    Zhu, Jinglong; Site, Luigi Delle

    2016-01-01

    Adaptive Molecular Resolution approaches in Molecular Dynamics are becoming relevant tools for the analysis of molecular liquids characterized by the interplay of different physical scales. The essential difference among these methods is in the way the change of molecular resolution is made in a buffer/transition region. In particular a central question concerns the possibility of the existence of a global Hamiltonian which, by describing the change of resolution, is at the same time physically consistent, mathematically well defined and numerically accurate. In this paper we present an asymptotic analysis of the adaptive process complemented by numerical results and show that under certain mathematical conditions a Hamiltonian, which is physically consistent and numerically accurate, may exist. \\blue{Such conditions show that molecular simulations in the current computational implementation require systems of large size and thus a Hamiltonian approach as the one proposed, at this stage, would not be practica...

  2. A COMPARATIVE ANALYSIS OF ASEAN CURRENCIES USING A COPULA APPROACH AND A DYNAMIC COPULA APPROACH

    Directory of Open Access Journals (Sweden)

    CHUKIAT CHAIBOONSRI

    2012-12-01

    Full Text Available The ASEAN Economic Community (AEC will be shaped developing to be a single market and production base in 2015, moving towards regional Economic Integration, 2009. These developments in international financial markets do lead to some adverse cost for AEC country borrowers. The specific objective aims to investigate the dependent measures and the co-movement among selected ASEAN currencies. A Copula Approach was used to examine dependent measures of Thai Baht exchange rate among selected ASEAN currencies during the period of 2008-2011. Also, a Dynamic Copula Approach was tested to investigate the co-movement of Thai Baht exchange rate among selected ASEAN currencies during the period of 2008-2011. The results of the study based on a Pearson linear correlation coefficient confirmed that Thai Baht exchange rate and each of selected ASEAN currencies have a linear correlation during the specific period excluding Vietnam exchange rate. Furthermore, based on empirical Copula Approach, Thai Baht exchange rate had a dependent structure with each of the selected in ASEAN currencies including Brunei exchange rate, Singapore exchange rate, Malaysia exchange rate, Indonesia exchange rate, Philippine exchange rate, and Vietnam exchange rate respectively. The results of Dynamic Copula estimation indicated that Thai Baht exchange rate had a co-movement with selected ASEAN currencies. The research results provide an informative and interactive ASEAN financial market to all users, including Global financial market.

  3. Environmental determinants of different blood lead levels in children: a quantile analysis from a nationwide survey.

    Science.gov (United States)

    Etchevers, Anne; Le Tertre, Alain; Lucas, Jean-Paul; Bretin, Philippe; Oulhote, Youssef; Le Bot, Barbara; Glorennec, Philippe

    2015-01-01

    Blood lead levels (BLLs) have substantially decreased in recent decades in children in France. However, further reducing exposure is a public health goal because there is no clear toxicological threshold. The identification of the environmental determinants of BLLs as well as risk factors associated with high BLLs is important to update prevention strategies. We aimed to estimate the contribution of environmental sources of lead to different BLLs in children in France. We enrolled 484 children aged from 6months to 6years, in a nationwide cross-sectional survey in 2008-2009. We measured lead concentrations in blood and environmental samples (water, soils, household settled dusts, paints, cosmetics and traditional cookware). We performed two models: a multivariate generalized additive model on the geometric mean (GM), and a quantile regression model on the 10th, 25th, 50th, 75th and 90th quantile of BLLs. The GM of BLLs was 13.8μg/L (=1.38μg/dL) (95% confidence intervals (CI): 12.7-14.9) and the 90th quantile was 25.7μg/L (CI: 24.2-29.5). Household and common area dust, tap water, interior paint, ceramic cookware, traditional cosmetics, playground soil and dust, and environmental tobacco smoke were associated with the GM of BLLs. Household dust and tap water made the largest contributions to both the GM and the 90th quantile of BLLs. The concentration of lead in dust was positively correlated with all quantiles of BLLs even at low concentrations. Lead concentrations in tap water above 5μg/L were also positively correlated with the GM, 75th and 90th quantiles of BLLs in children drinking tap water. Preventative actions must target household settled dust and tap water to reduce the BLLs of children in France. The use of traditional cosmetics should be avoided whereas ceramic cookware should be limited to decorative purposes.

  4. A CONTENT ANALYSIS ON PROBLEM-BASED LEARNING APPROACH

    OpenAIRE

    BİBER, Mahir; Esen ERSOY; KÖSE BİBER, Sezer

    2014-01-01

    Problem Based Learning is one of the learning models that contain the general principles of active learning and students can use scientific process skills. Within this research it was aimed to investigate in detail the postgraduate thesis held in Turkey about PBL approach. The content analysis method was used in the research. The study sample was consisted of a total of 64 masters and PhD thesis made between the years 2012-2013 and reached over the web. A “Content Analysis Template” prepared ...

  5. A CONTENT ANALYSIS ON PROBLEM-BASED LEARNING APPROACH

    OpenAIRE

    BİBER, Mahir; Esen ERSOY; KÖSE BİBER, Sezer

    2015-01-01

    Problem Based Learning is one of the learning models that contain the general principles of active learning and students can use scientific process skills. Within this research it was aimed to investigate in detail the postgraduate thesis held in Turkey about PBL approach. The content analysis method was used in the research. The study sample was consisted of a total of 64 masters and PhD thesis made between the years 2012-2013 and reached over the web. A “Content Analysis Template” prepared ...

  6. Scientific and methodical approaches to analysis of enterprise development potential

    Directory of Open Access Journals (Sweden)

    Hrechina Iryna V.

    2014-01-01

    Full Text Available The modern state of the Ukrainian economy urge enterprises to search for new possibilities of their development, which makes the study subject topical. The article systemises existing approaches to analysis of the potential of enterprise development and marks out two main scientific approaches: first is directed at analysis of prospects of self-development of the economic system; the second – at analysis of probability of possibilities of growth. In order to increase the quality of the process of formation of methods of analysis of potential of enterprise development, the article offers an organisation model of methods and characterises its main elements. It develops methods of analysis, in the basis of which there are indicators of potentialogical sustainability. Scientific novelty of the obtained results lies in a possibility of identification of main directions of enterprise development with the use of the enterprise development potential ration: self-development or probability of augmenting opportunities, which is traced through interconnection of resources and profit.

  7. Spectral Synthesis via Mean Field approach to Independent Component Analysis

    Science.gov (United States)

    Hu, Ning; Su, Shan-Shan; Kong, Xu

    2016-03-01

    We apply a new statistical analysis technique, the Mean Field approach to Independent Component Analysis (MF-ICA) in a Bayseian framework, to galaxy spectral analysis. This algorithm can compress a stellar spectral library into a few Independent Components (ICs), and the galaxy spectrum can be reconstructed by these ICs. Compared to other algorithms which decompose a galaxy spectrum into a combination of several simple stellar populations, the MF-ICA approach offers a large improvement in efficiency. To check the reliability of this spectral analysis method, three different methods are used: (1) parameter recovery for simulated galaxies, (2) comparison with parameters estimated by other methods, and (3) consistency test of parameters derived with galaxies from the Sloan Digital Sky Survey. We find that our MF-ICA method can not only fit the observed galaxy spectra efficiently, but can also accurately recover the physical parameters of galaxies. We also apply our spectral analysis method to the DEEP2 spectroscopic data, and find it can provide excellent fitting results for low signal-to-noise spectra.

  8. Total-System Approach To Design And Analysis Of Structures

    Science.gov (United States)

    Verderaime, V.

    1995-01-01

    Paper presents overview and study of, and comprehensive approach to, multidisciplinary engineering design and analysis of structures. Emphasizes issues related to design of semistatic structures in environments in which spacecraft launched, underlying concepts applicable to other structures within unique terrestrial, marine, or flight environments. Purpose of study to understand interactions among traditionally separate engineering design disciplines with view toward optimizing not only structure but also overall design process.

  9. Data Analysis A Model Comparison Approach, Second Edition

    CERN Document Server

    Judd, Charles M; Ryan, Carey S

    2008-01-01

    This completely rewritten classic text features many new examples, insights and topics including mediational, categorical, and multilevel models. Substantially reorganized, this edition provides a briefer, more streamlined examination of data analysis. Noted for its model-comparison approach and unified framework based on the general linear model, the book provides readers with a greater understanding of a variety of statistical procedures. This consistent framework, including consistent vocabulary and notation, is used throughout to develop fewer but more powerful model building techniques. T

  10. Performance Analysis of STFT Based Timing Approach to OFDM Systems

    Institute of Scientific and Technical Information of China (English)

    KUANG Yu-jun; TENG Yong; YIN Chang-chuan; HAO Jian-jun; YUE Guang-xin

    2003-01-01

    This paper mainly focuses on performance analysis of the previously proposed STFT based 2-D timing approach to OFDM systems and presents simulations results of its performance in AWGN and multipath fading environment and its robustness against the duration of Channel Impulse Response (CIR) and frequency offset. Simulation results suggest that a revised version of Short-Time Fourier Transform (STFT) can be used to greatly reduce computational complexity, especially at higher SNR.

  11. Generating function approach to reliability analysis of structural systems

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The generating function approach is an important tool for performance assessment in multi-state systems. Aiming at strength reliability analysis of structural systems, generating function approach is introduced and developed. Static reliability models of statically determinate, indeterminate systems and fatigue reliability models are built by constructing special generating functions, which are used to describe probability distributions of strength (resistance), stress (load) and fatigue life, by defining composite operators of generating functions and performance structure functions thereof. When composition operators are executed, computational costs can be reduced by a big margin by means of collecting like terms. The results of theoretical analysis and numerical simulation show that the generating function approach can be widely used for probability modeling of large complex systems with hierarchical structures due to the unified form, compact expression, computer program realizability and high universality. Because the new method considers twin loads giving rise to component failure dependency, it can provide a theoretical reference and act as a powerful tool for static, dynamic reliability analysis in civil engineering structures and mechanical equipment systems with multi-mode damage coupling.

  12. An Unified Approach for Process Quality Analysis and Control

    Directory of Open Access Journals (Sweden)

    Prof.Chandrakanth Biradar,Aruna Kawdi

    2012-09-01

    Full Text Available Abstract— The process in a company finally results in product of the company, which represents the company standard. Hence, during the process execution time quality of the process needs to be taken care before and after the work done. In this paper, an unified approach to quality analysis and control of a process development is presented. This approach gives an overview of what the task the company assigned to the employers. The process is defined as the set of action items to achieve the work completion. Quality means grade of excellence. Quality analysis of a process is an improvement of the process and making sure that all the standard procedures are followed. An unified approach designed in this paper is a combination of software cost estimation and a financial market forecasting with the support of historical data, statistical data mining technique and artificial neural networks, which helps the developers as well as investors in strategic planning and investment decision making. Therefore, the paper describes a recommended process to develop software (SW cost estimates for software managers, perform financial market forecasting to control quality of process development. As a result, the improvement and analysis of the process quality can be performed from basic level to the corporate level. By this work, we conclude that the process quality control can be made easier and efficient compared to the old graphical analytics technique.

  13. Further results on the L1 analysis of sampled-data systems via kernel approximation approach

    Science.gov (United States)

    Kim, Jung Hoon; Hagiwara, Tomomichi

    2016-08-01

    This paper gives two methods for the L1 analysis of sampled-data systems, by which we mean computing the L∞-induced norm of sampled-data systems. This is achieved by developing what we call the kernel approximation approach in the setting of sampled-data systems. We first consider the lifting treatment of sampled-data systems and give an operator theoretic representation of their input/output relation. We further apply the fast-lifting technique by which the sampling interval [0, h) is divided into M subintervals with an equal width, and provide methods for computing the L∞-induced norm. In contrast to a similar approach developed earlier called the input approximation approach, we use an idea of kernel approximation, in which the kernel function of an input operator and the hold function of an output operator are approximated by piecewise constant or piecewise linear functions. Furthermore, it is shown that the approximation errors in the piecewise constant approximation or piecewise linear approximation scheme converge to 0 at the rate of 1/M or 1/M2, respectively. In comparison with the existing input approximation approach, in which the input function (rather than the kernel function) of the input operator is approximated by piecewise constant or piecewise linear functions, we show that the kernel approximation approach gives improved computation results. More precisely, even though the convergence rates in the kernel approximation approach remain qualitatively the same as those in the input approximation approach, the newly developed former approach could lead to quantitatively improved approximation errors than the latter approach particularly when the piecewise linear approximation scheme is taken. Finally, a numerical example is given to demonstrate the effectiveness of the kernel approximation approach with this scheme.

  14. Comparison of Machine Learning Approaches on Arabic Twitter Sentiment Analysis

    Directory of Open Access Journals (Sweden)

    Merfat.M. Altawaier

    2016-12-01

    Full Text Available With the dramatic expansion of information over internet, users around the world express their opinion daily on the social network such as Facebook and Twitter. Large corporations nowadays invest on analyzing these opinions in order to assess their products or services by knowing the people feedback toward such business. The process of knowing users’ opinions toward particular product or services whether positive or negative is called sentiment analysis. Arabic is one of the common languages that have been addressed regarding sentiment analysis. In the literature, several approaches have been proposed for Arabic sentiment analysis and most of these approaches are using machine learning techniques. Machine learning techniques are various and have different performances. Therefore, in this study, we try to identifying a simple, but workable approach for Arabic sentiment analysis on Twitter. Hence, this study aims to investigate the machine learning technique in terms of Arabic sentiment analysis on Twitter. Three techniques have been used including Naïve Bayes, Decision Tree (DT and Support Vector Machine (SVM. In addition, two simple sub-tasks pre-processing have been also used; Term Frequency-Inverse Document Frequency (TF-IDF and Arabic stemming to get the heaviest weight term as the feature for tweet classification. TF-IDF aims to identify the most frequent words, whereas stemming aims to retrieve the stem of the word by removing the inflectional derivations. The dataset that has been used is Modern Arabic Corpus which consists of Arabic tweets. The performance of classification has been evaluated based on the information retrieval metrics precision, recall and f-measure. The experimental results have shown that DT has outperformed the other techniques by obtaining 78% of f-measure.

  15. A new approach to the long-term reconstruction of the solar irradiance leads to large historical solar forcing

    Science.gov (United States)

    Shapiro, A. I.; Schmutz, W.; Rozanov, E.; Schoell, M.; Haberreiter, M.; Shapiro, A. V.; Nyeki, S.

    2011-05-01

    Context. The variable Sun is the most likely candidate for the natural forcing of past climate changes on time scales of 50 to 1000 years. Evidence for this understanding is that the terrestrial climate correlates positively with the solar activity. During the past 10 000 years, the Sun has experienced the substantial variations in activity and there have been numerous attempts to reconstruct solar irradiance. While there is general agreement on how solar forcing varied during the last several hundred years - all reconstructions are proportional to the solar activity - there is scientific controversy on the magnitude of solar forcing. Aims: We present a reconstruction of the total and spectral solar irradiance covering 130 nm-10 μm from 1610 to the present with an annual resolution and for the Holocene with a 22-year resolution. Methods: We assume that the minimum state of the quiet Sun in time corresponds to the observed quietest area on the present Sun. Then we use available long-term proxies of the solar activity, which are 10Be isotope concentrations in ice cores and 22-year smoothed neutron monitor data, to interpolate between the present quiet Sun and the minimum state of the quiet Sun. This determines the long-term trend in the solar variability, which is then superposed with the 11-year activity cycle calculated from the sunspot number. The time-dependent solar spectral irradiance from about 7000 BC to the present is then derived using a state-of-the-art radiation code. Results: We derive a total and spectral solar irradiance that was substantially lower during the Maunder minimum than the one observed today. The difference is remarkably larger than other estimations published in the recent literature. The magnitude of the solar UV variability, which indirectly affects the climate, is also found to exceed previous estimates.We discuss in detail the assumptions that lead us to this conclusion. Appendix is only available in electronic form at http://www.aanda.org

  16. Extending dynamic segmentation with lead generation: A latent class Markov analysis of financial product portfolios

    OpenAIRE

    Paas, L.J.; Bijmolt, T.H.A.; Vermunt, J.K.

    2004-01-01

    A recent development in marketing research concerns the incorporation of dynamics in consumer segmentation.This paper extends the latent class Markov model, a suitable technique for conducting dynamic segmentation, in order to facilitate lead generation.We demonstrate the application of the latent Markov model for these purposes using a database containing information on the ownership of twelve financial products and demographics for explaining (changes in) consumer product portfolios.Data we...

  17. Numerical Analysis of Wind Turbine Airfoil Aerodynamic Performance with Leading Edge Bump

    Directory of Open Access Journals (Sweden)

    Majid Asli

    2015-01-01

    Full Text Available Aerodynamic performance improvement of wind turbine blade is the key process to improve wind turbine performance in electricity generated and energy conversion in renewable energy sources concept. The flow behavior on wind turbine blades profile and the relevant phenomena like stall can be improved by some modifications. In the present paper, Humpback Whales flippers leading edge protuberances model as a novel passive stall control method was investigated on S809 as a thick airfoil. The airfoil was numerically analyzed by CFD method in Reynolds number of 106 and aerodynamic coefficients in static angle of attacks were validated with the experimental data reported by Somers in NREL. Therefore, computational results for modified airfoil with sinusoidal wavy leading edge were presented. The results revealed that, at low angles of attacks before the stall region, lift coefficient decreases slightly rather than baseline model. However, the modified airfoil has a smooth stall trend while baseline airfoil lift coefficient decreases sharply due to the separation which occurred on suction side. According to the flow physics over the airfoils, leading edge bumps act as vortex generator so vortices containing high level of momentum make the flow remain attached to the surface of the airfoil at high angle of attack and prevent it from having a deep stall.

  18. Thermal-structural analysis of the platelet heat-pipe-cooled leading edge of hypersonic vehicle

    Science.gov (United States)

    Hongpeng, Liu; Weiqiang, Liu

    2016-10-01

    One of the main challenges for the hypersonic vehicle is its thermal protection, more specifically, the cooling of its leading edge. To investigate the feasibility of a platelet heat-pipe-cooled leading edge structure, thermal/stress distributions for steady-state flight conditions are calculated numerically. Studies are carried on for IN718/Na, C-103/Na and T-111/Li compatible material combinations of heat pipe under nominal operations and a central heat pipe failure cases, and the influence of wall thickness on the design robustness is also investigated. And the heat transfer limits (the sonic limit, the capillary limit and the boiling limit) are also computed to check the operation of platelet heat pipes. The results indicate that, with a 15 mm leading edge radius and a wall thickness of 0.5 mm, C-103/Na and T-111/Li combinations of heat pipe is capable of withstanding both nominal and failure conditions for Mach 8 and Mach 10 flight respectively.

  19. Application of six sigma and AHP in analysis of variable lead time calibration process instrumentation

    Science.gov (United States)

    Rimantho, Dino; Rahman, Tomy Abdul; Cahyadi, Bambang; Tina Hernawati, S.

    2017-02-01

    Calibration of instrumentation equipment in the pharmaceutical industry is an important activity to determine the true value of a measurement. Preliminary studies indicated that occur lead-time calibration resulted in disruption of production and laboratory activities. This study aimed to analyze the causes of lead-time calibration. Several methods used in this study such as, Six Sigma in order to determine the capability process of the calibration instrumentation of equipment. Furthermore, the method of brainstorming, Pareto diagrams, and Fishbone diagrams were used to identify and analyze the problems. Then, the method of Hierarchy Analytical Process (AHP) was used to create a hierarchical structure and prioritize problems. The results showed that the value of DPMO around 40769.23 which was equivalent to the level of sigma in calibration equipment approximately 3,24σ. This indicated the need for improvements in the calibration process. Furthermore, the determination of problem-solving strategies Lead Time Calibration such as, shortens the schedule preventive maintenance, increase the number of instrument Calibrators, and train personnel. Test results on the consistency of the whole matrix of pairwise comparisons and consistency test showed the value of hierarchy the CR below 0.1.

  20. Lesson from Tungsten Leading Edge Heat Load Analysis in KSTAR Divertor

    Science.gov (United States)

    Hong, Suk-Ho; Pitts, Richard Anthony; Lee, Hyeong-Ho; Bang, Eunnam; Kang, Chan-Soo; Kim, Kyung-Min; Kim, Hong-Tack; ITER Organization Collaboration; Kstar Team Team

    2016-10-01

    An important design issue for the ITER tungsten (W) divertor and in fact for all such components using metallic plasma-facing elements and which are exposed to high parallel power fluxes, is the question of surface shaping to avoid melting of leading edges. We have fabricated a series of tungsten blocks with a variety of leading edge heights (0.3, 0.6, 1.0, and 2.0 mm), from the ITER worst case to heights even beyond the extreme value tested on JET. They are mounted into adjacent, inertially cooled graphite tile installed in the central divertor region of KSTAR, within the field of view of an infra-red (IR) thermography system with a spatial resolution to 0.4 mm/pixel. Adjustment of the outer divertor strike point position is used to deposit power on the different blocks in different discharges. The measured power flux density on flat regions of the surrounding graphite tiles is used to obtain the parallel power flux, q|| impinging on the various W blocks. Experiments have been performed in Type I ELMing H-mode with Ip = 600 kA, BT = 2 T, PNBI = 3.5 MW, leading to a hot attached divertor with typical pulse lengths of 10 s. Three dimensional ANSYS simulations using q|| and assuming geometric projection of the heat flux are found to be consistent with the observed edge loading. This research was partially supported by Ministry of Science, ICT, and Future Planning under KSTAR project.

  1. Cadmium, lead, mercury and arsenic in animal feed and feed materials - trend analysis of monitoring results.

    Science.gov (United States)

    Adamse, Paulien; Van der Fels-Klerx, H J Ine; de Jong, Jacob

    2017-03-02

    This study aimed to obtain insights into the presence of cadmium, lead, mercury and arsenic in feed materials and feed over time, for the purpose of guiding national monitoring. Data from the Dutch feed monitoring program and from representatives of the feed industry in the period 2007-2013 were used. Data covered the concentrations of cadmium, lead, mercury and arsenic in a variety of feed materials and compound feeds in The Netherlands. Trends in the percentage of samples that exceeded the maximum limit (ML), set by the European Commission, and trends in average, median and 90(th) percentile concentrations of each of these elements per feed material or compound feed were investigated. Based on the results, monitoring for cadmium, lead, mercury and arsenic should focus on feed material of mineral origin, feed material of marine origin, especially fish meal, seaweed and algae as well as feed additives belonging to the functional groups of (i) trace elements (notably cupric sulphate, zinc oxide and manganese oxide for arsenic) and (ii) binders and anti-caking agents. Mycotoxin binders are a new group of feed additives that also need attention. For complementary feed it is important to make a proper distinction between mineral and non-mineral feed because the ML in the latter group is usually lower. In seaweed/algae products a relatively large number of samples contained arsenic concentrations that exceeded the ML. Forage crops in general do not need high priority in monitoring programs, although for arsenic grass meal still needs attention.

  2. Causality Analysis: Identifying the Leading Element in a Coupled Dynamical System

    Science.gov (United States)

    BozorgMagham, Amir E.; Motesharrei, Safa; Penny, Stephen G.; Kalnay, Eugenia

    2015-01-01

    Physical systems with time-varying internal couplings are abundant in nature. While the full governing equations of these systems are typically unknown due to insufficient understanding of their internal mechanisms, there is often interest in determining the leading element. Here, the leading element is defined as the sub-system with the largest coupling coefficient averaged over a selected time span. Previously, the Convergent Cross Mapping (CCM) method has been employed to determine causality and dominant component in weakly coupled systems with constant coupling coefficients. In this study, CCM is applied to a pair of coupled Lorenz systems with time-varying coupling coefficients, exhibiting switching between dominant sub-systems in different periods. Four sets of numerical experiments are carried out. The first three cases consist of different coupling coefficient schemes: I) Periodic–constant, II) Normal, and III) Mixed Normal/Non-normal. In case IV, numerical experiment of cases II and III are repeated with imposed temporal uncertainties as well as additive normal noise. Our results show that, through detecting directional interactions, CCM identifies the leading sub-system in all cases except when the average coupling coefficients are approximately equal, i.e., when the dominant sub-system is not well defined. PMID:26125157

  3. LEADAT: a MATLAB-based program for lead-210 data analysis of sediment cores

    Institute of Scientific and Technical Information of China (English)

    LU Xueqiang; MATSUMOTO Eiji

    2006-01-01

    The program described herein (LEADAT) serves to calculate sediment date and sedimentation rate using the 210Pb method for the studies of environmental processes and pollution history on a time scale of 100~150 a. The program written in MATLAB (Version7.0) permits the user to select the principal models of 210Pb method, I.e., the constant fluxes of sediment and lead-210 (CFSL) model, the constant flux of lead-210 (CFL) model, the constant initial concentration of lead-210 (CICL) model and the two-layer mixing (TLM) model. Although appropriate model selection is essentially based on understanding of the sedimentary processes, the pattern of the excess 210Pb profile is also helpful for model selection. The excess 210Pb profiles for two sediment cores collected from a brackish lake and an embayment respectively are used to demonstrate the application of the program. With a graphical user interface, the program can be easily executed. Not only ASCII but also graphical output can be generated by means of the program. Meanwhile, the code can be modified easily for extension.

  4. Synthetic and DFT studies towards a unified approach to phlegmarine alkaloids: aza-Michael intramolecular processes leading to 5-oxodecahydroquinolines.

    Science.gov (United States)

    Bradshaw, Ben; Luque-Corredera, Carlos; Saborit, Gisela; Cativiela, Carlos; Dorel, Ruth; Bo, Carles; Bonjoch, Josep

    2013-10-04

    A diastereoselective synthesis of cis-5-oxodecahydroquinolines is described in which three stereocenters are generated in a one-pot reaction. The reaction involves a lithium hydroxide-promoted Robinson annulation/intramolecular aza-Michael domino process from an achiral acyclic tosylamine-tethered β-keto ester. The development and scope of this reaction was facilitated through the use of DFT-based mechanistic studies, which enabled the observed diastereodivergent course of the azacyclization to be rationalized. The varying stereochemistry and stability of the resulting decahydroquinolines was found to depend on whether a β-keto ester or ketone were embedded in the substrates undergoing aminocyclization. This synthetic approach gave access not only to both diastereomeric cis-decahydroquinolines from the same precursor, but also to the corresponding trans isomers, through an epimerization processes of the corresponding N-unsubstituted cis-5-oxodecahydroquinolines. The described methodology provides advanced building-blocks with the three relative stereochemistries required for the total synthesis of phlegmarine alkaloids.

  5. International Lead Zinc Research Organization-sponsored field-data collection and analysis to determine relationships between service conditions and reliability of valve-regulated lead-acid batteries in stationary applications

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, P.A. [Energetics, Columbia, MD (United States); Moseley, P.T. [International Lead Zinc Research Organization, Research Triangle Park, NC (United States); Butler, P.C. [Sandia National Labs., Albuquerque, NM (United States)

    1999-03-01

    The International Lead Zinc Research Organization (ILZRO), in cooperation with Sandia National Laboratories, has initiated a multi-phase project with the following aims: to characterize relationships between valve-regulated lead-acid (VRLA) batteries, service conditions, and failure modes; to establish the degree of correlation between specific operating procedures and PCL; to identify operating procedures that mitigate PCL; to identify best-fits between the operating requirements of specific applications and the capabilities of specific VRLA technologies; to recommend combinations of battery design, manufacturing processes, and operating conditions that enhance VRLA performance and reliability. In the first phase of this project, ILZRO has contracted with Energetics to identify and survey manufacturers and users of VRLA batteries for stationary applications (including electric utilities, telecommunications companies, and government facilities). The confidential survey is collecting the service conditions of specific applications and performance records for specific VRLA technologies. From the data collected, Energetics is constructing a database of the service histories and analyzing the data to determine trends in performance for particular technologies in specific service conditions. ILZRO plans to make the final report of the analysis and a version of the database (that contains no proprietary information) available to ILZRO members, participants in the survey, and participants in a follow-on workshop for stakeholders in VRLA reliability. This paper presents the surveys distributed to manufacturers and end-users, discusses the analytic approach, presents an overview of the responses to the surveys and trends that have emerged in the early analysis of the data, and previews the functionality of the database being constructed. (orig.)

  6. Stability analysis for natural slope by kinematical approach

    Institute of Scientific and Technical Information of China (English)

    孙志彬; 覃长兵

    2014-01-01

    The stability of natural slope was analyzed on the basis of limit analysis. The sliding model of a kind of natural slope was presented. A new kinematically admissible velocity field for the new sliding model was constructed. The stability factor formulation by the upper bound theorem leads to a classical nonlinear programming problem, when the external work rate and internal energy dissipation were solved, and the constraint condition of the programming problem was given. The upper bound optimization problem can be solved efficiently by applying a nonlinear SQP algorithm, and stability factor was obtained, which agrees well with previous achievements.

  7. A constructive approach for discovering new drug leads: Using a kernel methodology for the inverse-QSAR problem

    Directory of Open Access Journals (Sweden)

    Wong William WL

    2009-04-01

    Full Text Available Abstract Background The inverse-QSAR problem seeks to find a new molecular descriptor from which one can recover the structure of a molecule that possess a desired activity or property. Surprisingly, there are very few papers providing solutions to this problem. It is a difficult problem because the molecular descriptors involved with the inverse-QSAR algorithm must adequately address the forward QSAR problem for a given biological activity if the subsequent recovery phase is to be meaningful. In addition, one should be able to construct a feasible molecule from such a descriptor. The difficulty of recovering the molecule from its descriptor is the major limitation of most inverse-QSAR methods. Results In this paper, we describe the reversibility of our previously reported descriptor, the vector space model molecular descriptor (VSMMD based on a vector space model that is suitable for kernel studies in QSAR modeling. Our inverse-QSAR approach can be described using five steps: (1 generate the VSMMD for the compounds in the training set; (2 map the VSMMD in the input space to the kernel feature space using an appropriate kernel function; (3 design or generate a new point in the kernel feature space using a kernel feature space algorithm; (4 map the feature space point back to the input space of descriptors using a pre-image approximation algorithm; (5 build the molecular structure template using our VSMMD molecule recovery algorithm. Conclusion The empirical results reported in this paper show that our strategy of using kernel methodology for an inverse-Quantitative Structure-Activity Relationship is sufficiently powerful to find a meaningful solution for practical problems.

  8. A QSAR approach for virtual screening of lead-like molecules en route to antitumor and antibiotic drugs from marine and microbial natural products

    Directory of Open Access Journals (Sweden)

    Florbela Pereira

    2014-05-01

    Figure 1. The unreported 15 lead antibiotic MNPs and MbNPs from AntiMarin database, using the best Rfs antibiotic model with a probability of being antibiotic greater than or equal to 0.8. Figure 2. The selected 4 lead antitumor MNPs and MbNPs from the AntiMarin database, using the best Rfs antitumor model with a probability of being antitumor greater than or equal to 0.8. The present work corroborates by one side the results of our previous work6 and enables the presentation of a new set of possible lead like bioactive compounds. Additionally, it is shown the usefulness of quantum-chemical descriptors in the discrimination of biological active and inactive compounds. The use of the εHOMO quantum-chemical descriptor in the discrimination of large scale data sets of lead-like or drug-like compounds has never been reported. This approach results in the reduction, in great extent, of the number of compounds used in real screens, and it reinforces the results of our previous work. Furthermore, besides the virtual screening, the computational methods can be very useful to build appropriate databases, allowing for effective shortcuts of NP extracts dereplication procedures, which will certainly result in increasing the efficiency of drug discovery.

  9. A global sensitivity analysis approach for morphogenesis models

    KAUST Repository

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  10. Sensitivity analysis approach to multibody systems described by natural coordinates

    Science.gov (United States)

    Li, Xiufeng; Wang, Yabin

    2014-03-01

    The classical natural coordinate modeling method which removes the Euler angles and Euler parameters from the governing equations is particularly suitable for the sensitivity analysis and optimization of multibody systems. However, the formulation has so many principles in choosing the generalized coordinates that it hinders the implementation of modeling automation. A first order direct sensitivity analysis approach to multibody systems formulated with novel natural coordinates is presented. Firstly, a new selection method for natural coordinate is developed. The method introduces 12 coordinates to describe the position and orientation of a spatial object. On the basis of the proposed natural coordinates, rigid constraint conditions, the basic constraint elements as well as the initial conditions for the governing equations are derived. Considering the characteristics of the governing equations, the newly proposed generalized-α integration method is used and the corresponding algorithm flowchart is discussed. The objective function, the detailed analysis process of first order direct sensitivity analysis and related solving strategy are provided based on the previous modeling system. Finally, in order to verify the validity and accuracy of the method presented, the sensitivity analysis of a planar spinner-slider mechanism and a spatial crank-slider mechanism are conducted. The test results agree well with that of the finite difference method, and the maximum absolute deviation of the results is less than 3%. The proposed approach is not only convenient for automatic modeling, but also helpful for the reduction of the complexity of sensitivity analysis, which provides a practical and effective way to obtain sensitivity for the optimization problems of multibody systems.

  11. Analysis of electrolyte abnormalities and the mechanisms leading to arrhythmias in heart failure. A literature review.

    Science.gov (United States)

    Urso, C; Canino, B; Brucculeri, S; Firenze, A; Caimi, G

    2016-01-01

    About 50% of deaths from heart failure (HF) are sudden, presumably referable to arrhythmias. Electrolyte and acid-base abnormalities are a frequent and potentially dangerous complication in HF patients. Their incidence is almost always correlated with the severity of cardiac dysfunction; furthermore leading to arrhythmias, these imbalances are associated with a poor prognosis. The frequency of ventricular ectopic beats and sudden cardiac death correlate with both plasma and whole body levels of potassium, especially in alkalemia. The early recognition of these alterations and the knowledge of the pathophysiological mechanisms are useful for the management of these HF patients.

  12. Numerical Analysis of Lead-Bismuth-Water Direct Contact Boiling Heat Transfer

    Science.gov (United States)

    Yamada, Yumi; Takahashi, Minoru

    Direct contact boiling heat transfer of sub-cooled water with lead-bismuth eutectic (Pb-Bi) was investigated for the evaluation of the performance of steam generation in direct contact of feed water with primary Pb-Bi coolant in upper plenum above the core in Pb-Bi-cooled direct contact boiling water fast reactor. An analytical two-fluid model was developed to estimate the heat transfer numerically. Numerical results were compared with experimental ones for verification of the model. The overall volumetric heat transfer coefficient was calculated from heat exchange rate in the chimney. It was confirmed that the calculated results agreed well with the experimental result.

  13. Genotypic and environmental variation in cadmium, chromium, lead and copper in rice and approaches for reducing the accumulation

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Fangbin; Wang, Runfeng [Institute of Crop Science, Department of Agronomy, College of Agriculture and Biotechnology, Zijingang Campus, Zhejiang University, Hangzhou 310058 (China); Cheng, Wangda [Jiaxing Academy of Agricultural Sciences, Jiaxing 314016 (China); Zeng, Fanrong; Ahmed, Imrul Mosaddek; Hu, Xinna; Zhang, Guoping [Institute of Crop Science, Department of Agronomy, College of Agriculture and Biotechnology, Zijingang Campus, Zhejiang University, Hangzhou 310058 (China); Wu, Feibo, E-mail: wufeibo@zju.edu.cn [Institute of Crop Science, Department of Agronomy, College of Agriculture and Biotechnology, Zijingang Campus, Zhejiang University, Hangzhou 310058 (China)

    2014-10-15

    The field scale trials revealed significant genotypic and environmental differences in grain heavy metal (HM) concentrations of 158 newly developed rice varieties grown in twelve locations of Zhejiang province of China. Grain Pb and Cd contents in 5.3% and 0.4% samples, respectively, were above the maximum permissible concentration (MPC); none of samples had Cr/Cu exceeding MPC. Stepwise multiple linear regression analysis estimated soil HM critical levels for safe rice production. Low grain HM accumulation cultivars such as Xiushui817, Jiayou08-1 and Chunyou689 were recommended as suitable cultivars for planting in slight/medium HM contaminated soils. The alleviating regulator (AR) of (NH{sub 4}){sub 2}SO{sub 4} as N fertilizer coupled with foliar spray of a mixture containing glutathione (GSH), Si, Zn and Se significantly decreased grain Cd, Cr, Cu and Pb concentrations grown in HM contaminated fields with no effect on yield, indicating a promising measurement for further reducing grain HM content to guarantee safe food production. - Highlights: • Field trials evaluated situation of grain HM in main rice growing areas of Zhejiang. • Forecasting index system to predict rice grain HM concentration was achieved. • Hybrid rice holds higher grain Cd concentration than conventional cultivars. • Low grain HM accumulation rice cultivars were successfully identified. • Developed alleviating regulator which effectively reduced grain toxic HM.

  14. 铅作业工人血铅、尿铅和锌原卟啉测定结果分析%Results analysis of blood lead , urine lead and zinc protoporphyrin determination in workers exposed to lead

    Institute of Scientific and Technical Information of China (English)

    盛红艳

    2012-01-01

    目的 比较铅作业工人和非铅作业工人血铅、尿铅浓度、红细胞锌原卟啉和血常规的差异,探讨铅作业工人红细胞锌原卟啉和血铅、尿铅浓度的相关性.方法 以蓄电池企业的252名铅作业工人作为接铅组,电子厂205名非铅作业工人作为对照组,测定血铅、尿铅浓度、红细胞锌原卟啉和血常规,并对结果进行统计学处理.结果 接铅组血红蛋白含量低于对照组(P<0.05),白细胞和红细胞、血小板则无差异(P>0.05),接铅组红细胞锌原卟啉、血铅、尿铅浓度均显著高于对照组(P<0.01),尿铅与红细胞锌原卟啉的相关系数(r) =0.166,血铅与红细胞锌原卟啉的相关系数r=0.406.结论 长期接触铅可导致血铅、尿铅浓度和红细胞锌原卟啉升高,血红蛋白降低,对白细胞、红细胞、血小板无明显影响,血铅与红细胞锌原卟咻的相关性较尿铅与红细胞锌原卟啉的相关性好.%[ Objective] To compare the differences of blood lead , urine lead and zinc protoporphyrin determination between workers with lead exposure and workers without lead exposure, and explore the correlation ? between erythrocyte zinc protoporphyrin and blood lead level, urine lead level of workers. [ Methods] A total of 252 lead exposed workers from a storage battery manufacture were selected as lead exposure group, another 205 workers without lead exposure from a electronics Factory as control group. The determination of blood lead level, urinary lead level, erythrocyte zinc protoporphyrin and blood routine examination were performed, and the results were statistically analyzed. [ Results] Hemoglobin content of lead exposure group was significantly lower than that of control group (P<0.05) , while leukocytes, erythrocytes and blood platelet were not significantly different (P>0.05). Erythrocyte zinc protoporphyrin , blood lead , urinary lead concentration of lead exposure group were higher than those of control

  15. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Directory of Open Access Journals (Sweden)

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  16. Space Shuttle Orbiter Wing-Leading-Edge Panel Thermo-Mechanical Analysis for Entry Conditions

    Science.gov (United States)

    Knight, Norman F., Jr.; Song, Kyongchan; Raju, Ivatury S.

    2010-01-01

    Linear elastic, thermo-mechanical stress analyses of the Space Shuttle Orbiter wing-leading-edge panels is presented for entry heating conditions. The wing-leading-edge panels are made from reinforced carbon-carbon and serve as a part of the overall thermal protection system. Three-dimensional finite element models are described for three configurations: integrated configuration, an independent single-panel configuration, and a local lower-apex joggle segment. Entry temperature conditions are imposed and the through-the-thickness response is examined. From the integrated model, it was concluded that individual panels can be analyzed independently since minimal interaction between adjacent components occurred. From the independent single-panel model, it was concluded that increased through-the-thickness stress levels developed all along the chord of a panel s slip-side joggle region, and hence isolated local joggle sections will exhibit the same trend. From the local joggle models, it was concluded that two-dimensional plane-strain models can be used to study the influence of subsurface defects along the slip-side joggle region of these panels.

  17. In Sillico Analysis of Newly Identified Potential Drug Lead Compound against VP40 for the Treatment of Ebola Virus Infection

    Directory of Open Access Journals (Sweden)

    *Biswadip Bandyopadhyay

    2016-11-01

    Full Text Available Ebola Virus (EBOV, also referred as Zaire Ebola Virus, member of filoviridae family, is a single stranded, filamentous, enveloped, mononegavirales virus. It causes acute hemorrhagic fever which is naturally resistant to various antibiotics. After the outbreak of Ebola virus, the CADD (Computer Aided Drug Discovery became necessary as classical model of drug discovery takes lots of time to find target protein and potential lead compound. Computational techniques made it much easier. The matrix protein of Ebola VP40 whose virulent activity and functions in pathogenesis affirmed it as a potential drug target. To inhibit Ebola infection, CADD and molecular docking approaches are one of the effective tools to discover new drug leads against these sporadic targets. In this study lead compounds identified that matches the drug likeliness criteria using “Lipinski’s rule of five” for different crystalline structures of target receptor protein. The discovery of such drug lead molecules which inhibits those protein molecules may constitute successful multidrug resistant Ebola virus infection.

  18. Occupational exposure and biological evaluation of lead in Iranian workers-a systematic review and meta-analysis

    Directory of Open Access Journals (Sweden)

    Kourosh Sayehmiri

    2016-09-01

    Full Text Available Introduction: Lead exposure is considered as a global health problem. The irreparable harmful effects of this heavy metal on human have been proven in various studies. Comparing to general population, workers in related industries are more exposed to lead. Several studies have investigated lead occupational exposure and its biological evaluation in Iran; however there is no overall estimate. Thus, the present study was conducted to determine the occupational exposure to lead and its biological evaluation in Iranian workers, using systematic review and meta-analysis. Material and Method: This study was carried out based on information obtained from databases including Magiran, Iranmedex, SID, Medlib, Trials Register, Scopus, Pubmed, Science Direct, Cochran, Embase, Medline, Web of Science, Springer, Online Library Wiley, and Google Scholar from 1991 to 2016, using standard key words. All of the reviewed papers which met the inclusion criteria have been evaluated. Data combination was performed according to Random Effects Model using Stata software version 11.1. Result: In the 34 qualified studies, the mean blood lead level (BLL concentration in Iranian workers was estimated 42.8µg/dl (95% CI: 35.15-50.49. The minimum and maximum BLL were belonged to west (28.348µg/dl and center (45.928µg/dl regions of Iran, respectively. Considering different occupations, the lowest mean value was reported in textile industry workers (12.3 µg/dl, while the highest value was for zinc-lead mine workers (72.6 µg/dl. Mean breathing air lead level of Iranian workers reported in 4 studies was estimated 0.23 mg/m3 (95% CI: 0.14-0.33. Conclusion: According to the high concentration of BLL and breathing air, it is recommended to increase protective measures and frequent screening. Scheduled clinical and paraclinical examination should also be performed for workers.

  19. Stability Analysis of a Model of Atherogenesis: An Energy Estimate Approach

    Directory of Open Access Journals (Sweden)

    A. I. Ibragimov

    2008-01-01

    Full Text Available Atherosclerosis is a disease of the vasculature that is characterized by chronic inflammation and the accumulation of lipids and apoptotic cells in the walls of large arteries. This disease results in plaque growth in an infected artery typically leading to occlusion of the artery. Atherosclerosis is the leading cause of human mortality in the US, much of Europe, and parts of Asia. In a previous work, we introduced a mathematical model of the biochemical aspects of the disease, in particular the inflammatory response of macrophages in the presence of chemoattractants and modified low density lipoproteins. Herein, we consider the onset of a lesion as resulting from an instability in an equilibrium configuration of cells and chemical species. We derive an appropriate norm by taking an energy estimate approach and present stability criteria. A bio-physical analysis of the mathematical results is presented.

  20. Data analysis with the DIANA meta-scheduling approach

    CERN Document Server

    Anjum, A; Willers, I

    2008-01-01

    The concepts, design and evaluation of the Data Intensive and Network Aware (DIANA) meta-scheduling approach for solving the challenges of data analysis being faced by CERN experiments are discussed in this paper. Our results suggest that data analysis can be made robust by employing fault tolerant and decentralized meta-scheduling algorithms supported in our DIANA meta-scheduler. The DIANA meta-scheduler supports data intensive bulk scheduling, is network aware and follows a policy centric meta-scheduling. In this paper, we demonstrate that a decentralized and dynamic meta-scheduling approach is an effective strategy to cope with increasing numbers of users, jobs and datasets. We present 'quality of service' related statistics for physics analysis through the application of a policy centric fair-share scheduling model. The DIANA meta-schedulers create a peer-to-peer hierarchy of schedulers to accomplish resource management that changes with evolving loads and is dynamic and adapts to the volatile nature of t...

  1. Terminal Performance of Lead Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video

    Science.gov (United States)

    2016-04-04

    Terminal Performance of Lead-Free Pistol Bullets in Ballistic Gelatin Using Retarding Force Analysis from High Speed Video ELIJAH COURTNEY, AMY...quantified using high speed video. The temporary stretch cavities and permanent wound cavities are also characterized. Two factors tend to re- duce the...which reduces muzzle velocity and energy, and thus reduces the ability of the bullet to exert damaging forces in tissue simulant. Second, the lower

  2. Bioinformatics approaches to single-cell analysis in developmental biology.

    Science.gov (United States)

    Yalcin, Dicle; Hakguder, Zeynep M; Otu, Hasan H

    2016-03-01

    Individual cells within the same population show various degrees of heterogeneity, which may be better handled with single-cell analysis to address biological and clinical questions. Single-cell analysis is especially important in developmental biology as subtle spatial and temporal differences in cells have significant associations with cell fate decisions during differentiation and with the description of a particular state of a cell exhibiting an aberrant phenotype. Biotechnological advances, especially in the area of microfluidics, have led to a robust, massively parallel and multi-dimensional capturing, sorting, and lysis of single-cells and amplification of related macromolecules, which have enabled the use of imaging and omics techniques on single cells. There have been improvements in computational single-cell image analysis in developmental biology regarding feature extraction, segmentation, image enhancement and machine learning, handling limitations of optical resolution to gain new perspectives from the raw microscopy images. Omics approaches, such as transcriptomics, genomics and epigenomics, targeting gene and small RNA expression, single nucleotide and structural variations and methylation and histone modifications, rely heavily on high-throughput sequencing technologies. Although there are well-established bioinformatics methods for analysis of sequence data, there are limited bioinformatics approaches which address experimental design, sample size considerations, amplification bias, normalization, differential expression, coverage, clustering and classification issues, specifically applied at the single-cell level. In this review, we summarize biological and technological advancements, discuss challenges faced in the aforementioned data acquisition and analysis issues and present future prospects for application of single-cell analyses to developmental biology.

  3. Intelligent Systems Approaches to Product Sound Quality Analysis

    Science.gov (United States)

    Pietila, Glenn M.

    As a product market becomes more competitive, consumers become more discriminating in the way in which they differentiate between engineered products. The consumer often makes a purchasing decision based on the sound emitted from the product during operation by using the sound to judge quality or annoyance. Therefore, in recent years, many sound quality analysis tools have been developed to evaluate the consumer preference as it relates to a product sound and to quantify this preference based on objective measurements. This understanding can be used to direct a product design process in order to help differentiate the product from competitive products or to establish an impression on consumers regarding a product's quality or robustness. The sound quality process is typically a statistical tool that is used to model subjective preference, or merit score, based on objective measurements, or metrics. In this way, new product developments can be evaluated in an objective manner without the laborious process of gathering a sample population of consumers for subjective studies each time. The most common model used today is the Multiple Linear Regression (MLR), although recently non-linear Artificial Neural Network (ANN) approaches are gaining popularity. This dissertation will review publicly available published literature and present additional intelligent systems approaches that can be used to improve on the current sound quality process. The focus of this work is to address shortcomings in the current paired comparison approach to sound quality analysis. This research will propose a framework for an adaptive jury analysis approach as an alternative to the current Bradley-Terry model. The adaptive jury framework uses statistical hypothesis testing to focus on sound pairings that are most interesting and is expected to address some of the restrictions required by the Bradley-Terry model. It will also provide a more amicable framework for an intelligent systems approach

  4. Analysis of a Mathematical Model of Emerging Infectious Disease Leading to Amphibian Decline

    Directory of Open Access Journals (Sweden)

    Muhammad Dur-e-Ahmad

    2014-01-01

    Full Text Available We formulate a three-dimensional deterministic model of amphibian larvae population to investigate the cause of extinction due to the infectious disease. The larvae population of the model is subdivided into two classes, exposed and unexposed, depending on their vulnerability to disease. Reproduction ratio ℛ0 has been calculated and we have shown that if ℛ01, we discussed different scenarios under which an infected population can survive or be eliminated using stability and persistence analysis. Finally, we also used Hopf bifurcation analysis to study the stability of periodic solutions.

  5. Chemoinformatic Analysis as a Tool for Prioritization of Trypanocidal Marine Derived Lead Compounds

    Directory of Open Access Journals (Sweden)

    Yunjiang Feng

    2014-03-01

    Full Text Available Marine trypanocidal natural products are, most often, reported with trypanocidal activity and selectivity against human cell lines. The triaging of hits requires a consideration of chemical tractability for drug development. We utilized a combined Lipinski’s rule-of-five, chemical clustering and ChemGPS-NP principle analysis to analyze a set of 40 antitrypanosomal natural products for their drug like properties and chemical space. The analyses identified 16 chemical clusters with 11 well positioned within drug-like chemical space. This study demonstrated that our combined analysis can be used as an important strategy for prioritization of active marine natural products for further investigation.

  6. Analysis of cadmium, nickel, and lead in commercial moist and dry snuff used in Pakistan.

    Science.gov (United States)

    Kazi, Tasneem Gul; Arain, Sadaf Sadia; Afridi, Hassan Imran; Naeemullah; Brahman, Kapil Dev; Kolachi, Nida Fatima; Mughal, Moina Akhtar

    2013-06-01

    The extent to which smokeless tobacco endangers human health is an ongoing subject of debate. In this study, concentrations of toxic metals, cadmium (Cd), lead (Pb), and nickel (Ni), were measured in different snuff products (dry brown and black and moist green and brown), available and consumed in Pakistan. Concentrations of Cd, Pb, and Ni were determined in 23 samples of various brands of snuff by electrothermal atomic absorption spectrometry, after microwave-assisted acid digestion. The reliability of methodology was assured by analyzing certified reference material. The resulted data of toxic metals in different snuff products are comparable to the existing information with limited exceptions. It was estimated that 10 g intake of different types of snuff could contribute 14-68, 17-47, and 20-73 % of the provisional maximum tolerable daily intake for Cd, Ni, and Pb, respectively.

  7. A ZEUS next-to-leading-order QCD analysis of data on deep inelastic scattering

    CERN Document Server

    Chekanov, S; Adamczyk, L; Adamus, M; Adler, V; Aghuzumtsyan, G; Allfrey, P D; Antonioli, P; Antonov, A; Arneodo, M; Bailey, D S; Bamberger, A; Barakbaev, A N; Barbagli, G; Barbi, M; Bari, G; Barreiro, F; Bartsch, D; Basile, M; Behrens, U; Bell, M A; Bellagamba, L; Bellan, P M; Benen, A; Bertolin, A; Bhadra, S; Bloch, I; Bold, T; Boos, E G; Borras, K; Boscherini, D; Brock, I; Brook, N H; Brugnera, R; Brümmer, N; Bruni, A; Bruni, G; Bussey, P J; Butterworth, J M; Büttner, C; Bylsma, B; Caldwell, A; Capua, M; Cara Romeo, G; Carli, T; Carlin, R; Cassel, D G; Catterall, C D; Chwastowski, J; Abramowicz, H; Ciborowski, J; Ciesielski, R; Cifarelli, Luisa; Cindolo, F; Cole, J E; Collins-Tooth, C; Contin, A; Cooper-Sarkar, A M; Coppola, N; Corradi, M; Corriveau, F; Costa, M; Cottrell, A; Cui, Y; D'Agostini, G; Dal Corso, F; Danilov, P; De Pasquale, S; Dementiev, R K; Derrick, M; Devenish, R C E; Dhawan, S; Dobur, D; Dolgoshein, B A; Doyle, A T; Drews, G; Durkin, L S; Dusini, S; Eisenberg, Y; Ermolov, P F; Eskreys, Andrzej; Everett, A; Ferrando, J; Ferrero, M I; Figiel, J; Foster, B; Foudas, C; Fourletov, S; Fourletova, J; Fry, C; Gabareen, A; Galas, A; Gallo, E; Garfagnini, A; Geiser, A; Genta, C; Gialas, I; Giusti, P; Gladilin, L K; Gladkov, D; Glasman, C; Göbel, F; Goers, S; Goncalo, R; González, O; Gosau, T; Göttlicher, P; Grabowska-Bold, I; Graciani-Díaz, R; Grigorescu, G; Grijpink, S; Groys, M; Grzelak, G; Gutsche, O; Gwenlan, C; Haas, T; Hain, W; Hall-Wilton, R; Hamatsu, R; Hamilton, J; Hanlon, S; Hart, C; Hartmann, H; Hartner, G; Heaphy, E A; Heath, G P; Helbich, M; Hilger, E; Hochman, D; Holm, U; Horn, C; Iacobucci, G; Iga, Y; Irrgang, P; Jakob, H P; Jiménez, M; Jones, T W; Kagawa, S; Kahle, B; Kaji, H; Kananov, S; Karshon, U; Karstens, F; Kasemann, M; Kataoka, M; Katkov, I I; Kcira, D; Keramidas, A; Khein, L A; Kim, J Y; Kind, O; Kisielewska, D; Kitamura, S; Koffeman, E; Kohno, T; Kooijman, P; Koop, T; Korzhavina, I A; Kotanski, A; Kötz, U; Kowal, A M; Kowalski, H; Kramberger, G; Kreisel, A; Krumnack, N; Kulinski, P; Kuze, M; Kuzmin, V A; Labarga, L; Lammers, S; Lelas, D; Levchenko, B B; Levy, A; Li, L; Lightwood, M S; Lim, H; Limentani, S; Ling, T Y; Liu, C; Liu, X; Löhr, B; Lohrmann, E; Loizides, J H; Long, K R; Longhin, A; Lukasik, J; Lukina, O Yu; Luzniak, P; Ma, K J; Maddox, E; Magill, S; Malka, J; Mankel, R; Margotti, A; Marini, G; Martin, J F; Martínez, M; Mastroberardino, A; Matsuzawa, K; Mattingly, M C K; Melzer-Pellmann, I A; Menary, S R; Metlica, F; Meyer, U; Miglioranzi, S; Milite, M; Mirea, A; Monaco, V; Montanari, A; Musgrave, B; Nagano, K; Namsoo, T; Nania, R; Nguyen, C N; Nigro, A; Ning, Y; Noor, U; Notz, D; Nowak, R J; Nuncio-Quiroz, A E; Oh, B Y; Olkiewicz, K; Ota, O; Padhi, S; Palmonari, F; Patel, S; Paul, E; Pavel, Usan; Pawlak, J M; Pelfer, P G; Pellegrino, A; Pesci, A; Piotrzkowski, K; Plamondon, M; Plucinsky, P P; Pokrovskiy, N S; Polini, A; Proskuryakov, A S; Przybycien, M B; Rautenberg, J; Raval, A; Reeder, D D; Ren, Z; Renner, R; Repond, J; Ri, Y D; Rinaldi, L; Robins, S; Rosin, M; Ruspa, M; Ryan, P; Sacchi, R; Salehi, H; Santamarta, R; Sartorelli, G; Savin, A A; Saxon, D H; Schagen, S; Schioppa, M; Schlenstedt, S; Schleper, P; Schmidke, W B; Schneekloth, U; Schörner-Sadenius, T; Sciulli, F; Shcheglova, L M; Skillicorn, I O; Slominski, W; Smith, W H; Soares, M; Solano, A; Son, D; Sosnovtsev, V V; Stairs, D G; Stanco, L; Standage, J; Stifutkin, A; Stonjek, S; Stopa, P; Stösslein, U; Straub, P B; Suchkov, S; Susinno, G; Suszycki, L; Sutiak, J; Sutton, M R; Sztuk, J; Szuba, D; Szuba, J; Tapper, A D; Targett-Adams, C; Tassi, E; Tawara, T; Terron, J; Tiecke, H G; Tokushuku, K; Tsurugai, T; Turcato, M; Tymieniecka, T; Tyszkiewicz, A; Ukleja, A; Ukleja, J; Vázquez, M; Vlasov, N N; Voss, K C; Walczak, R; Walsh, R; Wang, M; Whitmore, J J; Whyte, J; Wichmann, K; Wick, K; Wiggers, L; Wills, H H; Wing, M; Wlasenko, M; Wolf, G; Yagues-Molina, A G; Yamada, S; Yamazaki, Y; Yoshida, R; Youngman, C; Zambrana, M; Zawiejski, L; Zeuner, W; Zhautykov, B O; Zhou, C; Zichichi, A; Ziegler, A; Zotkin, D S; Zotkin, S A; De Favereau, J; De Wolf, E; Del Peso, J

    2003-01-01

    Next-to-leading order QCD analyses of the ZEUS data on deep inelastic scattering together with fixed-target data have been perfomed, from which the gluon and the quark densities of the proton and the value of the strong coupling constant, alpha_s(M_Z), were extracted. The study includes a full treatment of the experimental systematic uncertainties including point-to-point correlations. The resulting uncertainties in the parton density functions are presented. A combined fit for alpha_s(M_Z) and the gluon and qurak densities yields a value of alpha_s(M_Z) in agreement with the world average. The parton density functions derived from ZEUS data alone indicate the importance of HERA data in determining sea quark and gluon distributions at low x. The limits of applicability of the theoretical formalism have been explored by comparing the fit predictions to ZEUS data at very low Q^2.

  8. Safety analysis of the US dual coolant liquid lead lithium ITER test blanket module

    Science.gov (United States)

    Merrill, Brad; Reyes, Susana; Sawan, Mohamed; Wong, Clement

    2007-07-01

    The US is proposing a prototype of a dual coolant liquid lead-lithium DEMO blanket concept for testing in the International Thermonuclear Experimental Reactor (ITER) as an ITER test blanket module (TBM). Because safety considerations are an integral part of the design process to ensure that this TBM does not adversely impact the safety of ITER, a safety assessment has been conducted for this TBM and its ancillary systems as requested by the ITER project. Four events were selected by the ITER international team (IT) to address specific reactor safety concerns, such as vaccum vessel (VV) pressurization, confinement building pressure build-up, TBM decay heat removal capability, tritium and activation products release from the TBM system and hydrogen and heat production from chemical reactions. This paper summarizes the results of this safety assessment conducted with the MELCOR computer code.

  9. Stability Region Analysis of PID and Series Leading Correction PID Controllers for the Time Delay Systems

    Directory of Open Access Journals (Sweden)

    D. RAMA REDDY

    2012-07-01

    Full Text Available This paper describes the stability regions of PID (Proportional +Integral+ Derivative and a new PID with series leading correction (SLC for Networked control system with time delay. The new PID controller has a tuning parameter ‘β’. The relation between β, KP, KI and KD is derived. The effect of plant parameters on stabilityregion of PID controllers and SLC-PID controllers in first-order and second-order systems with time delay are also studied. Finally, an open-loop zero was inserted into the plant-unstable second order system with time delay so that the stability regions of PID and SLC-PID controllers get effectively enlarged. The total system isimplemented using MATLAB/Simulink.

  10. Failed anterior cruciate ligament reconstruction: analysis of factors leading to instability after primary surgery

    Institute of Scientific and Technical Information of China (English)

    MA Yong; AO Ying-fang; YU Jia-kuo; DAI Ling-hui; SHAO Zhen-xing

    2013-01-01

    Background Revision anterior cruciate ligament (ACL) surgery can be expected to become more common as the number of primary reconstruction keeps increasing.This study aims to investigate the factors causing instability after primary ACL reconstruction,which may provide an essential scientific base to prevent surgical failure.Methods One hundred and ten revision ACL surgeries were performed at our institute between November 2001 and July 2012.There were 74 men and 36 women,and the mean age at the time of revision was 27.6 years (range 16-56 years).The factors leading to instability after primary ACL reconstruction were retrospectively reviewed.Results Fifty-one knees failed because of bone tunnel malposition,with too anterior femoral tunnels (20 knees),posterior wall blowout (1 knee),vertical femoral tunnels (7 knees),too posterior tibial tunnels (12 knees),and too anterior tibial tunnels (10 knees).There was another knee performed with open surgery,where the femoral tunnel was drilled through the medial condyle and the tibial tunnel was too anterior.Five knees were found with malposition of the fixation.One knee with allograft was suspected of rejection and a second surgery had been made to take out the graft.Three knees met recurrent instability after postoperative infection.The other factors included traumatic (48 knees) and unidentified (12 knees).Conclusion Technical errors were the main factors leading to instability after primary ACL reconstructions,while attention should also be paid to the risk factors of re-injury and failure of graft incorporation.

  11. Unified Approach to Vulnerability Analysis of Web Applications

    Science.gov (United States)

    Le, H. T.; Loh, P. K. K.

    2008-11-01

    Web vulnerabilities in web-based applications may be detected, classified and documented. Several Web scanners exist for vulnerabilities in Web applications implemented via different technologies. However, none of them provides the technology-independent, generic coverage of possible vulnerabilities. In this project that is funded by Mindef Singapore, we propose a new approach for Web application security and vulnerability analysis. The design addresses the categorization of scanner results with a generic data model and the design of a language-independent rule-based engine that detects, analyses and reports suspected vulnerabilities in web-based applications.

  12. A MANAGERIAL AND COST ACCOUNTING APPROACH OF CUSTOMER PROFITABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    CARDOS Ildiko Reka

    2010-07-01

    Full Text Available In the last years many organizations realized that market orientation is essential to their success. Satisfying the needs of customers, offering them products and services which meet their desires and demands, customer loyalty can increase profitability for long term. After analyzing the existing journal literature in this field we would like to emphasize that managerial accounting, cost calculation methods and techniques, the analysis of costs provides relevant information when analyzing the customer’s profitability. We pay special attention on cost systems. An activity based costing approach takes customer profitability to new levels of accuracy and usefulness, provides the basis for creating, communicating and delivering value to the customers.

  13. An Approach to Analysis of Stable Isotopes in Microsamples

    Institute of Scientific and Technical Information of China (English)

    韩友科; 安娜

    1991-01-01

    A new aproach to isotopic analysis of carbon,oxygen,sulfur and nitrogen in microsamples has been established.Samples were conventionally prepared by mixing microsamples to be analyzed with reference samples with known δ values in a proper proportion,and then analyzed for their stable isotopes as those at ordinary levels.According to the equilibrium relationships before and after mixing,the δvalues of unknown microsamples were calculated.The δ15N of the atmosphere was estimated at zero by this approach,which is concordant with the internationally recommended value.

  14. Citation analysis: A social and dynamic approach to knowledge organization

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2013-01-01

    other and thereby indicating kinds of relatedness and semantic distance. It is therefore important to view bibliometric techniques as a family of approaches to KO in order to illustrate their relative strengths and weaknesses. The subfield of bibliometrics concerned with citation analysis forms......Knowledge organization (KO) and bibliometrics have traditionally been seen as separate subfields of library and information science, but bibliometric techniques make it possible to identify candidate terms for thesauri and to organize knowledge by relating scientific papers and authors to each...... be considered superior for all purposes. The main difference between traditional knowledge organization systems (KOSs) and maps based on citation analysis is that the first group represents intellectual KOSs, whereas the second represents social KOSs. For this reason bibliometric maps cannot be expected ever...

  15. A tessellated continuum approach to thermal analysis: discontinuity networks

    Science.gov (United States)

    Jiang, C.; Davey, K.; Prosser, R.

    2017-01-01

    Tessellated continuum mechanics is an approach for the representation of thermo-mechanical behaviour of porous media on tessellated continua. It involves the application of iteration function schemes using affine contraction and expansion maps, respectively, for the creation of porous fractal materials and associated tessellated continua. Highly complex geometries can be produced using a modest number of contraction mappings. The associated tessellations form the mesh in a numerical procedure. This paper tests the hypothesis that thermal analysis of porous structures can be achieved using a discontinuous Galerkin finite element method on a tessellation. Discontinuous behaviour is identified at a discontinuity network in a tessellation; its use is shown to provide a good representation of the physics relating to cellular heat exchanger designs. Results for different cellular designs (with corresponding tessellations) are contrasted against those obtained from direct analysis and very high accuracy is observed.

  16. The Peltier driven frequency domain approach in thermal analysis.

    Science.gov (United States)

    De Marchi, Andrea; Giaretto, Valter

    2014-10-01

    The merits of Frequency Domain analysis as a tool for thermal system characterization are discussed, and the complex thermal impedance approach is illustrated. Pure AC thermal flux generation with negligible DC component is possible with a Peltier device, differently from other existing methods in which a significant DC component is intrinsically attached to the generated AC flux. Such technique is named here Peltier Driven Frequency Domain (PDFD). As a necessary prerequisite, a novel one-dimensional analytical model for an asymmetrically loaded Peltier device is developed, which is general enough to be useful in most practical situations as a design tool for measurement systems and as a key for the interpretation of experimental results. Impedance analysis is possible with Peltier devices by the inbuilt Seebeck effect differential thermometer, and is used in the paper for an experimental validation of the analytical model. Suggestions are then given for possible applications of PDFD, including the determination of thermal properties of materials.

  17. The Peltier driven frequency domain approach in thermal analysis

    Science.gov (United States)

    Marchi, Andrea De; Giaretto, Valter

    2014-10-01

    The merits of Frequency Domain analysis as a tool for thermal system characterization are discussed, and the complex thermal impedance approach is illustrated. Pure AC thermal flux generation with negligible DC component is possible with a Peltier device, differently from other existing methods in which a significant DC component is intrinsically attached to the generated AC flux. Such technique is named here Peltier Driven Frequency Domain (PDFD). As a necessary prerequisite, a novel one-dimensional analytical model for an asymmetrically loaded Peltier device is developed, which is general enough to be useful in most practical situations as a design tool for measurement systems and as a key for the interpretation of experimental results. Impedance analysis is possible with Peltier devices by the inbuilt Seebeck effect differential thermometer, and is used in the paper for an experimental validation of the analytical model. Suggestions are then given for possible applications of PDFD, including the determination of thermal properties of materials.

  18. Sensitivity analysis in linear programming approach to optimal SVM classification

    Directory of Open Access Journals (Sweden)

    Roberto Ragona

    2014-06-01

    Full Text Available At present, linear programming (LP techniques for optimal one-class and two-class classification can be considered well established and feasible; they pose an alternative to the quadratic programming (QP approach, which is usually credited with having greater complexity. Sensitivity analysis, well developed in the LP context, is generally employed to furnish answers describing how an optimal solution changes when varying the parameters in an LP problem; as a possible application in optimal classification, it can be employed for the definition of the free parameters present in LP procedures, reducing the events of computational restart from scratch when searching for a satisfactory classifier through repeated trials. The proposed method is demonstrated on a simple example which exhibits its effectiveness in reducing the computational burden, but this procedure can be extrapolated to large problems as well. Keywords: Linear Programming, Optimal Classification, Sensitivity Analysis, Support Vector Machines. Normal 0 14 false false false IT X-NONE X-NONE

  19. An algebraic approach to analysis of recursive and concurrent programs

    DEFF Research Database (Denmark)

    Terepeta, Michal Tomasz

    This thesis focuses on formal techniques based on static program analysis, model checking and abstract interpretation that offer means for reasoning about software, verification of its properties and discovering potential bugs. First, we investigate an algebraic approach to static analysis...... the soundness or completeness results. Moreover, we present a new application of pushdown systems in the context of an aspect-oriented process calculus. The addition of aspect-oriented features makes it possible for a process to exhibit a recursive structure. We show how one can faithfully model and analyze...... present a compact data structure as well as efficient algorithms for the semiring operations. Apart from that, we discuss an improvement to Pre* and Post* algorithms for pushdown systems, making it possible to directly use program representations such as program graphs. We present a modular library...

  20. A Chemoinformatics Approach to the Discovery of Lead-Like Molecules from Marine and Microbial Sources En Route to Antitumor and Antibiotic Drugs

    Directory of Open Access Journals (Sweden)

    Florbela Pereira

    2014-01-01

    Full Text Available The comprehensive information of small molecules and their biological activities in the PubChem database allows chemoinformatic researchers to access and make use of large-scale biological activity data to improve the precision of drug profiling. A Quantitative Structure–Activity Relationship approach, for classification, was used for the prediction of active/inactive compounds relatively to overall biological activity, antitumor and antibiotic activities using a data set of 1804 compounds from PubChem. Using the best classification models for antibiotic and antitumor activities a data set of marine and microbial natural products from the AntiMarin database were screened—57 and 16 new lead compounds for antibiotic and antitumor drug design were proposed, respectively. All compounds proposed by our approach are classified as non-antibiotic and non-antitumor compounds in the AntiMarin database. Recently several of the lead-like compounds proposed by us were reported as being active in the literature.

  1. Process and technoeconomic analysis of leading pretreatment technologies for lignocellulosic ethanol production using switchgrass.

    Science.gov (United States)

    Tao, Ling; Aden, Andy; Elander, Richard T; Pallapolu, Venkata Ramesh; Lee, Y Y; Garlock, Rebecca J; Balan, Venkatesh; Dale, Bruce E; Kim, Youngmi; Mosier, Nathan S; Ladisch, Michael R; Falls, Matthew; Holtzapple, Mark T; Sierra, Rocio; Shi, Jian; Ebrik, Mirvat A; Redmond, Tim; Yang, Bin; Wyman, Charles E; Hames, Bonnie; Thomas, Steve; Warner, Ryan E

    2011-12-01

    Six biomass pretreatment processes to convert switchgrass to fermentable sugars and ultimately to cellulosic ethanol are compared on a consistent basis in this technoeconomic analysis. The six pretreatment processes are ammonia fiber expansion (AFEX), dilute acid (DA), lime, liquid hot water (LHW), soaking in aqueous ammonia (SAA), and sulfur dioxide-impregnated steam explosion (SO(2)). Each pretreatment process is modeled in the framework of an existing biochemical design model so that systematic variations of process-related changes are consistently captured. The pretreatment area process design and simulation are based on the research data generated within the Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) 3 project. Overall ethanol production, total capital investment, and minimum ethanol selling price (MESP) are reported along with selected sensitivity analysis. The results show limited differentiation between the projected economic performances of the pretreatment options, except for processes that exhibit significantly lower monomer sugar and resulting ethanol yields.

  2. Heat Recovery in a Pasta Factory. Pinch Analysis Leads to Optimal Heat Pump Usage.

    OpenAIRE

    Staine, Frédéric; Favrat, Daniel; Krummenacher, Pierre

    1994-01-01

    In the previous issue of the IEA Heat Pump Centre Newsletter (Vol, 12, No.3, pp. 29-31), an article by these authors described the use of pinch analysis (also known as pinch technology) in a buildings application. This article describes a similar procedure for integrating a heat pump into a pasta production process. Many industrial processes, and particularly those dealing with drying, are characterized by an overabundance of low- grade heat which often cannot be effi...

  3. Scientific publications from Arab world in leading journals of Integrative and Complementary Medicine: a bibliometric analysis

    OpenAIRE

    Zyoud, Sa’ed H.; Al-Jabi, Samah W; Sweileh, Waleed M.

    2015-01-01

    Background Bibliometric analysis is increasingly employed as a useful tool to assess the quantity and quality of research performance. The specific goal of the current study was to evaluate the performance of research output originating from Arab world and published in international Integrative and Complementary Medicine (ICM) journals. Methods Original scientific publications and reviews from the 22 Arab countries that were published in 22 international peer-reviewed ICM journals during all ...

  4. Lead Poisoning

    Science.gov (United States)

    Lead is a metal that occurs naturally in the earth's crust. Lead can be found in all parts of our ... from human activities such as mining and manufacturing. Lead used to be in paint; older houses may ...

  5. Analysis of the decision-making process leading to appendectomy: a grounded theory study.

    Science.gov (United States)

    Larsson, Gerry; Weibull, Henrik; Larsson, Bodil Wilde

    2004-11-01

    The aim was to develop a theoretical understanding of the decision-making process leading to appendectomy. A qualitative interview study was performed in the grounded theory tradition using the constant comparative method to analyze data. The study setting was one county hospital and two local hospitals in Sweden, where 11 surgeons and 15 surgical nurses were interviewed. A model was developed which suggests that surgeons' decision making regarding appendectomy is formed by the interplay between their medical assessment of the patient's condition and a set of contextual characteristics. The latter consist of three interacting factors: (1) organizational conditions, (2) the professional actors' individual characteristics and interaction, and (3) the personal characteristics of the patient and his or her family or relatives. In case the outcome of medical assessment is ambiguous, the risk evaluation and final decision will be influenced by an interaction of the contextual characteristics. It was concluded that, compared to existing, rational models of decision making, the model presented identified potentially important contextual characteristics and an outline on when they come into play.

  6. Turbopump Design and Analysis Approach for Nuclear Thermal Rockets

    Science.gov (United States)

    Chen, Shu-cheng S.; Veres, Joseph P.; Fittje, James E.

    2006-01-01

    A rocket propulsion system, whether it is a chemical rocket or a nuclear thermal rocket, is fairly complex in detail but rather simple in principle. Among all the interacting parts, three components stand out: they are pumps and turbines (turbopumps), and the thrust chamber. To obtain an understanding of the overall rocket propulsion system characteristics, one starts from analyzing the interactions among these three components. It is therefore of utmost importance to be able to satisfactorily characterize the turbopump, level by level, at all phases of a vehicle design cycle. Here at NASA Glenn Research Center, as the starting phase of a rocket engine design, specifically a Nuclear Thermal Rocket Engine design, we adopted the approach of using a high level system cycle analysis code (NESS) to obtain an initial analysis of the operational characteristics of a turbopump required in the propulsion system. A set of turbopump design codes (PumpDes and TurbDes) were then executed to obtain sizing and performance characteristics of the turbopump that were consistent with the mission requirements. A set of turbopump analyses codes (PUMPA and TURBA) were applied to obtain the full performance map for each of the turbopump components; a two dimensional layout of the turbopump based on these mean line analyses was also generated. Adequacy of the turbopump conceptual design will later be determined by further analyses and evaluation. In this paper, descriptions and discussions of the aforementioned approach are provided and future outlooks are discussed.

  7. Turbopump Design and Analysis Approach for Nuclear Thermal Rockets

    Science.gov (United States)

    Chen, Shu-Cheng S.; Veres, Joseph P.; Fittje, James E.

    2006-01-01

    A rocket propulsion system, whether it is a chemical rocket or a nuclear thermal rocket, is fairly complex in detail but rather simple in principle. Among all the interacting parts, three components stand out: they are pumps & turbines (turbopumps), and the thrust chamber. To obtain an understanding of the overall rocket propulsion system characteristics, one starts from analyzing the interactions among these three components. It is therefore of utmost importance to be able to satisfactorily characterize the turbopump, level by level, at all phases of a vehicle design cycle. Here at the NASA Glenn Research Center, as the starting phase of a rocket engine design, specifically a Nuclear Thermal Rocket Engine design, we adopted the approach of using a high level system cycle analysis code (NESS) to obtain an initial analysis of the operational characteristics of a turbopump required in the propulsion system. A set of turbopump design codes (PumpDes and TurbDes) were then executed to obtain sizing and performance parameters of the turbopump that were consistent with the mission requirements. A set of turbopump analyses codes (PUMPA and TURBA) were applied to obtain the full performance map for each of the turbopump components; a two dimensional layout of the turbopump based on these mean line analyses was also generated. Adequacy of the turbopump conceptual design will later be determined by further analyses and evaluation. In this paper, descriptions and discussions of the aforementioned approach are provided and future outlooks are discussed.

  8. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  9. Adaptive molecular resolution approach in Hamiltonian form: An asymptotic analysis

    Science.gov (United States)

    Zhu, Jinglong; Klein, Rupert; Delle Site, Luigi

    2016-10-01

    Adaptive molecular resolution approaches in molecular dynamics are becoming relevant tools for the analysis of molecular liquids characterized by the interplay of different physical scales. The essential difference among these methods is in the way the change of molecular resolution is made in a buffer (transition) region. In particular a central question concerns the possibility of the existence of a global Hamiltonian which, by describing the change of resolution, is at the same time physically consistent, mathematically well defined, and numerically accurate. In this paper we present an asymptotic analysis of the adaptive process complemented by numerical results and show that under certain mathematical conditions a Hamiltonian, which is physically consistent and numerically accurate, may exist. Such conditions show that molecular simulations in the current computational implementation require systems of large size, and thus a Hamiltonian approach such as the one proposed, at this stage, would not be practical from the numerical point of view. However, the Hamiltonian proposed provides the basis for a simplification and generalization of the numerical implementation of adaptive resolution algorithms to other molecular dynamics codes.

  10. Nonlinear Static Analysis Of 3-D RC Framed Asymmetric Building With Lead Rubber Isolator Using Sap2000v15

    Directory of Open Access Journals (Sweden)

    Mohammed Asim Khan

    2014-09-01

    Full Text Available Many buildings in the present scenario have irregular configurations both in plan and elevation. This in future may be subjected to devastating earthquakes. So it is also necessary to enhance the seismic performance of asymmetric buildings by using seismic control techniques. In the present study a total of 9 models, asymmetrical in plan (L-shape are taken for analysis to cover the broader spectrum of low, medium & high rise buildings for the seismic control of the structures using pushover analysis, two different techniques were considered such as lead rubber bearing isolator and masonry infill walls, the analysis has been carried out using SAP2000V15. The results of bare frame and other building models have been compared, the presence of lead rubber base isolator, top story drift get reduced as compared with masonry infill wall. The trend was found to be reversed for high rise buildings especially with the application of isolation systems due to the massive increase in the story displacements suggesting the ineffectiveness of the base isolators for high rise buildings successively the plastic hinge pattern formed after carrying out the pushover analysis was also studied which indicated that structural performance was considerably improved.

  11. A life cycle analysis approach to D and D decision-making

    Energy Technology Data Exchange (ETDEWEB)

    Yuracko, K.L.; Gresalfi, M. [Oak Ridge National Lab., TN (United States); Yerace, P. [Dept. of Energy, Fernald, OH (United States). Fernald Environmental Management; Flora, J. [West Valley Demonstration Project, NY (United States); Krstich, M.; Gerrick, D. [Environmental Management Solutions, Mason, OH (United States)

    1998-05-01

    This paper describes a life cycle analysis (LCA) approach that makes decontamination and decommissioning (D and D) of US Department of Energy facilities more efficient and more responsive to the concerns of the society. With the considerable complexity of D and D projects and their attendant environmental and health consequences, projects can no longer be designed based on engineering and economic criteria alone. Using the LCA D and D approach, the evaluation of material disposition alternatives explicitly includes environmental impacts, health and safety impacts, socioeconomic impacts, and stakeholder attitudes -- in addition to engineering and economic criteria. Multi-attribute decision analysis is used to take into consideration the uncertainties and value judgments that are an important part of all material disposition decisions. Use of the LCA D and D approach should lead to more appropriate selections of material disposition pathways and a decision-making process that is both understandable and defensible. The methodology and procedures of the LCA D and D approach are outlined and illustrated by an application of the approach at the Department of Energy`s West Valley Demonstration Project. Specifically, LCA was used to aid decisions on disposition of soil and concrete from the Tank Pad D and D Project. A decision tree and the Pollution Prevention/Waste Minimization Users Guide for Environmental Restoration Projects were used to identify possible alternatives for disposition of the soil and concrete. Eight alternatives encompassing source reduction, segregation, treatment, and disposal were defined for disposition of the soil; two alternatives were identified for disposition of the concrete. Preliminary results suggest that segregation and treatment are advantageous in the disposition of both the soil and the concrete. This and other recent applications illustrate the strength and ease of application of the LCA D and D approach.

  12. An approach for economic analysis of intermodal transportation.

    Science.gov (United States)

    Sahin, Bahri; Yilmaz, Huseyin; Ust, Yasin; Guneri, Ali Fuat; Gulsun, Bahadir; Turan, Eda

    2014-01-01

    A different intermodal transportation model based on cost analysis considering technical, economical, and operational parameters is presented. The model consists of such intermodal modes as sea-road, sea-railway, road-railway, and multimode of sea-road-railway. A case study of cargo transportation has been carried out by using the suggested model. Then, the single road transportation mode has been compared to intermodal modes in terms of transportation costs. This comparison takes into account the external costs of intermodal transportation. The research reveals that, in the short distance transportation, single transportation modes always tend to be advantageous. As the transportation distance gets longer, intermodal transportation advantages begin to be effective on the costs. In addition, the proposed method in this study leads to determining the fleet size and capacity for transportation and the appropriate transportation mode.

  13. An Approach for Economic Analysis of Intermodal Transportation

    Directory of Open Access Journals (Sweden)

    Bahri Sahin

    2014-01-01

    Full Text Available A different intermodal transportation model based on cost analysis considering technical, economical, and operational parameters is presented. The model consists of such intermodal modes as sea-road, sea-railway, road-railway, and multimode of sea-road-railway. A case study of cargo transportation has been carried out by using the suggested model. Then, the single road transportation mode has been compared to intermodal modes in terms of transportation costs. This comparison takes into account the external costs of intermodal transportation. The research reveals that, in the short distance transportation, single transportation modes always tend to be advantageous. As the transportation distance gets longer, intermodal transportation advantages begin to be effective on the costs. In addition, the proposed method in this study leads to determining the fleet size and capacity for transportation and the appropriate transportation mode.

  14. Gap Analysis Approach for Construction Safety Program Improvement

    Directory of Open Access Journals (Sweden)

    Thanet Aksorn

    2007-06-01

    Full Text Available To improve construction site safety, emphasis has been placed on the implementation of safety programs. In order to successfully gain from safety programs, factors that affect their improvement need to be studied. Sixteen critical success factors of safety programs were identified from safety literature, and these were validated by safety experts. This study was undertaken by surveying 70 respondents from medium- and large-scale construction projects. It explored the importance and the actual status of critical success factors (CSFs. Gap analysis was used to examine the differences between the importance of these CSFs and their actual status. This study found that the most critical problems characterized by the largest gaps were management support, appropriate supervision, sufficient resource allocation, teamwork, and effective enforcement. Raising these priority factors to satisfactory levels would lead to successful safety programs, thereby minimizing accidents.

  15. Chemical characterization of tin-lead glazed ceramics from Aragon (Spain) by neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Inanez, J.G. [Smithsonian Institution, Suitland, MD (United States). Museum Conservation Inst.; Barcelona Univ. (Spain). Facultat de Geografia i Historia; Speakman, R.J. [Smithsonian Institution, Suitland, MD (United States). Museum Conservation Inst.; Buxeda i Garrigos, J. [Barcelona Univ. (Spain). Facultat de Geografia i Historia; Glascock, M.D. [Missouri Univ., Columbia, MO (United States). Research Reactor Center

    2010-07-01

    Majolica pottery was the most characteristic tableware produced in Spain during the Medieval and Renaissance periods. A study of the three main production centers in the historical region of Aragon during Middle Ages and Renaissance was conducted on a set of 71 samples. The samples were analyzed by instrumental neutron activation analysis (INAA), and the resulting data were interpreted using an array of multivariate statistical procedures. Our results show a clear discrimination among different production centers allowing a reliable provenance attribution of ceramic sherds from the Aragonese workshops. (orig.)

  16. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    Science.gov (United States)

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  17. Detection and Analysis of Lead,Cadmium and Arsenic Content in Common Vegetables

    Institute of Scientific and Technical Information of China (English)

    Yining; HE; Peixia; CHENG; Ming; WANG; Minyu; HU

    2014-01-01

    This study was carried out to detect content of heavy metals( Pb,Cd,and As) in vegetables,understand the current situation of heavy metal contamination in vegetables,and to provide scientific reference for further researches. It randomly selected 6 large vegetable markets and 6 supermarkets in Changsha City,selected 8 types of typical vegetables,and detected 96 samples. In accordance with maximum level of contaminants in foods in existing GB2762- 2012 standard,Nemerow composite pollution index( Pt) and grading standards,it made evaluation: uncontaminated( Pt≤ 1),mildly contaminated( 1 < Pt≤2),moderately contaminated( 2 < Pt≤3),and highly contaminated( Pt>3). Among 96 samples,range of content of Pb,Cd and As is( 0. 06- 1. 41),( 0. 06- 1. 26) and( 0. 00- 0. 91) mg / kg respectively; the over- limit rate of these metals exceeding the safety level is 78. 13%,45. 83%,and 34. 38% separately; the composite pollution index is in( 0. 90-6. 05),the eggplant is 6. 05 and hot pepper is 3. 24; the content of Pb( F =23. 908,P =0. 001) and Cd( F =64. 908,P =0. 000)are significantly different between 8 types of vegetables and there is no significant difference between the content of As( F = 4. 634,P = 0. 705> 0. 05) in 8 types of vegetables. Study shows that common vegetables in Changsha City has problem of excess Pb,Cd and As,and the Pb over- limit rate is the highest. The composite pollution index indicates that most heavy metal contamination of vegetables is mild and moderate contamination,melon,fruit and vegetable contamination is high contamination,and Cd is the major factor leading to contamination of melons,fruits and vegetables.

  18. Analysis of induced electrical currents from magnetic field coupling inside implantable neurostimulator leads

    Directory of Open Access Journals (Sweden)

    Seidman Seth J

    2011-10-01

    Full Text Available Abstract Background Over the last decade, the number of neurostimulator systems implanted in patients has been rapidly growing. Nearly 50, 000 neurostimulators are implanted worldwide annually. The most common type of implantable neurostimulators is indicated for pain relief. At the same time, commercial use of other electromagnetic technologies is expanding, making electromagnetic interference (EMI of neurostimulator function an issue of concern. Typically reported sources of neurostimulator EMI include security systems, metal detectors and wireless equipment. When near such sources, patients with implanted neurostimulators have reported adverse events such as shock, pain, and increased stimulation. In recent in vitro studies, radio frequency identification (RFID technology has been shown to inhibit the stimulation pulse of an implantable neurostimulator system during low frequency exposure at close distances. This could potentially be due to induced electrical currents inside the implantable neurostimulator leads that are caused by magnetic field coupling from the low frequency identification system. Methods To systematically address the concerns posed by EMI, we developed a test platform to assess the interference from coupled magnetic fields on implantable neurostimulator systems. To measure interference, we recorded the output of one implantable neurostimulator, programmed for best therapy threshold settings, when in close proximity to an operating low frequency RFID emitter. The output contained electrical potentials from the neurostimulator system and those induced by EMI from the RFID emitter. We also recorded the output of the same neurostimulator system programmed for best therapy threshold settings without RFID interference. Using the Spatially Extended Nonlinear Node (SENN model, we compared threshold factors of spinal cord fiber excitation for both recorded outputs. Results The electric current induced by low frequency RFID emitter

  19. Development of a CT-guided standard approach for tined lead implantation at the sacral nerve root S3 in minipigs for chronic neuromodulation

    Directory of Open Access Journals (Sweden)

    Foditsch EE

    2016-09-01

    Full Text Available Elena Esra Foditsch,1 Reinhold Zimmermann2 1Urology, Spinal Cord Injury and Tissue Regeneration Center Salzburg, Paracelsus Medical University, 2University Clinic of Urology and Andrology, Salzburg General Hospital, Paracelsus Medical University, Salzburg, Austria Purpose: The aim of this study was to develop a controlled approach for sacral neuromodulation (SNM to improve both nerve targeting and tined lead placement, for which a new computed tomography (CT-guided implantation technique was analyzed in minipigs.Materials and methods: This study included five female, adult Göttingen minipigs. In deep sedoanalgesia, the minipigs were placed in an extended prone position. Commercially available SNM materials were used (needle, introduction sheath, and quadripolar tined lead electrode. Gross anatomy was displayed by CT, and the nerves were bilaterally identified. The optimal angles to puncture the S3 foramen, the resulting access path, and the site for the skin incision were defined subsequently. The needle puncture and the tined lead placement were followed by successive CT scans/3D-reconstruction images. Once proper CT-guided placement of the needle and electrode was established, response to functional stimuli was intraoperatively checked to verify correct positioning.Results: Successful bilateral tined lead implantation was performed in four out of five minipigs. Implantation was different from the clinical situation because the puncture was done from the contralateral side at a 30° angle to the midline and 60° horizontal angle to ensure both passage through the foramen and nerve access. Surgery time was 50–150 minutes. Stimulation response comprised a twitch of the perianal musculature and tail rotation to the contralateral side.Conclusion: We have established a new, minimally invasive, highly standardized, CT-guided SNM electrode implantation technique. Functional outcomes are clearly defined and reproducible. All procedures can be

  20. Quantitative analysis of lead in aqueous solutions by ultrasonic nebulizer assisted laser induced breakdown spectroscopy

    Science.gov (United States)

    Zhong, Shi-Lei; Lu, Yuan; Kong, Wei-Jin; Cheng, Kai; Zheng, Ronger

    2016-08-01

    In this study, an ultrasonic nebulizer unit was established to improve the quantitative analysis ability of laser-induced breakdown spectroscopy (LIBS) for liquid samples detection, using solutions of the heavy metal element Pb as an example. An analytical procedure was designed to guarantee the stability and repeatability of the LIBS signal. A series of experiments were carried out strictly according to the procedure. The experimental parameters were optimized based on studies of the pulse energy influence and temporal evolution of the emission features. The plasma temperature and electron density were calculated to confirm the LTE state of the plasma. Normalizing the intensities by background was demonstrated to be an appropriate method in this work. The linear range of this system for Pb analysis was confirmed over a concentration range of 0-4,150ppm by measuring 12 samples with different concentrations. The correlation coefficient of the fitted calibration curve was as high as 99.94% in the linear range, and the LOD of Pb was confirmed as 2.93ppm. Concentration prediction experiments were performed on a further six samples. The excellent quantitative ability of the system was demonstrated by comparison of the real and predicted concentrations of the samples. The lowest relative error was 0.043% and the highest was no more than 7.1%.

  1. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  2. Use of multi-criteria decision analysis in regulatory alternatives analysis: a case study of lead free solder.

    Science.gov (United States)

    Malloy, Timothy F; Sinsheimer, Peter J; Blake, Ann; Linkov, Igor

    2013-10-01

    Regulators are implementing new programs that require manufacturers of products containing certain chemicals of concern to identify, evaluate, and adopt viable, safer alternatives. Such programs raise the difficult question for policymakers and regulated businesses of which alternatives are "viable" and "safer." To address that question, these programs use "alternatives analysis," an emerging methodology that integrates issues of human health and environmental effects with technical feasibility and economic impact. Despite the central role that alternatives analysis plays in these programs, the methodology itself is neither well-developed nor tailored to application in regulatory settings. This study uses the case of Pb-based bar solder and its non-Pb-based alternatives to examine the application of 2 multi-criteria decision analysis (MCDA) methods to alternatives analysis: multi-attribute utility analysis and outranking. The article develops and evaluates an alternatives analysis methodology and supporting decision-analysis software for use in a regulatory context, using weighting of the relevant decision criteria generated from a stakeholder elicitation process. The analysis produced complete rankings of the alternatives, including identification of the relative contribution to the ranking of each of the highest level decision criteria such as human health impacts, technical feasibility, and economic feasibility. It also examined the effect of variation in data conventions, weighting, and decision frameworks on the outcome. The results indicate that MCDA can play a critical role in emerging prevention-based regulatory programs. Multi-criteria decision analysis methods offer a means for transparent, objective, and rigorous analysis of products and processes, providing regulators and stakeholders with a common baseline understanding of the relative performance of alternatives and the trade-offs they present.

  3. Pencil lead scratches on steel surfaces as a substrate for LIBS analysis of dissolved salts in liquids

    Energy Technology Data Exchange (ETDEWEB)

    Jijon, D; Costa, C, E-mail: judijival@hotmail.com [Departamento de Fisica, Escuela Politecnica Nacional, Ladron de Guevara E11-256, Apartado 17-12-866, Quito (Ecuador)

    2011-01-01

    A new substrate for the quantitative analysis of salts dissolved in liquids with Laser-induced Breakdown Spectroscopy (LIBS) is introduced for the first time. A steel surface scratched with HB pencil lead is introduced as a very efficient and sensitive substrate for quantitative analysis of dissolved salts in liquids. In this work we demonstrate the analytical quality of this system with the analysis of the crystalline deposits formed by the dried aqueous solutions of salts. We focused on analytical parameters such as sensitivity and linearity for the salt cations in each case. Four salts were studied (Sr(NO{sub 3}){sub 2}, LiSO{sub 4}, RbCl and BaCl), at nine different concentrations each. To improve linearity and lower the overall error in the calibration curves, we introduce a novel outlier removal method that takes into account the homogeneity of the dry deposits on the analytical surface.

  4. Numerical 3D analysis of cloud cavitation shedding frequency on a circular leading edge hydrofoil with a barotropic cavitation model

    Science.gov (United States)

    Blume, M.; Skoda, R.

    2015-12-01

    A compressible density-based time-explicit low Mach number consistent viscous flow solver is utilised in combination with a barotropic cavitation model for the analysis of cloud cavitation on a circular leading edge (CLE) hydrofoil. For 5° angle of attack, cloud structure and shedding frequency for different cavitation numbers are compared to experimental data. A strong grid sensitivity is found in particular for high cavitation numbers. On a fine grid, a very good agreement with validation data is achieved even without explicit turbulence model. The neglect of viscous effects as well as a two-dimensional set-up lead to a less realistic prediction of cloud structures and frequencies. Comparative simulations with the Sauer-Schnerr cavitation model and modified pre-factors of the mass transfer terms underestimate the measured shedding frequency.

  5. Profit and Cost Efficiency Analysis in Banking Sector: A Case of Stochastic Frontier Approach for Vietnam

    Directory of Open Access Journals (Sweden)

    Le Thi Thanh Ngan

    2014-10-01

    Full Text Available By using stochastic frontier analysis (SFA approach to measure the cost and profit efficiency for data of 45 Vietnam commercial banks over the years from 2007 to 2012, this paper highlights risk and asset quality factors related to cost and profit inefficiency of the banks. Besides, cost inefficiency seems to be strongly related to bank concentration, mergers, and bank ownership. These results suggested mergers and acquisition can gain potential cost inefficiency and foster banks’ competition in the banking system. Based on test of variance analysis, State owned commercial banks (SCOB is more efficiency than other domestic, commercial banks (JSCB and foreign banks in terms of profit efficiency, in contrast to cost efficiency, international banks are leading in cost efficiency than other national banks.

  6. Comparative Analysis for Polluted Agricultural Soils with Arsenic, Lead, and Mercury in Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Yarto-Ramirez, Mario; Santos-Santos, Elvira; Gavilan-Garcia, Arturo; Castro-Diaz, Jose; Gavilan-Garcia, Irma Cruz; Rosiles, Rene; Suarez, Sara

    2004-03-31

    The use of mercury in Mexico has been associated with the mining industry of Zacatecas. This activity has polluted several areas currently used for agriculture. The main objective of this study was to investigate the heavy metal concentration (Hg, As and Pb) in soil of Guadalupe Zacatecas in order to justify a further environmental risk assessment in the site. A 2X3 km grid was used for the sampling process and 20 soil samples were taken. The analysis was developed using EPA SW 846: 3050B/6010B method for arsenic and metals and EPA SW 846: 7471A for total mercury. It was concluded that there are heavy metals in agricultural soils used for corn and bean farming. For this it is required to make an environmental risk assessment and a bioavailability study in order to determine if there's a risk for heavy metals bioaccumulation in animals or human beings or metal lixiviation to aquifers.

  7. Launch Vehicle Abort Analysis for Failures Leading to Loss of Control

    Science.gov (United States)

    Hanson, John M.; Hill, Ashley D.; Beard, Bernard B.

    2013-01-01

    Launch vehicle ascent is a time of high risk for an onboard crew. There is a large fraction of possible failures for which time is of the essence and a successful abort is possible if the detection and action happens quickly enough. This paper focuses on abort determination based on data already available from the Guidance, Navigation, and Control system. This work is the result of failure analysis efforts performed during the Ares I launch vehicle development program. The two primary areas of focus are the derivation of abort triggers to ensure that abort occurs as quickly as possible when needed, but that false aborts are avoided, and evaluation of success in aborting off the failing launch vehicle.

  8. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  9. Lead Toxicity

    Science.gov (United States)

    ... including some imported jewelry. What are the health effects of lead? • More commonly, lower levels of lead in children over time may lead to reduced IQ, slow learning, Attention Deficit Hyperactivity Disorder (ADHD), or behavioral issues. • Lead also affects other ...

  10. Artificial intelligence approach in analysis of DNA sequences.

    Science.gov (United States)

    Brézillon, P J; Zaraté, P; Saci, F

    1993-01-01

    We present an approach for designing a knowledge-based system, called Sequence Acquisition In Context (SAIC), that will be able to cooperate with a biologist in the analysis of DNA sequences. The main task of the system is the acquisition of the expert knowledge that the biologist uses for solving ambiguities from gel autoradiograms, with the aim of re-using it later for solving similar ambiguities. The various types of expert knowledge constitute what we call the contextual knowledge of the sequence analysis. Contextual knowledge deals with the unavoidable problems that are common in the study of the living material (eg noise on data, difficulties of observations). Indeed, the analysis of DNA sequences from autoradiograms belongs to an emerging and promising area of investigation, namely reasoning with images. The SAIC project is developed in a theoretical framework that is shared with other applications. Not all tasks have the same importance in each application. We use this observation for designing an intelligent assistant system with three applications. In the SAIC project, we focus on knowledge acquisition, human-computer interaction and explanation. The project will benefit research in the two other applications. We also discuss our SAIC project in the context of large international projects that aim to re-use and share knowledge in a repository.

  11. Network Analysis: A Novel Approach to Understand Suicidal Behaviour

    Science.gov (United States)

    de Beurs, Derek

    2017-01-01

    Although suicide is a major public health issue worldwide, we understand little of the onset and development of suicidal behaviour. Suicidal behaviour is argued to be the end result of the complex interaction between psychological, social and biological factors. Epidemiological studies resulted in a range of risk factors for suicidal behaviour, but we do not yet understand how their interaction increases the risk for suicidal behaviour. A new approach called network analysis can help us better understand this process as it allows us to visualize and quantify the complex association between many different symptoms or risk factors. A network analysis of data containing information on suicidal patients can help us understand how risk factors interact and how their interaction is related to suicidal thoughts and behaviour. A network perspective has been successfully applied to the field of depression and psychosis, but not yet to the field of suicidology. In this theoretical article, I will introduce the concept of network analysis to the field of suicide prevention, and offer directions for future applications and studies.

  12. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  13. Microcanonical thermostatistics analysis without histograms: cumulative distribution and Bayesian approaches

    CERN Document Server

    Alves, Nelson A; Rizzi, Leandro G

    2015-01-01

    Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...

  14. Single-cell analysis of mixed-lineage states leading to a binary cell fate choice.

    Science.gov (United States)

    Olsson, Andre; Venkatasubramanian, Meenakshi; Chaudhri, Viren K; Aronow, Bruce J; Salomonis, Nathan; Singh, Harinder; Grimes, H Leighton

    2016-09-29

    Delineating hierarchical cellular states, including rare intermediates and the networks of regulatory genes that orchestrate cell-type specification, are continuing challenges for developmental biology. Single-cell RNA sequencing is greatly accelerating such research, given its power to provide comprehensive descriptions of genomic states and their presumptive regulators. Haematopoietic multipotential progenitor cells, as well as bipotential intermediates, manifest mixed-lineage patterns of gene expression at a single-cell level. Such mixed-lineage states may reflect the molecular priming of different developmental potentials by co-expressed alternative-lineage determinants, namely transcription factors. Although a bistable gene regulatory network has been proposed to regulate the specification of either neutrophils or macrophages, the nature of the transition states manifested in vivo, and the underlying dynamics of the cell-fate determinants, have remained elusive. Here we use single-cell RNA sequencing coupled with a new analytic tool, iterative clustering and guide-gene selection, and clonogenic assays to delineate hierarchical genomic and regulatory states that culminate in neutrophil or macrophage specification in mice. We show that this analysis captured prevalent mixed-lineage intermediates that manifested concurrent expression of haematopoietic stem cell/progenitor and myeloid progenitor cell genes. It also revealed rare metastable intermediates that had collapsed the haematopoietic stem cell/progenitor gene expression programme, instead expressing low levels of the myeloid determinants, Irf8 and Gfi1 (refs 9, 10, 11, 12, 13). Genetic perturbations and chromatin immunoprecipitation followed by sequencing revealed Irf8 and Gfi1 as key components of counteracting myeloid-gene-regulatory networks. Combined loss of these two determinants 'trapped' the metastable intermediate. We propose that mixed-lineage states are obligatory during cell-fate specification

  15. What leads Indians to participate in clinical trials? A meta-analysis of qualitative studies.

    Directory of Open Access Journals (Sweden)

    Jatin Y Shah

    Full Text Available BACKGROUND: With the globalization of clinical trials, large developing nations have substantially increased their participation in multi-site studies. This participation has raised ethical concerns, among them the fear that local customs, habits and culture are not respected while asking potential participants to take part in study. This knowledge gap is particularly noticeable among Indian subjects, since despite the large number of participants, little is known regarding what factors affect their willingness to participate in clinical trials. METHODS: We conducted a meta-analysis of all studies evaluating the factors and barriers, from the perspective of potential Indian participants, contributing to their participation in clinical trials. We searched both international as well as Indian-specific bibliographic databases, including Pubmed, Cochrane, Openjgate, MedInd, Scirus and Medknow, also performing hand searches and communicating with authors to obtain additional references. We enrolled studies dealing exclusively with the participation of Indians in clinical trials. Data extraction was conducted by three researchers, with disagreement being resolved by consensus. RESULTS: Six qualitative studies and one survey were found evaluating the main themes affecting the participation of Indian subjects. Themes included Personal health benefits, Altruism, Trust in physicians, Source of extra income, Detailed knowledge, Methods for motivating participants as factors favoring, while Mistrust on trial organizations, Concerns about efficacy and safety of trials, Psychological reasons, Trial burden, Loss of confidentiality, Dependency issues, Language as the barriers. CONCLUSION: We identified factors that facilitated and barriers that have negative implications on trial participation decisions in Indian subjects. Due consideration and weightage should be assigned to these factors while planning future trials in India.

  16. Stochastic approach to observability analysis in water networks

    Directory of Open Access Journals (Sweden)

    S. Díaz

    2016-07-01

    Full Text Available This work presents an alternative technique to the existing methods for observability analysis (OA in water networks, which is a prior essential step for the implementation of state estimation (SE techniques within such systems. The methodology presented here starts from a known hydraulic state and assumes random gaussian distributions for the uncertainty of some hydraulic variables, which is then propagated to the rest of the system. This process is repeated again to analyze the change in the network uncertainty when metering devices considered as error-free are included, based on which the network observability can be evaluated. The method’s potential is presented in an illustrative example, which shows the additional information that this methodology provides with respect to traditional OA approaches. This proposal allows a better understanding of the network and constitutes a practical tool to prioritize the location of additional meters, thus enhancing the transformation of large urban areas into actual smart cities.

  17. A Subjective Risk Analysis Approach of Container Supply Chains

    Institute of Scientific and Technical Information of China (English)

    Zai-Li Yang; Jin Wang; Steve Bonsall; Jian-Bo Yang; Quan-Gen Fang

    2005-01-01

    After the 9/11 terrorism attacks, the lock-out of the American West Ports in 2002 and the breakout of SARS disease in 2003 have further focused mind of both the public and industrialists to take effective and timely measures for assessing and controlling the risks related to container supply chains (CSCs). However, due to the complexity of the risks in the chains, conventional quantitative risk assessment (QRA) methods may not be capable of providing sufficient safety management information, as achieving such a functionality requires enabling the possibility of conducting risk analysis in view of the challenges and uncertainties posed by the unavailability and incompleteness of historical failure data. Combing the fuzzy set theory (FST) and an evidential reasoning (ER) approach, the paper presents a subjective method to deal with the vulnerability-based risks, which are more ubiquitous and uncertain than the traditional hazard-based ones in the chains.

  18. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  19. Strategic Technology Investment Analysis: An Integrated System Approach

    Science.gov (United States)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  20. Analysis of patient diaries in Danish ICUs: a narrative approach

    DEFF Research Database (Denmark)

    Egerod, Ingrid; Christensen, Doris

    2009-01-01

    OBJECTIVES: The objective was to describe the structure and content of patient diaries written for critically ill patients in Danish intensive care units (ICUs). BACKGROUND: Critical illness is associated with physical and psychological aftermath including cognitive impairment and post-traumatic ...... of the narratives may pave the way for insights to improve critical care nursing and ICU rehabilitation.......OBJECTIVES: The objective was to describe the structure and content of patient diaries written for critically ill patients in Danish intensive care units (ICUs). BACKGROUND: Critical illness is associated with physical and psychological aftermath including cognitive impairment and post......-traumatic stress. Patient diaries written in the intensive care unit are used to help ICU-survivors come to terms with their illness. RESEARCH METHODOLOGY: The study had a qualitative, descriptive and explorative design, using a narrative approach of analysis. Data were analysed on several levels: extra-case level...

  1. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  2. The Network Analysis of Urban Streets: A Primal Approach

    CERN Document Server

    Porta, S; Latora, V; Porta, Sergio; Crucitti, Paolo; Latora, Vito

    2005-01-01

    The network metaphor in the analysis of urban and territorial cases has a long tradition especially in transportation/land-use planning and economic geography. More recently, urban design has brought its contribution by means of the "space syntax" methodology. All these approaches, though under different terms like accessibility, proximity, integration,connectivity, cost or effort, focus on the idea that some places (or streets) are more important than others because they are more central. The study of centrality in complex systems,however, originated in other scientific areas, namely in structural sociology, well before its use in urban studies; moreover, as a structural property of the system, centrality has never been extensively investigated metrically in geographic networks as it has been topologically in a wide range of other relational networks like social, biological or technological. After two previous works on some structural properties of the dual and primal graph representations of urban street ne...

  3. Adaptive Approach for Situational Analysis of Space Experiments

    CERN Document Server

    Atanassov, Atanas Marinov

    2010-01-01

    The solving of scientific and practical application connected with conducting of satellite experiments and measurement demand analysis of geometric and physic conditions according to different kind of models. This is forced in connect of optimization of complex and expensive scientific projects. This article treat complex situation conditions described as conjunction of elementary mutually independent are discussed. Two formal algorithms based on alternative strategy for verification of multitude of the checked conditions which increase effectiveness of model calculation are present in this article. Every algorithm is connected with an approach for rearranging of the multitude of conditions. The inverse problem is presented as disjunction of all inverse constrains. The possibilities for applying of these algorithms are analyzed and behavior and effectiveness of the proposed algorithms are discussed.

  4. Factors influencing crime rates: an econometric analysis approach

    Science.gov (United States)

    Bothos, John M. A.; Thomopoulos, Stelios C. A.

    2016-05-01

    The scope of the present study is to research the dynamics that determine the commission of crimes in the US society. Our study is part of a model we are developing to understand urban crime dynamics and to enhance citizens' "perception of security" in large urban environments. The main targets of our research are to highlight dependence of crime rates on certain social and economic factors and basic elements of state anticrime policies. In conducting our research, we use as guides previous relevant studies on crime dependence, that have been performed with similar quantitative analyses in mind, regarding the dependence of crime on certain social and economic factors using statistics and econometric modelling. Our first approach consists of conceptual state space dynamic cross-sectional econometric models that incorporate a feedback loop that describes crime as a feedback process. In order to define dynamically the model variables, we use statistical analysis on crime records and on records about social and economic conditions and policing characteristics (like police force and policing results - crime arrests), to determine their influence as independent variables on crime, as the dependent variable of our model. The econometric models we apply in this first approach are an exponential log linear model and a logit model. In a second approach, we try to study the evolvement of violent crime through time in the US, independently as an autonomous social phenomenon, using autoregressive and moving average time-series econometric models. Our findings show that there are certain social and economic characteristics that affect the formation of crime rates in the US, either positively or negatively. Furthermore, the results of our time-series econometric modelling show that violent crime, viewed solely and independently as a social phenomenon, correlates with previous years crime rates and depends on the social and economic environment's conditions during previous years.

  5. Learning from Monet: A Fundamentally New Approach to Image Analysis

    Science.gov (United States)

    Falco, Charles M.

    2009-03-01

    The hands and minds of artists are intimately involved in the creative process, intrinsically making paintings complex images to analyze. In spite of this difficulty, several years ago the painter David Hockney and I identified optical evidence within a number of paintings that demonstrated artists as early as Jan van Eyck (c1425) used optical projections as aids for producing portions of their images. In the course of making those discoveries, Hockney and I developed new insights that are now being applied in a fundamentally new approach to image analysis. Very recent results from this new approach include identifying from Impressionist paintings by Monet, Pissarro, Renoir and others the precise locations the artists stood when making a number of their paintings. The specific deviations we find when accurately comparing these examples with photographs taken from the same locations provide us with key insights into what the artists' visual skills informed them were the ways to represent these two-dimensional images of three-dimensional scenes to viewers. As will be discussed, these results also have implications for improving the representation of certain scientific data. Acknowledgment: I am grateful to David Hockney for the many invaluable insights into imaging gained from him in our collaboration.

  6. Novel computational approaches for the analysis of cosmic magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Saveliev, Andrey [Universitaet Hamburg, Hamburg (Germany); Keldysh Institut, Moskau (Russian Federation)

    2016-07-01

    In order to give a consistent picture of cosmic, i.e. galactic and extragalactic, magnetic fields, different approaches are possible and often even necessary. Here we present three of them: First, a semianalytic analysis of the time evolution of primordial magnetic fields from which their properties and, subsequently, the nature of present-day intergalactic magnetic fields may be deduced. Second, the use of high-performance computing infrastructure by developing powerful algorithms for (magneto-)hydrodynamic simulations and applying them to astrophysical problems. We are currently developing a code which applies kinetic schemes in massive parallel computing on high performance multiprocessor systems in a new way to calculate both hydro- and electrodynamic quantities. Finally, as a third approach, astroparticle physics might be used as magnetic fields leave imprints of their properties on charged particles transversing them. Here we focus on electromagnetic cascades by developing a software based on CRPropa which simulates the propagation of particles from such cascades through the intergalactic medium in three dimensions. This may in particular be used to obtain information about the helicity of extragalactic magnetic fields.

  7. METHODOLOGICAL APPROACH AND MODEL ANALYSIS FOR IDENTIFICATION OF TOURIST TRENDS

    Directory of Open Access Journals (Sweden)

    Neven Šerić

    2015-06-01

    Full Text Available The draw and diversity of the destination’s offer is an antecedent of the tourism visits growth. The destination supply differentiation is carried through new, specialised tourism products. The usual approach consists of forming specialised tourism products in accordance with the existing tourism destination image. Another approach, prevalent in practice of developed tourism destinations is based on innovating the destination supply through accordance with the global tourism trends. For this particular purpose, it is advisable to choose a monitoring and analysis method of tourism trends. The goal is to determine actual trends governing target markets, differentiating whims from trends during the tourism preseason. When considering the return on investment, modifying the destination’s tourism offer on the basis of a tourism whim is a risky endeavour, indeed. Adapting the destination’s supply to tourism whims can result in a shifted image, one that is unable to ensure a long term interest and tourist vacation growth. With regard to tourism trend research and based on the research conducted, an advisable model for evaluating tourism phenomena is proposed, one that determines whether tourism phenomena is a tourism trend or a tourism whim.

  8. A practical approach to object based requirements analysis

    Science.gov (United States)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  9. Computational Approach to Dendritic Spine Taxonomy and Shape Transition Analysis

    Science.gov (United States)

    Bokota, Grzegorz; Magnowska, Marta; Kuśmierczyk, Tomasz; Łukasik, Michał; Roszkowska, Matylda; Plewczynski, Dariusz

    2016-01-01

    The common approach in morphological analysis of dendritic spines of mammalian neuronal cells is to categorize spines into subpopulations based on whether they are stubby, mushroom, thin, or filopodia shaped. The corresponding cellular models of synaptic plasticity, long-term potentiation, and long-term depression associate the synaptic strength with either spine enlargement or spine shrinkage. Although a variety of automatic spine segmentation and feature extraction methods were developed recently, no approaches allowing for an automatic and unbiased distinction between dendritic spine subpopulations and detailed computational models of spine behavior exist. We propose an automatic and statistically based method for the unsupervised construction of spine shape taxonomy based on arbitrary features. The taxonomy is then utilized in the newly introduced computational model of behavior, which relies on transitions between shapes. Models of different populations are compared using supplied bootstrap-based statistical tests. We compared two populations of spines at two time points. The first population was stimulated with long-term potentiation, and the other in the resting state was used as a control. The comparison of shape transition characteristics allowed us to identify the differences between population behaviors. Although some extreme changes were observed in the stimulated population, statistically significant differences were found only when whole models were compared. The source code of our software is freely available for non-commercial use1. Contact: d.plewczynski@cent.uw.edu.pl. PMID:28066226

  10. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  11. A Critical Analysis of Rational & Emotional Approaches in Car Selling

    Directory of Open Access Journals (Sweden)

    Krishn A. Goyal

    2010-12-01

    Full Text Available A well known fact is that investment in a Car is the costliest investment made in a life time only next to construction of a house, for any human being. It is a common knowledge that all of us are attracted towards cars right from childhood and we have developed our own perceptions for cars. When we acquire the capacity to buy cars, our experience of buying, involves both emotional and rational aspects which lead to a purchase decision. Unlike other consumable durables, the decision to buy specific brand of Car is shaped over a long period of time. The period between recognition of need to buy a car and the actual purchase may run into many weeks or even months. Considerable research has focussed on conceptually and operationally defining various factors that lead to a purchase decision. However, because of the inherent difficulties in deciphering consumer behaviour coupled with exponential changes in consumer aspirations, there is a need to constantly re-define our perceptions about consumer behaviour. Revealed Preference Theory of Samuelson & Bounded rationality Theory of Herbert Simon and many others have provided a conceptual analysis of Consumer Behaviour from the perspective of economics, we have still not been able to pinpoint whether consumers are Rational or Emotional when it comes to buying Cars. “ According to some early economic theorists (e.g., Adam Smith, Jeremy Bentham, Alfred Marshall, man’s/woman’s desire for goods and services exceed his/her ability to pay. Therefore, buying decisions are made through a rational process during which we assign a value to each desired product or service offering based upon our assessment of the ability of that offering to satisfy our needs and desires. This want satisfying ability is termed “utility.” As different offerings possess different levels of utility, rational behavior dictates that one seek to maximize utility.

  12. Analysis of dijet events in diffractive ep interactions with tagged leading proton at the H1 experiment

    Energy Technology Data Exchange (ETDEWEB)

    Polifka, Richard

    2011-08-15

    An inclusive dijet production in diffractive deep-inelastic scattering is measured. The diffractive selection is based on tagging of the leading proton in the Forward Proton Spectrometer. The statistics of events obtained during the HERA II running period (integrated luminosity of 156.7 pb{sup -1}) enables the measurement of jet final states with leading proton for the first time. The data cover the phase space of x{sub P}<0.1, vertical stroke t vertical stroke {<=}1.0 GeV{sup 2} and 4{<=} Q{sup 2} {<=}110 GeV{sup 2}. The dijet data are compared with the next to leading order predictions of the quantum chromodynamics (QCD). The phase space of diffractive dijets is in this analysis by factor of 3 in x{sub P} larger than in previous measurements. The QCD predictions based on the DGLAP parton evolution describe the measured data well even in a non-DGLAP enriched phase space where one on the jets goes into the region close to the direction of the outgoing proton. The measured single-differential cross sections are compared to several Monte Carlo models with different treatment of diffractive exchange implemented. (orig.)

  13. A historical review and bibliometric analysis of research on lead in drinking water field from 1991 to 2007.

    Science.gov (United States)

    Hu, Jie; Ma, Yuwei; Zhang, Liang; Gan, Fuxing; Ho, Yuh-Shan

    2010-03-01

    A bibliometric analysis based on Science Citation Index (SCI) published by Institute of Scientific Information (ISI) was carried out to identify the global research related to lead in drinking water field from 1991 to 2007 and to improve the understanding of research trends in the same period. The results from this analysis indicate that there have been an increasing number of annual publications mainly during two periods: from 1992 to 1997 and from 2004 to 2007. United States produced 37% of all pertinent articles followed by India with 8.0% and Canada with 4.8%. Science of the Total Environment published the most articles followed by Journal American Water Works Association and Toxicology. Summary of the most frequently used keywords are also provided. "Cadmium" was the most popular author keyword in the 17 years. Furthermore based on bibliometric results four research aspects were summarized in this paper and the historical research review was also presented.

  14. A historical review and bibliometric analysis of research on lead in drinking water field from 1991 to 2007

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Jie; Ma, Yuwei [Faculty of Civil Engineering and Geosciences, Delft University of Technology (Netherlands); Zhang, Liang [Institute of Geodesy and Geophysics, Chinese Academy of Sciences, Wuhan 430077 (China); Gan, Fuxing [School of Resource and Environmental Science, Wuhan University, Wuhan 430079 (China); Ho, Yuh-Shan, E-mail: ysho@asia.edu.tw [Water Research Centre, Asia University, Taichung 41354, Taiwan (China); Department of Public Health, China Medical University, Taichung 40402, Taiwan (China)

    2010-03-01

    A bibliometric analysis based on Science Citation Index (SCI) published by Institute of Scientific Information (ISI) was carried out to identify the global research related to lead in drinking water field from 1991 to 2007 and to improve the understanding of research trends in the same period. The results from this analysis indicate that there have been an increasing number of annual publications mainly during two periods: from 1992 to 1997 and from 2004 to 2007. United States produced 37% of all pertinent articles followed by India with 8.0% and Canada with 4.8%. Science of the Total Environment published the most articles followed by Journal American Water Works Association and Toxicology. Summary of the most frequently used keywords are also provided. 'Cadmium' was the most popular author keyword in the 17 years. Furthermore based on bibliometric results four research aspects were summarized in this paper and the historical research review was also presented.

  15. Manufacturing and alignment tolerance analysis through Montecarlo approach for PLATO

    Science.gov (United States)

    Magrin, Demetrio; Ragazzoni, Roberto; Bergomi, Maria; Biondi, Federico; Chinellato, Simonetta; Dima, Marco; Farinato, Jacopo; Greggio, Davide; Gullieuszik, Marco; Marafatto, Luca; Viotto, Valentina; Munari, Matteo; Pagano, Isabella; Sicilia, Daniela; Basso, Stefano; Borsa, Francesco; Ghigo, Mauro; Spiga, Daniele; Bandy, Timothy; Brändli, Mathias; Benz, Willy; Bruno, Giordano; De Roche, Thierry; Piazza, Daniele; Rieder, Martin; Brandeker, Alexis; Klebor, Maximilian; Mogulsky, Valery; Schweitzer, Mario; Wieser, Matthias; Erikson, Anders; Rauer, Heike

    2016-07-01

    The project PLAnetary Transits and Oscillations of stars (PLATO) is one of the selected medium class (M class) missions in the framework of the ESA Cosmic Vision 2015-2025 program. The main scientific goal of PLATO is the discovery and study of extrasolar planetary systems by means of planetary transits detection. According to the current baseline, the scientific payload consists of 34 all refractive telescopes having small aperture (120mm) and wide field of view (diameter greater than 37 degrees) observing over 0.5-1 micron wavelength band. The telescopes are mounted on a common optical bench and are divided in four families of eight telescopes with an overlapping line-of-sight in order to maximize the science return. Remaining two telescopes will be dedicated to support on-board star-tracking system and will be specialized on two different photometric bands for science purposes. The performance requirement, adopted as merit function during the analysis, is specified as 90% enclosed energy contained in a square having size 2 pixels over the whole field of view with a depth of focus of +/-20 micron. Given the complexity of the system, we have followed a Montecarlo analysis approach for manufacturing and alignment tolerances. We will describe here the tolerance method and the preliminary results, speculating on the assumed risks and expected performances.

  16. DNA Microarray Data Analysis: A Novel Biclustering Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Tewfik Ahmed H

    2006-01-01

    Full Text Available Biclustering algorithms refer to a distinct class of clustering algorithms that perform simultaneous row-column clustering. Biclustering problems arise in DNA microarray data analysis, collaborative filtering, market research, information retrieval, text mining, electoral trends, exchange analysis, and so forth. When dealing with DNA microarray experimental data for example, the goal of biclustering algorithms is to find submatrices, that is, subgroups of genes and subgroups of conditions, where the genes exhibit highly correlated activities for every condition. In this study, we develop novel biclustering algorithms using basic linear algebra and arithmetic tools. The proposed biclustering algorithms can be used to search for all biclusters with constant values, biclusters with constant values on rows, biclusters with constant values on columns, and biclusters with coherent values from a set of data in a timely manner and without solving any optimization problem. We also show how one of the proposed biclustering algorithms can be adapted to identify biclusters with coherent evolution. The algorithms developed in this study discover all valid biclusters of each type, while almost all previous biclustering approaches will miss some.

  17. DNA Microarray Data Analysis: A Novel Biclustering Algorithm Approach

    Science.gov (United States)

    Tchagang, Alain B.; Tewfik, Ahmed H.

    2006-12-01

    Biclustering algorithms refer to a distinct class of clustering algorithms that perform simultaneous row-column clustering. Biclustering problems arise in DNA microarray data analysis, collaborative filtering, market research, information retrieval, text mining, electoral trends, exchange analysis, and so forth. When dealing with DNA microarray experimental data for example, the goal of biclustering algorithms is to find submatrices, that is, subgroups of genes and subgroups of conditions, where the genes exhibit highly correlated activities for every condition. In this study, we develop novel biclustering algorithms using basic linear algebra and arithmetic tools. The proposed biclustering algorithms can be used to search for all biclusters with constant values, biclusters with constant values on rows, biclusters with constant values on columns, and biclusters with coherent values from a set of data in a timely manner and without solving any optimization problem. We also show how one of the proposed biclustering algorithms can be adapted to identify biclusters with coherent evolution. The algorithms developed in this study discover all valid biclusters of each type, while almost all previous biclustering approaches will miss some.

  18. Pathway analysis in attention deficit hyperactivity disorder: An ensemble approach.

    Science.gov (United States)

    Mooney, Michael A; McWeeney, Shannon K; Faraone, Stephen V; Hinney, Anke; Hebebrand, Johannes; Nigg, Joel T; Wilmot, Beth

    2016-09-01

    Despite a wealth of evidence for the role of genetics in attention deficit hyperactivity disorder (ADHD), specific and definitive genetic mechanisms have not been identified. Pathway analyses, a subset of gene-set analyses, extend the knowledge gained from genome-wide association studies (GWAS) by providing functional context for genetic associations. However, there are numerous methods for association testing of gene sets and no real consensus regarding the best approach. The present study applied six pathway analysis methods to identify pathways associated with ADHD in two GWAS datasets from the Psychiatric Genomics Consortium. Methods that utilize genotypes to model pathway-level effects identified more replicable pathway associations than methods using summary statistics. In addition, pathways implicated by more than one method were significantly more likely to replicate. A number of brain-relevant pathways, such as RhoA signaling, glycosaminoglycan biosynthesis, fibroblast growth factor receptor activity, and pathways containing potassium channel genes, were nominally significant by multiple methods in both datasets. These results support previous hypotheses about the role of regulation of neurotransmitter release, neurite outgrowth and axon guidance in contributing to the ADHD phenotype and suggest the value of cross-method convergence in evaluating pathway analysis results. © 2016 Wiley Periodicals, Inc.

  19. A novel approach for system change pathway analysis

    Directory of Open Access Journals (Sweden)

    Walaa Ibrahim Gabr

    2016-03-01

    Full Text Available This paper is directed toward presenting a novel approach based on “consolidity charts” for the analysis of natural and man-made systems during their change pathway or course of life. The physical significance of the consolidity chart (region is that it marks the boundary of all system interactive behavior resulting from all exhaustive internal and external influences. For instance, at a specific event state, the corresponding consolidity region describes all the plausible points of normalized input–output (fuzzy or non-fuzzy interactions. These charts are developed as each event step for zone scaling of system parameters changes due to affected events or varying environments “on and above” their normal operation or set points and following the “time driven-event driven-parameters change” paradigm. Examples of the consolidity trajectory movement in the regions or patterns centers in the proposed charts of various consolidity classes are developed showing situations of change pathways from the unconsolidated form to the consolidated ones and vice versa. It is shown that the regions comparisons are based on type of consolidity region geometric shapes properties. Moreover, it is illustrated that the centerlines connecting consolidity regions during the change pathway could follow some certain type of trajectories designated as “consolidity pathway trajectory” that could assume various forms including zigzagging patterns depending on the consecutive affected influences. Implementation procedures are elaborated for the consolidity chart analysis of four real life case studies during their conventional and unconventional change pathways, describing: (i the drug concentration production problem, (ii the prey–predator population problem, (iii the spread of infectious disease problem and (iv the HIV/AIDS Epidemic problem. These solved case studies have lucidly demonstrated the applicability and effectiveness of the suggested

  20. Extreme storm surges: a comparative study of frequency analysis approaches

    Science.gov (United States)

    Hamdi, Y.; Bardet, L.; Duluc, C.-M.; Rebour, V.

    2014-08-01

    In France, nuclear facilities were designed around very low probabilities of failure. Nevertheless, some extreme climatic events have given rise to exceptional observed surges (outliers) much larger than other observations, and have clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches to extreme value analysis, including the annual maxima (AM), the peaks-over-threshold (POT) and the r-largest order statistics (r-LOS). These methods are illustrated in a real analysis case study. All data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameter stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on (i) the uncertainty degrees, (ii) the adequacy criteria and tests, and (iii) the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distribution parameters and return level estimates and have systematically shown values of the 100 and 500-year return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fit at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativeness of outliers in data sets. Findings are of practical relevance, not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  1. ANALYSIS, SELECTION AND RANKING OF FOREIGN MARKETS. A COMPREHENSIVE APPROACH

    Directory of Open Access Journals (Sweden)

    LIVIU NEAMŢU

    2013-12-01

    Full Text Available Choosing the appropriate markets for growth and development is essential for a company that wishes expanding businesses through international economic exchanges. But in this business case foreign markets research is not sufficient even though is an important chapter in the decision technology and an indispensable condition for achieving firm’s objectives. If in marketing on the national market this market is defined requiring no more than its prospection and segmentation, in the case of the international market outside the research process there is a need of a selection of markets and their classification. Companies that have this intention know little or nothing about the conditions offered by a new market or another. Therefore, they must go, step by step, through a complex analysis process, multilevel- type, composed of selection and ranking of markets followed by the proper research through exploration and segmentation, which can lead to choosing the most profitable markets. In this regard, within this study, we propose a multi-criteria model for selection and ranking of international development markets, allowing companies access to those markets which are in compliance with the company's development strategy.

  2. Dynamic Range Size Analysis of Territorial Animals: An Optimality Approach.

    Science.gov (United States)

    Tao, Yun; Börger, Luca; Hastings, Alan

    2016-10-01

    Home range sizes of territorial animals are often observed to vary periodically in response to seasonal changes in foraging opportunities. Here we develop the first mechanistic model focused on the temporal dynamics of home range expansion and contraction in territorial animals. We demonstrate how simple movement principles can lead to a rich suite of range size dynamics, by balancing foraging activity with defensive requirements and incorporating optimal behavioral rules into mechanistic home range analysis. Our heuristic model predicts three general temporal patterns that have been observed in empirical studies across multiple taxa. First, a positive correlation between age and territory quality promotes shrinking home ranges over an individual's lifetime, with maximal range size variability shortly before the adult stage. Second, poor sensory information, low population density, and large resource heterogeneity may all independently facilitate range size instability. Finally, aggregation behavior toward forage-rich areas helps produce divergent home range responses between individuals from different age classes. This model has broad applications for addressing important unknowns in animal space use, with potential applications also in conservation and health management strategies.

  3. Investigation of mercury-free potentiometric stripping analysis and the influence of mercury in the analysis of trace-elements lead and zinc

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Andersen, Laust

    1997-01-01

    Application of Potentiometric Stripping Analysis (PSA), without any mercury, to determination of trace-elements lead and zinc, results in linear responses between stripping-peak areas and concentrations within the range 0-2000 ng/g. The best response, as determined by the size of stripping areas......, was obtained with an electrode prepared with mercury but without mercury ions in the electrolyte. In 0.09-0.1 M HCl lead is analysed by a freshly polished glassy-carbon electrode while analysis of zinc requires an electrode activation procedure. The electrode activation is performed by stripping zinc...... in an electrolyte containing 0.1 M HCl and 2 mg/g Zn2+ and electrolysis at -1400 mV(SCE). It is suggested that the concentration range of linear response occur where the electrode is not fully covered by metal clusters during the electrolysis step. The influence of mercury is investigated and a model is proposed...

  4. Glycan Node Analysis: A Bottom-up Approach to Glycomics.

    Science.gov (United States)

    Zaare, Sahba; Aguilar, Jesús S; Hu, Yueming; Ferdosi, Shadi; Borges, Chad R

    2016-01-01

    facilitates relative quantification of individual glycan nodes in a sample. Although presently constrained in terms of its absolute limits of detection, this method expedites the analysis of clinical biofluids and shows considerable promise as a complementary approach to traditional top-down glycomics.

  5. Fluorescent microscopy approaches of quantitative soil microbial analysis

    Science.gov (United States)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    hybridization method (FISH). This approach was used for evaluation of contribution of each gram-negative bactera group. No significant difference between the main soil gram-negative bacterial groups (phylum Proteobacteria and Bacteroidetes) was found both under anaerobic and anaerobic conditions in chernozem in the topsoil. Thus soil gram-negative bacteria play an important ecological role in natural polymer degradation as common group of microorganisms. Another approach with using cascade filtration technique for bacterial population density estimation in chernozem was compared to classical method of fluorescent microscopy. Quantification of soil bacteria with cascade filtration provided by filters with different diameters and filtering of soil suspension in fixed amount. In comparison to the classical fluorescent microscopy method the modification with filtration of soil suspension provided to quantify more bacterial cells. Thus biomass calculation results of soil bacteria by using classical fluorescent microscopy could be underestimated and combination with cascade filtration technique allow to avoid potential experimental error. Thereby, combination and comparison of several fluorescent microscopy methods modifications established during the research provided miscellaneous approaches in soil bacteria quantification and analysis of ecological roles of soil microorganisms.

  6. Impact of right-ventricular apical pacing on the optimal left-ventricular lead positions measured by phase analysis of SPECT myocardial perfusion imaging

    Energy Technology Data Exchange (ETDEWEB)

    Hung, Guang-Uei [Chang Bing Show Chwan Memorial Hospital, Changhua (China); China Medical University, Department of Biomedical Imaging and Radiological Science, Taichung (China); Huang, Jin-Long [Taichung Veterans General Hospital, Cardiovascular Center, Taichung (China); School of Medicine, National Yang-Ming University, Institute of Clinical Medicine, and Cardiovascular Research Institute, Department of Medicine, Taipei (China); Chung-Shan Medical University, Department of Medicine, School of Medicine, Taichung (China); Lin, Wan-Yu; Tsai, Shih-Chung [Taichung Veterans General Hospital, Department of Nuclear Medicine, Taichung (China); Wang, Kuo-Yang [Taichung Veterans General Hospital, Cardiovascular Center, Taichung (China); Chung-Shan Medical University, Department of Medicine, School of Medicine, Taichung (China); Chen, Shih-Ann [School of Medicine, National Yang-Ming University, Institute of Clinical Medicine, and Cardiovascular Research Institute, Department of Medicine, Taipei (China); Taipei Veterans General Hospital, Division of Cardiology, Department of Medicine, Taipei (China); Lloyd, Michael S.; Chen, Ji [Emory University, Department of Radiology and Imaging Sciences, Atlanta, GA (United States)

    2014-06-15

    The use of SPECT phase analysis to optimize left-ventricular (LV) lead positions for cardiac resynchronization therapy (CRT) was performed at baseline, but CRT works as simultaneous right ventricular (RV) and LV pacing. The aim of this study was to assess the impact of RV apical (RVA) pacing on optimal LV lead positions measured by SPECT phase analysis. This study prospectively enrolled 46 patients. Two SPECT myocardial perfusion scans were acquired under sinus rhythm with complete left bundle branch block and RVA pacing, respectively, following a single injection of {sup 99m}Tc-sestamibi. LV dyssynchrony parameters and optimal LV lead positions were measured by the phase analysis technique and then compared between the two scans. The LV dyssynchrony parameters were significantly larger with RVA pacing than with sinus rhythm (p ∝0.01). In 39 of the 46 patients, the optimal LV lead positions were the same between RVA pacing and sinus rhythm (kappa = 0.861). In 6 of the remaining 7 patients, the optimal LV lead positions were along the same radial direction, but RVA pacing shifted the optimal LV lead positions toward the base. The optimal LV lead positions measured by SPECT phase analysis were consistent, no matter whether the SPECT images were acquired under sinus rhythm or RVA pacing. In some patients, RVA pacing shifted the optimal LV lead positions toward the base. This study supports the use of baseline SPECT myocardial perfusion imaging to optimize LV lead positions to increase CRT efficacy. (orig.)

  7. Poverty Analysis of Rice Farming Households: A Multidimensional Approach

    Directory of Open Access Journals (Sweden)

    Adenuga A. H

    2013-12-01

    Full Text Available The official measurement and analysis of poverty in Nigeria has historically relied upon the single dimension, consumption based monetary approach with little attention on multidimensional poverty assessment. This study was therefore carried out to assess the multidimensional poverty index of rice farming households in Nasarawa/Benue Rice Hub, Nigeria. The study employed stratified random sampling technique to select 149 rice farming households in the study area. Descriptive statistics, the Alkire and Foster Multidimensional Poverty Index Methodology using two different cut-off points and the Tobit regression model were the main analytical tools employed for the study. The results of the multidimensional poverty index analysis revealed that female headed households were poorer than the male headed households. On the overall, 66 percent of the rice farming households was multidimensionally poor. The study also showed that the rice farming households were deprived in 48 percent of the dimensions. A multidimensional poverty index of 0.32 was obtained for the rice farming households in the study area with varying values obtained for the male and female headed households. The result of the Tobit regression model showed that gender of the household head, health, marital status and membership of association were the major determinants of multidimensional poverty of the rice farming households in the study area. The study concluded that the rice farming households in the study area were multidimensionally poor. It was recommended that the government should give priorities to the development of the rural areas with special consideration for women through the provision of essential infrastructural facilities.

  8. A cost minimisation analysis in teledermatology: model-based approach

    Directory of Open Access Journals (Sweden)

    Eminović Nina

    2010-08-01

    Full Text Available Abstract Background Although store-and-forward teledermatology is increasingly becoming popular, evidence on its effects on efficiency and costs is lacking. The aim of this study, performed in addition to a clustered randomised trial, was to investigate to what extent and under which conditions store-and-forward teledermatology can reduce costs from a societal perspective. Methods A cost minimisation study design (a model based approach was applied to compare teledermatology and conventional process costs per dermatology patient care episode. Regarding the societal perspective, total mean costs of investment, general practitioner, dermatologists, out-of-pocket expenses and employer costs were calculated. Uncertainty analysis was performed using Monte Carlo simulation with 31 distributions in the used cost model. Scenario analysis was performed using one-way and two-way sensitivity analyses with the following variables: the patient travel distance to physician and dermatologist, the duration of teleconsultation activities, and the proportion of preventable consultations. Results Total mean costs of teledermatology process were €387 (95%CI, 281 to 502.5, while the total mean costs of conventional process costs were €354.0 (95%CI, 228.0 to 484.0. The total mean difference between the processes was €32.5 (95%CI, -29.0 to 74.7. Savings by teledermatology can be achieved if the distance to a dermatologist is larger (> = 75 km or when more consultations (> = 37% can be prevented due to teledermatology. Conclusions Teledermatology, when applied to all dermatology referrals, has a probability of 0.11 of being cost saving to society. In order to achieve cost savings by teledermatology, teledermatology should be applied in only those cases with a reasonable probability that a live consultation can be prevented. Trail Registration This study is performed partially based on PERFECT D Trial (Current Controlled Trials No.ISRCTN57478950.

  9. Extreme storm surges: a comparative study of frequency analysis approaches

    Directory of Open Access Journals (Sweden)

    Y. Hamdi

    2013-11-01

    Full Text Available In France, nuclear facilities were designed to very low probabilities of failure. Nevertheless, exceptional climatic events have given rise to surges much larger than observations (outliers and had clearly illustrated the potential to underestimate the extreme water levels calculated with the current statistical methods. The objective of the present work is to conduct a comparative study of three approaches including the Annual Maxima (AM, the Peaks-Over Threshold (POT and the r-Largest Order Statistics (r-LOS. These methods are illustrated in a real analysis case study. All the data sets were screened for outliers. Non-parametric tests for randomness, homogeneity and stationarity of time series were used. The shape and scale parameters stability plots, the mean excess residual life plot and the stability of the standard errors of return levels were used to select optimal thresholds and r values for the POT and r-LOS method, respectively. The comparison of methods was based on: (i the uncertainty degrees, (ii the adequacy criteria and tests and (iii the visual inspection. It was found that the r-LOS and POT methods have reduced the uncertainty on the distributions parameters and return level estimates and have systematically shown values of the 100 and 500 yr return levels smaller than those estimated with the AM method. Results have also shown that none of the compared methods has allowed a good fitting at the right tail of the distribution in the presence of outliers. As a perspective, the use of historical information was proposed in order to increase the representativity of outliers in data sets. Findings are of practical relevance not only to nuclear energy operators in France, for applications in storm surge hazard analysis and flood management, but also for the optimal planning and design of facilities to withstand extreme environmental conditions, with an appropriate level of risk.

  10. A Multimodal Data Analysis Approach for Targeted Drug Discovery Involving Topological Data Analysis (TDA).

    Science.gov (United States)

    Alagappan, Muthuraman; Jiang, Dadi; Denko, Nicholas; Koong, Albert C

    2016-01-01

    In silico drug discovery refers to a combination of computational techniques that augment our ability to discover drug compounds from compound libraries. Many such techniques exist, including virtual high-throughput screening (vHTS), high-throughput screening (HTS), and mechanisms for data storage and querying. However, presently these tools are often used independent of one another. In this chapter, we describe a new multimodal in silico technique for the hit identification and lead generation phases of traditional drug discovery. Our technique leverages the benefits of three independent methods-virtual high-throughput screening, high-throughput screening, and structural fingerprint analysis-by using a fourth technique called topological data analysis (TDA). We describe how a compound library can be independently tested with vHTS, HTS, and fingerprint analysis, and how the results can be transformed into a topological data analysis network to identify compounds from a diverse group of structural families. This process of using TDA or similar clustering methods to identify drug leads is advantageous because it provides a mechanism for choosing structurally diverse compounds while maintaining the unique advantages of already established techniques such as vHTS and HTS.

  11. Analysis of Lotka's Law: The Simon-Yule Approach.

    Science.gov (United States)

    Chen, Ye-Sho

    1989-01-01

    Argues that a major difficulty in using Lotka's law in information science arises from the misuse of goodness of fit tests in parameter estimation. Three approaches for studying Lotka's law are presented: an index approach, a time series approach, and a generating mechanism incorporating these two influential variables to derive an equilibrium…

  12. Global analysis of nuclear parton distribution functions and their uncertainties at next-to-next-to-leading order

    CERN Document Server

    Khanpour, Hamzeh

    2016-01-01

    We perform a next-to-next-to-leading order (NNLO) analysis of nuclear parton distribution functions (nPDFs) using neutral current charged-lepton ($\\ell ^\\pm$ + nucleus) deeply inelastic scattering (DIS) data and Drell-Yan (DY) cross-section ratios $\\sigma_{DY}^{A}/\\sigma_{DY}^{A^\\prime}$ for several nuclear targets. We study in details the parameterizations and the atomic mass (A) dependence of the nuclear PDFs at this order. The present nuclear PDFs global analysis provides us a complete set of nuclear PDFs, $f_i^{(A,Z)}(x,Q^2)$, with a full functional dependence on $x$, A, Q$^2$. The uncertainties of the obtained nuclear modification factors for each parton flavour are estimated using the well-known Hessian method. The nuclear charm quark distributions are also added into the analysis. We compare the parametrization results with the available data and the results of other nuclear PDFs groups. We found our nuclear PDFs to be in reasonably good agreement with them. The estimates of errors provided by our glob...

  13. A combined approach for comparative exoproteome analysis of Corynebacterium pseudotuberculosis

    Directory of Open Access Journals (Sweden)

    Scrivens James H

    2011-01-01

    Full Text Available Abstract Background Bacterial exported proteins represent key components of the host-pathogen interplay. Hence, we sought to implement a combined approach for characterizing the entire exoproteome of the pathogenic bacterium Corynebacterium pseudotuberculosis, the etiological agent of caseous lymphadenitis (CLA in sheep and goats. Results An optimized protocol of three-phase partitioning (TPP was used to obtain the C. pseudotuberculosis exoproteins, and a newly introduced method of data-independent MS acquisition (LC-MSE was employed for protein identification and label-free quantification. Additionally, the recently developed tool SurfG+ was used for in silico prediction of sub-cellular localization of the identified proteins. In total, 93 different extracellular proteins of C. pseudotuberculosis were identified with high confidence by this strategy; 44 proteins were commonly identified in two different strains, isolated from distinct hosts, then composing a core C. pseudotuberculosis exoproteome. Analysis with the SurfG+ tool showed that more than 75% (70/93 of the identified proteins could be predicted as containing signals for active exportation. Moreover, evidence could be found for probable non-classical export of most of the remaining proteins. Conclusions Comparative analyses of the exoproteomes of two C. pseudotuberculosis strains, in addition to comparison with other experimentally determined corynebacterial exoproteomes, were helpful to gain novel insights into the contribution of the exported proteins in the virulence of this bacterium. The results presented here compose the most comprehensive coverage of the exoproteome of a corynebacterial species so far.

  14. Analysis of resource efficiency: a production frontier approach.

    Science.gov (United States)

    Hoang, Viet-Ngu

    2014-05-01

    This article integrates the material/energy flow analysis into a production frontier framework to quantify resource efficiency (RE). The emergy content of natural resources instead of their mass content is used to construct aggregate inputs. Using the production frontier approach, aggregate inputs will be optimised relative to given output quantities to derive RE measures. This framework is superior to existing RE indicators currently used in the literature. Using the exergy/emergy content in constructing aggregate material or energy flows overcomes a criticism that mass content cannot be used to capture different quality of differing types of resources. Derived RE measures are both 'qualitative' and 'quantitative', whereas existing RE indicators are only qualitative. An empirical examination into the RE of 116 economies was undertaken to illustrate the practical applicability of the new framework. The results showed that economies, on average, could reduce the consumption of resources by more than 30% without any reduction in per capita gross domestic product (GDP). This calculation occurred after adjustments for differences in the purchasing power of national currencies. The existence of high variations in RE across economies was found to be positively correlated with participation of people in labour force, population density, urbanisation, and GDP growth over the past five years. The results also showed that economies of a higher income group achieved higher RE, and those economies that are more dependent on imports and primary industries would have lower RE performance.

  15. Efficient Analysis of Pattern and Association Rule Mining Approaches

    Directory of Open Access Journals (Sweden)

    Thabet Slimani

    2014-02-01

    Full Text Available The process of data mining produces various patterns from a given data source. The most recognized data mining tasks are the process of discovering frequent itemsets, frequent sequential patterns, frequent sequential rules and frequent association rules. Numerous efficient algorithms have been proposed to do the above processes. Frequent pattern mining has been a focused topic in data mining research with a good number of references in literature and for that reason an important progress has been made, varying from performant algorithms for frequent itemset mining in transaction databases to complex algorithms, such as sequential pattern mining, structured pattern mining, correlation mining. Association Rule mining (ARM is one of the utmost current data mining techniques designed to group objects together from large databases aiming to extract the interesting correlation and relation among huge amount of data. In this article, we provide a brief review and analysis of the current status of frequent pattern mining and discuss some promising research directions. Additionally, this paper includes a comparative study between the performance of the described approaches.

  16. A Multidisciplinary Approach to Mixer-Ejector Analysis and Design

    Science.gov (United States)

    Hendricks, Eric, S.; Seidel, Jonathan, A.

    2012-01-01

    The design of an engine for a civil supersonic aircraft presents a difficult multidisciplinary problem to propulsion system engineers. There are numerous competing requirements for the engine, such as to be efficient during cruise while yet quiet enough at takeoff to meet airport noise regulations. The use of mixer-ejector nozzles presents one possible solution to this challenge. However, designing a mixer-ejector which will successfully address both of these concerns is a difficult proposition. Presented in this paper is an integrated multidisciplinary approach to the analysis and design of these systems. A process that uses several low-fidelity tools to evaluate both the performance and acoustics of mixer-ejectors nozzles is described. This process is further expanded to include system-level modeling of engines and aircraft to determine the effects on mission performance and noise near airports. The overall process is developed in the OpenMDAO framework currently being developed by NASA. From the developed process, sample results are given for a notional mixer-ejector design, thereby demonstrating the capabilities of the method.

  17. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  18. Exposures to lead.

    Science.gov (United States)

    Callan, Anna C; Hinwood, Andrea L

    2011-01-01

    The Pacific Basin Consortium for Environment and Health hosted a workshop on Exposures to Lead. Speakers from Australia and the United States of America addressed current research knowledge on lead exposures and health effects in children, risk assessment and communication issues in dealing with lead exposure sources, different methods for assessing exposure, and the variety of scenarios where lead still remains a pollutant of concern. Mining continues to be a source of lead for many communities, and approaches to reducing exposures in these settings present particular challenges. A Perth Declaration for the Global Reduction of Childhood Lead Exposure was signed by participants of the meeting and is aimed at increasing attention to the need to continue to assess lead in the environment and to develop strategies to reduce lead in the environment and exposure by communities.

  19. Analysis of the real EADGENE data set::Multivariate approaches and post analysis

    OpenAIRE

    Schuberth Hans-Joachim; van Schothorst Evert M; Lund Mogens; San Cristobal Magali; Robert-Granié Christèle; Pool Marco H; Petzl Wolfram; Nie Haisheng; Cao Kim-Anh; de Koning Dirk-Jan; Jiang Li; Jensen Kirsty; Hulsegge Ina; Jaffrézic Florence; Hornshøj Henrik

    2007-01-01

    Abstract The aim of this paper was to describe, and when possible compare, the multivariate methods used by the participants in the EADGENE WP1.4 workshop. The first approach was for class discovery and class prediction using evidence from the data at hand. Several teams used hierarchical clustering (HC) or principal component analysis (PCA) to identify groups of differentially expressed genes with a similar expression pattern over time points and infective agent (E. coli or S. aureus). The m...

  20. Analysis of variance of designed chromatographic data sets: The analysis of variance-target projection approach.

    Science.gov (United States)

    Marini, Federico; de Beer, Dalene; Joubert, Elizabeth; Walczak, Beata

    2015-07-31

    Direct application of popular approaches, e.g., Principal Component Analysis (PCA) or Partial Least Squares (PLS) to chromatographic data originating from a well-designed experimental study including more than one factor is not recommended. In the case of a well-designed experiment involving two or more factors (crossed or nested), data are usually decomposed into the contributions associated with the studied factors (and with their interactions), and the individual effect matrices are then analyzed using, e.g., PCA, as in the case of ASCA (analysis of variance combined with simultaneous component analysis). As an alternative to the ASCA method, we propose the application of PLS followed by target projection (TP), which allows a one-factor representation of the model for each column in the design dummy matrix. PLS application follows after proper deflation of the experimental matrix, i.e., to what are called the residuals under the reduced ANOVA model. The proposed approach (ANOVA-TP) is well suited for the study of designed chromatographic data of complex samples. It allows testing of statistical significance of the studied effects, 'biomarker' identification, and enables straightforward visualization and accurate estimation of between- and within-class variance. The proposed approach has been successfully applied to a case study aimed at evaluating the effect of pasteurization on the concentrations of various phenolic constituents of rooibos tea of different quality grades and its outcomes have been compared to those of ASCA.

  1. The 'Food Polymer Science' Approach to the Practice of Industrial R&D, Leading to Patent Estates Based on Fundamental Starch Science and Technology.

    Science.gov (United States)

    Slade, Louise; Levine, Harry

    2016-09-22

    This paper reviews the application of the 'Food Polymer Science' approach to the practice of industrial R&D, leading to patent estates based on fundamental starch science and technology . The areas of patents and patented technologies reviewed here include: a) soft-from-the-freezer ice creams and freezer-storage-stable frozen bread dough products, based on 'cryostabilization technology' of frozen foods, utilizing commercial starch hydrolysis products (SHPs); b) glassy-matrix encapsulation technology for flavors and other volatiles, based on structure-function relationships for commercial SHPs; c) production of stabilized whole-grain wheat flours for biscuit products, based on the application of 'solvent retention capacity' technology to develop flours with reduced damaged starch; d) production of improved-quality, low-moisture cookies and crackers, based on pentosanase enzyme technology; e) production of 'baked-not-fried', chip-like, starch-based snack products, based on the use of commercial modified-starch ingredients with selected functionality; f) accelerated staling of a starch-based food product from baked bread crumb, based on the kinetics of starch retrogradation, treated as a crystallization process for a partially crystalline glassy polymer system; g) a process for producing an enzyme-resistant starch, for use as a reduced-calorie flour replacer in a wide range of grain-based food products, including cookies, extruded expanded snacks, and breakfast cereals.

  2. $\\bar{B}^0_s \\to (\\pi^0 \\eta^{(*)}, \\eta^{(*)}\\eta^{(*)})$ decays and the effects of next-to-leading order contributions in the perturbative QCD approach

    CERN Document Server

    Xiao, Zhen-Jun; Lin, Dong-Ting; Fan, Ying-Ying; Ma, Ai-Jun

    2014-01-01

    In this paper, we calculate the branching ratios and CP violating asymmetries of the five $\\bar{B}^0_s \\to (\\pi^0\\eta^{(*)},\\eta^{(*)}\\eta^{(*)})$ decays, by employing the perturbative QCD (pQCD) factorization approach and with the inclusion of all currently known next-to-leading order (NLO) contributions. We find that (a) the NLO contributions can provide about 100% enhancements to the LO pQCD predictions for the decay rates of $\\bar{B}_s^0 \\to \\eta\\eta^\\prime$ and $\\eta^\\prime \\eta^\\prime$ decays, but result in small changes to $Br(\\bar{B}_s \\to \\pi^0 \\eta^{(*)})$ and $Br(\\bar{B}_s \\to \\eta\\eta)$; (b) the newly known NLO twist-2 and twist-3 contributions to the relevant form factors can provide about 10% enhancements to the decay rates of the considered decays; (c) for $\\bar{B}_s \\to \\pi^0 \\eta^{(*)}$ decays, their direct CP-violating asymmetries $\\cala_f^{dir}$ could be enhanced significantly by the inclusion of the NLO contributions; and (d) the pQCD predictions for $Br(\\bar{B}_s \\to \\eta \\eta^{(*)})$ and...

  3. A "genome-to-lead" approach for insecticide discovery: pharmacological characterization and screening of Aedes aegypti D(1-like dopamine receptors.

    Directory of Open Access Journals (Sweden)

    Jason M Meyer

    2012-01-01

    Full Text Available BACKGROUND: Many neglected tropical infectious diseases affecting humans are transmitted by arthropods such as mosquitoes and ticks. New mode-of-action chemistries are urgently sought to enhance vector management practices in countries where arthropod-borne diseases are endemic, especially where vector populations have acquired widespread resistance to insecticides. METHODOLOGY/PRINCIPAL FINDINGS: We describe a "genome-to-lead" approach for insecticide discovery that incorporates the first reported chemical screen of a G protein-coupled receptor (GPCR mined from a mosquito genome. A combination of molecular and pharmacological studies was used to functionally characterize two dopamine receptors (AaDOP1 and AaDOP2 from the yellow fever mosquito, Aedes aegypti. Sequence analyses indicated that these receptors are orthologous to arthropod D(1-like (Gα(s-coupled receptors, but share less than 55% amino acid identity in conserved domains with mammalian dopamine receptors. Heterologous expression of AaDOP1 and AaDOP2 in HEK293 cells revealed dose-dependent responses to dopamine (EC(50: AaDOP1 = 3.1±1.1 nM; AaDOP2 = 240±16 nM. Interestingly, only AaDOP1 exhibited sensitivity to epinephrine (EC(50 = 5.8±1.5 nM and norepinephrine (EC(50 = 760±180 nM, while neither receptor was activated by other biogenic amines tested. Differential responses were observed between these receptors regarding their sensitivity to dopamine agonists and antagonists, level of maximal stimulation, and constitutive activity. Subsequently, a chemical library screen was implemented to discover lead chemistries active at AaDOP2. Fifty-one compounds were identified as "hits," and follow-up validation assays confirmed the antagonistic effect of selected compounds at AaDOP2. In vitro comparison studies between AaDOP2 and the human D(1 dopamine receptor (hD(1 revealed markedly different pharmacological profiles and identified amitriptyline and doxepin as AaDOP2

  4. Lead Test

    Science.gov (United States)

    ... months, and at 3, 4, 5, and 6 years of age. A blood lead level test should be done only if the risk ... recommended if the person is symptomatic at any level below 70 mcg/dL. Because lead will pass through the blood to an unborn child, pregnant ...

  5. Analysis of the institutional evaluation approach applied to the educational management of Costa Rica Christian School

    OpenAIRE

    Campos Campos, Ana Jenssie

    2011-01-01

    The following article corresponds to the synthesis of a research work about the analysis of the Institutional Evaluation Approach applied to the educational management of a private school in the San José Norte Educational Region. The objectives of the analysis lied in identifying theinstitutional evaluation approach from the characteristics of the three approaches proposed, as well as on determining the dimensions of the approach used, and a third objective was determining the staff perceptio...

  6. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  7. Frequency domain analysis and design of nonlinear systems based on Volterra series expansion a parametric characteristic approach

    CERN Document Server

    Jing, Xingjian

    2015-01-01

    This book is a systematic summary of some new advances in the area of nonlinear analysis and design in the frequency domain, focusing on the application oriented theory and methods based on the GFRF concept, which is mainly done by the author in the past 8 years. The main results are formulated uniformly with a parametric characteristic approach, which provides a convenient and novel insight into nonlinear influence on system output response in terms of characteristic parameters and thus facilitate nonlinear analysis and design in the frequency domain.  The book starts with a brief introduction to the background of nonlinear analysis in the frequency domain, followed by recursive algorithms for computation of GFRFs for different parametric models, and nonlinear output frequency properties. Thereafter the parametric characteristic analysis method is introduced, which leads to the new understanding and formulation of the GFRFs, and nonlinear characteristic output spectrum (nCOS) and the nCOS based analysis a...

  8. Analysis of Heart Diseases Dataset using Neural Network Approach

    CERN Document Server

    Rani, K Usha

    2011-01-01

    One of the important techniques of Data mining is Classification. Many real world problems in various fields such as business, science, industry and medicine can be solved by using classification approach. Neural Networks have emerged as an important tool for classification. The advantages of Neural Networks helps for efficient classification of given data. In this study a Heart diseases dataset is analyzed using Neural Network approach. To increase the efficiency of the classification process parallel approach is also adopted in the training phase.

  9. A Discussion of Water Pollution in the United States and Mexico; with High School Laboratory Activities for Analysis of Lead, Atrazine, and Nitrate.

    Science.gov (United States)

    Kelter, Paul B.; Grundman, Julie; Hage, David S.; Carr, James D.; Castro-Acuna, Carlos Mauricio

    1997-01-01

    Presents discussions on sources, health impacts, methods of analysis as well as lengthy discussions of lead, nitrates, and atrazine as related to water pollution and the interdisciplinary nature of the modern chemistry curriculum. (DKM)

  10. The Dependent Converging Instrument Approach Procedure: An Analysis of its Safety and Applicability

    Science.gov (United States)

    1992-11-01

    scenarios, this feature leads to a slightly more conservative measure of the predicited separation (i.e., smaller separation). 3.4.2 Determination of Headwind...for the DCIA procedure may facilitate the maintenance of the independence of one parallel approach, with dependent converging approaches to the other

  11. An analysis of the impact of LHC Run I proton-lead data on nuclear parton densities

    Energy Technology Data Exchange (ETDEWEB)

    Armesto, Nestor; Penin, Jose Manuel; Salgado, Carlos A.; Zurita, Pia [Universidade de Santiago de Compostela, Departamento de Fisica de Particulas and IGFAE, Galicia (Spain); Paukkunen, Hannu [Universidade de Santiago de Compostela, Departamento de Fisica de Particulas and IGFAE, Galicia (Spain); University of Jyvaeskylae, Department of Physics, P.O. Box 35, Jyvaeskylae (Finland); University of Helsinki, Helsinki Institute of Physics, P.O. Box 64, Helsinki (Finland)

    2016-04-15

    We report on an analysis of the impact of available experimental data on hard processes in proton-lead collisions during Run I at the large hadron collider on nuclear modifications of parton distribution functions. Our analysis is restricted to the EPS09 and DSSZ global fits. The measurements that we consider comprise production of massive gauge bosons, jets, charged hadrons and pions. This is the first time a study of nuclear PDFs includes this number of different observables. The goal of the paper is twofold: (i) checking the description of the data by nPDFs, as well as the relevance of these nuclear effects, in a quantitative manner; (ii) testing the constraining power of these data in eventual global fits, for which we use the Bayesian reweighting technique. We find an overall good, even too good, description of the data, indicating that more constraining power would require a better control over the systematic uncertainties and/or the proper proton-proton reference from LHC Run II. Some of the observables, however, show sizeable tension with specific choices of proton and nuclear PDFs. We also comment on the corresponding improvements as regards the theoretical treatment. (orig.)

  12. Predictive 1-D thermal-hydraulic analysis of the prototype HTS current leads for the ITER correction coils

    Science.gov (United States)

    Heller, R.; Bauer, P.; Savoldi, L.; Zanino, R.; Zappatore, A.

    2016-12-01

    We present an analysis of the prototype high-temperature superconducting (HTS) current leads (CLs) for the ITER correction coils, which will operate at 10 kA. A copper heat exchanger (HX) of the meander-flow type is included in the CL design and covers the temperature range between room temperature and 65 K, whereas the HTS module, where Bi-2223 stacked tapes are positioned on the outer surface of a stainless steel hollow cylindrical support, covers the temperature range between 65 K and 4.5 K. The HX is cooled by gaseous helium entering at 50 K, whereas the HTS module is cooled by conduction from the cold end of the CL. We use the CURLEAD code, developed some years ago and now supplemented by a new set of correlations for the helium friction factor and heat transfer coefficient in the HX, recently derived using Computational Fluid Dynamics. Our analysis is aimed first of all at a "blind" design-like prediction of the CL performance, for both steady state and pulsed operation. In particular, the helium mass flow rate needed to guarantee the target temperature at the HX-HTS interface, the temperature profile, and the pressure drop across the HX will be computed. The predictive capabilities of the CURLEAD model are then assessed by comparison of the simulation results with experimental data obtained in the test of the prototype correction coil CLs at ASIPP, whose results were considered only after the simulations were performed.

  13. An analysis of the impact of LHC Run I proton-lead data on nuclear parton densities.

    Science.gov (United States)

    Armesto, Néstor; Paukkunen, Hannu; Penín, José Manuel; Salgado, Carlos A; Zurita, Pía

    2016-01-01

    We report on an analysis of the impact of available experimental data on hard processes in proton-lead collisions during Run I at the large hadron collider on nuclear modifications of parton distribution functions. Our analysis is restricted to the EPS09 and DSSZ global fits. The measurements that we consider comprise production of massive gauge bosons, jets, charged hadrons and pions. This is the first time a study of nuclear PDFs includes this number of different observables. The goal of the paper is twofold: (i) checking the description of the data by nPDFs, as well as the relevance of these nuclear effects, in a quantitative manner; (ii) testing the constraining power of these data in eventual global fits, for which we use the Bayesian reweighting technique. We find an overall good, even too good, description of the data, indicating that more constraining power would require a better control over the systematic uncertainties and/or the proper proton-proton reference from LHC Run II. Some of the observables, however, show sizeable tension with specific choices of proton and nuclear PDFs. We also comment on the corresponding improvements as regards the theoretical treatment.

  14. Scaling and disorder analysis of local I-V curves from ferroelectric thin films of lead zirconate titanate.

    Science.gov (United States)

    Maksymovych, Peter; Pan, Minghu; Yu, Pu; Ramesh, Ramamoorthy; Baddorf, Arthur P; Kalinin, Sergei V

    2011-06-24

    Differential analysis of current-voltage characteristics, obtained on the surface of epitaxial films of ferroelectric lead zirconate titanate (Pb(Zr(0.2)Ti(0.8))O(3)) using scanning probe microscopy, was combined with spatially resolved mapping of variations in local conductance to differentiate between candidate mechanisms of local electronic transport and the origin of disorder. Within the assumed approximations, electron transport was inferred to be determined by two mechanisms depending on the magnitude of applied bias, with the low-bias range dominated by the trap-assisted Fowler-Nordheim tunneling through the interface and the high-bias range limited by the hopping conduction through the bulk. Phenomenological analysis of the I-V curves has further revealed that the transition between the low- and high-bias regimes is manifested both in the strength of variations within the I-V curves sampled across the surface, as well as the spatial distribution of conductance. Spatial variations were concluded to originate primarily from the heterogeneity of the interfacial electronic barrier height with an additional small contribution from random changes in the tip-contact geometry.

  15. A Thermoeconomic Approach for the Analysis of District Heating Systems

    Directory of Open Access Journals (Sweden)

    Michele Calí

    2001-12-01

    Full Text Available

    District heating is a rational way to use fossil fuels for domestic heating (and cooling in towns, especially if it is joined with a cogenerative production of electricity. As with every other process it must be economically convenient for its realization, so technical considerations must be integrated into the economic ones. Thermoeconomic theories take into account these two aspects, representing a good tool for an optimized design and correct management (Bejan et al. 1996.

    The aim of this paper is to propose the use of thermoeconomic procedures for the analysis of district heating systems, in order to define criteria for the network design. The approach consists of the choice of significant design parameters, which can be varied in order to determine the optimized system.

    An application to the Turin district heating system is presented here. The system is composed of a steam power plant and a gas turbine power plant, both cogenerative, and of the pipe network. The effects of the choices in the network project on the working conditions of the system and on the cost of its products are shown. The optimization problem has been solved by evaluating the decision variable under some particular conditions, obtained by solving fluid-dynamic, thermal and thermoeconomic problems for the whole system, corresponding to different values of the supply fluid temperature.

    The application of the thermoeconomic theory to the network allows one to determine the effects of the parameters characterizing each user to the cost of the service provided. This information constitutes a tool for making management decisions, like the opportunity and modality for future expansions of the served area.

  16. Assessment of lead pollution in topsoils of a southern Italy area: Analysis of urban and peri-urban environment.

    Science.gov (United States)

    Guagliardi, Ilaria; Cicchella, Domenico; De Rosa, Rosanna; Buttafuoco, Gabriele

    2015-07-01

    Exposure to lead (Pb) may affect adversely human health. Mapping soil Pb contents is essential to obtain a quantitative estimate of potential risk of Pb contamination. The main aim of this paper was to determine the soil Pb concentrations in the urban and peri-urban area of Cosenza-Rende to map their spatial distribution and assess the probability that soil Pb concentration exceeds a critical threshold that might cause concern for human health. Samples were collected at 149 locations from residual and non-residual topsoil in gardens, parks, flower-beds, and agricultural fields. Fine earth fraction of soil samples was analyzed by X-ray Fluorescence spectrometry. Stochastic images generated by the sequential Gaussian simulation were jointly combined to calculate the probability of exceeding the critical threshold that could be used to delineate the potentially risky areas. Results showed areas in which Pb concentration values were higher to the Italian regulatory values. These polluted areas were quite large and likely, they could create a significant health risk for human beings and vegetation in the near future. The results demonstrated that the proposed approach can be used to study soil contamination to produce geochemical maps, and identify hot-spot areas for soil Pb concentration.

  17. Dynamic metabolic flux analysis using a convex analysis approach: Application to hybridoma cell cultures in perfusion.

    Science.gov (United States)

    Fernandes de Sousa, Sofia; Bastin, Georges; Jolicoeur, Mario; Vande Wouwer, Alain

    2016-05-01

    In recent years, dynamic metabolic flux analysis (DMFA) has been developed in order to evaluate the dynamic evolution of the metabolic fluxes. Most of the proposed approaches are dedicated to exactly determined or overdetermined systems. When an underdetermined system is considered, the literature suggests the use of dynamic flux balance analysis (DFBA). However the main challenge of this approach is to determine an appropriate objective function, which remains valid over the whole culture. In this work, we propose an alternative dynamic metabolic flux analysis based on convex analysis, DMFCA, which allows the determination of bounded intervals for the fluxes using the available knowledge of the metabolic network and information provided by the time evolution of extracellular component concentrations. Smoothing splines and mass balance differential equations are used to estimate the time evolution of the uptake and excretion rates from this experimental data. The main advantage of the proposed procedure is that it does not require additional constraints or objective functions, and provides relatively narrow intervals for the intracellular metabolic fluxes. DMFCA is applied to experimental data from hybridoma HB58 cell perfusion cultures, in order to investigate the influence of the operating mode (batch and perfusion) on the metabolic flux distribution.

  18. Elevated Concentrations of Lead in Particulate Matter on the Neighborhood-Scale in Delhi, India As Determined by Single Particle Analysis.

    Science.gov (United States)

    Shen, Hongru; Peters, Thomas M; Casuccio, Gary S; Lersch, Traci L; West, Roger R; Kumar, Amit; Kumar, Naresh; Ault, Andrew P

    2016-05-17

    High mass concentrations of atmospheric lead particles are frequently observed in the Delhi, India metropolitan area, although the sources of lead particles are poorly understood. In this study, particles sampled across Delhi (August - December 2008) were analyzed by computer-controlled scanning electron microscopy with energy dispersive X-ray spectroscopy (CCSEM-EDX) to improve our understanding of the spatial and physicochemical variability of lead-rich particles (>90% lead). The mean mass concentration of lead-rich particles smaller than 10 μm (PM10) was 0.7 μg/m(3) (1.5 μg/m(3) std. dev.) with high variability (range: 0-6.2 μg/m(3)). Four samples (16% of 25 samples) with PM10 lead-rich particle concentrations >1.4 μg/m(3) were defined as lead events and studied further. The temporal characteristics, heterogeneous spatial distribution, and wind patterns of events, excluded regional monsoon conditions or common anthropogenic sources from being the major causes of the lead events. Individual particle composition, size, and morphology analysis indicate informal recycling operations of used lead-acid batteries as the likely source of the lead events. This source is not typically included in emission inventories, and the observed isolated hotspots with high lead concentrations could represent an elevated exposure risk in certain neighborhoods of Delhi.

  19. An approach to rational ligand-design based on a thermodynamic analysis.

    Science.gov (United States)

    Ui, Mihoko; Tsumoto, Kouhei

    2010-11-01

    Thermodynamic analysis is an effective tool in screening of lead-compounds for development of potential drug candidates. In most cases, a ligand achieve high affinity and specificity to a target protein by means of both favorable enthalpy and entropy terms, which can be reflected in binding profiles of Isothermal Titration Calorimetry (ITC). A favorable enthalpy change suggests the contribution of noncovalent contacts such as hydrogen bonding and van der Waals interaction between a ligand and its target protein. In general, optimization of binding enthalpy is more difficult than that of entropies in ligand-design; therefore, it is desirable to choose firstly a lead-compound based on its binding enthalpic gain. In this paper, we demonstrate the utility of thermodynamic approach to ligand screening using anti-ciguatoxin antibody 10C9 as a model of a target protein which possesses a large hydrophobic pocket. As a result of this screening, we have identified three compounds that could bind to the antigen-binding pocket of 10C9 with a few kcal/mol of favorable binding enthalpy. Comparison of their structure with the proper antigen ciguatoxin CTX3C revealed that 10C9 rigorously identifies their cyclic structure and a characteristic hydroxyl group. ITC measurement might be useful and powerful for a rational ligand screening and the optimization of the ligand; the enthalpic gain is an effective index for ligand-design studies.

  20. Lead Poisoning

    Science.gov (United States)

    ... menopause.) Once the lead is released from the mother's bones, it re-enters the blood stream and ... drinks. Avoid eating off any colorfully painted ceramic plates, and avoid drinking from any ceramic mugs unless ...

  1. Lead Poisoning

    Science.gov (United States)

    ... Topics Environment & Health Healthy Living Pollution Reduce, Reuse, Recycle Science – How It Works The Natural World Games ... OTHERS: Lead has recently been found in some plastic mini-blinds and vertical blinds which were made ...

  2. The Soft Constraints Hypothesis: A Rational Analysis Approach to Resource Allocation for Interactive Behavior

    Science.gov (United States)

    2006-01-01

    analysis approach from signal-detection theorists (Geisler, 2003; Macmillan & Creelman , 2004) with rational analysis (Anderson, 1990, 1991) to present an...Rational Analysis3 Our ideal performer analysis combines elements of an ideal observer analysis (Geisler, 2003; Macmillan & Creelman , 2004) with those...of rational analysis (Anderson, 1990, 1991). The ideal observer analysis (Geisler, 2003; Macmillan & Creelman , 2004) is used to “determine the

  3. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    Science.gov (United States)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected

  4. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    Science.gov (United States)

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    2017-01-01

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO2(110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.

  5. Strategic analysis for the MER Cape Verde approach

    Science.gov (United States)

    Gaines, D.; Belluta, P.; Herman, J.; Hwang, P.; Mukai, R.; Porter, D.; Jones, B.; Wood, E.; Grotzinger, J.; Edgar, L.; Hayes, A.; Hare, T.; Squyres, S.

    2009-01-01

    The Mars Exploration Rover Opportunity has recently completed a two year campaign studying Victoria Crater. The campaign culminated in a close approach of Cape Verde in order to acquire high resolution imagery of the exposed stratigraphy in the cliff face. The close approach to Cape Verde provided significant challenges for every subsystem of the rover as the rover needed to traverse difficult, uncharacterised terrain and approach a cliff face with the potential of blocking out solar energy and communications with Earth. In this paper we describe the strategic analyses performed by the science and engineering teams so that we could successfully achieve the science objectives while keeping the rover safe. ??2009 IEEE.

  6. Identification of leads through in silico approaches utilizing benzylthio-1H-benzo[d]imidazol-1-yl acetic acid derivatives: A potent CRTh2 antagonist

    Science.gov (United States)

    Babu, Sathya; Kulkarni, Seema A.; Sohn, Honglae; Madhavan, Thirumurthy

    2015-12-01

    Chemoattractant Receptor-homologous molecule expressed on Th2 cells (CRTh2) is considered as a potential therapeutic target for the treatment of asthma and allergic rhinitis. Herein, we describe the pharmacophore based virtual screening combined with molecular docking and 3D-QSAR methods to identify new potent CRTh2 inhibitors. Several pharmacophore models were generated and validated by Guner-Henry scoring method. The best models were utilized as 3D Pharmacophore query to screen against ZINC database and the retrieved hits were further validated by fitness score, Lipinski's rule of five, Surflex docking and Comparative Molecular Field Analysis (CoMFA) process. The optimum CoMFA model was developed using known inhibitors and the predictive ability of model was examined by statistical parameters like q2 = 0.552 and r2pred = 0.636. The biological activities of the screened compounds were calculated using the generated CoMFA model. Finally nine compounds were found to have good potential and high inhibitory activities and they may act as novel lead compounds for CRTh2 inhibitor designing.

  7. Influence of social and environmental factors on dust, lead, hand lead, and blood lead levels in young children

    Energy Technology Data Exchange (ETDEWEB)

    Bornschein, R.L.; Succop, P.; Dietrich, K.N.; Clark, C.S.; Que Hee, S.; Hammond, P.B.

    1985-10-01

    The roles of environmental and behavioral factors in determining blood lead levels were studied in a cohort of young children living in an urban environment. The subjects were observed at 3-month intervals from birth to 24 months of age. Repeated measurements were made of the children's blood lead levels, environmental levels of lead in house dust, and in the dust found on the children's hands. A qualitative rating of the residence and of the socioeconomic status of the family was obtained. Interviews and direct observation of parent and child at home were used to evaluate various aspects of caretaker-child interactions. Data analysis consisted of a comparison of results obtained by (a) simple correlational analysis, (b) multiple regression analysis, and (c) structural equations analysis. The results demonstrated that structural equation modeling offers a useful approach to unraveling the complex interactions present in the data set. In this preliminary analysis, the suspected relationship between the levels of lead in house dust and on hands and the blood lead level was clearly demonstrated. Furthermore, the analyses indicated an important interplay between environmental sources and social factors in the determination of hand lead and blood lead levels in very young children.

  8. Real Time Analysis and Display of Aircraft Approach Maneuvers

    Science.gov (United States)

    Lynch, Robert E. (Inventor); Chidester, Thomas R. (Inventor); Lawrence, Robert E. (Inventor)

    2007-01-01

    Method and system for monitoring and comparing, in real time, performance of an aircraft during an approach to touchdown along a conventional approach path and along a contemplated modified approach path to touchdown. In a first procedure, a flight parameter value at a selected location is compared and displayed, for the planned path and for the modified path. In a second procedure, flight parameter values FP(t(sub m)) at a sequence (t(sub n)}n, of measurement times is compared and displayed, for the planned path and for a contemplated or presently-executed modified path. If the flight parameter for the planned path and for the modified path differ too much from each other, the pilot in command has an option of terminating the approach along the modified path.

  9. Analysis of Kernel Approach in Fuzzy-Based Image Classifications

    Directory of Open Access Journals (Sweden)

    Mragank Singhal

    2013-03-01

    Full Text Available This paper presents a framework of kernel approach in the field of fuzzy based image classification in remote sensing. The goal of image classification is to separate images according to their visual content into two or more disjoint classes. Fuzzy logic is relatively young theory. Major advantage of this theory is that it allows the natural description, in linguistic terms, of problems that should be solved rather than in terms of relationships between precise numerical values. This paper describes how remote sensing data with uncertainty are handled with fuzzy based classification using Kernel approach for land use/land cover maps generation. The introduction to fuzzification using Kernel approach provides the basis for the development of more robust approaches to the remote sensing classification problem. The kernel explicitly defines a similarity measure between two samples and implicitly represents the mapping of the input space to the feature space.

  10. QCD analysis of nucleon structure functions in deep-inelastic neutrino-nucleon scattering: Laplace transform and Jacobi polynomials approach

    Science.gov (United States)

    Nejad, S. Mohammad Moosavi; Khanpour, Hamzeh; Tehrani, S. Atashbar; Mahdavi, Mahdi

    2016-10-01

    We present a detailed QCD analysis of nucleon structure functions x F3(x ,Q2) , based on Laplace transforms and the Jacobi polynomials approach. The analysis corresponds to the next-to-leading order and next-to-next-to-leading order approximations of perturbative QCD. The Laplace transform technique, as an exact analytical solution, is used for the solution of nonsinglet Dokshitzer-Gribov-Lipatov-Altarelli-Parisi evolution equations at low- and large-x values. The extracted results are used as input to obtain the x and Q2 evolution of x F3(x ,Q2) structure functions using the Jacobi polynomials approach. In our work, the values of the typical QCD scale ΛMS¯ (nf) and the strong coupling constant αs(MZ2) are determined for four quark flavors (nf=4 ) as well. A careful estimation of the uncertainties shall be performed using the Hessian method for the valence-quark distributions, originating from the experimental errors. We compare our valence-quark parton distribution functions sets with those of other collaborations, in particular with the CT14, MMHT14, and NNPDF sets, which are contemporary with the present analysis. The obtained results from the analysis are in good agreement with those from the literature.

  11. [Analysis of lead in unknown samples based on the standard addition method using laser induced breakdown spectroscopy].

    Science.gov (United States)

    Fang, Li; Zhao, Nan-jing; Meng, De-shuo; Yuan, Jing; Tang, Jie; Wang, Yin; Yu, Yang; Ma, Ming-jun; Hu, Li; Zhang, Da-hai; Xiao, Xue; Wang, Yu; Liu, Jian-guo; Liu, Wen-qing

    2015-01-01

    The standard addition method with laser induced breakdown spectroscopy was used to analyze an unknown sample taken from a lead battery factory. the matrix influence on the results was effectively avoided when the external or internal standard method was used, and the pretreatment of samples was simple and quick. The Nd ' YAG pulse laser with wavelength 1 064 nm was used as the excitation source. The echelle spectroscopy with high resolution and wide spectral range was used as the spectral separation device, and the intensified charge coupled device (ICCD) as the spectral detection device in the experiment. The characteristic line at 405. 78 nrn was chosen as the analysis line to measure Pb concentration. Fe I : 404. 58 line was chosen as the internal standard. Pre-experiment was carried out to confirm the appropriate condition. Under the laser energy of 128. 5 mJ, the delay time of 2. 5 tps, and the gate width of 3 ps, it was determined that with the addition of Pb to the sample in the range of 0 and 25 000 mg . kg-1, there wasn't self-absorption. There was a good linear relationship between the intensity of the spectral line of 405. 78 nm and the addition of Pb. The appropriate concentration of Pb added into the sample for analysis was determined by this series of samples. On this basis, four samples were prepared with three parallel samples for each sample in order to verify the repeatability and reliability of the method, i. e. 5 000, 10 000, 15 000, 20 000 mg . kg-1 Pb was added into the original sample. The results were compared with the result of ICP-MS. The twelve samples' relative errors were between -24. 6% and 17. 6%. The average result was 43 069 mg . kg-1 with the relative error -2. 44%.

  12. Modified Stoppa Approach versus Ilioinguinal Approach for Anterior Acetabular Fractures; A Systematic Review and Meta-Analysis

    Science.gov (United States)

    Meena, Sanjay; Sharma, Pankaj Kumar; Mittal, Samarth; Sharma, Jyoti; Chowdhury, Buddhadev

    2017-01-01

    Introduction: Modified Stoppa approach was introduced as an alternative to ilioinguinal approach for management of anterior fractures of acetabulum in order to reduce complications of the latter. However, the efficacy of either approach over other is not well established. The aim of this meta-analysis is to compare the efficacy of modified stoppa and ilioinguinal approach in the management of acetabular fractures in terms of a) quality of reduction achieved b) complication rates c) functional outcomes d) operative time e) intra-operative blood loss. Methods: Databases of PubMed, EMBASE and Cochrane registry of controlled trials were taken into consideration for studies on modified Stoppa approach versus Ilioinguinal approach group for the treatment of anterior acetabular fractures. Dichotomous variables were presented as risk ratios (RRs) /Odds Ratio (OR) with 95% confidence intervals (CIs), and continuous data was measured as mean differences, with 95% CIs. Result: Four studies involving 375 patients were included in this meta-analysis. Out of those 375 patients, 192 were managed with ilioinguinal approach and 183 were managed with modified Stoppa approach. Anatomical reduction was significantly higher in Stoppa group (p=0.052, RR=1. 19 (1.02, 1.37), p=0. 90, I2=0%). The complication rate was significantly higher in the Ilioinguinal approach as compared with the Stoppa approach (p=0.01, RR 0.63 (0.44 to 0.91), p=0.73 (I2= 0%). The operative time was significantly shorter with modified Stoppa approach (MD = -48.79 (-80.29 to -17.30), p=0.002). No significant differences were found between the two groups in terms of their functional outcomes (p=0.63, RR 0.96 (-0.80 to 1.15), p=0. 56, I2=0%) and blood loss (MD= -212.89 (-476.27 to 50.49) p=0. 06, I2=71%). Conclusion: Anterior acetabular fractures, if operated with the modified Stoppa approach were found to have better reduction and lower complication rates with less operative time, when compared to ilioinguinal

  13. Analysis on the anisotropic electromechanical properties of lead magnoniobate titanate single crystal for ring type ultrasonic motors

    Directory of Open Access Journals (Sweden)

    Xiang Shi

    2016-11-01

    Full Text Available This work discussed the optimized cut of single crystal lead magnoniobate titanate (PMNT for use of ring type travelling wave ultrasonic motors (USMs, according to anisotropic analysis on electromechanical properties. The selection criterion of crystal orientation relies on the circular uniformity of the induced travelling wave amplitude on the stator surface. By calculating the equivalent elastic coefficient c11 and lateral piezoelectric constant d31, the optimal crystal orientations were proposed for PMNT single crystals poled along different directions. For single crystal poled along c directions, the optimal orientation lies along [001]c with d31=-1335pC/N and k31=0.87. The crystallographic orientation [025]c is the optimized orientation for single crystals poled along c direction with d31=199pC/N and k31=0.55. The optimal orientation of 1R configuration is [332¯]c with a large enhancement of d31 = 1201 and k31=0.92.

  14. Analysis on the anisotropic electromechanical properties of lead magnoniobate titanate single crystal for ring type ultrasonic motors

    Science.gov (United States)

    Shi, Xiang; Huang, Wenbin; Li, Fei; Li, Zhenrong; Xu, Zhuo; Jiang, Xiaoning; Wei, Xiaoyong

    2016-11-01

    This work discussed the optimized cut of single crystal lead magnoniobate titanate (PMNT) for use of ring type travelling wave ultrasonic motors (USMs), according to anisotropic analysis on electromechanical properties. The selection criterion of crystal orientation relies on the circular uniformity of the induced travelling wave amplitude on the stator surface. By calculating the equivalent elastic coefficient c11 and lateral piezoelectric constant d31, the optimal crystal orientations were proposed for PMNT single crystals poled along different directions. For single crystal poled along c directions, the optimal orientation lies along [001]c with d31=-1335pC/N and k31=0.87. The crystallographic orientation [025]c is the optimized orientation for single crystals poled along c direction with d31=199pC/N and k31=0.55. The optimal orientation of 1R configuration is [332 ¯ ] c with a large enhancement of d31 = 1201 and k31=0.92.

  15. Enabling Design for Affordability: An Epoch-Era Analysis Approach

    Science.gov (United States)

    2013-04-01

    Hummel , & McGrath, 2012). The use of tradespaces instead of simple tradeoffs of several point designs can lead to better lifecycle results for the...20Group%2 0-%20SE%20Division%20Strategy%20Meeting%20111207.pdf Neches, R., Carlini, J., Graybill, R., Hummel , R., & McGrath, M. (2012). From today’s

  16. Occupational exposures to solvents and lead as risk factors for Alzheimer's disease: A collaborative re-analysis of case-control studies

    NARCIS (Netherlands)

    A.B. Graves; C.M. van Duijn (Cock); V. Chandra; L. Fratiglioni (Laura); A. Heyman; A.F. Jorm; E. Kokmen (Emre); K. Kondo; J.A. Mortimer; W.A. Rocca; S.L. Shalat; H. Soininen; A. Hofman (Albert)

    1991-01-01

    textabstractA meta-analysis, involving the secondary analysis of original data from 11 case-control studies of Alzheimer's disease, is presented for occupational exposures to solvents and lead. Three studies had data on occupational exposure to solvents. Among cases, 21.3% were reported to have been

  17. Ecotoxicology: Lead

    Science.gov (United States)

    Scheuhammer, A.M.; Beyer, W.N.; Schmitt, C.J.; Jorgensen, Sven Erik; Fath, Brian D.

    2008-01-01

    Lead (Pb) is a naturally occurring metallic element; trace concentrations are found in all environmental media and in all living things. However, certain human activities, especially base metal mining and smelting; combustion of leaded gasoline; the use of Pb in hunting, target shooting, and recreational angling; the use of Pb-based paints; and the uncontrolled disposal of Pb-containing products such as old vehicle batteries and electronic devices have resulted in increased environmental levels of Pb, and have created risks for Pb exposure and toxicity in invertebrates, fish, and wildlife in some ecosystems.

  18. Automatic symbolic analysis of SC networks using a modified nodal approach

    NARCIS (Netherlands)

    Zivkovic, V.A.; Petkovic, P.M.; Milanovic, D.P.

    1998-01-01

    This paper presents a symbolic analysis of Switched-Capacitor (SC) circuits in the z-domain using Modified Nodal Approach (MNA). We have selected the MNA method as one of the widely established approaches in circuit analysis. The analyses are performed using SymsimC symbolic simulator which also ena

  19. A digraph permanent approach to evaluation and analysis of integrated watershed management system

    Science.gov (United States)

    Ratha, Dwarikanath; Agrawal, V. P.

    2015-06-01

    In the present study a deterministic quantitative model based on graph theory has been developed for the better development and management of watershed. Graph theory is an integrative systems approach to consider and model structural components of watershed management system along with the interrelationships between them concurrently and integratively. The factors responsible for the development of watershed system are identified. The degree of interaction between one subsystem with others are determined. The eigenvalue formulation is used to take care the inconsistencies arises due to inaccurate judgement in the degree of interaction between the subsystems. In this model the visual analysis is done to abstract the information using the directed graph or digraph. Then the matrix model is developed for computer processing. Variable permanent function in the form of multinomial represents the watershed system uniquely and completely by an index value. Different terms of the multinomial represent all possible subsystems of integrated watershed management system and thus different solutions for watershed management, leading to optimum solution. This index value is used to compare the suitability of the watershed with different alternatives available for its development. So the graph theory analysis presents a powerful tool to generate the optimum solutions for the decision maker for benefit of local people living in the watershed as well as the stakeholders. The proposed methodology is also demonstrated by a suitable example and is applied to the ecosystem and environment subsystem of the lake Qionghai watershed in China.

  20. Design and analysis of experiments classical and regression approaches with SAS

    CERN Document Server

    Onyiah, Leonard C

    2008-01-01

    Introductory Statistical Inference and Regression Analysis Elementary Statistical Inference Regression Analysis Experiments, the Completely Randomized Design (CRD)-Classical and Regression Approaches Experiments Experiments to Compare Treatments Some Basic Ideas Requirements of a Good Experiment One-Way Experimental Layout or the CRD: Design and Analysis Analysis of Experimental Data (Fixed Effects Model) Expected Values for the Sums of Squares The Analysis of Variance (ANOVA) Table Follow-Up Analysis to Check fo

  1. A systematic approach to initial data analysis is good research practice.

    Science.gov (United States)

    Huebner, Marianne; Vach, Werner; le Cessie, Saskia

    2016-01-01

    Initial data analysis is conducted independently of the analysis needed to address the research questions. Shortcomings in these first steps may result in inappropriate statistical methods or incorrect conclusions. We outline a framework for initial data analysis and illustrate the impact of initial data analysis on research studies. Examples of reporting of initial data analysis in publications are given. A systematic and careful approach to initial data analysis is needed as good research practice.

  2. Comparison and analysis of the efficiency of heat exchange of copper rod and copper wires current lead

    Science.gov (United States)

    Fang, J.; Yu, T.; Li, Z. M.; Wei, B.; Qiu, M.; Zhang, H. J.

    2013-11-01

    Current leads are the key components that connect the low-temperature and high temperature parts of the cryogenic system. Owing to the wide range of temperatures, current leads are the main sources of heat leakage. Since the HTS tapes have no resistance and the generated Joule heat is almost zero, HTS binary current leads can reduce heat leakage compared to the conventional leads. However, heat will still be generated and conducted to the cryogenic system through the copper parts of the HTS current leads. In order to reduce heat leakage by the copper parts of the HTS current leads, this paper presents an optimized design of the copper parts of HTS binary current leads. Inside the leads, the copper wires were applied as an alternative to the copper rod without changing the overall dimensions. Firstly, the differential function of heat transfer was derived. By solving the function, the optimum number of the copper wires and the temperature distribution of two different current leads were gotten. Then the experiment of the temperature distribution was done, and the experimental results were basically the same with the calculative results. The simulation and related experiments proved that the copper wire can increase security margins and reduce maximum temperatures under the same shunt current.

  3. Integrated Risk-Capability Analysis under Deep Uncertainty: an ESDMA Approach

    OpenAIRE

    Pruyt, E.; Kwakkel, J. H.

    2012-01-01

    Integrated risk-capability analysis methodologies for dealing with increasing degrees of complexity and deep uncertainty are urgently needed in an ever more complex and uncertain world. Although scenario approaches, risk assessment methods, and capability analysis methods are used, few organizations and nations use truly integrated risk-capability approaches, and almost none use integrated risk-capability approaches that take dynamic complexity and deep uncertainty seriously into account. Thi...

  4. Simulation Approach for Timing Analysis of Genetic Logic Circuits.

    Science.gov (United States)

    Baig, Hasan; Madsen, Jan

    2017-02-01

    Constructing genetic logic circuits is an application of synthetic biology in which parts of the DNA of a living cell are engineered to perform a dedicated Boolean function triggered by an appropriate concentration of certain proteins or by different genetic components. These logic circuits work in a manner similar to electronic logic circuits, but they are much more stochastic and hence much harder to characterize. In this article, we introduce an approach to analyze the threshold value and timing of genetic logic circuits. We show how this approach can be used to analyze the timing behavior of single and cascaded genetic logic circuits. We further analyze the timing sensitivity of circuits by varying the degradation rates and concentrations. Our approach can be used not only to characterize the timing behavior but also to analyze the timing constraints of cascaded genetic logic circuits, a capability that we believe will be important for design automation in synthetic biology.

  5. Analysis of the real EADGENE data set: Multivariate approaches and post analysis (Open Access publication

    Directory of Open Access Journals (Sweden)

    Schuberth Hans-Joachim

    2007-11-01

    Full Text Available Abstract The aim of this paper was to describe, and when possible compare, the multivariate methods used by the participants in the EADGENE WP1.4 workshop. The first approach was for class discovery and class prediction using evidence from the data at hand. Several teams used hierarchical clustering (HC or principal component analysis (PCA to identify groups of differentially expressed genes with a similar expression pattern over time points and infective agent (E. coli or S. aureus. The main result from these analyses was that HC and PCA were able to separate tissue samples taken at 24 h following E. coli infection from the other samples. The second approach identified groups of differentially co-expressed genes, by identifying clusters of genes highly correlated when animals were infected with E. coli but not correlated more than expected by chance when the infective pathogen was S. aureus. The third approach looked at differential expression of predefined gene sets. Gene sets were defined based on information retrieved from biological databases such as Gene Ontology. Based on these annotation sources the teams used either the GlobalTest or the Fisher exact test to identify differentially expressed gene sets. The main result from these analyses was that gene sets involved in immune defence responses were differentially expressed.

  6. Discourse analysis in general practice: a sociolinguistic approach.

    Science.gov (United States)

    Nessa, J; Malterud, K

    1990-06-01

    It is a simple but important fact that as general practitioners we talk to our patients. The quality of the conversation is of vital importance for the outcome of the consultation. The purpose of this article is to discuss a methodological tool borrowed from sociolinguistics--discourse analysis. To assess the suitability of this method for analysis of general practice consultations, the authors have performed a discourse analysis of one single consultation. Our experiences are presented here.

  7. Lead grids

    CERN Multimedia

    1974-01-01

    One of the 150 lead grids used in the multiwire proportional chamber g-ray detector. The 0.75 mm diameter holes are spaced 1 mm centre to centre. The grids were made by chemical cutting techniques in the Godet Workshop of the SB Physics.

  8. Leading men

    DEFF Research Database (Denmark)

    Bekker-Nielsen, Tønnes

    2016-01-01

    Through a systematic comparison of c. 50 careers leading to the koinarchate or high priesthood of Asia, Bithynia, Galatia, Lycia, Macedonia and coastal Pontus, as described in funeral or honorary inscriptions of individual koinarchs, it is possible to identify common denominators but also...

  9. Complex network approach for recurrence analysis of time series

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donges, Jonathan F. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany); Zou Yong [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donner, Reik V. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Institute for Transport and Economics, Dresden University of Technology, Andreas-Schubert-Str. 23, 01062 Dresden (Germany)] [Graduate School of Science, Osaka Prefecture University, 1-1 Gakuencho, Naka-ku, Sakai 599-8531 (Japan); Kurths, Juergen [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany)

    2009-11-09

    We propose a novel approach for analysing time series using complex network theory. We identify the recurrence matrix (calculated from time series) with the adjacency matrix of a complex network and apply measures for the characterisation of complex networks to this recurrence matrix. By using the logistic map, we illustrate the potential of these complex network measures for the detection of dynamical transitions. Finally, we apply the proposed approach to a marine palaeo-climate record and identify the subtle changes to the climate regime.

  10. Thermo-mechanical analysis using a multiphysics approach

    Energy Technology Data Exchange (ETDEWEB)

    Delprete, C; Rosso, C [Dipartimento di Meccanica, Politecnico di Torino, Corso Duca degli Abruzzi, 24, Torino (Italy); Freschi, F; Repetto, M, E-mail: cristiana.delprete@polito.i [Dipartimento di Ingegneria Elettrica, Politecnico di Torino, Corso Duca degli Abruzzi, 24, Torino (Italy)

    2009-08-01

    In the paper the Cell Method, a discrete method for solving partial differential equations, is applied to a time dependent thermo-mechanical problem. The basic equations are developed with a multiphysics approach and results, both in two-dimensional and tree-dimensional models, are presented. By means of the comparison with finite element approach, some advantages of the proposed methodology are highlighted, in particular quick model construction, capability to separate thermal strain from the mechanical one and, as a consequence, the capability to model the strain and stress evolution in a time dependent problem, considering possible mutual effects between thermal and mechanical fields.

  11. Fatigue Damage Analysis by Use of Cyclic Strain Approach

    DEFF Research Database (Denmark)

    Andersen, Michael Rye

    1996-01-01

    A number of cracks were reported in a bulk carrier (approx. 300m long) after 23 years of trade. The ship was still in good condition, no significant corrosion was found by inspection. The trade routes of the vessel were in the North Atlantic, usually one voyage in fully loaded condition followed...... of the deck. The initiation of the cracks was probably due to fatigue damage. In this paper the cracks will be investigated using the cyclic strain approach, and the obtained results will be compared with fatigue lives estimated by the S-N approach....

  12. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation

  13. Analysis of lead concentration in forager stingless bees Trigona sp. (hymenoptera: Apidae) and propolis at Cilutung and Maribaya, West Java

    Energy Technology Data Exchange (ETDEWEB)

    Safira, Nabila, E-mail: safira.nabila@ymail.com; Anggraeni, Tjandra, E-mail: tjandra@sith.itb.ac.id [School of Life Science and Technology, Institut Teknologi Bandung – Jalan Ganesha 10, Bandung (Indonesia)

    2015-09-30

    Several studies had shown that lead (Pb) in the environment could accumulate in bees, which in turn could affect the quality of the resulting product. In this study, forager stingless bees (Trigona sp.) and its product (propolis) collected from a stingless bees apiculture. This apiculture had two apiary sites which were distinguished by its environmental setting. Apiary site in Cilutung had a forest region environmental setting, while apiary site in Maribaya was located beside the main road. The objective of this study was to determine the extent of lead concentration in propolis originated from both apiary sites and establish the correlation between lead concentration in propolis and lead level in forager stingless bees. Forager bees and propolis samples were originated from 50 bees colonies (Cilutung) and 44 bees colonies (Maribaya). They were analyzed using AAS-GF (Atomic Absorption Spectrometre–Graphite Furnace) to determine the level of lead concentration. The results showed that the average level of lead in propolis originated from Cilutung (298.08±73.71 ppb) was lower than the average level of lead in forager bees which originated from Maribaya (330.64±156.34 ppb). However, these values did not show significant difference (p>0.05). There was no significant difference (p>0.05) between the average level of lead in forager bees which originated from Cilutung (118.08±30.46 ppb) and Maribaya (128.82±39.66 ppb). However, these values did not show significant difference (p>0.05). In conclusion, the average level of lead concentration in propolis in both sites had passed the maximum permission standard of lead for food in Indonesia. There was no correlation between lead concentration in propolis and forager stingless bees.

  14. A State Space Modeling Approach to Mediation Analysis

    Science.gov (United States)

    Gu, Fei; Preacher, Kristopher J.; Ferrer, Emilio

    2014-01-01

    Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudinal data. Another implicit assumption commonly made in longitudinal designs for mediation analysis is that the same mediation…

  15. An Integrated Approach to Thermal Analysis of Pharmaceutical Solids

    Science.gov (United States)

    Riley, Shelley R. Rabel

    2015-01-01

    A three-tiered experiment for undergraduate Instrumental Analysis students is presented in which students characterize the solid-state thermal behavior of an active pharmaceutical ingredient (acetaminophen) and excipient (a-lactose hydrate) using differential scanning calorimetry, thermogravimetric analysis, and thermal microscopy. Students are…

  16. Genetic Approaches to Appearance and Ancestry : Improving Forensic DNA Analysis

    NARCIS (Netherlands)

    L.C. Chaitanya (Lakshmi)

    2016-01-01

    textabstractTraditionally, routine forensic casework is based on comparative grounds. DNA profiles obtained from crime-scenes are compared with those of potential suspects or DNA profiles deposited in forensic DNA databases. The principal limitation of such comparative approach is that trace donors

  17. Child Psychotherapy, Child Analysis, and Medication: A Flexible, Integrative Approach.

    Science.gov (United States)

    Whitman, Laura

    2015-01-01

    For children with moderate to severe emotional or behavioral problems, the current approach in child psychiatry is to make an assessment for the use of both psychotherapy and medication. This paper describes integration of antidepressants and stimulants with psychoanalytically oriented techniques.

  18. Approaches and Methods in Language Teaching: A Description and Analysis.

    Science.gov (United States)

    Richards, Jack C.; Rodgers, Theodore S.

    Each major trend in 20th-century second language teaching is explained, and similarities and differences are highlighted. An introductory chapter offers a brief history of second language teaching. The second chapter outlines a model for examining and comparing the different approaches. This model is used in subsequent chapters to describe methods…

  19. Perturbation Experiments: Approaches for Metabolic Pathway Analysis in Bioreactors.

    Science.gov (United States)

    Weiner, Michael; Tröndle, Julia; Albermann, Christoph; Sprenger, Georg A; Weuster-Botz, Dirk

    2016-01-01

    In the last decades, targeted metabolic engineering of microbial cells has become one of the major tools in bioprocess design and optimization. For successful application, a detailed knowledge is necessary about the relevant metabolic pathways and their regulation inside the cells. Since in vitro experiments cannot display process conditions and behavior properly, process data about the cells' metabolic state have to be collected in vivo. For this purpose, special techniques and methods are necessary. Therefore, most techniques enabling in vivo characterization of metabolic pathways rely on perturbation experiments, which can be divided into dynamic and steady-state approaches. To avoid any process disturbance, approaches which enable perturbation of cell metabolism in parallel to the continuing production process are reasonable. Furthermore, the fast dynamics of microbial production processes amplifies the need of parallelized data generation. These points motivate the development of a parallelized approach for multiple metabolic perturbation experiments outside the operating production reactor. An appropriate approach for in vivo characterization of metabolic pathways is presented and applied exemplarily to a microbial L-phenylalanine production process on a 15 L-scale.

  20. Analysis of the Romanian employment rate. A panel data approach

    Directory of Open Access Journals (Sweden)

    Larisa APARASCHIVEI

    2012-07-01

    Full Text Available This paper examines the evolution of the employment rate in Romania. I employ a panel data approach, considering the 42 counties, in order to explore the relationship between labour productivity, average earnings, investments and employment. The results reveal a positive impact of the average wage and gross investments and a negative impact of the labour productivity on employment rate.

  1. Electromigration and solid state aging of flip chip solder joints and analysis of tin whisker on lead-frame

    Science.gov (United States)

    Lee, Taekyeong

    Electromigration and solid state aging in flip chip joint, and whisker on lead frame of Pb-containing (eutectic SnPb) and Pb-free solders (SnAg 3.5, SnAg3.8Cu0.7, and SnCu0.7), have been studied systematically, using Scanning Electron Microscopy (SEM), Energy Dispersive X-ray Analysis (EDX), and synchrotron radiation. The high current density in flip chip joint drives the diffusion of atoms of eutectic SnPb and SnAgCu. A marker is used to measure the diffusion flux in a half cross-sectioned solder joint. SnAgCu shows higher resistance against electromigration than eutectic SnPb. In the half cross-sectioned solder joint, void growth is the dominant failure mechanism. However, the whole solder balls in the underfill show that the failure mechanism is a result from the dissolution of electroless Ni under bump metallization (UBM) of about 10 mum thickness. The growth rate between intermetallic compounds in molten and solid solders differed by four orders of magnitude. In liquid solder, the growth rate is about 1 mum/min; the growth rate in solid solder is only about 10 -4 mum/min. The difference is not resulting from factors of thermodynamics, which is the change of Gibbs free energy before and after intermetallic compound formation, but from kinetic factors, which is the rate of change of Gibbs free energy. Even though the difference in growth rate between eutectic SnPb and Pb-free solders during solid state aging was found, the reason behind such difference shown is unclear. The orientation and stress levels of whiskers are measured by white X-ray of synchrotron radiation. The growth direction is nearly parallel to one of the principal axes of tin. The compressive stress level is quite low because the residual stress is relaxed by the whisker growth.

  2. LIB spectroscopic and biochemical analysis to characterize lead toxicity alleviative nature of silicon in wheat (Triticum aestivum L.) seedlings.

    Science.gov (United States)

    Tripathi, Durgesh Kumar; Singh, Vijay Pratap; Prasad, Sheo Mohan; Dubey, Nawal Kishore; Chauhan, Devendra Kumar; Rai, Awadesh Kumar

    2016-01-01

    The responses of wheat seedling treated with silicon (Si; 10 μM) and lead (Pb; 100 μM) for 7 days have been investigated by analyzing growth, Pb uptake, chlorophyll fluorescence, oxidative stress, antioxidants and nutrients regulation. Results indicated that, Pb significantly (P<0.05) declined growth of seedlings which was accompanied by uptake of Pb. Under Pb stress, fluorescence parameters: Fv/Fm ratio and qP were significantly (P<0.05) decreased while NPQ was increased. Si addition alleviated Pb-induced decrease in growth and alterations in photosynthesis, and also significantly (P<0.05) lowered Pb uptake. Under Pb treatment, oxidative stress markers: hydrogen peroxide and lipid peroxidation were enhanced while DPPH(•) scavenging capacity and total phenolic compounds (TPCs) were decreased significantly, however, Si addition improved the status of antioxidants. The non-protein thiols (NP-SH) showed enhanced level under Pb stress. Pb stress considerably disturbed status of the nutrients as decrease in Ca, P, Mg, Zn and Ni contents while an increase in K, S, B, Cu, Fe, Mn and Na contents were noticed. Si addition maintained status of all the nutrients remarkably. The quickest method of element analysis: LIBS spectra revealed significantly lower uptake of Pb in seedlings grown under Si and Pb combination and same was correlated with the data of AAS. Overall results pointed out that excess Pb uptake disturbed status of nutrients, photosynthetic performance, antioxidant capacity, hence severe oxidative damage to lipids occurred. Further, Si supplementation successfully regulated these parameters by inhibiting Pb uptake hence maintained growth of wheat seedlings. Similar pattern of data recorded by the LIBS, AAS and ICAP-AES confirmed that LIBS may be one of the promising and authentic tools to monitor the mineral and metal distribution in the plants without hampering or disturbing the environment due to its eco-friendly and non-invasive nature.

  3. Real time analysis of lead-containing atmospheric particles in Beijing during springtime by single particle aerosol mass spectrometry.

    Science.gov (United States)

    Ma, Li; Li, Mei; Huang, Zhengxu; Li, Lei; Gao, Wei; Nian, Huiqing; Zou, Lilin; Fu, Zhong; Gao, Jian; Chai, Fahe; Zhou, Zhen

    2016-07-01

    Using a single particle aerosol mass spectrometer (SPAMS), the chemical composition and size distributions of lead (Pb)-containing particles with diameter from 0.1 μm to 2.0 μm in Beijing were analyzed in the spring of 2011 during clear, hazy, and dusty days. Based on mass spectral features of particles, cluster analysis was applied to Pb-containing particles, and six major classes were acquired consisting of K-rich, carboneous, Fe-rich, dust, Pb-rich, and Cl-rich particles. Pb-containing particles accounted for 4.2-5.3%, 21.8-22.7%, and 3.2% of total particle number during clear, hazy and dusty days, respectively. K-rich particles are a major contribution to Pb-containing particles, varying from 30.8% to 82.1% of total number of Pb-containing particles, lowest during dusty days and highest during hazy days. The results reflect that the chemical composition and amount of Pb-containing particles has been affected by meteorological conditions as well as the emissions of natural and anthropogenic sources. K-rich particles and carbonaceous particles could be mainly assigned to the emissions of coal combustion. Other classes of Pb-containing particles may be associated with metallurgical processes, coal combustion, dust, and waste incineration etc. In addition, Pb-containing particles during dusty days were first time studied by SPAMS. This method could provide a powerful tool for monitoring and controlling of Pb pollution in real time.

  4. Approach to conceptual data integration for multidimensional data analysis in e-commerce

    Institute of Scientific and Technical Information of China (English)

    Zhang Zhe; Huang Pei

    2006-01-01

    In e-commerce the multidimensional data analysis based on the Web data needs integrating various data sources such as XML data and relational data on the conceptual level. A conceptual data description approach to multidimensional data model --the UML galaxy diagram is presented in order to conduct multidimensional data analysis for multiple subjects. The approach is illuminated using a case of 2_roots UML galaxy diagram that takes marketing analysis of TV products involved one retailer and several suppliers into consideration.

  5. Network analysis and synthesis a modern systems theory approach

    CERN Document Server

    Anderson, Brian D O

    2006-01-01

    Geared toward upper-level undergraduates and graduate students, this book offers a comprehensive look at linear network analysis and synthesis. It explores state-space synthesis as well as analysis, employing modern systems theory to unite the classical concepts of network theory. The authors stress passive networks but include material on active networks. They avoid topology in dealing with analysis problems and discuss computational techniques. The concepts of controllability, observability, and degree are emphasized in reviewing the state-variable description of linear systems. Explorations

  6. Next-to-next-to-leading order QCD analysis of the revised CCFR data for $xF_3$ structure function and the higher twist contributions

    CERN Document Server

    Kataev, A L; Parente, G; Sidorov, A V

    1998-01-01

    We present the results of the next-to-next-to-leading order QCD analysis of the recently revised experimental data of the CCFR collaboration for the $xF_3$ structure function using the Jacobi polynomial expansion method. The effects of the higher twist contributions are included into the fits following the infrared renormalon motivated model. The special attention is paid to the checks of the predictive abilities of the infrared renormalon model and to the independent extraction of the $x$-shape of the twist-4 contributions to the $xF_3$ structure function in the process of the leading order, next-to-leading order and next-to-next-to-leading order fits of the revised CCFR data. We stress that at the next-to-next-to-leading order the results for We obtain the following result $\\alpha_s(M_Z)^{NNLO}=0.117 \\pm 0.002(stat) \\pm 0.005 (syst)\\pm 0.003 (theory)$. The comparison of the outcomes of our next-to-leading order and next-to-next-to-leading order analysis indicate that the theoretical QCD uncertainties were u...

  7. New Approach of Envelope Dynamic Analysis for Milling Process

    CERN Document Server

    Bisu, Claudiu-Florinel; Gérard, Alain; Vijelea, V; Anica, Marin

    2012-01-01

    This paper proposes a method to vibration analysis in order to on-line monitoring of milling process quality. Adapting envelope analysis to characterize the milling tool materials is an important contribution to the qualitative and quantitative characterization of milling capacity and a step by modeling the three-dimensional cutting process. An experimental protocol was designed and developed for the acquisition, processing and analyzing three-dimensional signal. The vibration envelope analysis is proposed to detect the cutting capacity of the tool with the optimization application of cutting parameters. The research is focused on FFT Fourier transform optimization of vibration analysis and vibration envelope to evaluate the dynamic behavior of the machine/ tool/workpiece

  8. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  9. Caprock integrity analysis in thermal operations: an integrated geomechanics approach

    Energy Technology Data Exchange (ETDEWEB)

    Khan, Safdar; Han, Hongxue; Ansari, Sajjad; Vishteh, Morteza; Khosravi, Nader [Schlumberger Data and Consulting Services (Canada)

    2011-07-01

    In heavy oil fields, thermal processes such as steam assisted gravity drainage (SAGD) and cyclic steam simulation (CSS) are used to recover oil. Unfortunately these methods can induce problems of caprock integrity. It is commonly thought that to maintain caprock integrity, the net injection pressure should be kept under the fracture pressure as measured by mini-fracturing tests but cases have shown that failures can occur below this pressure. This paper aimed at presenting a new approach based on geomechanics to evaluate caprock integrity in thermal processes. A study case was conducted on an SAGD pad development in Northern Alberta. Results confirmed that caprock integrity cannot be guaranteed by injecting steam at a pressure lower than the fracture pressure. In addition the geomechanical model can be used to take measures to avoid catastrophic events. A new integrated geomechanical approach for caprock integrity in thermal operations was developed herein and successfully tested in a field study.

  10. Operational Risk Management A Practical Approach to Intelligent Data Analysis

    CERN Document Server

    Kenett, Ron

    2010-01-01

    The book will introduce modern Operational Risk (OpR) Management and illustrates the various sources of OpR assessment and OpR mitigation. This book discusses how various data sources can be integrated and analyzed and how OpR is synergetic to other risk management activities such as Financial Risk Management and Internationalization. The topics will include state of the art technology such as semantic analysis, ontology engineering, data mining and statistical analysis.

  11. Analysis of the Story of an Hour from Feminist Approaches

    Institute of Scientific and Technical Information of China (English)

    任立立

    2014-01-01

    The Story of an Hour is a short story written by Kate Chopin in 1894. It is mainly about a wife’s unexpected response to the news of her husband's death. In this paper, the plot, main characters, and conflicts will first be illustrated, and the author will analyze this story from feminist approaches. We see clearly that women in that time did not have the right to choose their own marriages, and the main character is just one of the victims.

  12. An Efficient Approach for Fuzzy Project Network Analysis

    Institute of Scientific and Technical Information of China (English)

    HU Jing-song

    2002-01-01

    In this paper we present two-level linear programming method for determining latest dates and slack times in project network models with triangular fuzzy number or trapezoidal fuzzy number. Compared with the well-known fuzzy network techniques in literature, the approach always produces the meaningful latest dates and slack times. Practically we have generalized critical path method by accepting imprecise,fuzzy data for the duration of the activities.

  13. Similarity transformation approach to identifiability analysis of nonlinear compartmental models.

    Science.gov (United States)

    Vajda, S; Godfrey, K R; Rabitz, H

    1989-04-01

    Through use of the local state isomorphism theorem instead of the algebraic equivalence theorem of linear systems theory, the similarity transformation approach is extended to nonlinear models, resulting in finitely verifiable sufficient and necessary conditions for global and local identifiability. The approach requires testing of certain controllability and observability conditions, but in many practical examples these conditions prove very easy to verify. In principle the method also involves nonlinear state variable transformations, but in all of the examples presented in the paper the transformations turn out to be linear. The method is applied to an unidentifiable nonlinear model and a locally identifiable nonlinear model, and these are the first nonlinear models other than bilinear models where the reason for lack of global identifiability is nontrivial. The method is also applied to two models with Michaelis-Menten elimination kinetics, both of considerable importance in pharmacokinetics, and for both of which the complicated nature of the algebraic equations arising from the Taylor series approach has hitherto defeated attempts to establish identifiability results for specific input functions.

  14. Combined approach based on principal component analysis and canonical discriminant analysis for investigating hyperspectral plant response

    Directory of Open Access Journals (Sweden)

    Anna Maria Stellacci

    2012-07-01

    Full Text Available Hyperspectral (HS data represents an extremely powerful means for rapidly detecting crop stress and then aiding in the rational management of natural resources in agriculture. However, large volume of data poses a challenge for data processing and extracting crucial information. Multivariate statistical techniques can play a key role in the analysis of HS data, as they may allow to both eliminate redundant information and identify synthetic indices which maximize differences among levels of stress. In this paper we propose an integrated approach, based on the combined use of Principal Component Analysis (PCA and Canonical Discriminant Analysis (CDA, to investigate HS plant response and discriminate plant status. The approach was preliminary evaluated on a data set collected on durum wheat plants grown under different nitrogen (N stress levels. Hyperspectral measurements were performed at anthesis through a high resolution field spectroradiometer, ASD FieldSpec HandHeld, covering the 325-1075 nm region. Reflectance data were first restricted to the interval 510-1000 nm and then divided into five bands of the electromagnetic spectrum [green: 510-580 nm; yellow: 581-630 nm; red: 631-690 nm; red-edge: 705-770 nm; near-infrared (NIR: 771-1000 nm]. PCA was applied to each spectral interval. CDA was performed on the extracted components to identify the factors maximizing the differences among plants fertilised with increasing N rates. Within the intervals of green, yellow and red only the first principal component (PC had an eigenvalue greater than 1 and explained more than 95% of total variance; within the ranges of red-edge and NIR, the first two PCs had an eigenvalue higher than 1. Two canonical variables explained cumulatively more than 81% of total variance and the first was able to discriminate wheat plants differently fertilised, as confirmed also by the significant correlation with aboveground biomass and grain yield parameters. The combined

  15. Universal screening test based on analysis of circulating organ-enriched microRNAs: a novel approach to diagnostic screening.

    Science.gov (United States)

    Sheinerman, Kira S; Umansky, Samuil

    2015-03-01

    Early disease detection leads to more effective and cost-efficient treatment. It is especially important for cancer and neurodegenerative diseases, because progression of these pathologies leads to significant and frequently irreversible changes in underlying pathophysiological processes. At the same time, the development of specific screening tests for detection of each of the hundreds of human pathologies in asymptomatic stage may be impractical. Here, we discuss a recently proposed concept: the development of minimally invasive Universal Screening Test (UST) based on analysis of organ-enriched microRNAs in plasma and other bodily fluids. The UST is designed to detect the presence of a pathology in particular organ systems, organs, tissues or cell types without diagnosing a specific disease. Once the pathology is detected, more specific, and if necessary invasive and expensive, tests can be administered to precisely define the nature of the disease. Here, we discuss recent studies and analyze the data supporting the UST approach.

  16. Sampling and Analysis for Lead in Water and Soil Samples on a University Campus: A Student Research Project.

    Science.gov (United States)

    Butala, Steven J.; Zarrabi, Kaveh

    1995-01-01

    Describes a student research project that determined concentrations of lead in water drawn from selected drinking fountains and in selected soil samples on the campus of the University of Nevada, Las Vegas. (18 references) (DDR)

  17. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    Science.gov (United States)

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration.

  18. A knowledge discovery approach to urban analysis: Beyoglu Preservation Area as a data mine

    Directory of Open Access Journals (Sweden)

    Ahu Sokmenoglu Sohtorik

    2016-05-01

    targeted researchers and practitioners, these can lead to the development of more informed intervention proposals. Thereby the knowledge discovery approach to urban analysis developed in this thesis may help to improve the quality of urban intervention proposals and consequently the quality of built environments. On the other hand, the implementations carried out in the thesis also exposed the major limitation of the knowledge discovery approach to urban analysis through data mining, which is the fact that the findings discoverable by this approach are limited by the relevant data that is collectable and accessible.

  19. Deterministic and risk-informed approaches for safety analysis of advanced reactors: Part II, Risk-informed approaches

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Inn Seock, E-mail: innseockkim@gmail.co [ISSA Technology, 21318 Seneca Crossing Drive, Germantown, MD 20876 (United States); Ahn, Sang Kyu; Oh, Kyu Myung [Korea Institute of Nuclear Safety, 19 Kusong-dong, Yuseong-gu, Daejeon 305-338 (Korea, Republic of)

    2010-05-15

    Technical insights and findings from a critical review of deterministic approaches typically applied to ensure design safety of nuclear power plants were presented in the companion paper of Part I included in this issue. In this paper we discuss the risk-informed approaches that have been proposed to make a safety case for advanced reactors including Generation-IV reactors such as Modular High-Temperature Gas-cooled Reactor (MHTGR), Pebble Bed Modular Reactor (PBMR), or Sodium-cooled Fast Reactor (SFR). Also considered herein are a risk-informed safety analysis approach suggested by Westinghouse as a means to improve the conventional accident analysis, together with the Technology Neutral Framework recently developed by the US Nuclear Regulatory Commission as a high-level regulatory infrastructure for safety evaluation of any type of reactor design. The insights from a comparative review of various deterministic and risk-informed approaches could be usefully used in developing a new licensing architecture for enhanced safety of evolutionary or advanced plants.

  20. Contradiction analysis: towards a dialectical approach in ergonomics field interventions

    Directory of Open Access Journals (Sweden)

    Dimitris Nathanael

    2015-03-01

    Full Text Available The present paper is a methodological contribution to the ergonomics field intervention process. It proposes a perspective on work analysis based on the dialectics notion of contradictions. Contradiction analysis is proposed as being complementary to more established work decomposition methods. The aim of including such an analysis is to frame various heterogeneous determinants of a work activity in practical terms, swiftly and in a manner that preserves its multifaceted unity and essence. Such framing is of particular value when considering alternative design solutions because it provides a practical means for anticipating the effects and side effects of proposed changes. The proposed method is inspired by two theoretical constructs: (i contradiction, as used in Cultural Historical Activity Theory, and (ii regulation, as developed and used by the francophone tradition of the ergonomics of activity. Two brief examples of its use are presented, and its usefulness, possible pitfalls and need for further developments are discussed.